Sample records for rule-based inference system

  1. Algorithm Optimally Orders Forward-Chaining Inference Rules

    NASA Technical Reports Server (NTRS)

    James, Mark

    2008-01-01

    People typically develop knowledge bases in a somewhat ad hoc manner by incrementally adding rules with no specific organization. This often results in a very inefficient execution of those rules since they are so often order sensitive. This is relevant to tasks like Deep Space Network in that it allows the knowledge base to be incrementally developed and have it automatically ordered for efficiency. Although data flow analysis was first developed for use in compilers for producing optimal code sequences, its usefulness is now recognized in many software systems including knowledge-based systems. However, this approach for exhaustively computing data-flow information cannot directly be applied to inference systems because of the ubiquitous execution of the rules. An algorithm is presented that efficiently performs a complete producer/consumer analysis for each antecedent and consequence clause in a knowledge base to optimally order the rules to minimize inference cycles. An algorithm was developed that optimally orders a knowledge base composed of forwarding chaining inference rules such that independent inference cycle executions are minimized, thus, resulting in significantly faster execution. This algorithm was integrated into the JPL tool Spacecraft Health Inference Engine (SHINE) for verification and it resulted in a significant reduction in inference cycles for what was previously considered an ordered knowledge base. For a knowledge base that is completely unordered, then the improvement is much greater.

  2. How to select combination operators for fuzzy expert systems using CRI

    NASA Technical Reports Server (NTRS)

    Turksen, I. B.; Tian, Y.

    1992-01-01

    A method to select combination operators for fuzzy expert systems using the Compositional Rule of Inference (CRI) is proposed. First, fuzzy inference processes based on CRI are classified into three categories in terms of their inference results: the Expansion Type Inference, the Reduction Type Inference, and Other Type Inferences. Further, implication operators under Sup-T composition are classified as the Expansion Type Operator, the Reduction Type Operator, and the Other Type Operators. Finally, the combination of rules or their consequences is investigated for inference processes based on CRI.

  3. Common-Sense Rule Inference

    NASA Astrophysics Data System (ADS)

    Lombardi, Ilaria; Console, Luca

    In the paper we show how rule-based inference can be made more flexible by exploiting semantic information associated with the concepts involved in the rules. We introduce flexible forms of common sense reasoning in which whenever no rule applies to a given situation, the inference engine can fire rules that apply to more general or to similar situations. This can be obtained by defining new forms of match between rules and the facts in the working memory and new forms of conflict resolution. We claim that in this way we can overcome some of the brittleness problems that are common in rule-based systems.

  4. Recommendation System Based On Association Rules For Distributed E-Learning Management Systems

    NASA Astrophysics Data System (ADS)

    Mihai, Gabroveanu

    2015-09-01

    Traditional Learning Management Systems are installed on a single server where learning materials and user data are kept. To increase its performance, the Learning Management System can be installed on multiple servers; learning materials and user data could be distributed across these servers obtaining a Distributed Learning Management System. In this paper is proposed the prototype of a recommendation system based on association rules for Distributed Learning Management System. Information from LMS databases is analyzed using distributed data mining algorithms in order to extract the association rules. Then the extracted rules are used as inference rules to provide personalized recommendations. The quality of provided recommendations is improved because the rules used to make the inferences are more accurate, since these rules aggregate knowledge from all e-Learning systems included in Distributed Learning Management System.

  5. A Logical Framework for Service Migration Based Survivability

    DTIC Science & Technology

    2016-06-24

    platforms; Service Migration Strategy Fuzzy Inference System Knowledge Base Fuzzy rules representing domain expert knowledge about implications of...service migration strategy. Our approach uses expert knowledge as linguistic reasoning rules and takes service programs damage assessment, service...programs complexity, and available network capability as input. The fuzzy inference system includes four components as shown in Figure 5: (1) a knowledge

  6. On implementing clinical decision support: achieving scalability and maintainability by combining business rules and ontologies.

    PubMed

    Kashyap, Vipul; Morales, Alfredo; Hongsermeier, Tonya

    2006-01-01

    We present an approach and architecture for implementing scalable and maintainable clinical decision support at the Partners HealthCare System. The architecture integrates a business rules engine that executes declarative if-then rules stored in a rule-base referencing objects and methods in a business object model. The rules engine executes object methods by invoking services implemented on the clinical data repository. Specialized inferences that support classification of data and instances into classes are identified and an approach to implement these inferences using an OWL based ontology engine is presented. Alternative representations of these specialized inferences as if-then rules or OWL axioms are explored and their impact on the scalability and maintenance of the system is presented. Architectural alternatives for integration of clinical decision support functionality with the invoking application and the underlying clinical data repository; and their associated trade-offs are discussed and presented.

  7. On Inference Rules of Logic-Based Information Retrieval Systems.

    ERIC Educational Resources Information Center

    Chen, Patrick Shicheng

    1994-01-01

    Discussion of relevance and the needs of the users in information retrieval focuses on a deductive object-oriented approach and suggests eight inference rules for the deduction. Highlights include characteristics of a deductive object-oriented system, database and data modeling language, implementation, and user interface. (Contains 24…

  8. Analysis, Simulation, and Verification of Knowledge-Based, Rule-Based, and Expert Systems

    NASA Technical Reports Server (NTRS)

    Hinchey, Mike; Rash, James; Erickson, John; Gracanin, Denis; Rouff, Chris

    2010-01-01

    Mathematically sound techniques are used to view a knowledge-based system (KBS) as a set of processes executing in parallel and being enabled in response to specific rules being fired. The set of processes can be manipulated, examined, analyzed, and used in a simulation. The tool that embodies this technology may warn developers of errors in their rules, but may also highlight rules (or sets of rules) in the system that are underspecified (or overspecified) and need to be corrected for the KBS to operate as intended. The rules embodied in a KBS specify the allowed situations, events, and/or results of the system they describe. In that sense, they provide a very abstract specification of a system. The system is implemented through the combination of the system specification together with an appropriate inference engine, independent of the algorithm used in that inference engine. Viewing the rule base as a major component of the specification, and choosing an appropriate specification notation to represent it, reveals how additional power can be derived from an approach to the knowledge-base system that involves analysis, simulation, and verification. This innovative approach requires no special knowledge of the rules, and allows a general approach where standardized analysis, verification, simulation, and model checking techniques can be applied to the KBS.

  9. Intelligent control for modeling of real-time reservoir operation, part II: artificial neural network with operating rule curves

    NASA Astrophysics Data System (ADS)

    Chang, Ya-Ting; Chang, Li-Chiu; Chang, Fi-John

    2005-04-01

    To bridge the gap between academic research and actual operation, we propose an intelligent control system for reservoir operation. The methodology includes two major processes, the knowledge acquired and implemented, and the inference system. In this study, a genetic algorithm (GA) and a fuzzy rule base (FRB) are used to extract knowledge based on the historical inflow data with a design objective function and on the operating rule curves respectively. The adaptive network-based fuzzy inference system (ANFIS) is then used to implement the knowledge, to create the fuzzy inference system, and then to estimate the optimal reservoir operation. To investigate its applicability and practicability, the Shihmen reservoir, Taiwan, is used as a case study. For the purpose of comparison, a simulation of the currently used M-5 operating rule curve is also performed. The results demonstrate that (1) the GA is an efficient way to search the optimal input-output patterns, (2) the FRB can extract the knowledge from the operating rule curves, and (3) the ANFIS models built on different types of knowledge can produce much better performance than the traditional M-5 curves in real-time reservoir operation. Moreover, we show that the model can be more intelligent for reservoir operation if more information (or knowledge) is involved.

  10. An Ada inference engine for expert systems

    NASA Technical Reports Server (NTRS)

    Lavallee, David B.

    1986-01-01

    The purpose is to investigate the feasibility of using Ada for rule-based expert systems with real-time performance requirements. This includes exploring the Ada features which give improved performance to expert systems as well as optimizing the tradeoffs or workarounds that the use of Ada may require. A prototype inference engine was built using Ada, and rule firing rates in excess of 500 per second were demonstrated on a single MC68000 processor. The knowledge base uses a directed acyclic graph to represent production lines. The graph allows the use of AND, OR, and NOT logical operators. The inference engine uses a combination of both forward and backward chaining in order to reach goals as quickly as possible. Future efforts will include additional investigation of multiprocessing to improve performance and creating a user interface allowing rule input in an Ada-like syntax. Investigation of multitasking and alternate knowledge base representations will help to analyze some of the performance issues as they relate to larger problems.

  11. Hierarchy-associated semantic-rule inference framework for classifying indoor scenes

    NASA Astrophysics Data System (ADS)

    Yu, Dan; Liu, Peng; Ye, Zhipeng; Tang, Xianglong; Zhao, Wei

    2016-03-01

    Typically, the initial task of classifying indoor scenes is challenging, because the spatial layout and decoration of a scene can vary considerably. Recent efforts at classifying object relationships commonly depend on the results of scene annotation and predefined rules, making classification inflexible. Furthermore, annotation results are easily affected by external factors. Inspired by human cognition, a scene-classification framework was proposed using the empirically based annotation (EBA) and a match-over rule-based (MRB) inference system. The semantic hierarchy of images is exploited by EBA to construct rules empirically for MRB classification. The problem of scene classification is divided into low-level annotation and high-level inference from a macro perspective. Low-level annotation involves detecting the semantic hierarchy and annotating the scene with a deformable-parts model and a bag-of-visual-words model. In high-level inference, hierarchical rules are extracted to train the decision tree for classification. The categories of testing samples are generated from the parts to the whole. Compared with traditional classification strategies, the proposed semantic hierarchy and corresponding rules reduce the effect of a variable background and improve the classification performance. The proposed framework was evaluated on a popular indoor scene dataset, and the experimental results demonstrate its effectiveness.

  12. A novel prosodic-information synthesizer based on recurrent fuzzy neural network for the Chinese TTS system.

    PubMed

    Lin, Chin-Teng; Wu, Rui-Cheng; Chang, Jyh-Yeong; Liang, Sheng-Fu

    2004-02-01

    In this paper, a new technique for the Chinese text-to-speech (TTS) system is proposed. Our major effort focuses on the prosodic information generation. New methodologies for constructing fuzzy rules in a prosodic model simulating human's pronouncing rules are developed. The proposed Recurrent Fuzzy Neural Network (RFNN) is a multilayer recurrent neural network (RNN) which integrates a Self-cOnstructing Neural Fuzzy Inference Network (SONFIN) into a recurrent connectionist structure. The RFNN can be functionally divided into two parts. The first part adopts the SONFIN as a prosodic model to explore the relationship between high-level linguistic features and prosodic information based on fuzzy inference rules. As compared to conventional neural networks, the SONFIN can always construct itself with an economic network size in high learning speed. The second part employs a five-layer network to generate all prosodic parameters by directly using the prosodic fuzzy rules inferred from the first part as well as other important features of syllables. The TTS system combined with the proposed method can behave not only sandhi rules but also the other prosodic phenomena existing in the traditional TTS systems. Moreover, the proposed scheme can even find out some new rules about prosodic phrase structure. The performance of the proposed RFNN-based prosodic model is verified by imbedding it into a Chinese TTS system with a Chinese monosyllable database based on the time-domain pitch synchronous overlap add (TD-PSOLA) method. Our experimental results show that the proposed RFNN can generate proper prosodic parameters including pitch means, pitch shapes, maximum energy levels, syllable duration, and pause duration. Some synthetic sounds are online available for demonstration.

  13. Dynamic Querying of Mass-Storage RDF Data with Rule-Based Entailment Regimes

    NASA Astrophysics Data System (ADS)

    Ianni, Giovambattista; Krennwallner, Thomas; Martello, Alessandra; Polleres, Axel

    RDF Schema (RDFS) as a lightweight ontology language is gaining popularity and, consequently, tools for scalable RDFS inference and querying are needed. SPARQL has become recently a W3C standard for querying RDF data, but it mostly provides means for querying simple RDF graphs only, whereas querying with respect to RDFS or other entailment regimes is left outside the current specification. In this paper, we show that SPARQL faces certain unwanted ramifications when querying ontologies in conjunction with RDF datasets that comprise multiple named graphs, and we provide an extension for SPARQL that remedies these effects. Moreover, since RDFS inference has a close relationship with logic rules, we generalize our approach to select a custom ruleset for specifying inferences to be taken into account in a SPARQL query. We show that our extensions are technically feasible by providing benchmark results for RDFS querying in our prototype system GiaBATA, which uses Datalog coupled with a persistent Relational Database as a back-end for implementing SPARQL with dynamic rule-based inference. By employing different optimization techniques like magic set rewriting our system remains competitive with state-of-the-art RDFS querying systems.

  14. An Expert-System Engine With Operative Probabilities

    NASA Technical Reports Server (NTRS)

    Orlando, N. E.; Palmer, M. T.; Wallace, R. S.

    1986-01-01

    Program enables proof-of-concepts tests of expert systems under development. AESOP is rule-based inference engine for expert system, which makes decisions about particular situation given user-supplied hypotheses, rules, and answers to questions drawn from rules. If knowledge base containing hypotheses and rules governing environment is available to AESOP, almost any situation within that environment resolved by answering questions asked by AESOP. Questions answered with YES, NO, MAYBE, DON'T KNOW, DON'T CARE, or with probability factor ranging from 0 to 10. AESOP written in Franz LISP for interactive execution.

  15. Genie Inference Engine Rule Writer’s Guide.

    DTIC Science & Technology

    1987-08-01

    33 APPENDIX D. Animal Bootstrap File.............................................................. 39...APPENDIX E. Sample Run of Animal Identification Expert System.......................... 43 APPENDIX F. Numeric Test Knowledge Base...and other data s.tructures stored in the knowledge base (KB), queries the user for input, and draws conclusions. Genie (GENeric Inference Engine) is

  16. SIRE: A Simple Interactive Rule Editor for NICBES

    NASA Technical Reports Server (NTRS)

    Bykat, Alex

    1988-01-01

    To support evolution of domain expertise, and its representation in an expert system knowledge base, a user-friendly rule base editor is mandatory. The Nickel Cadmium Battery Expert System (NICBES), a prototype of an expert system for the Hubble Space Telescope power storage management system, does not provide such an editor. In the following, a description of a Simple Interactive Rule Base Editor (SIRE) for NICBES is described. The SIRE provides a consistent internal representation of the NICBES knowledge base. It supports knowledge presentation and provides a user-friendly and code language independent medium for rule addition and modification. The SIRE is integrated with NICBES via an interface module. This module provides translation of the internal representation to Prolog-type rules (Horn clauses), latter rule assertion, and a simple mechanism for rule selection for its Prolog inference engine.

  17. Rule groupings in expert systems using nearest neighbour decision rules, and convex hulls

    NASA Technical Reports Server (NTRS)

    Anastasiadis, Stergios

    1991-01-01

    Expert System shells are lacking in many areas of software engineering. Large rule based systems are not semantically comprehensible, difficult to debug, and impossible to modify or validate. Partitioning a set of rules found in CLIPS (C Language Integrated Production System) into groups of rules which reflect the underlying semantic subdomains of the problem, will address adequately the concerns stated above. Techniques are introduced to structure a CLIPS rule base into groups of rules that inherently have common semantic information. The concepts involved are imported from the field of A.I., Pattern Recognition, and Statistical Inference. Techniques focus on the areas of feature selection, classification, and a criteria of how 'good' the classification technique is, based on Bayesian Decision Theory. A variety of distance metrics are discussed for measuring the 'closeness' of CLIPS rules and various Nearest Neighbor classification algorithms are described based on the above metric.

  18. Poisson-Based Inference for Perturbation Models in Adaptive Spelling Training

    ERIC Educational Resources Information Center

    Baschera, Gian-Marco; Gross, Markus

    2010-01-01

    We present an inference algorithm for perturbation models based on Poisson regression. The algorithm is designed to handle unclassified input with multiple errors described by independent mal-rules. This knowledge representation provides an intelligent tutoring system with local and global information about a student, such as error classification…

  19. An Investigation and Interpretation of Selected Topics in Uncertainty Reasoning

    DTIC Science & Technology

    1989-12-01

    Characterizing seconditry uncertainty as spurious evidence and including it in the inference process , It was shown that probability ratio graphs are a...in the inference process has great impact on the computational complexity of an Inference process . viii An Investigation and Interpretation of...Systems," he outlines a five step process that incorporates Blyeslan reasoning in the development of the expert system rule base: 1. A group of

  20. COMPUTERIZED RISK AND BIOACCUMULATION SYSTEM (VERSION 1.0)

    EPA Science Inventory

    CRABS is a combination of a rule-based expert system and more traditional procedural programming techniques. ule-based expert systems attempt to emulate the decision making process of human experts within a clearly defined subject area. xpert systems consist of an "inference engi...

  1. Robust Strategy for Rocket Engine Health Monitoring

    NASA Technical Reports Server (NTRS)

    Santi, L. Michael

    2001-01-01

    Monitoring the health of rocket engine systems is essentially a two-phase process. The acquisition phase involves sensing physical conditions at selected locations, converting physical inputs to electrical signals, conditioning the signals as appropriate to establish scale or filter interference, and recording results in a form that is easy to interpret. The inference phase involves analysis of results from the acquisition phase, comparison of analysis results to established health measures, and assessment of health indications. A variety of analytical tools may be employed in the inference phase of health monitoring. These tools can be separated into three broad categories: statistical, rule based, and model based. Statistical methods can provide excellent comparative measures of engine operating health. They require well-characterized data from an ensemble of "typical" engines, or "golden" data from a specific test assumed to define the operating norm in order to establish reliable comparative measures. Statistical methods are generally suitable for real-time health monitoring because they do not deal with the physical complexities of engine operation. The utility of statistical methods in rocket engine health monitoring is hindered by practical limits on the quantity and quality of available data. This is due to the difficulty and high cost of data acquisition, the limited number of available test engines, and the problem of simulating flight conditions in ground test facilities. In addition, statistical methods incur a penalty for disregarding flow complexity and are therefore limited in their ability to define performance shift causality. Rule based methods infer the health state of the engine system based on comparison of individual measurements or combinations of measurements with defined health norms or rules. This does not mean that rule based methods are necessarily simple. Although binary yes-no health assessment can sometimes be established by relatively simple rules, the causality assignment needed for refined health monitoring often requires an exceptionally complex rule base involving complicated logical maps. Structuring the rule system to be clear and unambiguous can be difficult, and the expert input required to maintain a large logic network and associated rule base can be prohibitive.

  2. Architecture For The Optimization Of A Machining Process In Real Time Through Rule-Based Expert System

    NASA Astrophysics Data System (ADS)

    Serrano, Rafael; González, Luis Carlos; Martín, Francisco Jesús

    2009-11-01

    Under the project SENSOR-IA which has had financial funding from the Order of Incentives to the Regional Technology Centers of the Counsil of Innovation, Science and Enterprise of Andalusia, an architecture for the optimization of a machining process in real time through rule-based expert system has been developed. The architecture consists of an acquisition system and sensor data processing engine (SATD) from an expert system (SE) rule-based which communicates with the SATD. The SE has been designed as an inference engine with an algorithm for effective action, using a modus ponens rule model of goal-oriented rules.The pilot test demonstrated that it is possible to govern in real time the machining process based on rules contained in a SE. The tests have been done with approximated rules. Future work includes an exhaustive collection of data with different tool materials and geometries in a database to extract more precise rules.

  3. Role of Utility and Inference in the Evolution of Functional Information

    PubMed Central

    Sharov, Alexei A.

    2009-01-01

    Functional information means an encoded network of functions in living organisms from molecular signaling pathways to an organism’s behavior. It is represented by two components: code and an interpretation system, which together form a self-sustaining semantic closure. Semantic closure allows some freedom between components because small variations of the code are still interpretable. The interpretation system consists of inference rules that control the correspondence between the code and the function (phenotype) and determines the shape of the fitness landscape. The utility factor operates at multiple time scales: short-term selection drives evolution towards higher survival and reproduction rate within a given fitness landscape, and long-term selection favors those fitness landscapes that support adaptability and lead to evolutionary expansion of certain lineages. Inference rules make short-term selection possible by shaping the fitness landscape and defining possible directions of evolution, but they are under control of the long-term selection of lineages. Communication normally occurs within a set of agents with compatible interpretation systems, which I call communication system. Functional information cannot be directly transferred between communication systems with incompatible inference rules. Each biological species is a genetic communication system that carries unique functional information together with inference rules that determine evolutionary directions and constraints. This view of the relation between utility and inference can resolve the conflict between realism/positivism and pragmatism. Realism overemphasizes the role of inference in evolution of human knowledge because it assumes that logic is embedded in reality. Pragmatism substitutes usefulness for truth and therefore ignores the advantage of inference. The proposed concept of evolutionary pragmatism rejects the idea that logic is embedded in reality; instead, inference rules are constructed within each communication system to represent reality and they evolve towards higher adaptability on a long time scale. PMID:20160960

  4. Automatic inference of indexing rules for MEDLINE

    PubMed Central

    Névéol, Aurélie; Shooshan, Sonya E; Claveau, Vincent

    2008-01-01

    Background: Indexing is a crucial step in any information retrieval system. In MEDLINE, a widely used database of the biomedical literature, the indexing process involves the selection of Medical Subject Headings in order to describe the subject matter of articles. The need for automatic tools to assist MEDLINE indexers in this task is growing with the increasing number of publications being added to MEDLINE. Methods: In this paper, we describe the use and the customization of Inductive Logic Programming (ILP) to infer indexing rules that may be used to produce automatic indexing recommendations for MEDLINE indexers. Results: Our results show that this original ILP-based approach outperforms manual rules when they exist. In addition, the use of ILP rules also improves the overall performance of the Medical Text Indexer (MTI), a system producing automatic indexing recommendations for MEDLINE. Conclusion: We expect the sets of ILP rules obtained in this experiment to be integrated into MTI. PMID:19025687

  5. Automatic inference of indexing rules for MEDLINE.

    PubMed

    Névéol, Aurélie; Shooshan, Sonya E; Claveau, Vincent

    2008-11-19

    Indexing is a crucial step in any information retrieval system. In MEDLINE, a widely used database of the biomedical literature, the indexing process involves the selection of Medical Subject Headings in order to describe the subject matter of articles. The need for automatic tools to assist MEDLINE indexers in this task is growing with the increasing number of publications being added to MEDLINE. In this paper, we describe the use and the customization of Inductive Logic Programming (ILP) to infer indexing rules that may be used to produce automatic indexing recommendations for MEDLINE indexers. Our results show that this original ILP-based approach outperforms manual rules when they exist. In addition, the use of ILP rules also improves the overall performance of the Medical Text Indexer (MTI), a system producing automatic indexing recommendations for MEDLINE. We expect the sets of ILP rules obtained in this experiment to be integrated into MTI.

  6. A method and knowledge base for automated inference of patient problems from structured data in an electronic medical record.

    PubMed

    Wright, Adam; Pang, Justine; Feblowitz, Joshua C; Maloney, Francine L; Wilcox, Allison R; Ramelson, Harley Z; Schneider, Louise I; Bates, David W

    2011-01-01

    Accurate knowledge of a patient's medical problems is critical for clinical decision making, quality measurement, research, billing and clinical decision support. Common structured sources of problem information include the patient problem list and billing data; however, these sources are often inaccurate or incomplete. To develop and validate methods of automatically inferring patient problems from clinical and billing data, and to provide a knowledge base for inferring problems. We identified 17 target conditions and designed and validated a set of rules for identifying patient problems based on medications, laboratory results, billing codes, and vital signs. A panel of physicians provided input on a preliminary set of rules. Based on this input, we tested candidate rules on a sample of 100,000 patient records to assess their performance compared to gold standard manual chart review. The physician panel selected a final rule for each condition, which was validated on an independent sample of 100,000 records to assess its accuracy. Seventeen rules were developed for inferring patient problems. Analysis using a validation set of 100,000 randomly selected patients showed high sensitivity (range: 62.8-100.0%) and positive predictive value (range: 79.8-99.6%) for most rules. Overall, the inference rules performed better than using either the problem list or billing data alone. We developed and validated a set of rules for inferring patient problems. These rules have a variety of applications, including clinical decision support, care improvement, augmentation of the problem list, and identification of patients for research cohorts.

  7. CLIPS: A tool for corn disease diagnostic system and an aid to neural network for automated knowledge acquisition

    NASA Technical Reports Server (NTRS)

    Wu, Cathy; Taylor, Pam; Whitson, George; Smith, Cathy

    1990-01-01

    This paper describes the building of a corn disease diagnostic expert system using CLIPS, and the development of a neural expert system using the fact representation method of CLIPS for automated knowledge acquisition. The CLIPS corn expert system diagnoses 21 diseases from 52 symptoms and signs with certainty factors. CLIPS has several unique features. It allows the facts in rules to be broken down to object-attribute-value (OAV) triples, allows rule-grouping, and fires rules based on pattern-matching. These features combined with the chained inference engine result to a natural user query system and speedy execution. In order to develop a method for automated knowledge acquisition, an Artificial Neural Expert System (ANES) is developed by a direct mapping from the CLIPS system. The ANES corn expert system uses the same OAV triples in the CLIPS system for its facts. The LHS and RHS facts of the CLIPS rules are mapped into the input and output layers of the ANES, respectively; and the inference engine of the rules is imbedded in the hidden layer. The fact representation by OAC triples gives a natural grouping of the rules. These features allow the ANES system to automate rule-generation, and make it efficient to execute and easy to expand for a large and complex domain.

  8. Experiments on neural network architectures for fuzzy logic

    NASA Technical Reports Server (NTRS)

    Keller, James M.

    1991-01-01

    The use of fuzzy logic to model and manage uncertainty in a rule-based system places high computational demands on an inference engine. In an earlier paper, the authors introduced a trainable neural network structure for fuzzy logic. These networks can learn and extrapolate complex relationships between possibility distributions for the antecedents and consequents in the rules. Here, the power of these networks is further explored. The insensitivity of the output to noisy input distributions (which are likely if the clauses are generated from real data) is demonstrated as well as the ability of the networks to internalize multiple conjunctive clause and disjunctive clause rules. Since different rules with the same variables can be encoded in a single network, this approach to fuzzy logic inference provides a natural mechanism for rule conflict resolution.

  9. An architecture for rule based system explanation

    NASA Technical Reports Server (NTRS)

    Fennel, T. R.; Johannes, James D.

    1990-01-01

    A system architecture is presented which incorporate both graphics and text into explanations provided by rule based expert systems. This architecture facilitates explanation of the knowledge base content, the control strategies employed by the system, and the conclusions made by the system. The suggested approach combines hypermedia and inference engine capabilities. Advantages include: closer integration of user interface, explanation system, and knowledge base; the ability to embed links to deeper knowledge underlying the compiled knowledge used in the knowledge base; and allowing for more direct control of explanation depth and duration by the user. User models are suggested to control the type, amount, and order of information presented.

  10. A method and knowledge base for automated inference of patient problems from structured data in an electronic medical record

    PubMed Central

    Pang, Justine; Feblowitz, Joshua C; Maloney, Francine L; Wilcox, Allison R; Ramelson, Harley Z; Schneider, Louise I; Bates, David W

    2011-01-01

    Background Accurate knowledge of a patient's medical problems is critical for clinical decision making, quality measurement, research, billing and clinical decision support. Common structured sources of problem information include the patient problem list and billing data; however, these sources are often inaccurate or incomplete. Objective To develop and validate methods of automatically inferring patient problems from clinical and billing data, and to provide a knowledge base for inferring problems. Study design and methods We identified 17 target conditions and designed and validated a set of rules for identifying patient problems based on medications, laboratory results, billing codes, and vital signs. A panel of physicians provided input on a preliminary set of rules. Based on this input, we tested candidate rules on a sample of 100 000 patient records to assess their performance compared to gold standard manual chart review. The physician panel selected a final rule for each condition, which was validated on an independent sample of 100 000 records to assess its accuracy. Results Seventeen rules were developed for inferring patient problems. Analysis using a validation set of 100 000 randomly selected patients showed high sensitivity (range: 62.8–100.0%) and positive predictive value (range: 79.8–99.6%) for most rules. Overall, the inference rules performed better than using either the problem list or billing data alone. Conclusion We developed and validated a set of rules for inferring patient problems. These rules have a variety of applications, including clinical decision support, care improvement, augmentation of the problem list, and identification of patients for research cohorts. PMID:21613643

  11. System and method for responding to ground and flight system malfunctions

    NASA Technical Reports Server (NTRS)

    Anderson, Julie J. (Inventor); Fussell, Ronald M. (Inventor)

    2010-01-01

    A system for on-board anomaly resolution for a vehicle has a data repository. The data repository stores data related to different systems, subsystems, and components of the vehicle. The data stored is encoded in a tree-based structure. A query engine is coupled to the data repository. The query engine provides a user and automated interface and provides contextual query to the data repository. An inference engine is coupled to the query engine. The inference engine compares current anomaly data to contextual data stored in the data repository using inference rules. The inference engine generates a potential solution to the current anomaly by referencing the data stored in the data repository.

  12. Evaluation of fuzzy inference systems using fuzzy least squares

    NASA Technical Reports Server (NTRS)

    Barone, Joseph M.

    1992-01-01

    Efforts to develop evaluation methods for fuzzy inference systems which are not based on crisp, quantitative data or processes (i.e., where the phenomenon the system is built to describe or control is inherently fuzzy) are just beginning. This paper suggests that the method of fuzzy least squares can be used to perform such evaluations. Regressing the desired outputs onto the inferred outputs can provide both global and local measures of success. The global measures have some value in an absolute sense, but they are particularly useful when competing solutions (e.g., different numbers of rules, different fuzzy input partitions) are being compared. The local measure described here can be used to identify specific areas of poor fit where special measures (e.g., the use of emphatic or suppressive rules) can be applied. Several examples are discussed which illustrate the applicability of the method as an evaluation tool.

  13. Program for Experimentation With Expert Systems

    NASA Technical Reports Server (NTRS)

    Engle, S. W.

    1986-01-01

    CERBERUS is forward-chaining, knowledge-based system program useful for experimentation with expert systems. Inference-engine mechanism performs deductions according to user-supplied rule set. Information stored in intermediate area, and user interrogated only when no applicable data found in storage. Each assertion posed by CERBERUS answered with certainty ranging from 0 to 100 percent. Rule processor stops investigating applicable rules when goal reaches certainty of 95 percent or higher. Capable of operating for wide variety of domains. Sample rule files included for animal identification, pixel classification in image processing, and rudimentary car repair for novice mechanic. User supplies set of end goals or actions. System complexity decided by user's rule file. CERBERUS written in FORTRAN 77.

  14. An Architecture for Performance Optimization in a Collaborative Knowledge-Based Approach for Wireless Sensor Networks

    PubMed Central

    Gadeo-Martos, Manuel Angel; Fernandez-Prieto, Jose Angel; Canada-Bago, Joaquin; Velasco, Juan Ramon

    2011-01-01

    Over the past few years, Intelligent Spaces (ISs) have received the attention of many Wireless Sensor Network researchers. Recently, several studies have been devoted to identify their common capacities and to set up ISs over these networks. However, little attention has been paid to integrating Fuzzy Rule-Based Systems into collaborative Wireless Sensor Networks for the purpose of implementing ISs. This work presents a distributed architecture proposal for collaborative Fuzzy Rule-Based Systems embedded in Wireless Sensor Networks, which has been designed to optimize the implementation of ISs. This architecture includes the following: (a) an optimized design for the inference engine; (b) a visual interface; (c) a module to reduce the redundancy and complexity of the knowledge bases; (d) a module to evaluate the accuracy of the new knowledge base; (e) a module to adapt the format of the rules to the structure used by the inference engine; and (f) a communications protocol. As a real-world application of this architecture and the proposed methodologies, we show an application to the problem of modeling two plagues of the olive tree: prays (olive moth, Prays oleae Bern.) and repilo (caused by the fungus Spilocaea oleagina). The results show that the architecture presented in this paper significantly decreases the consumption of resources (memory, CPU and battery) without a substantial decrease in the accuracy of the inferred values. PMID:22163687

  15. An architecture for performance optimization in a collaborative knowledge-based approach for wireless sensor networks.

    PubMed

    Gadeo-Martos, Manuel Angel; Fernandez-Prieto, Jose Angel; Canada-Bago, Joaquin; Velasco, Juan Ramon

    2011-01-01

    Over the past few years, Intelligent Spaces (ISs) have received the attention of many Wireless Sensor Network researchers. Recently, several studies have been devoted to identify their common capacities and to set up ISs over these networks. However, little attention has been paid to integrating Fuzzy Rule-Based Systems into collaborative Wireless Sensor Networks for the purpose of implementing ISs. This work presents a distributed architecture proposal for collaborative Fuzzy Rule-Based Systems embedded in Wireless Sensor Networks, which has been designed to optimize the implementation of ISs. This architecture includes the following: (a) an optimized design for the inference engine; (b) a visual interface; (c) a module to reduce the redundancy and complexity of the knowledge bases; (d) a module to evaluate the accuracy of the new knowledge base; (e) a module to adapt the format of the rules to the structure used by the inference engine; and (f) a communications protocol. As a real-world application of this architecture and the proposed methodologies, we show an application to the problem of modeling two plagues of the olive tree: prays (olive moth, Prays oleae Bern.) and repilo (caused by the fungus Spilocaea oleagina). The results show that the architecture presented in this paper significantly decreases the consumption of resources (memory, CPU and battery) without a substantial decrease in the accuracy of the inferred values.

  16. Decision Support Systems for Launch and Range Operations Using Jess

    NASA Technical Reports Server (NTRS)

    Thirumalainambi, Rajkumar

    2007-01-01

    The virtual test bed for launch and range operations developed at NASA Ames Research Center consists of various independent expert systems advising on weather effects, toxic gas dispersions and human health risk assessment during space-flight operations. An individual dedicated server supports each expert system and the master system gather information from the dedicated servers to support the launch decision-making process. Since the test bed is based on the web system, reducing network traffic and optimizing the knowledge base is critical to its success of real-time or near real-time operations. Jess, a fast rule engine and powerful scripting environment developed at Sandia National Laboratory has been adopted to build the expert systems providing robustness and scalability. Jess also supports XML representation of knowledge base with forward and backward chaining inference mechanism. Facts added - to working memory during run-time operations facilitates analyses of multiple scenarios. Knowledge base can be distributed with one inference engine performing the inference process. This paper discusses details of the knowledge base and inference engine using Jess for a launch and range virtual test bed.

  17. Knowledge-based reasoning in the Paladin tactical decision generation system

    NASA Technical Reports Server (NTRS)

    Chappell, Alan R.

    1993-01-01

    A real-time tactical decision generation system for air combat engagements, Paladin, has been developed. A pilot's job in air combat includes tasks that are largely symbolic. These symbolic tasks are generally performed through the application of experience and training (i.e. knowledge) gathered over years of flying a fighter aircraft. Two such tasks, situation assessment and throttle control, are identified and broken out in Paladin to be handled by specialized knowledge based systems. Knowledge pertaining to these tasks is encoded into rule-bases to provide the foundation for decisions. Paladin uses a custom built inference engine and a partitioned rule-base structure to give these symbolic results in real-time. This paper provides an overview of knowledge-based reasoning systems as a subset of rule-based systems. The knowledge used by Paladin in generating results as well as the system design for real-time execution is discussed.

  18. Inferring the Limit Behavior of Some Elementary Cellular Automata

    NASA Astrophysics Data System (ADS)

    Ruivo, Eurico L. P.; de Oliveira, Pedro P. B.

    Cellular automata locally define dynamical systems, discrete in space, time and in the state variables, capable of displaying arbitrarily complex global emergent behavior. One core question in the study of cellular automata refers to their limit behavior, that is, to the global dynamical features in an infinite time evolution. Previous works have shown that for finite time evolutions, the dynamics of one-dimensional cellular automata can be described by regular languages and, therefore, by finite automata. Such studies have shown the existence of growth patterns in the evolution of such finite automata for some elementary cellular automata rules and also inferred the limit behavior of such rules based upon the growth patterns; however, the results on the limit behavior were obtained manually, by direct inspection of the structures that arise during the time evolution. Here we present the formalization of an automatic method to compute such structures. Based on this, the rules of the elementary cellular automata space were classified according to the existence of a growth pattern in their finite automata. Also, we present a method to infer the limit graph of some elementary cellular automata rules, derived from the analysis of the regular expressions that describe their behavior in finite time. Finally, we analyze some attractors of two rules for which we could not compute the whole limit set.

  19. An intelligent knowledge-based and customizable home care system framework with ubiquitous patient monitoring and alerting techniques.

    PubMed

    Chen, Yen-Lin; Chiang, Hsin-Han; Yu, Chao-Wei; Chiang, Chuan-Yen; Liu, Chuan-Ming; Wang, Jenq-Haur

    2012-01-01

    This study develops and integrates an efficient knowledge-based system and a component-based framework to design an intelligent and flexible home health care system. The proposed knowledge-based system integrates an efficient rule-based reasoning model and flexible knowledge rules for determining efficiently and rapidly the necessary physiological and medication treatment procedures based on software modules, video camera sensors, communication devices, and physiological sensor information. This knowledge-based system offers high flexibility for improving and extending the system further to meet the monitoring demands of new patient and caregiver health care by updating the knowledge rules in the inference mechanism. All of the proposed functional components in this study are reusable, configurable, and extensible for system developers. Based on the experimental results, the proposed intelligent homecare system demonstrates that it can accomplish the extensible, customizable, and configurable demands of the ubiquitous healthcare systems to meet the different demands of patients and caregivers under various rehabilitation and nursing conditions.

  20. An Intelligent Knowledge-Based and Customizable Home Care System Framework with Ubiquitous Patient Monitoring and Alerting Techniques

    PubMed Central

    Chen, Yen-Lin; Chiang, Hsin-Han; Yu, Chao-Wei; Chiang, Chuan-Yen; Liu, Chuan-Ming; Wang, Jenq-Haur

    2012-01-01

    This study develops and integrates an efficient knowledge-based system and a component-based framework to design an intelligent and flexible home health care system. The proposed knowledge-based system integrates an efficient rule-based reasoning model and flexible knowledge rules for determining efficiently and rapidly the necessary physiological and medication treatment procedures based on software modules, video camera sensors, communication devices, and physiological sensor information. This knowledge-based system offers high flexibility for improving and extending the system further to meet the monitoring demands of new patient and caregiver health care by updating the knowledge rules in the inference mechanism. All of the proposed functional components in this study are reusable, configurable, and extensible for system developers. Based on the experimental results, the proposed intelligent homecare system demonstrates that it can accomplish the extensible, customizable, and configurable demands of the ubiquitous healthcare systems to meet the different demands of patients and caregivers under various rehabilitation and nursing conditions. PMID:23112650

  1. Toward Webscale, Rule-Based Inference on the Semantic Web Via Data Parallelism

    DTIC Science & Technology

    2013-02-01

    Another work distinct from its peers is the work on approximate reasoning by Rudolph et al. [34] in which multiple inference sys- tems were combined not...Workshop Scalable Semantic Web Knowledge Base Systems, 2010, pp. 17–31. [34] S. Rudolph , T. Tserendorj, and P. Hitzler, “What is approximate reasoning...2013] [55] M. Duerst and M. Suignard. (2005, Jan .). RFC 3987 – internationalized resource identifiers (IRIs). IETF. [Online]. Available: http

  2. Faults Discovery By Using Mined Data

    NASA Technical Reports Server (NTRS)

    Lee, Charles

    2005-01-01

    Fault discovery in the complex systems consist of model based reasoning, fault tree analysis, rule based inference methods, and other approaches. Model based reasoning builds models for the systems either by mathematic formulations or by experiment model. Fault Tree Analysis shows the possible causes of a system malfunction by enumerating the suspect components and their respective failure modes that may have induced the problem. The rule based inference build the model based on the expert knowledge. Those models and methods have one thing in common; they have presumed some prior-conditions. Complex systems often use fault trees to analyze the faults. Fault diagnosis, when error occurs, is performed by engineers and analysts performing extensive examination of all data gathered during the mission. International Space Station (ISS) control center operates on the data feedback from the system and decisions are made based on threshold values by using fault trees. Since those decision-making tasks are safety critical and must be done promptly, the engineers who manually analyze the data are facing time challenge. To automate this process, this paper present an approach that uses decision trees to discover fault from data in real-time and capture the contents of fault trees as the initial state of the trees.

  3. Web-based Weather Expert System (WES) for Space Shuttle Launch

    NASA Technical Reports Server (NTRS)

    Bardina, Jorge E.; Rajkumar, T.

    2003-01-01

    The Web-based Weather Expert System (WES) is a critical module of the Virtual Test Bed development to support 'go/no go' decisions for Space Shuttle operations in the Intelligent Launch and Range Operations program of NASA. The weather rules characterize certain aspects of the environment related to the launching or landing site, the time of the day or night, the pad or runway conditions, the mission durations, the runway equipment and landing type. Expert system rules are derived from weather contingency rules, which were developed over years by NASA. Backward chaining, a goal-directed inference method is adopted, because a particular consequence or goal clause is evaluated first, and then chained backward through the rules. Once a rule is satisfied or true, then that particular rule is fired and the decision is expressed. The expert system is continuously verifying the rules against the past one-hour weather conditions and the decisions are made. The normal procedure of operations requires a formal pre-launch weather briefing held on Launch minus 1 day, which is a specific weather briefing for all areas of Space Shuttle launch operations. In this paper, the Web-based Weather Expert System of the Intelligent Launch and range Operations program is presented.

  4. Efficient probabilistic inference in generic neural networks trained with non-probabilistic feedback.

    PubMed

    Orhan, A Emin; Ma, Wei Ji

    2017-07-26

    Animals perform near-optimal probabilistic inference in a wide range of psychophysical tasks. Probabilistic inference requires trial-to-trial representation of the uncertainties associated with task variables and subsequent use of this representation. Previous work has implemented such computations using neural networks with hand-crafted and task-dependent operations. We show that generic neural networks trained with a simple error-based learning rule perform near-optimal probabilistic inference in nine common psychophysical tasks. In a probabilistic categorization task, error-based learning in a generic network simultaneously explains a monkey's learning curve and the evolution of qualitative aspects of its choice behavior. In all tasks, the number of neurons required for a given level of performance grows sublinearly with the input population size, a substantial improvement on previous implementations of probabilistic inference. The trained networks develop a novel sparsity-based probabilistic population code. Our results suggest that probabilistic inference emerges naturally in generic neural networks trained with error-based learning rules.Behavioural tasks often require probability distributions to be inferred about task specific variables. Here, the authors demonstrate that generic neural networks can be trained using a simple error-based learning rule to perform such probabilistic computations efficiently without any need for task specific operations.

  5. A new intuitionistic fuzzy rule-based decision-making system for an operating system process scheduler.

    PubMed

    Butt, Muhammad Arif; Akram, Muhammad

    2016-01-01

    We present a new intuitionistic fuzzy rule-based decision-making system based on intuitionistic fuzzy sets for a process scheduler of a batch operating system. Our proposed intuitionistic fuzzy scheduling algorithm, inputs the nice value and burst time of all available processes in the ready queue, intuitionistically fuzzify the input values, triggers appropriate rules of our intuitionistic fuzzy inference engine and finally calculates the dynamic priority (dp) of all the processes in the ready queue. Once the dp of every process is calculated the ready queue is sorted in decreasing order of dp of every process. The process with maximum dp value is sent to the central processing unit for execution. Finally, we show complete working of our algorithm on two different data sets and give comparisons with some standard non-preemptive process schedulers.

  6. Accident/Mishap Investigation System

    NASA Technical Reports Server (NTRS)

    Keller, Richard; Wolfe, Shawn; Gawdiak, Yuri; Carvalho, Robert; Panontin, Tina; Williams, James; Sturken, Ian

    2007-01-01

    InvestigationOrganizer (IO) is a Web-based collaborative information system that integrates the generic functionality of a database, a document repository, a semantic hypermedia browser, and a rule-based inference system with specialized modeling and visualization functionality to support accident/mishap investigation teams. This accessible, online structure is designed to support investigators by allowing them to make explicit, shared, and meaningful links among evidence, causal models, findings, and recommendations.

  7. Theory of mind broad and narrow: reasoning about social exchange engages ToM areas, precautionary reasoning does not.

    PubMed

    Ermer, Elsa; Guerin, Scott A; Cosmides, Leda; Tooby, John; Miller, Michael B

    2006-01-01

    Baron-Cohen (1995) proposed that the theory of mind (ToM) inference system evolved to promote strategic social interaction. Social exchange--a form of co-operation for mutual benefit--involves strategic social interaction and requires ToM inferences about the contents of other individuals' mental states, especially their desires, goals, and intentions. There are behavioral and neuropsychological dissociations between reasoning about social exchange and reasoning about equivalent problems tapping other, more general content domains. It has therefore been proposed that social exchange behavior is regulated by social contract algorithms: a domain-specific inference system that is functionally specialized for reasoning about social exchange. We report an fMRI study using the Wason selection task that provides further support for this hypothesis. Precautionary rules share so many properties with social exchange rules--they are conditional, deontic, and involve subjective utilities--that most reasoning theories claim they are processed by the same neurocomputational machinery. Nevertheless, neuroimaging shows that reasoning about social exchange activates brain areas not activated by reasoning about precautionary rules, and vice versa. As predicted, neural correlates of ToM (anterior and posterior temporal cortex) were activated when subjects interpreted social exchange rules, but not precautionary rules (where ToM inferences are unnecessary). We argue that the interaction between ToM and social contract algorithms can be reciprocal: social contract algorithms requires ToM inferences, but their functional logic also allows ToM inferences to be made. By considering interactions between ToM in the narrower sense (belief-desire reasoning) and all the social inference systems that create the logic of human social interaction--ones that enable as well as use inferences about the content of mental states--a broader conception of ToM may emerge: a computational model embodying a Theory of Human Nature (ToHN).

  8. A Fuzzy Logic Based Controller for the Automated Alignment of a Laser-beam-smoothing Spatial Filter

    NASA Technical Reports Server (NTRS)

    Krasowski, M. J.; Dickens, D. E.

    1992-01-01

    A fuzzy logic based controller for a laser-beam-smoothing spatial filter is described. It is demonstrated that a human operator's alignment actions can easily be described by a system of fuzzy rules of inference. The final configuration uses inexpensive, off-the-shelf hardware and allows for a compact, readily implemented embedded control system.

  9. C Language Integrated Production System, Ada Version

    NASA Technical Reports Server (NTRS)

    Culbert, Chris; Riley, Gary; Savely, Robert T.; Melebeck, Clovis J.; White, Wesley A.; Mcgregor, Terry L.; Ferguson, Melisa; Razavipour, Reza

    1992-01-01

    CLIPS/Ada provides capabilities of CLIPS v4.3 but uses Ada as source language for CLIPS executable code. Implements forward-chaining rule-based language. Program contains inference engine and language syntax providing framework for construction of expert-system program. Also includes features for debugging application program. Based on Rete algorithm which provides efficient method for performing repeated matching of patterns. Written in Ada.

  10. An expert system for diagnosing environmentally induced spacecraft anomalies

    NASA Technical Reports Server (NTRS)

    Rolincik, Mark; Lauriente, Michael; Koons, Harry C.; Gorney, David

    1992-01-01

    A new rule-based, machine independent analytical tool was designed for diagnosing spacecraft anomalies using an expert system. Expert systems provide an effective method for saving knowledge, allow computers to sift through large amounts of data pinpointing significant parts, and most importantly, use heuristics in addition to algorithms, which allow approximate reasoning and inference and the ability to attack problems not rigidly defined. The knowledge base consists of over two-hundred (200) rules and provides links to historical and environmental databases. The environmental causes considered are bulk charging, single event upsets (SEU), surface charging, and total radiation dose. The system's driver translates forward chaining rules into a backward chaining sequence, prompting the user for information pertinent to the causes considered. The use of heuristics frees the user from searching through large amounts of irrelevant information and allows the user to input partial information (varying degrees of confidence in an answer) or 'unknown' to any question. The modularity of the expert system allows for easy updates and modifications. It not only provides scientists with needed risk analysis and confidence not found in algorithmic programs, but is also an effective learning tool, and the window implementation makes it very easy to use. The system currently runs on a Micro VAX II at Goddard Space Flight Center (GSFC). The inference engine used is NASA's C Language Integrated Production System (CLIPS).

  11. Intelligent Diagnostic Assistant for Complicated Skin Diseases through C5's Algorithm.

    PubMed

    Jeddi, Fatemeh Rangraz; Arabfard, Masoud; Kermany, Zahra Arab

    2017-09-01

    Intelligent Diagnostic Assistant can be used for complicated diagnosis of skin diseases, which are among the most common causes of disability. The aim of this study was to design and implement a computerized intelligent diagnostic assistant for complicated skin diseases through C5's Algorithm. An applied-developmental study was done in 2015. Knowledge base was developed based on interviews with dermatologists through questionnaires and checklists. Knowledge representation was obtained from the train data in the database using Excel Microsoft Office. Clementine Software and C5's Algorithms were applied to draw the decision tree. Analysis of test accuracy was performed based on rules extracted using inference chains. The rules extracted from the decision tree were entered into the CLIPS programming environment and the intelligent diagnostic assistant was designed then. The rules were defined using forward chaining inference technique and were entered into Clips programming environment as RULE. The accuracy and error rates obtained in the training phase from the decision tree were 99.56% and 0.44%, respectively. The accuracy of the decision tree was 98% and the error was 2% in the test phase. Intelligent diagnostic assistant can be used as a reliable system with high accuracy, sensitivity, specificity, and agreement.

  12. Inference of cancer-specific gene regulatory networks using soft computing rules.

    PubMed

    Wang, Xiaosheng; Gotoh, Osamu

    2010-03-24

    Perturbations of gene regulatory networks are essentially responsible for oncogenesis. Therefore, inferring the gene regulatory networks is a key step to overcoming cancer. In this work, we propose a method for inferring directed gene regulatory networks based on soft computing rules, which can identify important cause-effect regulatory relations of gene expression. First, we identify important genes associated with a specific cancer (colon cancer) using a supervised learning approach. Next, we reconstruct the gene regulatory networks by inferring the regulatory relations among the identified genes, and their regulated relations by other genes within the genome. We obtain two meaningful findings. One is that upregulated genes are regulated by more genes than downregulated ones, while downregulated genes regulate more genes than upregulated ones. The other one is that tumor suppressors suppress tumor activators and activate other tumor suppressors strongly, while tumor activators activate other tumor activators and suppress tumor suppressors weakly, indicating the robustness of biological systems. These findings provide valuable insights into the pathogenesis of cancer.

  13. Single board system for fuzzy inference

    NASA Technical Reports Server (NTRS)

    Symon, James R.; Watanabe, Hiroyuki

    1991-01-01

    The very large scale integration (VLSI) implementation of a fuzzy logic inference mechanism allows the use of rule-based control and decision making in demanding real-time applications. Researchers designed a full custom VLSI inference engine. The chip was fabricated using CMOS technology. The chip consists of 688,000 transistors of which 476,000 are used for RAM memory. The fuzzy logic inference engine board system incorporates the custom designed integrated circuit into a standard VMEbus environment. The Fuzzy Logic system uses Transistor-Transistor Logic (TTL) parts to provide the interface between the Fuzzy chip and a standard, double height VMEbus backplane, allowing the chip to perform application process control through the VMEbus host. High level C language functions hide details of the hardware system interface from the applications level programmer. The first version of the board was installed on a robot at Oak Ridge National Laboratory in January of 1990.

  14. A machine independent expert system for diagnosing environmentally induced spacecraft anomalies

    NASA Technical Reports Server (NTRS)

    Rolincik, Mark J.

    1991-01-01

    A new rule-based, machine independent analytical tool for diagnosing spacecraft anomalies, the EnviroNET expert system, was developed. Expert systems provide an effective method for storing knowledge, allow computers to sift through large amounts of data pinpointing significant parts, and most importantly, use heuristics in addition to algorithms which allow approximate reasoning and inference, and the ability to attack problems not rigidly defines. The EviroNET expert system knowledge base currently contains over two hundred rules, and links to databases which include past environmental data, satellite data, and previous known anomalies. The environmental causes considered are bulk charging, single event upsets (SEU), surface charging, and total radiation dose.

  15. Performance evaluation of the machine learning algorithms used in inference mechanism of a medical decision support system.

    PubMed

    Bal, Mert; Amasyali, M Fatih; Sever, Hayri; Kose, Guven; Demirhan, Ayse

    2014-01-01

    The importance of the decision support systems is increasingly supporting the decision making process in cases of uncertainty and the lack of information and they are widely used in various fields like engineering, finance, medicine, and so forth, Medical decision support systems help the healthcare personnel to select optimal method during the treatment of the patients. Decision support systems are intelligent software systems that support decision makers on their decisions. The design of decision support systems consists of four main subjects called inference mechanism, knowledge-base, explanation module, and active memory. Inference mechanism constitutes the basis of decision support systems. There are various methods that can be used in these mechanisms approaches. Some of these methods are decision trees, artificial neural networks, statistical methods, rule-based methods, and so forth. In decision support systems, those methods can be used separately or a hybrid system, and also combination of those methods. In this study, synthetic data with 10, 100, 1000, and 2000 records have been produced to reflect the probabilities on the ALARM network. The accuracy of 11 machine learning methods for the inference mechanism of medical decision support system is compared on various data sets.

  16. Development of the Diagnostic Expert System for Tea Processing

    NASA Astrophysics Data System (ADS)

    Yoshitomi, Hitoshi; Yamaguchi, Yuichi

    A diagnostic expert system for tea processing which can presume the cause of the defect of the processed tea was developed to contribute to the improvement of tea processing. This system that consists of some programs can be used through the Internet. The inference engine, the core of the system adopts production system which is well used on artificial intelligence, and is coded by Prolog as the artificial intelligence oriented language. At present, 176 rules for inference have been registered on this system. The system will be able to presume better if more rules are added to the system.

  17. Equations for Scoring Rules When Data Are Missing

    NASA Technical Reports Server (NTRS)

    James, Mark

    2006-01-01

    A document presents equations for scoring rules in a diagnostic and/or prognostic artificial-intelligence software system of the rule-based inference-engine type. The equations define a set of metrics that characterize the evaluation of a rule when data required for the antecedence clause(s) of the rule are missing. The metrics include a primary measure denoted the rule completeness metric (RCM) plus a number of subsidiary measures that contribute to the RCM. The RCM is derived from an analysis of a rule with respect to its truth and a measure of the completeness of its input data. The derivation is such that the truth value of an antecedent is independent of the measure of its completeness. The RCM can be used to compare the degree of completeness of two or more rules with respect to a given set of data. Hence, the RCM can be used as a guide to choosing among rules during the rule-selection phase of operation of the artificial-intelligence system..

  18. Structure identification in fuzzy inference using reinforcement learning

    NASA Technical Reports Server (NTRS)

    Berenji, Hamid R.; Khedkar, Pratap

    1993-01-01

    In our previous work on the GARIC architecture, we have shown that the system can start with surface structure of the knowledge base (i.e., the linguistic expression of the rules) and learn the deep structure (i.e., the fuzzy membership functions of the labels used in the rules) by using reinforcement learning. Assuming the surface structure, GARIC refines the fuzzy membership functions used in the consequents of the rules using a gradient descent procedure. This hybrid fuzzy logic and reinforcement learning approach can learn to balance a cart-pole system and to backup a truck to its docking location after a few trials. In this paper, we discuss how to do structure identification using reinforcement learning in fuzzy inference systems. This involves identifying both surface as well as deep structure of the knowledge base. The term set of fuzzy linguistic labels used in describing the values of each control variable must be derived. In this process, splitting a label refers to creating new labels which are more granular than the original label and merging two labels creates a more general label. Splitting and merging of labels directly transform the structure of the action selection network used in GARIC by increasing or decreasing the number of hidden layer nodes.

  19. Rule-based expert system for maritime anomaly detection

    NASA Astrophysics Data System (ADS)

    Roy, Jean

    2010-04-01

    Maritime domain operators/analysts have a mandate to be aware of all that is happening within their areas of responsibility. This mandate derives from the needs to defend sovereignty, protect infrastructures, counter terrorism, detect illegal activities, etc., and it has become more challenging in the past decade, as commercial shipping turned into a potential threat. In particular, a huge portion of the data and information made available to the operators/analysts is mundane, from maritime platforms going about normal, legitimate activities, and it is very challenging for them to detect and identify the non-mundane. To achieve such anomaly detection, they must establish numerous relevant situational facts from a variety of sensor data streams. Unfortunately, many of the facts of interest just cannot be observed; the operators/analysts thus use their knowledge of the maritime domain and their reasoning faculties to infer these facts. As they are often overwhelmed by the large amount of data and information, automated reasoning tools could be used to support them by inferring the necessary facts, ultimately providing indications and warning on a small number of anomalous events worthy of their attention. Along this line of thought, this paper describes a proof-of-concept prototype of a rule-based expert system implementing automated rule-based reasoning in support of maritime anomaly detection.

  20. A novel on-line spatial-temporal k-anonymity method for location privacy protection from sequence rules-based inference attacks.

    PubMed

    Zhang, Haitao; Wu, Chenxue; Chen, Zewei; Liu, Zhao; Zhu, Yunhong

    2017-01-01

    Analyzing large-scale spatial-temporal k-anonymity datasets recorded in location-based service (LBS) application servers can benefit some LBS applications. However, such analyses can allow adversaries to make inference attacks that cannot be handled by spatial-temporal k-anonymity methods or other methods for protecting sensitive knowledge. In response to this challenge, first we defined a destination location prediction attack model based on privacy-sensitive sequence rules mined from large scale anonymity datasets. Then we proposed a novel on-line spatial-temporal k-anonymity method that can resist such inference attacks. Our anti-attack technique generates new anonymity datasets with awareness of privacy-sensitive sequence rules. The new datasets extend the original sequence database of anonymity datasets to hide the privacy-sensitive rules progressively. The process includes two phases: off-line analysis and on-line application. In the off-line phase, sequence rules are mined from an original sequence database of anonymity datasets, and privacy-sensitive sequence rules are developed by correlating privacy-sensitive spatial regions with spatial grid cells among the sequence rules. In the on-line phase, new anonymity datasets are generated upon LBS requests by adopting specific generalization and avoidance principles to hide the privacy-sensitive sequence rules progressively from the extended sequence anonymity datasets database. We conducted extensive experiments to test the performance of the proposed method, and to explore the influence of the parameter K value. The results demonstrated that our proposed approach is faster and more effective for hiding privacy-sensitive sequence rules in terms of hiding sensitive rules ratios to eliminate inference attacks. Our method also had fewer side effects in terms of generating new sensitive rules ratios than the traditional spatial-temporal k-anonymity method, and had basically the same side effects in terms of non-sensitive rules variation ratios with the traditional spatial-temporal k-anonymity method. Furthermore, we also found the performance variation tendency from the parameter K value, which can help achieve the goal of hiding the maximum number of original sensitive rules while generating a minimum of new sensitive rules and affecting a minimum number of non-sensitive rules.

  1. A novel on-line spatial-temporal k-anonymity method for location privacy protection from sequence rules-based inference attacks

    PubMed Central

    Wu, Chenxue; Liu, Zhao; Zhu, Yunhong

    2017-01-01

    Analyzing large-scale spatial-temporal k-anonymity datasets recorded in location-based service (LBS) application servers can benefit some LBS applications. However, such analyses can allow adversaries to make inference attacks that cannot be handled by spatial-temporal k-anonymity methods or other methods for protecting sensitive knowledge. In response to this challenge, first we defined a destination location prediction attack model based on privacy-sensitive sequence rules mined from large scale anonymity datasets. Then we proposed a novel on-line spatial-temporal k-anonymity method that can resist such inference attacks. Our anti-attack technique generates new anonymity datasets with awareness of privacy-sensitive sequence rules. The new datasets extend the original sequence database of anonymity datasets to hide the privacy-sensitive rules progressively. The process includes two phases: off-line analysis and on-line application. In the off-line phase, sequence rules are mined from an original sequence database of anonymity datasets, and privacy-sensitive sequence rules are developed by correlating privacy-sensitive spatial regions with spatial grid cells among the sequence rules. In the on-line phase, new anonymity datasets are generated upon LBS requests by adopting specific generalization and avoidance principles to hide the privacy-sensitive sequence rules progressively from the extended sequence anonymity datasets database. We conducted extensive experiments to test the performance of the proposed method, and to explore the influence of the parameter K value. The results demonstrated that our proposed approach is faster and more effective for hiding privacy-sensitive sequence rules in terms of hiding sensitive rules ratios to eliminate inference attacks. Our method also had fewer side effects in terms of generating new sensitive rules ratios than the traditional spatial-temporal k-anonymity method, and had basically the same side effects in terms of non-sensitive rules variation ratios with the traditional spatial-temporal k-anonymity method. Furthermore, we also found the performance variation tendency from the parameter K value, which can help achieve the goal of hiding the maximum number of original sensitive rules while generating a minimum of new sensitive rules and affecting a minimum number of non-sensitive rules. PMID:28767687

  2. Functional networks inference from rule-based machine learning models.

    PubMed

    Lazzarini, Nicola; Widera, Paweł; Williamson, Stuart; Heer, Rakesh; Krasnogor, Natalio; Bacardit, Jaume

    2016-01-01

    Functional networks play an important role in the analysis of biological processes and systems. The inference of these networks from high-throughput (-omics) data is an area of intense research. So far, the similarity-based inference paradigm (e.g. gene co-expression) has been the most popular approach. It assumes a functional relationship between genes which are expressed at similar levels across different samples. An alternative to this paradigm is the inference of relationships from the structure of machine learning models. These models are able to capture complex relationships between variables, that often are different/complementary to the similarity-based methods. We propose a protocol to infer functional networks from machine learning models, called FuNeL. It assumes, that genes used together within a rule-based machine learning model to classify the samples, might also be functionally related at a biological level. The protocol is first tested on synthetic datasets and then evaluated on a test suite of 8 real-world datasets related to human cancer. The networks inferred from the real-world data are compared against gene co-expression networks of equal size, generated with 3 different methods. The comparison is performed from two different points of view. We analyse the enriched biological terms in the set of network nodes and the relationships between known disease-associated genes in a context of the network topology. The comparison confirms both the biological relevance and the complementary character of the knowledge captured by the FuNeL networks in relation to similarity-based methods and demonstrates its potential to identify known disease associations as core elements of the network. Finally, using a prostate cancer dataset as a case study, we confirm that the biological knowledge captured by our method is relevant to the disease and consistent with the specialised literature and with an independent dataset not used in the inference process. The implementation of our network inference protocol is available at: http://ico2s.org/software/funel.html.

  3. Inference of combinatorial Boolean rules of synergistic gene sets from cancer microarray datasets.

    PubMed

    Park, Inho; Lee, Kwang H; Lee, Doheon

    2010-06-15

    Gene set analysis has become an important tool for the functional interpretation of high-throughput gene expression datasets. Moreover, pattern analyses based on inferred gene set activities of individual samples have shown the ability to identify more robust disease signatures than individual gene-based pattern analyses. Although a number of approaches have been proposed for gene set-based pattern analysis, the combinatorial influence of deregulated gene sets on disease phenotype classification has not been studied sufficiently. We propose a new approach for inferring combinatorial Boolean rules of gene sets for a better understanding of cancer transcriptome and cancer classification. To reduce the search space of the possible Boolean rules, we identify small groups of gene sets that synergistically contribute to the classification of samples into their corresponding phenotypic groups (such as normal and cancer). We then measure the significance of the candidate Boolean rules derived from each group of gene sets; the level of significance is based on the class entropy of the samples selected in accordance with the rules. By applying the present approach to publicly available prostate cancer datasets, we identified 72 significant Boolean rules. Finally, we discuss several identified Boolean rules, such as the rule of glutathione metabolism (down) and prostaglandin synthesis regulation (down), which are consistent with known prostate cancer biology. Scripts written in Python and R are available at http://biosoft.kaist.ac.kr/~ihpark/. The refined gene sets and the full list of the identified Boolean rules are provided in the Supplementary Material. Supplementary data are available at Bioinformatics online.

  4. Construction of a Clinical Decision Support System for Undergoing Surgery Based on Domain Ontology and Rules Reasoning

    PubMed Central

    Bau, Cho-Tsan; Huang, Chung-Yi

    2014-01-01

    Abstract Objective: To construct a clinical decision support system (CDSS) for undergoing surgery based on domain ontology and rules reasoning in the setting of hospitalized diabetic patients. Materials and Methods: The ontology was created with a modified ontology development method, including specification and conceptualization, formalization, implementation, and evaluation and maintenance. The Protégé–Web Ontology Language editor was used to implement the ontology. Embedded clinical knowledge was elicited to complement the domain ontology with formal concept analysis. The decision rules were translated into JENA format, which JENA can use to infer recommendations based on patient clinical situations. Results: The ontology includes 31 classes and 13 properties, plus 38 JENA rules that were built to generate recommendations. The evaluation studies confirmed the correctness of the ontology, acceptance of recommendations, satisfaction with the system, and usefulness of the ontology for glycemic management of diabetic patients undergoing surgery, especially for domain experts. Conclusions: The contribution of this research is to set up an evidence-based hybrid ontology and an evaluation method for CDSS. The system can help clinicians to achieve inpatient glycemic control in diabetic patients undergoing surgery while avoiding hypoglycemia. PMID:24730353

  5. Construction of a clinical decision support system for undergoing surgery based on domain ontology and rules reasoning.

    PubMed

    Bau, Cho-Tsan; Chen, Rung-Ching; Huang, Chung-Yi

    2014-05-01

    To construct a clinical decision support system (CDSS) for undergoing surgery based on domain ontology and rules reasoning in the setting of hospitalized diabetic patients. The ontology was created with a modified ontology development method, including specification and conceptualization, formalization, implementation, and evaluation and maintenance. The Protégé-Web Ontology Language editor was used to implement the ontology. Embedded clinical knowledge was elicited to complement the domain ontology with formal concept analysis. The decision rules were translated into JENA format, which JENA can use to infer recommendations based on patient clinical situations. The ontology includes 31 classes and 13 properties, plus 38 JENA rules that were built to generate recommendations. The evaluation studies confirmed the correctness of the ontology, acceptance of recommendations, satisfaction with the system, and usefulness of the ontology for glycemic management of diabetic patients undergoing surgery, especially for domain experts. The contribution of this research is to set up an evidence-based hybrid ontology and an evaluation method for CDSS. The system can help clinicians to achieve inpatient glycemic control in diabetic patients undergoing surgery while avoiding hypoglycemia.

  6. Discrimination of Human Forearm Motions on the Basis of Myoelectric Signals by Using Adaptive Fuzzy Inference System

    NASA Astrophysics Data System (ADS)

    Kiso, Atsushi; Seki, Hirokazu

    This paper describes a method for discriminating of the human forearm motions based on the myoelectric signals using an adaptive fuzzy inference system. In conventional studies, the neural network is often used to estimate motion intention by the myoelectric signals and realizes the high discrimination precision. On the other hand, this study uses the fuzzy inference for a human forearm motion discrimination based on the myoelectric signals. This study designs the membership function and the fuzzy rules using the average value and the standard deviation of the root mean square of the myoelectric potential for every channel of each motion. In addition, the characteristics of the myoelectric potential gradually change as a result of the muscle fatigue. Therefore, the motion discrimination should be performed by taking muscle fatigue into consideration. This study proposes a method to redesign the fuzzy inference system such that dynamic change of the myoelectric potential because of the muscle fatigue will be taken into account. Some experiments carried out using a myoelectric hand simulator show the effectiveness of the proposed motion discrimination method.

  7. Learning and tuning fuzzy logic controllers through reinforcements.

    PubMed

    Berenji, H R; Khedkar, P

    1992-01-01

    A method for learning and tuning a fuzzy logic controller based on reinforcements from a dynamic system is presented. It is shown that: the generalized approximate-reasoning-based intelligent control (GARIC) architecture learns and tunes a fuzzy logic controller even when only weak reinforcement, such as a binary failure signal, is available; introduces a new conjunction operator in computing the rule strengths of fuzzy control rules; introduces a new localized mean of maximum (LMOM) method in combining the conclusions of several firing control rules; and learns to produce real-valued control actions. Learning is achieved by integrating fuzzy inference into a feedforward network, which can then adaptively improve performance by using gradient descent methods. The GARIC architecture is applied to a cart-pole balancing system and demonstrates significant improvements in terms of the speed of learning and robustness to changes in the dynamic system's parameters over previous schemes for cart-pole balancing.

  8. [Concepts of rational taxonomy].

    PubMed

    Pavlinov, I Ia

    2011-01-01

    The problems are discussed related to development of concepts of rational taxonomy and rational classifications (taxonomic systems) in biology. Rational taxonomy is based on the assumption that the key characteristic of rationality is deductive inference of certain partial judgments about reality under study from other judgments taken as more general and a priory true. Respectively, two forms of rationality are discriminated--ontological and epistemological ones. The former implies inference of classifications properties from general (essential) properties of the reality being investigated. The latter implies inference of the partial rules of judgments about classifications from more general (formal) rules. The following principal concepts of ontologically rational biological taxonomy are considered: "crystallographic" approach, inference of the orderliness of organismal diversity from general laws of Nature, inference of the above orderliness from the orderliness of ontogenetic development programs, based on the concept of natural kind and Cassirer's series theory, based on the systemic concept, based on the idea of periodic systems. Various concepts of ontologically rational taxonomy can be generalized by an idea of the causal taxonomy, according to which any biologically sound classification is founded on a contentwise model of biological diversity that includes explicit indication of general causes responsible for that diversity. It is asserted that each category of general causation and respective background model may serve as a basis for a particular ontologically rational taxonomy as a distinctive research program. Concepts of epistemologically rational taxonomy and classifications (taxonomic systems) can be interpreted in terms of application of certain epistemological criteria of substantiation of scientific status of taxonomy in general and of taxonomic systems in particular. These concepts include: consideration of taxonomy consistency from the standpoint of inductive and hypothetico-deductive argumentation schemes and such fundamental criteria of classifications naturalness as their prognostic capabilities; foundation of a theory of "general taxonomy" as a "general logic", including elements of the axiomatic method. The latter concept constitutes a core of the program of general classiology; it is inconsistent due to absence of anything like "general logic". It is asserted that elaboration of a theory of taxonomy as a biological discipline based on the formal principles of epistemological rationality is not feasible. Instead, it is to be elaborated as ontologically rational one based on biologically sound metatheories about biological diversity causes.

  9. Adult Age Differences in Categorization and Multiple-Cue Judgment

    ERIC Educational Resources Information Center

    Mata, Rui; von Helversen, Bettina; Karlsson, Linnea; Cupper, Lutz

    2012-01-01

    We often need to infer unknown properties of objects from observable ones, just like detectives must infer guilt from observable clues and behavior. But how do inferential processes change with age? We examined young and older adults' reliance on rule-based and similarity-based processes in an inference task that can be considered either a…

  10. Forward-Chaining Versus A Graph Approach As The Inference Engine In Expert Systems

    NASA Astrophysics Data System (ADS)

    Neapolitan, Richard E.

    1986-03-01

    Rule-based expert systems are those in which a certain number of IF-THEN rules are assumed to be true. Based on the verity of some assertions, the rules deduce as many new conclusions as possible. A standard technique used to make these deductions is forward-chaining. In forward-chaining, the program or 'inference engine' cycles through the rules. At each rule, the premises for the rule are checked against the current true assertions. If all the premises are found, the conclusion is added to the list of true assertions. At that point it is necessary to start over at the first rule, since the new conclusion may be a premise in a rule already checked. Therefore, each time a new conclusion is deduced it is necessary to start the rule checking procedure over. This process continues until no new conclusions are added and the end of the list of rules is reached. The above process, although quite costly in terms of CPU cycles due to the necessity of repeatedly starting the process over, is necessary if the rules contain 'pattern variables'. An example of such a rule is, 'IF X IS A BACTERIA, THEN X CAN BE TREATED WITH ANTIBIOTICS'. Since the rule can lead to conclusions for many values of X, it is necessary to check each premise in the rule against every true assertion producing an association list to be used in the checking of the next premise. However, if the rule does not contain variable data, as is the case in many current expert systems, then a rule can lead to only one conclusion. In this case, the rules can be stored in a graph, and the true assertions in an assertion list. The assertion list is traversed only once; at each assertion a premise is triggered in all the rules which have that assertion as a premise. When all premises for a rule trigger, the rule's conclusion is added to the END of the list of assertions. It must be added at the end so that it will eventually be used to make further deductions. In the current paper, the two methods are described in detail, the relative advantages of each is discussed, and a benchmark comparing the CPU cycles consumed by each is included. It is also shown that, in the case of reasoning under uncertainty, it is possible to properly combine the certainties derived from rules arguing for the same conclusion when the graph approach is used.

  11. Development of a coupled expert system for the spacecraft attitude control problem

    NASA Technical Reports Server (NTRS)

    Kawamura, K.; Beale, G.; Schaffer, J.; Hsieh, B.-J.; Padalkar, S.; Rodriguezmoscoso, J.; Vinz, F.; Fernandez, K.

    1987-01-01

    A majority of the current expert systems focus on the symbolic-oriented logic and inference mechanisms of artificial intelligence (AI). Common rule-based systems employ empirical associations and are not well suited to deal with problems often arising in engineering. Described is a prototype expert system which combines both symbolic and numeric computing. The expert system's configuration is presented and its application to a spacecraft attitude control problem is discussed.

  12. Incremental Learning of Context Free Grammars by Parsing-Based Rule Generation and Rule Set Search

    NASA Astrophysics Data System (ADS)

    Nakamura, Katsuhiko; Hoshina, Akemi

    This paper discusses recent improvements and extensions in Synapse system for inductive inference of context free grammars (CFGs) from sample strings. Synapse uses incremental learning, rule generation based on bottom-up parsing, and the search for rule sets. The form of production rules in the previous system is extended from Revised Chomsky Normal Form A→βγ to Extended Chomsky Normal Form, which also includes A→B, where each of β and γ is either a terminal or nonterminal symbol. From the result of bottom-up parsing, a rule generation mechanism synthesizes minimum production rules required for parsing positive samples. Instead of inductive CYK algorithm in the previous version of Synapse, the improved version uses a novel rule generation method, called ``bridging,'' which bridges the lacked part of the derivation tree for the positive string. The improved version also employs a novel search strategy, called serial search in addition to minimum rule set search. The synthesis of grammars by the serial search is faster than the minimum set search in most cases. On the other hand, the size of the generated CFGs is generally larger than that by the minimum set search, and the system can find no appropriate grammar for some CFL by the serial search. The paper shows experimental results of incremental learning of several fundamental CFGs and compares the methods of rule generation and search strategies.

  13. Neural networks and logical reasoning systems: a translation table.

    PubMed

    Martins, J; Mendes, R V

    2001-04-01

    A correspondence is established between the basic elements of logic reasoning systems (knowledge bases, rules, inference and queries) and the structure and dynamical evolution laws of neural networks. The correspondence is pictured as a translation dictionary which might allow to go back and forth between symbolic and network formulations, a desirable step in learning-oriented systems and multicomputer networks. In the framework of Horn clause logics, it is found that atomic propositions with n arguments correspond to nodes with nth order synapses, rules to synaptic intensity constraints, forward chaining to synaptic dynamics and queries either to simple node activation or to a query tensor dynamics.

  14. Another expert system rule inference based on DNA molecule logic gates

    NASA Astrophysics Data System (ADS)

    WÄ siewicz, Piotr

    2013-10-01

    With the help of silicon industry microfluidic processors were invented utilizing nano membrane valves, pumps and microreactors. These so called lab-on-a-chips combined together with molecular computing create molecular-systems-ona- chips. This work presents a new approach to implementation of molecular inference systems. It requires the unique representation of signals by DNA molecules. The main part of this work includes the concept of logic gates based on typical genetic engineering reactions. The presented method allows for constructing logic gates with many inputs and for executing them at the same quantity of elementary operations, regardless of a number of input signals. Every microreactor of the lab-on-a-chip performs one unique operation on input molecules and can be connected by dataflow output-input connections to other ones.

  15. Adaptive Critic-based Neurofuzzy Controller for the Steam Generator Water Level

    NASA Astrophysics Data System (ADS)

    Fakhrazari, Amin; Boroushaki, Mehrdad

    2008-06-01

    In this paper, an adaptive critic-based neurofuzzy controller is presented for water level regulation of nuclear steam generators. The problem has been of great concern for many years as the steam generator is a highly nonlinear system showing inverse response dynamics especially at low operating power levels. Fuzzy critic-based learning is a reinforcement learning method based on dynamic programming. The only information available for the critic agent is the system feedback which is interpreted as the last action the controller has performed in the previous state. The signal produced by the critic agent is used alongside the backpropagation of error algorithm to tune online conclusion parts of the fuzzy inference rules. The critic agent here has a proportional-derivative structure and the fuzzy rule base has nine rules. The proposed controller shows satisfactory transient responses, disturbance rejection and robustness to model uncertainty. Its simple design procedure and structure, nominates it as one of the suitable controller designs for the steam generator water level control in nuclear power plant industry.

  16. Combining human and machine intelligence to derive agents' behavioral rules for groundwater irrigation

    NASA Astrophysics Data System (ADS)

    Hu, Yao; Quinn, Christopher J.; Cai, Ximing; Garfinkle, Noah W.

    2017-11-01

    For agent-based modeling, the major challenges in deriving agents' behavioral rules arise from agents' bounded rationality and data scarcity. This study proposes a "gray box" approach to address the challenge by incorporating expert domain knowledge (i.e., human intelligence) with machine learning techniques (i.e., machine intelligence). Specifically, we propose using directed information graph (DIG), boosted regression trees (BRT), and domain knowledge to infer causal factors and identify behavioral rules from data. A case study is conducted to investigate farmers' pumping behavior in the Midwest, U.S.A. Results show that four factors identified by the DIG algorithm- corn price, underlying groundwater level, monthly mean temperature and precipitation- have main causal influences on agents' decisions on monthly groundwater irrigation depth. The agent-based model is then developed based on the behavioral rules represented by three DIGs and modeled by BRTs, and coupled with a physically-based groundwater model to investigate the impacts of agents' pumping behavior on the underlying groundwater system in the context of coupled human and environmental systems.

  17. CDMBE: A Case Description Model Based on Evidence

    PubMed Central

    Zhu, Jianlin; Yang, Xiaoping; Zhou, Jing

    2015-01-01

    By combining the advantages of argument map and Bayesian network, a case description model based on evidence (CDMBE), which is suitable to continental law system, is proposed to describe the criminal cases. The logic of the model adopts the credibility logical reason and gets evidence-based reasoning quantitatively based on evidences. In order to consist with practical inference rules, five types of relationship and a set of rules are defined to calculate the credibility of assumptions based on the credibility and supportability of the related evidences. Experiments show that the model can get users' ideas into a figure and the results calculated from CDMBE are in line with those from Bayesian model. PMID:26421006

  18. TMS for Instantiating a Knowledge Base With Incomplete Data

    NASA Technical Reports Server (NTRS)

    James, Mark

    2007-01-01

    A computer program that belongs to the class known among software experts as output truth-maintenance-systems (output TMSs) has been devised as one of a number of software tools for reducing the size of the knowledge base that must be searched during execution of artificial- intelligence software of the rule-based inference-engine type in a case in which data are missing. This program determines whether the consequences of activation of two or more rules can be combined without causing a logical inconsistency. For example, in a case involving hypothetical scenarios that could lead to turning a given device on or off, the program determines whether a scenario involving a given combination of rules could lead to turning the device both on and off at the same time, in which case that combination of rules would not be included in the scenario.

  19. Using Fuzzy Gaussian Inference and Genetic Programming to Classify 3D Human Motions

    NASA Astrophysics Data System (ADS)

    Khoury, Mehdi; Liu, Honghai

    This research introduces and builds on the concept of Fuzzy Gaussian Inference (FGI) (Khoury and Liu in Proceedings of UKCI, 2008 and IEEE Workshop on Robotic Intelligence in Informationally Structured Space (RiiSS 2009), 2009) as a novel way to build Fuzzy Membership Functions that map to hidden Probability Distributions underlying human motions. This method is now combined with a Genetic Programming Fuzzy rule-based system in order to classify boxing moves from natural human Motion Capture data. In this experiment, FGI alone is able to recognise seven different boxing stances simultaneously with an accuracy superior to a GMM-based classifier. Results seem to indicate that adding an evolutionary Fuzzy Inference Engine on top of FGI improves the accuracy of the classifier in a consistent way.

  20. Knowledge representation for fuzzy inference aided medical image interpretation.

    PubMed

    Gal, Norbert; Stoicu-Tivadar, Vasile

    2012-01-01

    Knowledge defines how an automated system transforms data into information. This paper suggests a representation method of medical imaging knowledge using fuzzy inference systems coded in XML files. The imaging knowledge incorporates features of the investigated objects in linguistic form and inference rules that can transform the linguistic data into information about a possible diagnosis. A fuzzy inference system is used to model the vagueness of the linguistic medical imaging terms. XML files are used to facilitate easy manipulation and deployment of the knowledge into the imaging software. Preliminary results are presented.

  1. Design and implementation of the tree-based fuzzy logic controller.

    PubMed

    Liu, B D; Huang, C Y

    1997-01-01

    In this paper, a tree-based approach is proposed to design the fuzzy logic controller. Based on the proposed methodology, the fuzzy logic controller has the following merits: the fuzzy control rule can be extracted automatically from the input-output data of the system and the extraction process can be done in one-pass; owing to the fuzzy tree inference structure, the search spaces of the fuzzy inference process are largely reduced; the operation of the inference process can be simplified as a one-dimensional matrix operation because of the fuzzy tree approach; and the controller has regular and modular properties, so it is easy to be implemented by hardware. Furthermore, the proposed fuzzy tree approach has been applied to design the color reproduction system for verifying the proposed methodology. The color reproduction system is mainly used to obtain a color image through the printer that is identical to the original one. In addition to the software simulation, an FPGA is used to implement the prototype hardware system for real-time application. Experimental results show that the effect of color correction is quite good and that the prototype hardware system can operate correctly under the condition of 30 MHz clock rate.

  2. An expert system for choosing the best combination of options in a general purpose program for automated design synthesis

    NASA Technical Reports Server (NTRS)

    Rogers, J. L.; Barthelemy, J.-F. M.

    1986-01-01

    An expert system called EXADS has been developed to aid users of the Automated Design Synthesis (ADS) general purpose optimization program. ADS has approximately 100 combinations of strategy, optimizer, and one-dimensional search options from which to choose. It is difficult for a nonexpert to make this choice. This expert system aids the user in choosing the best combination of options based on the users knowledge of the problem and the expert knowledge stored in the knowledge base. The knowledge base is divided into three categories; constrained problems, unconstrained problems, and constrained problems being treated as unconstrained problems. The inference engine and rules are written in LISP, contains about 200 rules, and executes on DEC-VAX (with Franz-LISP) and IBM PC (with IQ-LISP) computers.

  3. Ozone levels in the Empty Quarter of Saudi Arabia--application of adaptive neuro-fuzzy model.

    PubMed

    Rahman, Syed Masiur; Khondaker, A N; Khan, Rouf Ahmad

    2013-05-01

    In arid regions, primary pollutants may contribute to the increase of ozone levels and cause negative effects on biotic health. This study investigates the use of adaptive neuro-fuzzy inference system (ANFIS) for ozone prediction. The initial fuzzy inference system is developed by using fuzzy C-means (FCM) and subtractive clustering (SC) algorithms, which determines the important rules, increases generalization capability of the fuzzy inference system, reduces computational needs, and ensures speedy model development. The study area is located in the Empty Quarter of Saudi Arabia, which is considered as a source of huge potential for oil and gas field development. The developed clustering algorithm-based ANFIS model used meteorological data and derived meteorological data, along with NO and NO₂ concentrations and their transformations, as inputs. The root mean square error and Willmott's index of agreement of the FCM- and SC-based ANFIS models are 3.5 ppbv and 0.99, and 8.9 ppbv and 0.95, respectively. Based on the analysis of the performance measures and regression error characteristic curves, it is concluded that the FCM-based ANFIS model outperforms the SC-based ANFIS model.

  4. An ontology for Autism Spectrum Disorder (ASD) to infer ASD phenotypes from Autism Diagnostic Interview-Revised data.

    PubMed

    Mugzach, Omri; Peleg, Mor; Bagley, Steven C; Guter, Stephen J; Cook, Edwin H; Altman, Russ B

    2015-08-01

    Our goal is to create an ontology that will allow data integration and reasoning with subject data to classify subjects, and based on this classification, to infer new knowledge on Autism Spectrum Disorder (ASD) and related neurodevelopmental disorders (NDD). We take a first step toward this goal by extending an existing autism ontology to allow automatic inference of ASD phenotypes and Diagnostic & Statistical Manual of Mental Disorders (DSM) criteria based on subjects' Autism Diagnostic Interview-Revised (ADI-R) assessment data. Knowledge regarding diagnostic instruments, ASD phenotypes and risk factors was added to augment an existing autism ontology via Ontology Web Language class definitions and semantic web rules. We developed a custom Protégé plugin for enumerating combinatorial OWL axioms to support the many-to-many relations of ADI-R items to diagnostic categories in the DSM. We utilized a reasoner to infer whether 2642 subjects, whose data was obtained from the Simons Foundation Autism Research Initiative, meet DSM-IV-TR (DSM-IV) and DSM-5 diagnostic criteria based on their ADI-R data. We extended the ontology by adding 443 classes and 632 rules that represent phenotypes, along with their synonyms, environmental risk factors, and frequency of comorbidities. Applying the rules on the data set showed that the method produced accurate results: the true positive and true negative rates for inferring autistic disorder diagnosis according to DSM-IV criteria were 1 and 0.065, respectively; the true positive rate for inferring ASD based on DSM-5 criteria was 0.94. The ontology allows automatic inference of subjects' disease phenotypes and diagnosis with high accuracy. The ontology may benefit future studies by serving as a knowledge base for ASD. In addition, by adding knowledge of related NDDs, commonalities and differences in manifestations and risk factors could be automatically inferred, contributing to the understanding of ASD pathophysiology. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. An HL7-CDA wrapper for facilitating semantic interoperability to rule-based Clinical Decision Support Systems.

    PubMed

    Sáez, Carlos; Bresó, Adrián; Vicente, Javier; Robles, Montserrat; García-Gómez, Juan Miguel

    2013-03-01

    The success of Clinical Decision Support Systems (CDSS) greatly depends on its capability of being integrated in Health Information Systems (HIS). Several proposals have been published up to date to permit CDSS gathering patient data from HIS. Some base the CDSS data input on the HL7 reference model, however, they are tailored to specific CDSS or clinical guidelines technologies, or do not focus on standardizing the CDSS resultant knowledge. We propose a solution for facilitating semantic interoperability to rule-based CDSS focusing on standardized input and output documents conforming an HL7-CDA wrapper. We define the HL7-CDA restrictions in a HL7-CDA implementation guide. Patient data and rule inference results are mapped respectively to and from the CDSS by means of a binding method based on an XML binding file. As an independent clinical document, the results of a CDSS can present clinical and legal validity. The proposed solution is being applied in a CDSS for providing patient-specific recommendations for the care management of outpatients with diabetes mellitus. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  6. Eliminative Argumentation: A Basis for Arguing Confidence in System Properties

    DTIC Science & Technology

    2015-02-01

    errors to acceptable system reliability is unsound . But this is not an acceptable undercutting defeater; it does not put the conclusion about system...first to note sources of unsoundness in arguments, namely, questionable inference rules and weaknesses in proffered evidence. However, the notions of...This material is based upon work funded and supported by the Department of Defense under Contract No. FA8721-05-C-0003 with Carnegie Mellon University

  7. Design a Fuzzy Rule-based Expert System to Aid Earlier Diagnosis of Gastric Cancer.

    PubMed

    Safdari, Reza; Arpanahi, Hadi Kazemi; Langarizadeh, Mostafa; Ghazisaiedi, Marjan; Dargahi, Hossein; Zendehdel, Kazem

    2018-01-01

    Screening and health check-up programs are most important sanitary priorities, that should be undertaken to control dangerous diseases such as gastric cancer that affected by different factors. More than 50% of gastric cancer diagnoses are made during the advanced stage. Currently, there is no systematic approach for early diagnosis of gastric cancer. to develop a fuzzy expert system that can identify gastric cancer risk levels in individuals. This system was implemented in MATLAB software, Mamdani inference technique applied to simulate reasoning of experts in the field, a total of 67 fuzzy rules extracted as a rule-base based on medical expert's opinion. 50 case scenarios were used to evaluate the system, the information of case reports is given to the system to find risk level of each case report then obtained results were compared with expert's diagnosis. Results revealed that sensitivity was 92.1% and the specificity was 83.1%. The results show that is possible to develop a system that can identify High risk individuals for gastric cancer. The system can lead to earlier diagnosis, this may facilitate early treatment and reduce gastric cancer mortality rate.

  8. Challenges in Requirements Engineering: A Research Agenda for Conceptual Modeling

    NASA Astrophysics Data System (ADS)

    March, Salvatore T.; Allen, Gove N.

    Domains for which information systems are developed deal primarily with social constructions—conceptual objects and attributes created by human intentions and for human purposes. Information systems play an active role in these domains. They document the creation of new conceptual objects, record and ascribe values to their attributes, initiate actions within the domain, track activities performed, and infer conclusions based on the application of rules that govern how the domain is affected when socially-defined and identified causal events occur. Emerging applications of information technologies evaluate such business rules, learn from experience, and adapt to changes in the domain. Conceptual modeling grammars aimed at representing their system requirements must include conceptual objects, socially-defined events, and the rules pertaining to them. We identify challenges to conceptual modeling research and pose an ontology of the artificial as a step toward meeting them.

  9. HyFIS: adaptive neuro-fuzzy inference systems and their application to nonlinear dynamical systems.

    PubMed

    Kim, J; Kasabov, N

    1999-11-01

    This paper proposes an adaptive neuro-fuzzy system, HyFIS (Hybrid neural Fuzzy Inference System), for building and optimising fuzzy models. The proposed model introduces the learning power of neural networks to fuzzy logic systems and provides linguistic meaning to the connectionist architectures. Heuristic fuzzy logic rules and input-output fuzzy membership functions can be optimally tuned from training examples by a hybrid learning scheme comprised of two phases: rule generation phase from data; and rule tuning phase using error backpropagation learning scheme for a neural fuzzy system. To illustrate the performance and applicability of the proposed neuro-fuzzy hybrid model, extensive simulation studies of nonlinear complex dynamic systems are carried out. The proposed method can be applied to an on-line incremental adaptive learning for the prediction and control of nonlinear dynamical systems. Two benchmark case studies are used to demonstrate that the proposed HyFIS system is a superior neuro-fuzzy modelling technique.

  10. Land cover classification of Landsat 8 satellite data based on Fuzzy Logic approach

    NASA Astrophysics Data System (ADS)

    Taufik, Afirah; Sakinah Syed Ahmad, Sharifah

    2016-06-01

    The aim of this paper is to propose a method to classify the land covers of a satellite image based on fuzzy rule-based system approach. The study uses bands in Landsat 8 and other indices, such as Normalized Difference Water Index (NDWI), Normalized difference built-up index (NDBI) and Normalized Difference Vegetation Index (NDVI) as input for the fuzzy inference system. The selected three indices represent our main three classes called water, built- up land, and vegetation. The combination of the original multispectral bands and selected indices provide more information about the image. The parameter selection of fuzzy membership is performed by using a supervised method known as ANFIS (Adaptive neuro fuzzy inference system) training. The fuzzy system is tested for the classification on the land cover image that covers Klang Valley area. The results showed that the fuzzy system approach is effective and can be explored and implemented for other areas of Landsat data.

  11. Learning and tuning fuzzy logic controllers through reinforcements

    NASA Technical Reports Server (NTRS)

    Berenji, Hamid R.; Khedkar, Pratap

    1992-01-01

    A new method for learning and tuning a fuzzy logic controller based on reinforcements from a dynamic system is presented. In particular, our Generalized Approximate Reasoning-based Intelligent Control (GARIC) architecture: (1) learns and tunes a fuzzy logic controller even when only weak reinforcements, such as a binary failure signal, is available; (2) introduces a new conjunction operator in computing the rule strengths of fuzzy control rules; (3) introduces a new localized mean of maximum (LMOM) method in combining the conclusions of several firing control rules; and (4) learns to produce real-valued control actions. Learning is achieved by integrating fuzzy inference into a feedforward network, which can then adaptively improve performance by using gradient descent methods. We extend the AHC algorithm of Barto, Sutton, and Anderson to include the prior control knowledge of human operators. The GARIC architecture is applied to a cart-pole balancing system and has demonstrated significant improvements in terms of the speed of learning and robustness to changes in the dynamic system's parameters over previous schemes for cart-pole balancing.

  12. Optical Generation of Fuzzy-Based Rules

    NASA Astrophysics Data System (ADS)

    Gur, Eran; Mendlovic, David; Zalevsky, Zeev

    2002-08-01

    In the last third of the 20th century, fuzzy logic has risen from a mathematical concept to an applicable approach in soft computing. Today, fuzzy logic is used in control systems for various applications, such as washing machines, train-brake systems, automobile automatic gear, and so forth. The approach of optical implementation of fuzzy inferencing was given by the authors in previous papers, giving an extra emphasis to applications with two dominant inputs. In this paper the authors introduce a real-time optical rule generator for the dual-input fuzzy-inference engine. The paper briefly goes over the dual-input optical implementation of fuzzy-logic inferencing. Then, the concept of constructing a set of rules from given data is discussed. Next, the authors show ways to implement this procedure optically. The discussion is accompanied by an example that illustrates the transformation from raw data into fuzzy set rules.

  13. Expert systems for automated correlation and interpretation of wireline logs

    USGS Publications Warehouse

    Olea, R.A.

    1994-01-01

    CORRELATOR is an interactive computer program for lithostratigraphic correlation of wireline logs able to store correlations in a data base with a consistency, accuracy, speed, and resolution that are difficult to obtain manually. The automatic determination of correlations is based on the maximization of a weighted correlation coefficient using two wireline logs per well. CORRELATOR has an expert system to scan and flag incongruous correlations in the data base. The user has the option to accept or disregard the advice offered by the system. The expert system represents knowledge through production rules. The inference system is goal-driven and uses backward chaining to scan through the rules. Work in progress is used to illustrate the potential that a second expert system with a similar architecture for interpreting dip diagrams could have to identify episodes-as those of interest in sequence stratigraphy and fault detection- and annotate them in the stratigraphic column. Several examples illustrate the presentation. ?? 1994 International Association for Mathematical Geology.

  14. Integration of object-oriented knowledge representation with the CLIPS rule based system

    NASA Technical Reports Server (NTRS)

    Logie, David S.; Kamil, Hasan

    1990-01-01

    The paper describes a portion of the work aimed at developing an integrated, knowledge based environment for the development of engineering-oriented applications. An Object Representation Language (ORL) was implemented in C++ which is used to build and modify an object-oriented knowledge base. The ORL was designed in such a way so as to be easily integrated with other representation schemes that could effectively reason with the object base. Specifically, the integration of the ORL with the rule based system C Language Production Systems (CLIPS), developed at the NASA Johnson Space Center, will be discussed. The object-oriented knowledge representation provides a natural means of representing problem data as a collection of related objects. Objects are comprised of descriptive properties and interrelationships. The object-oriented model promotes efficient handling of the problem data by allowing knowledge to be encapsulated in objects. Data is inherited through an object network via the relationship links. Together, the two schemes complement each other in that the object-oriented approach efficiently handles problem data while the rule based knowledge is used to simulate the reasoning process. Alone, the object based knowledge is little more than an object-oriented data storage scheme; however, the CLIPS inference engine adds the mechanism to directly and automatically reason with that knowledge. In this hybrid scheme, the expert system dynamically queries for data and can modify the object base with complete access to all the functionality of the ORL from rules.

  15. Plausible inference: A multi-valued logic for problem solving

    NASA Technical Reports Server (NTRS)

    Friedman, L.

    1979-01-01

    A new logic is developed which permits continuously variable strength of belief in the truth of assertions. Four inference rules result, with formal logic as a limiting case. Quantification of belief is defined. Propagation of belief to linked assertions results from dependency-based techniques of truth maintenance so that local consistency is achieved or contradiction discovered in problem solving. Rules for combining, confirming, or disconfirming beliefs are given, and several heuristics are suggested that apply to revising already formed beliefs in the light of new evidence. The strength of belief that results in such revisions based on conflicting evidence are a highly subjective phenomenon. Certain quantification rules appear to reflect an orderliness in the subjectivity. Several examples of reasoning by plausible inference are given, including a legal example and one from robot learning. Propagation of belief takes place in directions forbidden in formal logic and this results in conclusions becoming possible for a given set of assertions that are not reachable by formal logic.

  16. Deduction of reservoir operating rules for application in global hydrological models

    NASA Astrophysics Data System (ADS)

    Coerver, Hubertus M.; Rutten, Martine M.; van de Giesen, Nick C.

    2018-01-01

    A big challenge in constructing global hydrological models is the inclusion of anthropogenic impacts on the water cycle, such as caused by dams. Dam operators make decisions based on experience and often uncertain information. In this study information generally available to dam operators, like inflow into the reservoir and storage levels, was used to derive fuzzy rules describing the way a reservoir is operated. Using an artificial neural network capable of mimicking fuzzy logic, called the ANFIS adaptive-network-based fuzzy inference system, fuzzy rules linking inflow and storage with reservoir release were determined for 11 reservoirs in central Asia, the US and Vietnam. By varying the input variables of the neural network, different configurations of fuzzy rules were created and tested. It was found that the release from relatively large reservoirs was significantly dependent on information concerning recent storage levels, while release from smaller reservoirs was more dependent on reservoir inflows. Subsequently, the derived rules were used to simulate reservoir release with an average Nash-Sutcliffe coefficient of 0.81.

  17. An Expert System for Diagnosis of Sleep Disorder Using Fuzzy Rule-Based Classification Systems

    NASA Astrophysics Data System (ADS)

    Septem Riza, Lala; Pradini, Mila; Fitrajaya Rahman, Eka; Rasim

    2017-03-01

    Sleep disorder is an anomaly that could cause problems for someone’ sleeping pattern. Nowadays, it becomes an issue since people are getting busy with their own business and have no time to visit the doctors. Therefore, this research aims to develop a system used for diagnosis of sleep disorder using Fuzzy Rule-Based Classification System (FRBCS). FRBCS is a method based on the fuzzy set concepts. It consists of two steps: (i) constructing a model/knowledge involving rulebase and database, and (ii) prediction over new data. In this case, the knowledge is obtained from experts whereas in the prediction stage, we perform fuzzification, inference, and classification. Then, a platform implementing the method is built with a combination between PHP and the R programming language using the “Shiny” package. To validate the system that has been made, some experiments have been done using data from a psychiatric hospital in West Java, Indonesia. Accuracy of the result and computation time are 84.85% and 0.0133 seconds, respectively.

  18. Situation-Assessment And Decision-Aid Production-Rule Analysis System For Nuclear Plant Monitoring And Emergency Preparedness

    NASA Astrophysics Data System (ADS)

    Gvillo, D.; Ragheb, M.; Parker, M.; Swartz, S.

    1987-05-01

    A Production-Rule Analysis System is developed for Nuclear Plant Monitoring. The signals generated by the Zion-1 Plant are considered. A Situation-Assessment and Decision-Aid capability is provided for monitoring the integrity of the Plant Radiation, the Reactor Coolant, the Fuel Clad, and the Containment Systems. A total of 41 signals are currently fed as facts to an Inference Engine functioning in the backward-chaining mode and built along the same structure as the E-Mycin system. The Goal-Tree constituting the Knowledge Base was generated using a representation in the form of Fault Trees deduced from plant procedures information. The system is constructed in support of the Data Analysis and Emergency Preparedness tasks at the Illinois Radiological Emergency Assessment Center (REAC).

  19. A Rational Analysis of Rule-Based Concept Learning

    ERIC Educational Resources Information Center

    Goodman, Noah D.; Tenenbaum, Joshua B.; Feldman, Jacob; Griffiths, Thomas L.

    2008-01-01

    This article proposes a new model of human concept learning that provides a rational analysis of learning feature-based concepts. This model is built upon Bayesian inference for a grammatically structured hypothesis space--a concept language of logical rules. This article compares the model predictions to human generalization judgments in several…

  20. Starmind: A Fuzzy Logic Knowledge-Based System for the Automated Classification of Stars in the MK System

    NASA Astrophysics Data System (ADS)

    Manteiga, M.; Carricajo, I.; Rodríguez, A.; Dafonte, C.; Arcay, B.

    2009-02-01

    Astrophysics is evolving toward a more rational use of costly observational data by intelligently exploiting the large terrestrial and spatial astronomical databases. In this paper, we present a study showing the suitability of an expert system to perform the classification of stellar spectra in the Morgan and Keenan (MK) system. Using the formalism of artificial intelligence for the development of such a system, we propose a rules' base that contains classification criteria and confidence grades, all integrated in an inference engine that emulates human reasoning by means of a hierarchical decision rules tree that also considers the uncertainty factors associated with rules. Our main objective is to illustrate the formulation and development of such a system for an astrophysical classification problem. An extensive spectral database of MK standard spectra has been collected and used as a reference to determine the spectral indexes that are suitable for classification in the MK system. It is shown that by considering 30 spectral indexes and associating them with uncertainty factors, we can find an accurate diagnose in MK types of a particular spectrum. The system was evaluated against the NOAO-INDO-US spectral catalog.

  1. Towards a Semantically-Enabled Control Strategy for Building Simulations: Integration of Semantic Technologies and Model Predictive Control

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Delgoshaei, Parastoo; Austin, Mark A.; Pertzborn, Amanda J.

    State-of-the-art building simulation control methods incorporate physical constraints into their mathematical models, but omit implicit constraints associated with policies of operation and dependency relationships among rules representing those constraints. To overcome these shortcomings, there is a recent trend in enabling the control strategies with inference-based rule checking capabilities. One solution is to exploit semantic web technologies in building simulation control. Such approaches provide the tools for semantic modeling of domains, and the ability to deduce new information based on the models through use of Description Logic (DL). In a step toward enabling this capability, this paper presents a cross-disciplinary data-drivenmore » control strategy for building energy management simulation that integrates semantic modeling and formal rule checking mechanisms into a Model Predictive Control (MPC) formulation. The results show that MPC provides superior levels of performance when initial conditions and inputs are derived from inference-based rules.« less

  2. Travel Time Estimation Using Freeway Point Detector Data Based on Evolving Fuzzy Neural Inference System.

    PubMed

    Tang, Jinjun; Zou, Yajie; Ash, John; Zhang, Shen; Liu, Fang; Wang, Yinhai

    2016-01-01

    Travel time is an important measurement used to evaluate the extent of congestion within road networks. This paper presents a new method to estimate the travel time based on an evolving fuzzy neural inference system. The input variables in the system are traffic flow data (volume, occupancy, and speed) collected from loop detectors located at points both upstream and downstream of a given link, and the output variable is the link travel time. A first order Takagi-Sugeno fuzzy rule set is used to complete the inference. For training the evolving fuzzy neural network (EFNN), two learning processes are proposed: (1) a K-means method is employed to partition input samples into different clusters, and a Gaussian fuzzy membership function is designed for each cluster to measure the membership degree of samples to the cluster centers. As the number of input samples increases, the cluster centers are modified and membership functions are also updated; (2) a weighted recursive least squares estimator is used to optimize the parameters of the linear functions in the Takagi-Sugeno type fuzzy rules. Testing datasets consisting of actual and simulated data are used to test the proposed method. Three common criteria including mean absolute error (MAE), root mean square error (RMSE), and mean absolute relative error (MARE) are utilized to evaluate the estimation performance. Estimation results demonstrate the accuracy and effectiveness of the EFNN method through comparison with existing methods including: multiple linear regression (MLR), instantaneous model (IM), linear model (LM), neural network (NN), and cumulative plots (CP).

  3. Travel Time Estimation Using Freeway Point Detector Data Based on Evolving Fuzzy Neural Inference System

    PubMed Central

    Tang, Jinjun; Zou, Yajie; Ash, John; Zhang, Shen; Liu, Fang; Wang, Yinhai

    2016-01-01

    Travel time is an important measurement used to evaluate the extent of congestion within road networks. This paper presents a new method to estimate the travel time based on an evolving fuzzy neural inference system. The input variables in the system are traffic flow data (volume, occupancy, and speed) collected from loop detectors located at points both upstream and downstream of a given link, and the output variable is the link travel time. A first order Takagi-Sugeno fuzzy rule set is used to complete the inference. For training the evolving fuzzy neural network (EFNN), two learning processes are proposed: (1) a K-means method is employed to partition input samples into different clusters, and a Gaussian fuzzy membership function is designed for each cluster to measure the membership degree of samples to the cluster centers. As the number of input samples increases, the cluster centers are modified and membership functions are also updated; (2) a weighted recursive least squares estimator is used to optimize the parameters of the linear functions in the Takagi-Sugeno type fuzzy rules. Testing datasets consisting of actual and simulated data are used to test the proposed method. Three common criteria including mean absolute error (MAE), root mean square error (RMSE), and mean absolute relative error (MARE) are utilized to evaluate the estimation performance. Estimation results demonstrate the accuracy and effectiveness of the EFNN method through comparison with existing methods including: multiple linear regression (MLR), instantaneous model (IM), linear model (LM), neural network (NN), and cumulative plots (CP). PMID:26829639

  4. Deductive Glue Code Synthesis for Embedded Software Systems Based on Code Patterns

    NASA Technical Reports Server (NTRS)

    Liu, Jian; Fu, Jicheng; Zhang, Yansheng; Bastani, Farokh; Yen, I-Ling; Tai, Ann; Chau, Savio N.

    2006-01-01

    Automated code synthesis is a constructive process that can be used to generate programs from specifications. It can, thus, greatly reduce the software development cost and time. The use of formal code synthesis approach for software generation further increases the dependability of the system. Though code synthesis has many potential benefits, the synthesis techniques are still limited. Meanwhile, components are widely used in embedded system development. Applying code synthesis to component based software development (CBSD) process can greatly enhance the capability of code synthesis while reducing the component composition efforts. In this paper, we discuss the issues and techniques for applying deductive code synthesis techniques to CBSD. For deductive synthesis in CBSD, a rule base is the key for inferring appropriate component composition. We use the code patterns to guide the development of rules. Code patterns have been proposed to capture the typical usages of the components. Several general composition operations have been identified to facilitate systematic composition. We present the technique for rule development and automated generation of new patterns from existing code patterns. A case study of using this method in building a real-time control system is also presented.

  5. Prediction on carbon dioxide emissions based on fuzzy rules

    NASA Astrophysics Data System (ADS)

    Pauzi, Herrini; Abdullah, Lazim

    2014-06-01

    There are several ways to predict air quality, varying from simple regression to models based on artificial intelligence. Most of the conventional methods are not sufficiently able to provide good forecasting performances due to the problems with non-linearity uncertainty and complexity of the data. Artificial intelligence techniques are successfully used in modeling air quality in order to cope with the problems. This paper describes fuzzy inference system (FIS) to predict CO2 emissions in Malaysia. Furthermore, adaptive neuro-fuzzy inference system (ANFIS) is used to compare the prediction performance. Data of five variables: energy use, gross domestic product per capita, population density, combustible renewable and waste and CO2 intensity are employed in this comparative study. The results from the two model proposed are compared and it is clearly shown that the ANFIS outperforms FIS in CO2 prediction.

  6. Qualitative Discovery in Medical Databases

    NASA Technical Reports Server (NTRS)

    Maluf, David A.

    2000-01-01

    Implication rules have been used in uncertainty reasoning systems to confirm and draw hypotheses or conclusions. However a major bottleneck in developing such systems lies in the elicitation of these rules. This paper empirically examines the performance of evidential inferencing with implication networks generated using a rule induction tool called KAT. KAT utilizes an algorithm for the statistical analysis of empirical case data, and hence reduces the knowledge engineering efforts and biases in subjective implication certainty assignment. The paper describes several experiments in which real-world diagnostic problems were investigated; namely, medical diagnostics. In particular, it attempts to show that: (1) with a limited number of case samples, KAT is capable of inducing implication networks useful for making evidential inferences based on partial observations, and (2) observation driven by a network entropy optimization mechanism is effective in reducing the uncertainty of predicted events.

  7. Using fuzzy logic to integrate neural networks and knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Yen, John

    1991-01-01

    Outlined here is a novel hybrid architecture that uses fuzzy logic to integrate neural networks and knowledge-based systems. The author's approach offers important synergistic benefits to neural nets, approximate reasoning, and symbolic processing. Fuzzy inference rules extend symbolic systems with approximate reasoning capabilities, which are used for integrating and interpreting the outputs of neural networks. The symbolic system captures meta-level information about neural networks and defines its interaction with neural networks through a set of control tasks. Fuzzy action rules provide a robust mechanism for recognizing the situations in which neural networks require certain control actions. The neural nets, on the other hand, offer flexible classification and adaptive learning capabilities, which are crucial for dynamic and noisy environments. By combining neural nets and symbolic systems at their system levels through the use of fuzzy logic, the author's approach alleviates current difficulties in reconciling differences between low-level data processing mechanisms of neural nets and artificial intelligence systems.

  8. Inference in fuzzy rule bases with conflicting evidence

    NASA Technical Reports Server (NTRS)

    Koczy, Laszlo T.

    1992-01-01

    Inference based on fuzzy 'If ... then' rules has played a very important role since when Zadeh proposed the Compositional Rule of Inference and, especially, since the first successful application presented by Mamdani. From the mid-1980's when the 'fuzzy boom' started in Japan, numerous industrial applications appeared, all using simplified techniques because of the high levels of computational complexity. Another feature is that antecedents in the rules are distributed densely in the input space, so the conclusion can be calculated by some weighted combination of the consequents of the matching (fired) rules. The CRI works in the following way: If R is a rule and A* is an observation, the conclusion is computed by B* = R o A* (o stands for the max-min composition). Algorithms implementing this idea directly have an exponential time complexity (maybe the problem is NP-hard) as the rules are relations in X x Y, a k1 x k2 dimensional space, if X is k1, Y is k2 dimensional. The simplified techniques usually decompose the relation into k1 projections in X(sub i) and measure in some way the degree of similarity between observation and antecedent by some parameter of the overlapping. These parameters are aggregated to a single value in (0,1) which is applied as a resulting weight for the given rule. The projections of rules in dimensions Y(sub i) are weighted by these aggregated values and then they are combined in order to obtain a resulting conclusion separately in every dimension. This method is unapplicable with sparse bases as there is no guarantee that an arbitrary observation matches with any of the antecedents. Then, the degree of similarity is 0 and all consequents are weighted by 0. Some considerations for such a situation are summarized in the next sections.

  9. An approach to combining heuristic and qualitative reasoning in an expert system

    NASA Technical Reports Server (NTRS)

    Jiang, Wei-Si; Han, Chia Yung; Tsai, Lian Cheng; Wee, William G.

    1988-01-01

    An approach to combining the heuristic reasoning from shallow knowledge and the qualitative reasoning from deep knowledge is described. The shallow knowledge is represented in production rules and under the direct control of the inference engine. The deep knowledge is represented in frames, which may be put in a relational DataBase Management System. This approach takes advantage of both reasoning schemes and results in improved efficiency as well as expanded problem solving ability.

  10. Reasoning and Knowledge Acquisition Framework for 5G Network Analytics

    PubMed Central

    2017-01-01

    Autonomic self-management is a key challenge for next-generation networks. This paper proposes an automated analysis framework to infer knowledge in 5G networks with the aim to understand the network status and to predict potential situations that might disrupt the network operability. The framework is based on the Endsley situational awareness model, and integrates automated capabilities for metrics discovery, pattern recognition, prediction techniques and rule-based reasoning to infer anomalous situations in the current operational context. Those situations should then be mitigated, either proactive or reactively, by a more complex decision-making process. The framework is driven by a use case methodology, where the network administrator is able to customize the knowledge inference rules and operational parameters. The proposal has also been instantiated to prove its adaptability to a real use case. To this end, a reference network traffic dataset was used to identify suspicious patterns and to predict the behavior of the monitored data volume. The preliminary results suggest a good level of accuracy on the inference of anomalous traffic volumes based on a simple configuration. PMID:29065473

  11. Reasoning and Knowledge Acquisition Framework for 5G Network Analytics.

    PubMed

    Sotelo Monge, Marco Antonio; Maestre Vidal, Jorge; García Villalba, Luis Javier

    2017-10-21

    Autonomic self-management is a key challenge for next-generation networks. This paper proposes an automated analysis framework to infer knowledge in 5G networks with the aim to understand the network status and to predict potential situations that might disrupt the network operability. The framework is based on the Endsley situational awareness model, and integrates automated capabilities for metrics discovery, pattern recognition, prediction techniques and rule-based reasoning to infer anomalous situations in the current operational context. Those situations should then be mitigated, either proactive or reactively, by a more complex decision-making process. The framework is driven by a use case methodology, where the network administrator is able to customize the knowledge inference rules and operational parameters. The proposal has also been instantiated to prove its adaptability to a real use case. To this end, a reference network traffic dataset was used to identify suspicious patterns and to predict the behavior of the monitored data volume. The preliminary results suggest a good level of accuracy on the inference of anomalous traffic volumes based on a simple configuration.

  12. Inferring rules of lineage commitment in haematopoiesis.

    PubMed

    Pina, Cristina; Fugazza, Cristina; Tipping, Alex J; Brown, John; Soneji, Shamit; Teles, Jose; Peterson, Carsten; Enver, Tariq

    2012-02-19

    How the molecular programs of differentiated cells develop as cells transit from multipotency through lineage commitment remains unexplored. This reflects the inability to access cells undergoing commitment or located in the immediate vicinity of commitment boundaries. It remains unclear whether commitment constitutes a gradual process, or else represents a discrete transition. Analyses of in vitro self-renewing multipotent systems have revealed cellular heterogeneity with individual cells transiently exhibiting distinct biases for lineage commitment. Such systems can be used to molecularly interrogate early stages of lineage affiliation and infer rules of lineage commitment. In haematopoiesis, population-based studies have indicated that lineage choice is governed by global transcriptional noise, with self-renewing multipotent cells reversibly activating transcriptome-wide lineage-affiliated programs. We examine this hypothesis through functional and molecular analysis of individual blood cells captured from self-renewal cultures, during cytokine-driven differentiation and from primary stem and progenitor bone marrow compartments. We show dissociation between self-renewal potential and transcriptome-wide activation of lineage programs, and instead suggest that multipotent cells experience independent activation of individual regulators resulting in a low probability of transition to the committed state.

  13. Learning and tuning fuzzy logic controllers through reinforcements

    NASA Technical Reports Server (NTRS)

    Berenji, Hamid R.; Khedkar, Pratap

    1992-01-01

    This paper presents a new method for learning and tuning a fuzzy logic controller based on reinforcements from a dynamic system. In particular, our generalized approximate reasoning-based intelligent control (GARIC) architecture (1) learns and tunes a fuzzy logic controller even when only weak reinforcement, such as a binary failure signal, is available; (2) introduces a new conjunction operator in computing the rule strengths of fuzzy control rules; (3) introduces a new localized mean of maximum (LMOM) method in combining the conclusions of several firing control rules; and (4) learns to produce real-valued control actions. Learning is achieved by integrating fuzzy inference into a feedforward neural network, which can then adaptively improve performance by using gradient descent methods. We extend the AHC algorithm of Barto et al. (1983) to include the prior control knowledge of human operators. The GARIC architecture is applied to a cart-pole balancing system and demonstrates significant improvements in terms of the speed of learning and robustness to changes in the dynamic system's parameters over previous schemes for cart-pole balancing.

  14. On Some Assumptions of the Null Hypothesis Statistical Testing

    ERIC Educational Resources Information Center

    Patriota, Alexandre Galvão

    2017-01-01

    Bayesian and classical statistical approaches are based on different types of logical principles. In order to avoid mistaken inferences and misguided interpretations, the practitioner must respect the inference rules embedded into each statistical method. Ignoring these principles leads to the paradoxical conclusions that the hypothesis…

  15. Lithology identification of aquifers from geophysical well logs and fuzzy logic analysis: Shui-Lin Area, Taiwan

    NASA Astrophysics Data System (ADS)

    Hsieh, Bieng-Zih; Lewis, Charles; Lin, Zsay-Shing

    2005-04-01

    The purpose of this study is to construct a fuzzy lithology system from well logs to identify formation lithology of a groundwater aquifer system in order to better apply conventional well logging interpretation in hydro-geologic studies because well log responses of aquifers are sometimes different from those of conventional oil and gas reservoirs. The input variables for this system are the gamma-ray log reading, the separation between the spherically focused resistivity and the deep very-enhanced resistivity curves, and the borehole compensated sonic log reading. The output variable is groundwater formation lithology. All linguistic variables are based on five linguistic terms with a trapezoidal membership function. In this study, 50 data sets are clustered into 40 training sets and 10 testing sets for constructing the fuzzy lithology system and validating the ability of system prediction, respectively. The rule-based database containing 12 fuzzy lithology rules is developed from the training data sets, and the rule strength is weighted. A Madani inference system and the bisector of area defuzzification method are used for fuzzy inference and defuzzification. The success of training performance and the prediction ability were both 90%, with the calculated correlation of training and testing equal to 0.925 and 0.928, respectively. Well logs and core data from a clastic aquifer (depths 100-198 m) in the Shui-Lin area of west-central Taiwan are used for testing the system's construction. Comparison of results from core analysis, well logging and the fuzzy lithology system indicates that even though the well logging method can easily define a permeable sand formation, distinguishing between silts and sands and determining grain size variation in sands is more subjective. These shortcomings can be improved by a fuzzy lithology system that is able to yield more objective decisions than some conventional methods of log interpretation.

  16. A real-time expert system for self-repairing flight control

    NASA Technical Reports Server (NTRS)

    Gaither, S. A.; Agarwal, A. K.; Shah, S. C.; Duke, E. L.

    1989-01-01

    An integrated environment for specifying, prototyping, and implementing a self-repairing flight-control (SRFC) strategy is described. At an interactive workstation, the user can select paradigms such as rule-based expert systems, state-transition diagrams, and signal-flow graphs and hierarchically nest them, assign timing and priority attributes, establish blackboard-type communication, and specify concurrent execution on single or multiple processors. High-fidelity nonlinear simulations of aircraft and SRFC systems can be performed off-line, with the possibility of changing SRFC rules, inference strategies, and other heuristics to correct for control deficiencies. Finally, the off-line-generated SRFC can be transformed into highly optimized application-specific real-time C-language code. An application of this environment to the design of aircraft fault detection, isolation, and accommodation algorithms is presented in detail.

  17. Building distributed rule-based systems using the AI Bus

    NASA Technical Reports Server (NTRS)

    Schultz, Roger D.; Stobie, Iain C.

    1990-01-01

    The AI Bus software architecture was designed to support the construction of large-scale, production-quality applications in areas of high technology flux, running heterogeneous distributed environments, utilizing a mix of knowledge-based and conventional components. These goals led to its current development as a layered, object-oriented library for cooperative systems. This paper describes the concepts and design of the AI Bus and its implementation status as a library of reusable and customizable objects, structured by layers from operating system interfaces up to high-level knowledge-based agents. Each agent is a semi-autonomous process with specialized expertise, and consists of a number of knowledge sources (a knowledge base and inference engine). Inter-agent communication mechanisms are based on blackboards and Actors-style acquaintances. As a conservative first implementation, we used C++ on top of Unix, and wrapped an embedded Clips with methods for the knowledge source class. This involved designing standard protocols for communication and functions which use these protocols in rules. Embedding several CLIPS objects within a single process was an unexpected problem because of global variables, whose solution involved constructing and recompiling a C++ version of CLIPS. We are currently working on a more radical approach to incorporating CLIPS, by separating out its pattern matcher, rule and fact representations and other components as true object oriented modules.

  18. Recognition of Handwritten Arabic words using a neuro-fuzzy network

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boukharouba, Abdelhak; Bennia, Abdelhak

    We present a new method for the recognition of handwritten Arabic words based on neuro-fuzzy hybrid network. As a first step, connected components (CCs) of black pixels are detected. Then the system determines which CCs are sub-words and which are stress marks. The stress marks are then isolated and identified separately and the sub-words are segmented into graphemes. Each grapheme is described by topological and statistical features. Fuzzy rules are extracted from training examples by a hybrid learning scheme comprised of two phases: rule generation phase from data using a fuzzy c-means, and rule parameter tuning phase using gradient descentmore » learning. After learning, the network encodes in its topology the essential design parameters of a fuzzy inference system.The contribution of this technique is shown through the significant tests performed on a handwritten Arabic words database.« less

  19. A knowledge-base generating hierarchical fuzzy-neural controller.

    PubMed

    Kandadai, R M; Tien, J M

    1997-01-01

    We present an innovative fuzzy-neural architecture that is able to automatically generate a knowledge base, in an extractable form, for use in hierarchical knowledge-based controllers. The knowledge base is in the form of a linguistic rule base appropriate for a fuzzy inference system. First, we modify Berenji and Khedkar's (1992) GARIC architecture to enable it to automatically generate a knowledge base; a pseudosupervised learning scheme using reinforcement learning and error backpropagation is employed. Next, we further extend this architecture to a hierarchical controller that is able to generate its own knowledge base. Example applications are provided to underscore its viability.

  20. Characterising bias in regulatory risk and decision analysis: An analysis of heuristics applied in health technology appraisal, chemicals regulation, and climate change governance.

    PubMed

    MacGillivray, Brian H

    2017-08-01

    In many environmental and public health domains, heuristic methods of risk and decision analysis must be relied upon, either because problem structures are ambiguous, reliable data is lacking, or decisions are urgent. This introduces an additional source of uncertainty beyond model and measurement error - uncertainty stemming from relying on inexact inference rules. Here we identify and analyse heuristics used to prioritise risk objects, to discriminate between signal and noise, to weight evidence, to construct models, to extrapolate beyond datasets, and to make policy. Some of these heuristics are based on causal generalisations, yet can misfire when these relationships are presumed rather than tested (e.g. surrogates in clinical trials). Others are conventions designed to confer stability to decision analysis, yet which may introduce serious error when applied ritualistically (e.g. significance testing). Some heuristics can be traced back to formal justifications, but only subject to strong assumptions that are often violated in practical applications. Heuristic decision rules (e.g. feasibility rules) in principle act as surrogates for utility maximisation or distributional concerns, yet in practice may neglect costs and benefits, be based on arbitrary thresholds, and be prone to gaming. We highlight the problem of rule-entrenchment, where analytical choices that are in principle contestable are arbitrarily fixed in practice, masking uncertainty and potentially introducing bias. Strategies for making risk and decision analysis more rigorous include: formalising the assumptions and scope conditions under which heuristics should be applied; testing rather than presuming their underlying empirical or theoretical justifications; using sensitivity analysis, simulations, multiple bias analysis, and deductive systems of inference (e.g. directed acyclic graphs) to characterise rule uncertainty and refine heuristics; adopting "recovery schemes" to correct for known biases; and basing decision rules on clearly articulated values and evidence, rather than convention. Copyright © 2017. Published by Elsevier Ltd.

  1. Fuzzy Computing Model of Activity Recognition on WSN Movement Data for Ubiquitous Healthcare Measurement.

    PubMed

    Chiang, Shu-Yin; Kan, Yao-Chiang; Chen, Yun-Shan; Tu, Ying-Ching; Lin, Hsueh-Chun

    2016-12-03

    Ubiquitous health care (UHC) is beneficial for patients to ensure they complete therapeutic exercises by self-management at home. We designed a fuzzy computing model that enables recognizing assigned movements in UHC with privacy. The movements are measured by the self-developed body motion sensor, which combines both accelerometer and gyroscope chips to make an inertial sensing node compliant with a wireless sensor network (WSN). The fuzzy logic process was studied to calculate the sensor signals that would entail necessary features of static postures and dynamic motions. Combinations of the features were studied and the proper feature sets were chosen with compatible fuzzy rules. Then, a fuzzy inference system (FIS) can be generated to recognize the assigned movements based on the rules. We thus implemented both fuzzy and adaptive neuro-fuzzy inference systems in the model to distinguish static and dynamic movements. The proposed model can effectively reach the recognition scope of the assigned activity. Furthermore, two exercises of upper-limb flexion in physical therapy were applied for the model in which the recognition rate can stand for the passing rate of the assigned motions. Finally, a web-based interface was developed to help remotely measure movement in physical therapy for UHC.

  2. Fuzzy Computing Model of Activity Recognition on WSN Movement Data for Ubiquitous Healthcare Measurement

    PubMed Central

    Chiang, Shu-Yin; Kan, Yao-Chiang; Chen, Yun-Shan; Tu, Ying-Ching; Lin, Hsueh-Chun

    2016-01-01

    Ubiquitous health care (UHC) is beneficial for patients to ensure they complete therapeutic exercises by self-management at home. We designed a fuzzy computing model that enables recognizing assigned movements in UHC with privacy. The movements are measured by the self-developed body motion sensor, which combines both accelerometer and gyroscope chips to make an inertial sensing node compliant with a wireless sensor network (WSN). The fuzzy logic process was studied to calculate the sensor signals that would entail necessary features of static postures and dynamic motions. Combinations of the features were studied and the proper feature sets were chosen with compatible fuzzy rules. Then, a fuzzy inference system (FIS) can be generated to recognize the assigned movements based on the rules. We thus implemented both fuzzy and adaptive neuro-fuzzy inference systems in the model to distinguish static and dynamic movements. The proposed model can effectively reach the recognition scope of the assigned activity. Furthermore, two exercises of upper-limb flexion in physical therapy were applied for the model in which the recognition rate can stand for the passing rate of the assigned motions. Finally, a web-based interface was developed to help remotely measure movement in physical therapy for UHC. PMID:27918482

  3. Learning control of inverted pendulum system by neural network driven fuzzy reasoning: The learning function of NN-driven fuzzy reasoning under changes of reasoning environment

    NASA Technical Reports Server (NTRS)

    Hayashi, Isao; Nomura, Hiroyoshi; Wakami, Noboru

    1991-01-01

    Whereas conventional fuzzy reasonings are associated with tuning problems, which are lack of membership functions and inference rule designs, a neural network driven fuzzy reasoning (NDF) capable of determining membership functions by neural network is formulated. In the antecedent parts of the neural network driven fuzzy reasoning, the optimum membership function is determined by a neural network, while in the consequent parts, an amount of control for each rule is determined by other plural neural networks. By introducing an algorithm of neural network driven fuzzy reasoning, inference rules for making a pendulum stand up from its lowest suspended point are determined for verifying the usefulness of the algorithm.

  4. The impact of category structure and training methodology on learning and generalizing within-category representations.

    PubMed

    Ell, Shawn W; Smith, David B; Peralta, Gabriela; Hélie, Sébastien

    2017-08-01

    When interacting with categories, representations focused on within-category relationships are often learned, but the conditions promoting within-category representations and their generalizability are unclear. We report the results of three experiments investigating the impact of category structure and training methodology on the learning and generalization of within-category representations (i.e., correlational structure). Participants were trained on either rule-based or information-integration structures using classification (Is the stimulus a member of Category A or Category B?), concept (e.g., Is the stimulus a member of Category A, Yes or No?), or inference (infer the missing component of the stimulus from a given category) and then tested on either an inference task (Experiments 1 and 2) or a classification task (Experiment 3). For the information-integration structure, within-category representations were consistently learned, could be generalized to novel stimuli, and could be generalized to support inference at test. For the rule-based structure, extended inference training resulted in generalization to novel stimuli (Experiment 2) and inference training resulted in generalization to classification (Experiment 3). These data help to clarify the conditions under which within-category representations can be learned. Moreover, these results make an important contribution in highlighting the impact of category structure and training methodology on the generalization of categorical knowledge.

  5. Induction of belief decision trees from data

    NASA Astrophysics Data System (ADS)

    AbuDahab, Khalil; Xu, Dong-ling; Keane, John

    2012-09-01

    In this paper, a method for acquiring belief rule-bases by inductive inference from data is described and evaluated. Existing methods extract traditional rules inductively from data, with consequents that are believed to be either 100% true or 100% false. Belief rules can capture uncertain or incomplete knowledge using uncertain belief degrees in consequents. Instead of using singled-value consequents, each belief rule deals with a set of collectively exhaustive and mutually exclusive consequents. The proposed method extracts belief rules from data which contain uncertain or incomplete knowledge.

  6. Genie: An Inference Engine with Applications to Vulnerability Analysis.

    DTIC Science & Technology

    1986-06-01

    Stanford Artifcial intelligence Laboratory, 1976. 15 D. A. Waterman and F. Hayes-Roth, eds. Pattern-Directed Inference Systems. Academic Press, Inc...Continue an reverse aide It nlecessary mid Identify by block rnmbor) ; f Expert Systems Artificial Intelligence % Vulnerability Analysis Knowledge...deduction it is used wherever possible in data -driven mode (forward chaining). Production rules - JIM 0 g79OOFMV55@S I INCLASSTpnF SECURITY CLASSIFICATION OF

  7. ELIPS: Toward a Sensor Fusion Processor on a Chip

    NASA Technical Reports Server (NTRS)

    Daud, Taher; Stoica, Adrian; Tyson, Thomas; Li, Wei-te; Fabunmi, James

    1998-01-01

    The paper presents the concept and initial tests from the hardware implementation of a low-power, high-speed reconfigurable sensor fusion processor. The Extended Logic Intelligent Processing System (ELIPS) processor is developed to seamlessly combine rule-based systems, fuzzy logic, and neural networks to achieve parallel fusion of sensor in compact low power VLSI. The first demonstration of the ELIPS concept targets interceptor functionality; other applications, mainly in robotics and autonomous systems are considered for the future. The main assumption behind ELIPS is that fuzzy, rule-based and neural forms of computation can serve as the main primitives of an "intelligent" processor. Thus, in the same way classic processors are designed to optimize the hardware implementation of a set of fundamental operations, ELIPS is developed as an efficient implementation of computational intelligence primitives, and relies on a set of fuzzy set, fuzzy inference and neural modules, built in programmable analog hardware. The hardware programmability allows the processor to reconfigure into different machines, taking the most efficient hardware implementation during each phase of information processing. Following software demonstrations on several interceptor data, three important ELIPS building blocks (a fuzzy set preprocessor, a rule-based fuzzy system and a neural network) have been fabricated in analog VLSI hardware and demonstrated microsecond-processing times.

  8. An inference engine for embedded diagnostic systems

    NASA Technical Reports Server (NTRS)

    Fox, Barry R.; Brewster, Larry T.

    1987-01-01

    The implementation of an inference engine for embedded diagnostic systems is described. The system consists of two distinct parts. The first is an off-line compiler which accepts a propositional logical statement of the relationship between facts and conclusions and produces data structures required by the on-line inference engine. The second part consists of the inference engine and interface routines which accept assertions of fact and return the conclusions which necessarily follow. Given a set of assertions, it will generate exactly the conclusions which logically follow. At the same time, it will detect any inconsistencies which may propagate from an inconsistent set of assertions or a poorly formulated set of rules. The memory requirements are fixed and the worst case execution times are bounded at compile time. The data structures and inference algorithms are very simple and well understood. The data structures and algorithms are described in detail. The system has been implemented on Lisp, Pascal, and Modula-2.

  9. Knowledge-based control of an adaptive interface

    NASA Technical Reports Server (NTRS)

    Lachman, Roy

    1989-01-01

    The analysis, development strategy, and preliminary design for an intelligent, adaptive interface is reported. The design philosophy couples knowledge-based system technology with standard human factors approaches to interface development for computer workstations. An expert system has been designed to drive the interface for application software. The intelligent interface will be linked to application packages, one at a time, that are planned for multiple-application workstations aboard Space Station Freedom. Current requirements call for most Space Station activities to be conducted at the workstation consoles. One set of activities will consist of standard data management services (DMS). DMS software includes text processing, spreadsheets, data base management, etc. Text processing was selected for the first intelligent interface prototype because text-processing software can be developed initially as fully functional but limited with a small set of commands. The program's complexity then can be increased incrementally. The intelligent interface includes the operator's behavior and three types of instructions to the underlying application software are included in the rule base. A conventional expert-system inference engine searches the data base for antecedents to rules and sends the consequents of fired rules as commands to the underlying software. Plans for putting the expert system on top of a second application, a database management system, will be carried out following behavioral research on the first application. The intelligent interface design is suitable for use with ground-based workstations now common in government, industrial, and educational organizations.

  10. An adaptive neuro fuzzy inference system controlled space cector pulse width modulation based HVDC light transmission system under AC fault conditions

    NASA Astrophysics Data System (ADS)

    Ajay Kumar, M.; Srikanth, N. V.

    2014-03-01

    In HVDC Light transmission systems, converter control is one of the major fields of present day research works. In this paper, fuzzy logic controller is utilized for controlling both the converters of the space vector pulse width modulation (SVPWM) based HVDC Light transmission systems. Due to its complexity in the rule base formation, an intelligent controller known as adaptive neuro fuzzy inference system (ANFIS) controller is also introduced in this paper. The proposed ANFIS controller changes the PI gains automatically for different operating conditions. A hybrid learning method which combines and exploits the best features of both the back propagation algorithm and least square estimation method is used to train the 5-layer ANFIS controller. The performance of the proposed ANFIS controller is compared and validated with the fuzzy logic controller and also with the fixed gain conventional PI controller. The simulations are carried out in the MATLAB/SIMULINK environment. The results reveal that the proposed ANFIS controller is reducing power fluctuations at both the converters. It also improves the dynamic performance of the test power system effectively when tested for various ac fault conditions.

  11. Estimation of tool wear length in finish milling using a fuzzy inference algorithm

    NASA Astrophysics Data System (ADS)

    Ko, Tae Jo; Cho, Dong Woo

    1993-10-01

    The geometric accuracy and surface roughness are mainly affected by the flank wear at the minor cutting edge in finish machining. A fuzzy estimator obtained by a fuzzy inference algorithm with a max-min composition rule to evaluate the minor flank wear length in finish milling is introduced. The features sensitive to minor flank wear are extracted from the dispersion analysis of a time series AR model of the feed directional acceleration of the spindle housing. Linguistic rules for fuzzy estimation are constructed using these features, and then fuzzy inferences are carried out with test data sets under various cutting conditions. The proposed system turns out to be effective for estimating minor flank wear length, and its mean error is less than 12%.

  12. Willpower and Personal Rules.

    ERIC Educational Resources Information Center

    Benabou, Roland; Tirole, Jean

    2004-01-01

    We develop a theory of internal commitments or "personal rules" based on self-reputation over one's willpower, which transforms lapses into precedents that undermine future self-restraint. The foundation for this mechanism is the imperfect recall of past motives and feelings, leading people to draw inferences from their past actions. The degree of…

  13. Model authoring system for fail safe analysis

    NASA Technical Reports Server (NTRS)

    Sikora, Scott E.

    1990-01-01

    The Model Authoring System is a prototype software application for generating fault tree analyses and failure mode and effects analyses for circuit designs. Utilizing established artificial intelligence and expert system techniques, the circuits are modeled as a frame-based knowledge base in an expert system shell, which allows the use of object oriented programming and an inference engine. The behavior of the circuit is then captured through IF-THEN rules, which then are searched to generate either a graphical fault tree analysis or failure modes and effects analysis. Sophisticated authoring techniques allow the circuit to be easily modeled, permit its behavior to be quickly defined, and provide abstraction features to deal with complexity.

  14. EXADS - EXPERT SYSTEM FOR AUTOMATED DESIGN SYNTHESIS

    NASA Technical Reports Server (NTRS)

    Rogers, J. L.

    1994-01-01

    The expert system called EXADS was developed to aid users of the Automated Design Synthesis (ADS) general purpose optimization program. Because of the general purpose nature of ADS, it is difficult for a nonexpert to select the best choice of strategy, optimizer, and one-dimensional search options from the one hundred or so combinations that are available. EXADS aids engineers in determining the best combination based on their knowledge of the problem and the expert knowledge previously stored by experts who developed ADS. EXADS is a customized application of the AESOP artificial intelligence program (the general version of AESOP is available separately from COSMIC. The ADS program is also available from COSMIC.) The expert system consists of two main components. The knowledge base contains about 200 rules and is divided into three categories: constrained, unconstrained, and constrained treated as unconstrained. The EXADS inference engine is rule-based and makes decisions about a particular situation using hypotheses (potential solutions), rules, and answers to questions drawn from the rule base. EXADS is backward-chaining, that is, it works from hypothesis to facts. The rule base was compiled from sources such as literature searches, ADS documentation, and engineer surveys. EXADS will accept answers such as yes, no, maybe, likely, and don't know, or a certainty factor ranging from 0 to 10. When any hypothesis reaches a confidence level of 90% or more, it is deemed as the best choice and displayed to the user. If no hypothesis is confirmed, the user can examine explanations of why the hypotheses failed to reach the 90% level. The IBM PC version of EXADS is written in IQ-LISP for execution under DOS 2.0 or higher with a central memory requirement of approximately 512K of 8 bit bytes. This program was developed in 1986.

  15. Evidence Accumulation and Change Rate Inference in Dynamic Environments.

    PubMed

    Radillo, Adrian E; Veliz-Cuba, Alan; Josić, Krešimir; Kilpatrick, Zachary P

    2017-06-01

    In a constantly changing world, animals must account for environmental volatility when making decisions. To appropriately discount older, irrelevant information, they need to learn the rate at which the environment changes. We develop an ideal observer model capable of inferring the present state of the environment along with its rate of change. Key to this computation is an update of the posterior probability of all possible change point counts. This computation can be challenging, as the number of possibilities grows rapidly with time. However, we show how the computations can be simplified in the continuum limit by a moment closure approximation. The resulting low-dimensional system can be used to infer the environmental state and change rate with accuracy comparable to the ideal observer. The approximate computations can be performed by a neural network model via a rate-correlation-based plasticity rule. We thus show how optimal observers accumulate evidence in changing environments and map this computation to reduced models that perform inference using plausible neural mechanisms.

  16. Analyzing spacecraft configurations through specialization and default reasoning

    NASA Technical Reports Server (NTRS)

    Barry, Matthew R.; Lowe, Carlyle M.

    1990-01-01

    For an intelligent system to describe a real-world situation using as few statements as possible, it is necessary to make inferences based on observed data and to incorporate general knowledge of the reasoning domain into the description. These reasoning processes must reduce several levels of specific descriptions into only those few that most precisely describe the situation. Moreover, the system must be able to generate descriptions in the absence of data, as instructed by certain rules of inference. The deductions applied by the system, then, generate a high-level description from the low-level evidence provided by the real and default data sources. An implementation of these ideas in a real-world situation is described. The application concerns evaluation of Space Shuttle electromechanical system configurations by console operators in the Mission Control Center. A production system provides the reasoning mechanism through which the default assignments and specializations occur. Examples are provided within this domain for each type of inference, and the suitability is discussed of each toward achieving the goal of describing a situation in the fewest statements possible. Finally, several enhancements are suggested that will further increase the intelligence of similar spacecraft monitoring applications.

  17. The use of natural language processing on pediatric diagnostic radiology reports in the electronic health record to identify deep venous thrombosis in children.

    PubMed

    Gálvez, Jorge A; Pappas, Janine M; Ahumada, Luis; Martin, John N; Simpao, Allan F; Rehman, Mohamed A; Witmer, Char

    2017-10-01

    Venous thromboembolism (VTE) is a potentially life-threatening condition that includes both deep vein thrombosis (DVT) and pulmonary embolism. We sought to improve detection and reporting of children with a new diagnosis of VTE by applying natural language processing (NLP) tools to radiologists' reports. We validated an NLP tool, Reveal NLP (Health Fidelity Inc, San Mateo, CA) and inference rules engine's performance in identifying reports with deep venous thrombosis using a curated set of ultrasound reports. We then configured the NLP tool to scan all available radiology reports on a daily basis for studies that met criteria for VTE between July 1, 2015, and March 31, 2016. The NLP tool and inference rules engine correctly identified 140 out of 144 reports with positive DVT findings and 98 out of 106 negative reports in the validation set. The tool's sensitivity was 97.2% (95% CI 93-99.2%), specificity was 92.5% (95% CI 85.7-96.7%). Subsequently, the NLP tool and inference rules engine processed 6373 radiology reports from 3371 hospital encounters. The NLP tool and inference rules engine identified 178 positive reports and 3193 negative reports with a sensitivity of 82.9% (95% CI 74.8-89.2) and specificity of 97.5% (95% CI 96.9-98). The system functions well as a safety net to screen patients for HA-VTE on a daily basis and offers value as an automated, redundant system. To our knowledge, this is the first pediatric study to apply NLP technology in a prospective manner for HA-VTE identification.

  18. On the inherent competition between valid and spurious inductive inferences in Boolean data

    NASA Astrophysics Data System (ADS)

    Andrecut, M.

    Inductive inference is the process of extracting general rules from specific observations. This problem also arises in the analysis of biological networks, such as genetic regulatory networks, where the interactions are complex and the observations are incomplete. A typical task in these problems is to extract general interaction rules as combinations of Boolean covariates, that explain a measured response variable. The inductive inference process can be considered as an incompletely specified Boolean function synthesis problem. This incompleteness of the problem will also generate spurious inferences, which are a serious threat to valid inductive inference rules. Using random Boolean data as a null model, here we attempt to measure the competition between valid and spurious inductive inference rules from a given data set. We formulate two greedy search algorithms, which synthesize a given Boolean response variable in a sparse disjunct normal form, and respectively a sparse generalized algebraic normal form of the variables from the observation data, and we evaluate numerically their performance.

  19. Multi-user investigation organizer

    NASA Technical Reports Server (NTRS)

    Panontin, Tina L. (Inventor); Williams, James F. (Inventor); Carvalho, Robert E. (Inventor); Sturken, Ian (Inventor); Wolfe, Shawn R. (Inventor); Gawdiak, Yuri O. (Inventor); Keller, Richard M. (Inventor)

    2009-01-01

    A system that allows a team of geographically dispersed users to collaboratively analyze a mishap event. The system includes a reconfigurable ontology, including instances that are related to and characterize the mishap, a semantic network that receives, indexes and stores, for retrieval, viewing and editing, the instances and links between the instances, a network browser interface for retrieving and viewing screens that present the instances and links to other instances and that allow editing thereof, and a rule-based inference engine, including a collection of rules associated with establishment of links between the instances. A possible conclusion arising from analysis of the mishap event may be characterized as one or more of: not a credible conclusion; an unlikely conclusion; a credible conclusion; conclusion needs analysis; conclusion needs supporting data; conclusion proposed to be closed; and an un-reviewed conclusion.

  20. Natural language processing and inference rules as strategies for updating problem list in an electronic health record.

    PubMed

    Plazzotta, Fernando; Otero, Carlos; Luna, Daniel; de Quiros, Fernan Gonzalez Bernaldo

    2013-01-01

    Physicians do not always keep the problem list accurate, complete and updated. To analyze natural language processing (NLP) techniques and inference rules as strategies to maintain completeness and accuracy of the problem list in EHRs. Non systematic literature review in PubMed, in the last 10 years. Strategies to maintain the EHRs problem list were analyzed in two ways: inputting and removing problems from the problem list. NLP and inference rules have acceptable performance for inputting problems into the problem list. No studies using these techniques for removing problems were published Conclusion: Both tools, NLP and inference rules have had acceptable results as tools for maintain the completeness and accuracy of the problem list.

  1. Signal Processing Expert Code (SPEC)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ames, H.S.

    1985-12-01

    The purpose of this paper is to describe a prototype expert system called SPEC which was developed to demonstrate the utility of providing an intelligent interface for users of SIG, a general purpose signal processing code. The expert system is written in NIL, runs on a VAX 11/750 and consists of a backward chaining inference engine and an English-like parser. The inference engine uses knowledge encoded as rules about the formats of SIG commands and about how to perform frequency analyses using SIG. The system demonstrated that expert system can be used to control existing codes.

  2. RISMA: A Rule-based Interval State Machine Algorithm for Alerts Generation, Performance Analysis and Monitoring Real-Time Data Processing

    NASA Astrophysics Data System (ADS)

    Laban, Shaban; El-Desouky, Aly

    2013-04-01

    The monitoring of real-time systems is a challenging and complicated process. So, there is a continuous need to improve the monitoring process through the use of new intelligent techniques and algorithms for detecting exceptions, anomalous behaviours and generating the necessary alerts during the workflow monitoring of such systems. The interval-based or period-based theorems have been discussed, analysed, and used by many researches in Artificial Intelligence (AI), philosophy, and linguistics. As explained by Allen, there are 13 relations between any two intervals. Also, there have also been many studies of interval-based temporal reasoning and logics over the past decades. Interval-based theorems can be used for monitoring real-time interval-based data processing. However, increasing the number of processed intervals makes the implementation of such theorems a complex and time consuming process as the relationships between such intervals are increasing exponentially. To overcome the previous problem, this paper presents a Rule-based Interval State Machine Algorithm (RISMA) for processing, monitoring, and analysing the behaviour of interval-based data, received from real-time sensors. The proposed intelligent algorithm uses the Interval State Machine (ISM) approach to model any number of interval-based data into well-defined states as well as inferring them. An interval-based state transition model and methodology are presented to identify the relationships between the different states of the proposed algorithm. By using such model, the unlimited number of relationships between similar large numbers of intervals can be reduced to only 18 direct relationships using the proposed well-defined states. For testing the proposed algorithm, necessary inference rules and code have been designed and applied to the continuous data received in near real-time from the stations of International Monitoring System (IMS) by the International Data Centre (IDC) of the Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO). The CLIPS expert system shell has been used as the main rule engine for implementing the algorithm rules. Python programming language and the module "PyCLIPS" are used for building the necessary code for algorithm implementation. More than 1.7 million intervals constitute the Concise List of Frames (CLF) from 20 different seismic stations have been used for evaluating the proposed algorithm and evaluating stations behaviour and performance. The initial results showed that proposed algorithm can help in better understanding of the operation and performance of those stations. Different important information, such as alerts and some station performance parameters, can be derived from the proposed algorithm. For IMS interval-based data and at any period of time it is possible to analyze station behavior, determine the missing data, generate necessary alerts, and to measure some of station performance attributes. The details of the proposed algorithm, methodology, implementation, experimental results, advantages, and limitations of this research are presented. Finally, future directions and recommendations are discussed.

  3. Advancing reservoir operation description in physically based hydrological models

    NASA Astrophysics Data System (ADS)

    Anghileri, Daniela; Giudici, Federico; Castelletti, Andrea; Burlando, Paolo

    2016-04-01

    Last decades have seen significant advances in our capacity of characterizing and reproducing hydrological processes within physically based models. Yet, when the human component is considered (e.g. reservoirs, water distribution systems), the associated decisions are generally modeled with very simplistic rules, which might underperform in reproducing the actual operators' behaviour on a daily or sub-daily basis. For example, reservoir operations are usually described by a target-level rule curve, which represents the level that the reservoir should track during normal operating conditions. The associated release decision is determined by the current state of the reservoir relative to the rule curve. This modeling approach can reasonably reproduce the seasonal water volume shift due to reservoir operation. Still, it cannot capture more complex decision making processes in response, e.g., to the fluctuations of energy prices and demands, the temporal unavailability of power plants or varying amount of snow accumulated in the basin. In this work, we link a physically explicit hydrological model with detailed hydropower behavioural models describing the decision making process by the dam operator. In particular, we consider two categories of behavioural models: explicit or rule-based behavioural models, where reservoir operating rules are empirically inferred from observational data, and implicit or optimization based behavioural models, where, following a normative economic approach, the decision maker is represented as a rational agent maximising a utility function. We compare these two alternate modelling approaches on the real-world water system of Lake Como catchment in the Italian Alps. The water system is characterized by the presence of 18 artificial hydropower reservoirs generating almost 13% of the Italian hydropower production. Results show to which extent the hydrological regime in the catchment is affected by different behavioural models and reservoir operating strategies.

  4. DELTA: An Expert System for Diesel Electric Locomotive Repair

    DTIC Science & Technology

    1984-06-01

    Rules and Inference Mechanisms. AD-P003 943 The ACE (Automated Cable Expert) Exlpelient: Initial Evaluation of an Expert System for Preventive...tions. The first field prototype expert system, designated CATS -i (Computer-Aided Troubleshooting System - Version 1), was delivered in July 1983 and is

  5. A Legal Negotiatiton Support System Based on A Diagram

    NASA Astrophysics Data System (ADS)

    Nitta, Katsumi; Shibasaki, Masato; Yasumura, Yoshiaki; Hasegawa, Ryuzo; Fujita, Hiroshi; Koshimura, Miyuki; Inoue, Katsumi; Shirai, Yasuyuki; Komatsu, Hiroshi

    We present an overview of a legal negotiation support system, ANS (Argumentation based Negotiation support System). ANS consists of a user interface, three inference engines, a database of old cases, and two decision support modules. The ANS users negotiates or disputes with others via a computer network. The negotiation status is managed in the form of the negotiation diagram. The negotiation diagram is an extension of Toulmin’s argument diagram, and it contains all arguments insisted by participants. The negotiation protocols are defined as operations to the negotiation diagram. By exchanging counter arguments each other, the negotiation diagram grows up. Nonmonotonic reasoning using rule priorities are applied to the negotiation diagram.

  6. An Ambulatory System for Gait Monitoring Based on Wireless Sensorized Insoles.

    PubMed

    González, Iván; Fontecha, Jesús; Hervás, Ramón; Bravo, José

    2015-07-09

    A new gait phase detection system for continuous monitoring based on wireless sensorized insoles is presented. The system can be used in gait analysis mobile applications, and it is designed for real-time demarcation of gait phases. The system employs pressure sensors to assess the force exerted by each foot during walking. A fuzzy rule-based inference algorithm is implemented on a smartphone and used to detect each of the gait phases based on the sensor signals. Additionally, to provide a solution that is insensitive to perturbations caused by non-walking activities, a probabilistic classifier is employed to discriminate walking forward from other low-level activities, such as turning, walking backwards, lateral walking, etc. The combination of these two algorithms constitutes the first approach towards a continuous gait assessment system, by means of the avoidance of non-walking influences.

  7. The Multi-Intelligence Tools Suite - Supporting Research and Development in Information and Knowledge Exploitation

    DTIC Science & Technology

    2011-06-01

    to build a membership fact. The atom definition also defines the precise order of the pieces. Each argument has a label (D) and a type ( E ). The...list of ato argument). Figure 2 shows the inference rule editor. B. Name E . Rule Premises F. Rule Conclusions Figure 2. Inference rule editor One...created using this specific rule. one premise in the rule premises list ( E ), which represents a list of fact conditions that need to be found in the fact

  8. Identification of rheumatoid arthritis and osteoarthritis patients by transcriptome-based rule set generation

    PubMed Central

    2014-01-01

    Introduction Discrimination of rheumatoid arthritis (RA) patients from patients with other inflammatory or degenerative joint diseases or healthy individuals purely on the basis of genes differentially expressed in high-throughput data has proven very difficult. Thus, the present study sought to achieve such discrimination by employing a novel unbiased approach using rule-based classifiers. Methods Three multi-center genome-wide transcriptomic data sets (Affymetrix HG-U133 A/B) from a total of 79 individuals, including 20 healthy controls (control group - CG), as well as 26 osteoarthritis (OA) and 33 RA patients, were used to infer rule-based classifiers to discriminate the disease groups. The rules were ranked with respect to Kiendl’s statistical relevance index, and the resulting rule set was optimized by pruning. The rule sets were inferred separately from data of one of three centers and applied to the two remaining centers for validation. All rules from the optimized rule sets of all centers were used to analyze their biological relevance applying the software Pathway Studio. Results The optimized rule sets for the three centers contained a total of 29, 20, and 8 rules (including 10, 8, and 4 rules for ‘RA’), respectively. The mean sensitivity for the prediction of RA based on six center-to-center tests was 96% (range 90% to 100%), that for OA 86% (range 40% to 100%). The mean specificity for RA prediction was 94% (range 80% to 100%), that for OA 96% (range 83.3% to 100%). The average overall accuracy of the three different rule-based classifiers was 91% (range 80% to 100%). Unbiased analyses by Pathway Studio of the gene sets obtained by discrimination of RA from OA and CG with rule-based classifiers resulted in the identification of the pathogenetically and/or therapeutically relevant interferon-gamma and GM-CSF pathways. Conclusion First-time application of rule-based classifiers for the discrimination of RA resulted in high performance, with means for all assessment parameters close to or higher than 90%. In addition, this unbiased, new approach resulted in the identification not only of pathways known to be critical to RA, but also of novel molecules such as serine/threonine kinase 10. PMID:24690414

  9. System and method for creating expert systems

    NASA Technical Reports Server (NTRS)

    Hughes, Peter M. (Inventor); Luczak, Edward C. (Inventor)

    1998-01-01

    A system and method provides for the creation of a highly graphical expert system without the need for programming in code. An expert system is created by initially building a data interface, defining appropriate Mission, User-Defined, Inferred, and externally-generated GenSAA (EGG) data variables whose data values will be updated and input into the expert system. Next, rules of the expert system are created by building appropriate conditions of the rules which must be satisfied and then by building appropriate actions of rules which are to be executed upon corresponding conditions being satisfied. Finally, an appropriate user interface is built which can be highly graphical in nature and which can include appropriate message display and/or modification of display characteristics of a graphical display object, to visually alert a user of the expert system of varying data values, upon conditions of a created rule being satisfied. The data interface building, rule building, and user interface building are done in an efficient manner and can be created without the need for programming in code.

  10. Rule acquisition in formal decision contexts based on formal, object-oriented and property-oriented concept lattices.

    PubMed

    Ren, Yue; Li, Jinhai; Aswani Kumar, Cherukuri; Liu, Wenqi

    2014-01-01

    Rule acquisition is one of the main purposes in the analysis of formal decision contexts. Up to now, there have been several types of rules in formal decision contexts such as decision rules, decision implications, and granular rules, which can be viewed as ∧-rules since all of them have the following form: "if conditions 1,2,…, and m hold, then decisions hold." In order to enrich the existing rule acquisition theory in formal decision contexts, this study puts forward two new types of rules which are called ∨-rules and ∨-∧ mixed rules based on formal, object-oriented, and property-oriented concept lattices. Moreover, a comparison of ∨-rules, ∨-∧ mixed rules, and ∧-rules is made from the perspectives of inclusion and inference relationships. Finally, some real examples and numerical experiments are conducted to compare the proposed rule acquisition algorithms with the existing one in terms of the running efficiency.

  11. Rule Acquisition in Formal Decision Contexts Based on Formal, Object-Oriented and Property-Oriented Concept Lattices

    PubMed Central

    Ren, Yue; Aswani Kumar, Cherukuri; Liu, Wenqi

    2014-01-01

    Rule acquisition is one of the main purposes in the analysis of formal decision contexts. Up to now, there have been several types of rules in formal decision contexts such as decision rules, decision implications, and granular rules, which can be viewed as ∧-rules since all of them have the following form: “if conditions 1,2,…, and m hold, then decisions hold.” In order to enrich the existing rule acquisition theory in formal decision contexts, this study puts forward two new types of rules which are called ∨-rules and ∨-∧ mixed rules based on formal, object-oriented, and property-oriented concept lattices. Moreover, a comparison of ∨-rules, ∨-∧ mixed rules, and ∧-rules is made from the perspectives of inclusion and inference relationships. Finally, some real examples and numerical experiments are conducted to compare the proposed rule acquisition algorithms with the existing one in terms of the running efficiency. PMID:25165744

  12. Sensor-based activity recognition using extended belief rule-based inference methodology.

    PubMed

    Calzada, A; Liu, J; Nugent, C D; Wang, H; Martinez, L

    2014-01-01

    The recently developed extended belief rule-based inference methodology (RIMER+) recognizes the need of modeling different types of information and uncertainty that usually coexist in real environments. A home setting with sensors located in different rooms and on different appliances can be considered as a particularly relevant example of such an environment, which brings a range of challenges for sensor-based activity recognition. Although RIMER+ has been designed as a generic decision model that could be applied in a wide range of situations, this paper discusses how this methodology can be adapted to recognize human activities using binary sensors within smart environments. The evaluation of RIMER+ against other state-of-the-art classifiers in terms of accuracy, efficiency and applicability was found to be significantly relevant, specially in situations of input data incompleteness, and it demonstrates the potential of this methodology and underpins the basis to develop further research on the topic.

  13. DNA biosensors that reason.

    PubMed

    Sainz de Murieta, Iñaki; Rodríguez-Patón, Alfonso

    2012-08-01

    Despite the many designs of devices operating with the DNA strand displacement, surprisingly none is explicitly devoted to the implementation of logical deductions. The present article introduces a new model of biosensor device that uses nucleic acid strands to encode simple rules such as "IF DNA_strand(1) is present THEN disease(A)" or "IF DNA_strand(1) AND DNA_strand(2) are present THEN disease(B)". Taking advantage of the strand displacement operation, our model makes these simple rules interact with input signals (either DNA or any type of RNA) to generate an output signal (in the form of nucleotide strands). This output signal represents a diagnosis, which either can be measured using FRET techniques, cascaded as the input of another logical deduction with different rules, or even be a drug that is administered in response to a set of symptoms. The encoding introduces an implicit error cancellation mechanism, which increases the system scalability enabling longer inference cascades with a bounded and controllable signal-noise relation. It also allows the same rule to be used in forward inference or backward inference, providing the option of validly outputting negated propositions (e.g. "diagnosis A excluded"). The models presented in this paper can be used to implement smart logical DNA devices that perform genetic diagnosis in vitro. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  14. A New Fuzzy-Evidential Controller for Stabilization of the Planar Inverted Pendulum System

    PubMed Central

    Tang, Yongchuan; Zhou, Deyun

    2016-01-01

    In order to realize the stability control of the planar inverted pendulum system, which is a typical multi-variable and strong coupling system, a new fuzzy-evidential controller based on fuzzy inference and evidential reasoning is proposed. Firstly, for each axis, a fuzzy nine-point controller for the rod and a fuzzy nine-point controller for the cart are designed. Then, in order to coordinate these two controllers of each axis, a fuzzy-evidential coordinator is proposed. In this new fuzzy-evidential controller, the empirical knowledge for stabilization of the planar inverted pendulum system is expressed by fuzzy rules, while the coordinator of different control variables in each axis is built incorporated with the dynamic basic probability assignment (BPA) in the frame of fuzzy inference. The fuzzy-evidential coordinator makes the output of the control variable smoother, and the control effect of the new controller is better compared with some other work. The experiment in MATLAB shows the effectiveness and merit of the proposed method. PMID:27482707

  15. A New Fuzzy-Evidential Controller for Stabilization of the Planar Inverted Pendulum System.

    PubMed

    Tang, Yongchuan; Zhou, Deyun; Jiang, Wen

    2016-01-01

    In order to realize the stability control of the planar inverted pendulum system, which is a typical multi-variable and strong coupling system, a new fuzzy-evidential controller based on fuzzy inference and evidential reasoning is proposed. Firstly, for each axis, a fuzzy nine-point controller for the rod and a fuzzy nine-point controller for the cart are designed. Then, in order to coordinate these two controllers of each axis, a fuzzy-evidential coordinator is proposed. In this new fuzzy-evidential controller, the empirical knowledge for stabilization of the planar inverted pendulum system is expressed by fuzzy rules, while the coordinator of different control variables in each axis is built incorporated with the dynamic basic probability assignment (BPA) in the frame of fuzzy inference. The fuzzy-evidential coordinator makes the output of the control variable smoother, and the control effect of the new controller is better compared with some other work. The experiment in MATLAB shows the effectiveness and merit of the proposed method.

  16. Natural frequencies facilitate diagnostic inferences of managers

    PubMed Central

    Hoffrage, Ulrich; Hafenbrädl, Sebastian; Bouquet, Cyril

    2015-01-01

    In Bayesian inference tasks, information about base rates as well as hit rate and false-alarm rate needs to be integrated according to Bayes’ rule after the result of a diagnostic test became known. Numerous studies have found that presenting information in a Bayesian inference task in terms of natural frequencies leads to better performance compared to variants with information presented in terms of probabilities or percentages. Natural frequencies are the tallies in a natural sample in which hit rate and false-alarm rate are not normalized with respect to base rates. The present research replicates the beneficial effect of natural frequencies with four tasks from the domain of management, and with management students as well as experienced executives as participants. The percentage of Bayesian responses was almost twice as high when information was presented in natural frequencies compared to a presentation in terms of percentages. In contrast to most tasks previously studied, the majority of numerical responses were lower than the Bayesian solutions. Having heard of Bayes’ rule prior to the study did not affect Bayesian performance. An implication of our work is that textbooks explaining Bayes’ rule should teach how to represent information in terms of natural frequencies instead of how to plug probabilities or percentages into a formula. PMID:26157397

  17. Preschoolers can infer general rules governing fantastical events in fiction.

    PubMed

    Van de Vondervoort, Julia W; Friedman, Ori

    2014-05-01

    Young children are frequently exposed to fantastic fiction. How do they make sense of the unrealistic and impossible events that occur in such fiction? Although children could view such events as isolated episodes, the present experiments suggest that children use such events to infer general fantasy rules. In 2 experiments, 2- to 4-year-olds were shown scenarios in which 2 animals behaved unrealistically (N = 78 in Experiment 1, N = 94 in Experiment 2). When asked to predict how other animals in the fiction would behave, children predicted novel behaviors consistent with the nature of the fiction. These findings suggest that preschoolers can infer the general rules that govern the events and entities in fantastic fiction and can use these rules to predict what events will happen in the fiction. The findings also provide evidence that children may infer fantasy rules at a more superordinate level than the basic level. (PsycINFO Database Record (c) 2014 APA, all rights reserved).

  18. Determining rules for closing customer service centers: A public utility company's fuzzy decision

    NASA Technical Reports Server (NTRS)

    Dekorvin, Andre; Shipley, Margaret F.; Lea, Robert N.

    1992-01-01

    In the present work, we consider the general problem of knowledge acquisition under uncertainty. Simply stated, the problem reduces to the following: how can we capture the knowledge of an expert when the expert is unable to clearly formulate how he or she arrives at a decision? A commonly used method is to learn by examples. We observe how the expert solves specific cases and from this infer some rules by which the decision may have been made. Unique to our work is the fuzzy set representation of the conditions or attributes upon which the expert may possibly base his fuzzy decision. From our examples, we infer certain and possible fuzzy rules for closing a customer service center and illustrate the importance of having the decision closely relate to the conditions under consideration.

  19. Method to integrate clinical guidelines into the electronic health record (EHR) by applying the archetypes approach.

    PubMed

    Garcia, Diego; Moro, Claudia Maria Cabral; Cicogna, Paulo Eduardo; Carvalho, Deborah Ribeiro

    2013-01-01

    Clinical guidelines are documents that assist healthcare professionals, facilitating and standardizing diagnosis, management, and treatment in specific areas. Computerized guidelines as decision support systems (DSS) attempt to increase the performance of tasks and facilitate the use of guidelines. Most DSS are not integrated into the electronic health record (EHR), ordering some degree of rework especially related to data collection. This study's objective was to present a method for integrating clinical guidelines into the EHR. The study developed first a way to identify data and rules contained in the guidelines, and then incorporate rules into an archetype-based EHR. The proposed method tested was anemia treatment in the Chronic Kidney Disease Guideline. The phases of the method are: data and rules identification; archetypes elaboration; rules definition and inclusion in inference engine; and DSS-EHR integration and validation. The main feature of the proposed method is that it is generic and can be applied toany type of guideline.

  20. Approximate reasoning using terminological models

    NASA Technical Reports Server (NTRS)

    Yen, John; Vaidya, Nitin

    1992-01-01

    Term Subsumption Systems (TSS) form a knowledge-representation scheme in AI that can express the defining characteristics of concepts through a formal language that has a well-defined semantics and incorporates a reasoning mechanism that can deduce whether one concept subsumes another. However, TSS's have very limited ability to deal with the issue of uncertainty in knowledge bases. The objective of this research is to address issues in combining approximate reasoning with term subsumption systems. To do this, we have extended an existing AI architecture (CLASP) that is built on the top of a term subsumption system (LOOM). First, the assertional component of LOOM has been extended for asserting and representing uncertain propositions. Second, we have extended the pattern matcher of CLASP for plausible rule-based inferences. Third, an approximate reasoning model has been added to facilitate various kinds of approximate reasoning. And finally, the issue of inconsistency in truth values due to inheritance is addressed using justification of those values. This architecture enhances the reasoning capabilities of expert systems by providing support for reasoning under uncertainty using knowledge captured in TSS. Also, as definitional knowledge is explicit and separate from heuristic knowledge for plausible inferences, the maintainability of expert systems could be improved.

  1. Inference for Transition Network Grammars,

    DTIC Science & Technology

    1976-01-01

    If the arc Is followed. language L(G) is said to be structurally complete if The power of an augmented transition network (Am) is each rewriting rule ...Clearly, a context-sensitive grammar can be represented as a context—free grarmar plus a set of transformationDbbbbb Eabbbbbb Dbb~~bb Ebbbbbb rules ...are the foun— as a CFG (base) and a set of transformationa l rules . datIons of grammars of different complexities. The The CSL Is obtained by appl

  2. CORMIX1: AN EXPERT SYSTEM FOR MIXING ZONE ANALYSIS OF TOXIC AND CONVENTIONAL, SINGLE PORT AQUATIC DISCHARGES

    EPA Science Inventory

    An expert system, CORMIX1, was developed to predict the dilution and trajectory of a single buoyant discharge into an unstratified aquatic environment with and without crossflow. The system uses knowledge and inference rules obtained from hydrodynamic experts to classify and pred...

  3. The design and application of a Transportable Inference Engine (TIE1)

    NASA Technical Reports Server (NTRS)

    Mclean, David R.

    1986-01-01

    A Transportable Inference Engine (TIE1) system has been developed by the author as part of the Interactive Experimenter Planning System (IEPS) task which is involved with developing expert systems in support of the Spacecraft Control Programs Branch at Goddard Space Flight Center in Greenbelt, Maryland. Unlike traditional inference engines, TIE1 is written in the C programming language. In the TIE1 system, knowledge is represented by a hierarchical network of objects which have rule frames. The TIE1 search algorithm uses a set of strategies, including backward chaining, to obtain the values of goals. The application of TIE1 to a spacecraft scheduling problem is described. This application involves the development of a strategies interpreter which uses TIE1 to do constraint checking.

  4. An Ambulatory System for Gait Monitoring Based on Wireless Sensorized Insoles

    PubMed Central

    González, Iván; Fontecha, Jesús; Hervás, Ramón; Bravo, José

    2015-01-01

    A new gait phase detection system for continuous monitoring based on wireless sensorized insoles is presented. The system can be used in gait analysis mobile applications, and it is designed for real-time demarcation of gait phases. The system employs pressure sensors to assess the force exerted by each foot during walking. A fuzzy rule-based inference algorithm is implemented on a smartphone and used to detect each of the gait phases based on the sensor signals. Additionally, to provide a solution that is insensitive to perturbations caused by non-walking activities, a probabilistic classifier is employed to discriminate walking forward from other low-level activities, such as turning, walking backwards, lateral walking, etc. The combination of these two algorithms constitutes the first approach towards a continuous gait assessment system, by means of the avoidance of non-walking influences. PMID:26184199

  5. Knowledge acquisition and representation using fuzzy evidential reasoning and dynamic adaptive fuzzy Petri nets.

    PubMed

    Liu, Hu-Chen; Liu, Long; Lin, Qing-Lian; Liu, Nan

    2013-06-01

    The two most important issues of expert systems are the acquisition of domain experts' professional knowledge and the representation and reasoning of the knowledge rules that have been identified. First, during expert knowledge acquisition processes, the domain expert panel often demonstrates different experience and knowledge from one another and produces different types of knowledge information such as complete and incomplete, precise and imprecise, and known and unknown because of its cross-functional and multidisciplinary nature. Second, as a promising tool for knowledge representation and reasoning, fuzzy Petri nets (FPNs) still suffer a couple of deficiencies. The parameters in current FPN models could not accurately represent the increasingly complex knowledge-based systems, and the rules in most existing knowledge inference frameworks could not be dynamically adjustable according to propositions' variation as human cognition and thinking. In this paper, we present a knowledge acquisition and representation approach using the fuzzy evidential reasoning approach and dynamic adaptive FPNs to solve the problems mentioned above. As is illustrated by the numerical example, the proposed approach can well capture experts' diversity experience, enhance the knowledge representation power, and reason the rule-based knowledge more intelligently.

  6. Is awareness necessary for true inference?

    PubMed

    Leo, Peter D; Greene, Anthony J

    2008-09-01

    In transitive inference, participants learn a set of context-dependent discriminations that can be organized into a hierarchy that supports inference. Several studies show that inference occurs with or without task awareness. However, some studies assert that without awareness, performance is attributable to pseudoinference. By this account, inference-like performance is achieved by differential stimulus weighting according to the stimuli's proximity to the end items of the hierarchy. We implement an inference task that cannot be based on differential stimulus weighting. The design itself rules out pseudoinference strategies. Success on the task without evidence of deliberative strategies would therefore suggest that true inference can be achieved implicitly. We found that accurate performance on the inference task was not dependent on explicit awareness. The finding is consistent with a growing body of evidence that indicates that forms of learning and memory supporting inference and flexibility do not necessarily depend on task awareness.

  7. Linguistic Summarization of Video for Fall Detection Using Voxel Person and Fuzzy Logic

    PubMed Central

    Anderson, Derek; Luke, Robert H.; Keller, James M.; Skubic, Marjorie; Rantz, Marilyn; Aud, Myra

    2009-01-01

    In this paper, we present a method for recognizing human activity from linguistic summarizations of temporal fuzzy inference curves representing the states of a three-dimensional object called voxel person. A hierarchy of fuzzy logic is used, where the output from each level is summarized and fed into the next level. We present a two level model for fall detection. The first level infers the states of the person at each image. The second level operates on linguistic summarizations of voxel person’s states and inference regarding activity is performed. The rules used for fall detection were designed under the supervision of nurses to ensure that they reflect the manner in which elders perform these activities. The proposed framework is extremely flexible. Rules can be modified, added, or removed, allowing for per-resident customization based on knowledge about their cognitive and physical ability. PMID:20046216

  8. Collaborative human-machine analysis using a controlled natural language

    NASA Astrophysics Data System (ADS)

    Mott, David H.; Shemanski, Donald R.; Giammanco, Cheryl; Braines, Dave

    2015-05-01

    A key aspect of an analyst's task in providing relevant information from data is the reasoning about the implications of that data, in order to build a picture of the real world situation. This requires human cognition, based upon domain knowledge about individuals, events and environmental conditions. For a computer system to collaborate with an analyst, it must be capable of following a similar reasoning process to that of the analyst. We describe ITA Controlled English (CE), a subset of English to represent analyst's domain knowledge and reasoning, in a form that it is understandable by both analyst and machine. CE can be used to express domain rules, background data, assumptions and inferred conclusions, thus supporting human-machine interaction. A CE reasoning and modeling system can perform inferences from the data and provide the user with conclusions together with their rationale. We present a logical problem called the "Analysis Game", used for training analysts, which presents "analytic pitfalls" inherent in many problems. We explore an iterative approach to its representation in CE, where a person can develop an understanding of the problem solution by incremental construction of relevant concepts and rules. We discuss how such interactions might occur, and propose that such techniques could lead to better collaborative tools to assist the analyst and avoid the "pitfalls".

  9. Sex differences in the brain: implications for explaining autism.

    PubMed

    Baron-Cohen, Simon; Knickmeyer, Rebecca C; Belmonte, Matthew K

    2005-11-04

    Empathizing is the capacity to predict and to respond to the behavior of agents (usually people) by inferring their mental states and responding to these with an appropriate emotion. Systemizing is the capacity to predict and to respond to the behavior of nonagentive deterministic systems by analyzing input-operation-output relations and inferring the rules that govern such systems. At a population level, females are stronger empathizers and males are stronger systemizers. The "extreme male brain" theory posits that autism represents an extreme of the male pattern (impaired empathizing and enhanced systemizing). Here we suggest that specific aspects of autistic neuroanatomy may also be extremes of typical male neuroanatomy.

  10. Improving the anesthetic process by a fuzzy rule based medical decision system.

    PubMed

    Mendez, Juan Albino; Leon, Ana; Marrero, Ayoze; Gonzalez-Cava, Jose M; Reboso, Jose Antonio; Estevez, Jose Ignacio; Gomez-Gonzalez, José F

    2018-01-01

    The main objective of this research is the design and implementation of a new fuzzy logic tool for automatic drug delivery in patients undergoing general anesthesia. The aim is to adjust the drug dose to the real patient needs using heuristic knowledge provided by clinicians. A two-level computer decision system is proposed. The idea is to release the clinician from routine tasks so that he can focus on other variables of the patient. The controller uses the Bispectral Index (BIS) to assess the hypnotic state of the patient. Fuzzy controller was included in a closed-loop system to reach the BIS target and reject disturbances. BIS was measured using a BIS VISTA monitor, a device capable of calculating the hypnosis level of the patient through EEG information. An infusion pump with propofol 1% is used to supply the drug to the patient. The inputs to the fuzzy inference system are BIS error and BIS rate. The output is infusion rate increment. The mapping of the input information and the appropriate output is given by a rule-base based on knowledge of clinicians. To evaluate the performance of the fuzzy closed-loop system proposed, an observational study was carried out. Eighty one patients scheduled for ambulatory surgery were randomly distributed in 2 groups: one group using a fuzzy logic based closed-loop system (FCL) to automate the administration of propofol (42 cases); the second group using manual delivering of the drug (39 cases). In both groups, the BIS target was 50. The FCL, designed with intuitive logic rules based on the clinician experience, performed satisfactorily and outperformed the manual administration in patients in terms of accuracy through the maintenance stage. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Bayesian networks improve causal environmental assessments for evidence-based policy

    EPA Science Inventory

    Rule-based weight of evidence approaches to ecological risk assessment may not account for uncertainties and generally lack probabilistic integration of lines of evidence. Bayesian networks allow causal inferences to be made from evidence by including causal knowledge about the p...

  12. Determining rules for closing customer service centers: A public utility company's fuzzy decision

    NASA Technical Reports Server (NTRS)

    Dekorvin, Andre; Shipley, Margaret F.

    1992-01-01

    In the present work, we consider the general problem of knowledge acquisition under uncertainty. A commonly used method is to learn by examples. We observe how the expert solves specific cases and from this infer some rules by which the decision was made. Unique to this work is the fuzzy set representation of the conditions or attributes upon which the decision make may base his fuzzy set decision. From our examples, we infer certain and possible rules containing fuzzy terms. It should be stressed that the procedure determines how closely the expert follows the conditions under consideration in making his decision. We offer two examples pertaining to the possible decision to close a customer service center of a public utility company. In the first example, the decision maker does not follow too closely the conditions. In the second example, the conditions are much more relevant to the decision of the expert.

  13. Object/rule integration in CLIPS. [C Language Integrated Production System

    NASA Technical Reports Server (NTRS)

    Donnell, Brian L.

    1993-01-01

    This paper gives a brief overview of the C Language Integrated Production System (CLIPS) with a focus on the object-oriented features. The advantages of an object data representation over the traditional working memory element (WME), i.e., facts, are discussed, and the implementation of the Rete inference algorithm in CLIPS is presented in detail. A few methods for achieving pattern-matching on objects with the current inference engine are given, and finally, the paper examines the modifications necessary to the Rete algorithm to allow direct object pattern-matching.

  14. ARNetMiT R Package: association rules based gene co-expression networks of miRNA targets.

    PubMed

    Özgür Cingiz, M; Biricik, G; Diri, B

    2017-03-31

    miRNAs are key regulators that bind to target genes to suppress their gene expression level. The relations between miRNA-target genes enable users to derive co-expressed genes that may be involved in similar biological processes and functions in cells. We hypothesize that target genes of miRNAs are co-expressed, when they are regulated by multiple miRNAs. With the usage of these co-expressed genes, we can theoretically construct co-expression networks (GCNs) related to 152 diseases. In this study, we introduce ARNetMiT that utilize a hash based association rule algorithm in a novel way to infer the GCNs on miRNA-target genes data. We also present R package of ARNetMiT, which infers and visualizes GCNs of diseases that are selected by users. Our approach assumes miRNAs as transactions and target genes as their items. Support and confidence values are used to prune association rules on miRNA-target genes data to construct support based GCNs (sGCNs) along with support and confidence based GCNs (scGCNs). We use overlap analysis and the topological features for the performance analysis of GCNs. We also infer GCNs with popular GNI algorithms for comparison with the GCNs of ARNetMiT. Overlap analysis results show that ARNetMiT outperforms the compared GNI algorithms. We see that using high confidence values in scGCNs increase the ratio of the overlapped gene-gene interactions between the compared methods. According to the evaluation of the topological features of ARNetMiT based GCNs, the degrees of nodes have power-law distribution. The hub genes discovered by ARNetMiT based GCNs are consistent with the literature.

  15. Smart Aerospace eCommerce: Using Intelligent Agents in a NASA Mission Services Ordering Application

    NASA Technical Reports Server (NTRS)

    Moleski, Walt; Luczak, Ed; Morris, Kim; Clayton, Bill; Scherf, Patricia; Obenschain, Arthur F. (Technical Monitor)

    2002-01-01

    This paper describes how intelligent agent technology was successfully prototyped and then deployed in a smart eCommerce application for NASA. An intelligent software agent called the Intelligent Service Validation Agent (ISVA) was added to an existing web-based ordering application to validate complex orders for spacecraft mission services. This integration of intelligent agent technology with conventional web technology satisfies an immediate NASA need to reduce manual order processing costs. The ISVA agent checks orders for completeness, consistency, and correctness, and notifies users of detected problems. ISVA uses NASA business rules and a knowledge base of NASA services, and is implemented using the Java Expert System Shell (Jess), a fast rule-based inference engine. The paper discusses the design of the agent and knowledge base, and the prototyping and deployment approach. It also discusses future directions and other applications, and discusses lessons-learned that may help other projects make their aerospace eCommerce applications smarter.

  16. Qualitative reasoning for biological network inference from systematic perturbation experiments.

    PubMed

    Badaloni, Silvana; Di Camillo, Barbara; Sambo, Francesco

    2012-01-01

    The systematic perturbation of the components of a biological system has been proven among the most informative experimental setups for the identification of causal relations between the components. In this paper, we present Systematic Perturbation-Qualitative Reasoning (SPQR), a novel Qualitative Reasoning approach to automate the interpretation of the results of systematic perturbation experiments. Our method is based on a qualitative abstraction of the experimental data: for each perturbation experiment, measured values of the observed variables are modeled as lower, equal or higher than the measurements in the wild type condition, when no perturbation is applied. The algorithm exploits a set of IF-THEN rules to infer causal relations between the variables, analyzing the patterns of propagation of the perturbation signals through the biological network, and is specifically designed to minimize the rate of false positives among the inferred relations. Tested on both simulated and real perturbation data, SPQR indeed exhibits a significantly higher precision than the state of the art.

  17. Making intelligent systems team players. A guide to developing intelligent monitoring systems

    NASA Technical Reports Server (NTRS)

    Land, Sherry A.; Malin, Jane T.; Thronesberry, Carroll; Schreckenghost, Debra L.

    1995-01-01

    This reference guide for developers of intelligent monitoring systems is based on lessons learned by developers of the DEcision Support SYstem (DESSY), an expert system that monitors Space Shuttle telemetry data in real time. DESSY makes inferences about commands, state transitions, and simple failures. It performs failure detection rather than in-depth failure diagnostics. A listing of rules from DESSY and cue cards from DESSY subsystems are included to give the development community a better understanding of the selected model system. The G-2 programming tool used in developing DESSY provides an object-oriented, rule-based environment, but many of the principles in use here can be applied to any type of monitoring intelligent system. The step-by-step instructions and examples given for each stage of development are in G-2, but can be used with other development tools. This guide first defines the authors' concept of real-time monitoring systems, then tells prospective developers how to determine system requirements, how to build the system through a combined design/development process, and how to solve problems involved in working with real-time data. It explains the relationships among operational prototyping, software evolution, and the user interface. It also explains methods of testing, verification, and validation. It includes suggestions for preparing reference documentation and training users.

  18. Combining Human and Machine Intelligence to Derive Agents' Behavioral Rules for Groundwater Irrigation

    NASA Astrophysics Data System (ADS)

    Hu, Y.; Quinn, C.; Cai, X.

    2015-12-01

    One major challenge of agent-based modeling is to derive agents' behavioral rules due to behavioral uncertainty and data scarcity. This study proposes a new approach to combine a data-driven modeling based on the directed information (i.e., machine intelligence) with expert domain knowledge (i.e., human intelligence) to derive the behavioral rules of agents considering behavioral uncertainty. A directed information graph algorithm is applied to identifying the causal relationships between agents' decisions (i.e., groundwater irrigation depth) and time-series of environmental, socio-economical and institutional factors. A case study is conducted for the High Plains aquifer hydrological observatory (HO) area, U.S. Preliminary results show that four factors, corn price (CP), underlying groundwater level (GWL), monthly mean temperature (T) and precipitation (P) have causal influences on agents' decisions on groundwater irrigation depth (GWID) to various extents. Based on the similarity of the directed information graph for each agent, five clusters of graphs are further identified to represent all the agents' behaviors in the study area as shown in Figure 1. Using these five representative graphs, agents' monthly optimal groundwater pumping rates are derived through the probabilistic inference. Such data-driven relationships and probabilistic quantifications are then coupled with a physically-based groundwater model to investigate the interactions between agents' pumping behaviors and the underlying groundwater system in the context of coupled human and natural systems.

  19. A Fuzzy Reasoning Design for Fault Detection and Diagnosis of a Computer-Controlled System

    PubMed Central

    Ting, Y.; Lu, W.B.; Chen, C.H.; Wang, G.K.

    2008-01-01

    A Fuzzy Reasoning and Verification Petri Nets (FRVPNs) model is established for an error detection and diagnosis mechanism (EDDM) applied to a complex fault-tolerant PC-controlled system. The inference accuracy can be improved through the hierarchical design of a two-level fuzzy rule decision tree (FRDT) and a Petri nets (PNs) technique to transform the fuzzy rule into the FRVPNs model. Several simulation examples of the assumed failure events were carried out by using the FRVPNs and the Mamdani fuzzy method with MATLAB tools. The reasoning performance of the developed FRVPNs was verified by comparing the inference outcome to that of the Mamdani method. Both methods result in the same conclusions. Thus, the present study demonstratrates that the proposed FRVPNs model is able to achieve the purpose of reasoning, and furthermore, determining of the failure event of the monitored application program. PMID:19255619

  20. Forward chaining method on diagnosis of diseases and pests corn crop

    NASA Astrophysics Data System (ADS)

    Nurlaeli, Subiyanto

    2017-03-01

    Integrated pest management should be done to control the explosion of plants pest and diseases due to climate change is uncertain. This paper is a present implementation of the forward chaining method in the diagnosis diseases and pests of corn crop to help farmers/agricultural facilitators in getting knowledge about disease and pest corn crop. Forward chaining method as inference engine is used to get a disease/pest that attacks the corn crop based on symptoms. The forward chaining method works based on the fact that there is to get a conclusion. Fact in this system derived from the symptoms of the selected user is matched with the premise on every rule in the knowledge base. A rule that matches the facts to be executed to be the conclusion in the form of diagnosis. This validation using 36 data test, 32 data showed the same diagnostic results between systems with an expert. So, the percentage accuracy of results of diagnosis using data test of 88%. Finally, it can be concluded that the diagnosis system of diseases and pests corn crop can be used to help farmers/agricultural facilitators to diagnose diseases and pests corn crop.

  1. The expert surgical assistant. An intelligent virtual environment with multimodal input.

    PubMed

    Billinghurst, M; Savage, J; Oppenheimer, P; Edmond, C

    1996-01-01

    Virtual Reality has made computer interfaces more intuitive but not more intelligent. This paper shows how an expert system can be coupled with multimodal input in a virtual environment to provide an intelligent simulation tool or surgical assistant. This is accomplished in three steps. First, voice and gestural input is interpreted and represented in a common semantic form. Second, a rule-based expert system is used to infer context and user actions from this semantic representation. Finally, the inferred user actions are matched against steps in a surgical procedure to monitor the user's progress and provide automatic feedback. In addition, the system can respond immediately to multimodal commands for navigational assistance and/or identification of critical anatomical structures. To show how these methods are used we present a prototype sinus surgery interface. The approach described here may easily be extended to a wide variety of medical and non-medical training applications by making simple changes to the expert system database and virtual environment models. Successful implementation of an expert system in both simulated and real surgery has enormous potential for the surgeon both in training and clinical practice.

  2. Earthquake hazard assessment in the Zagros Orogenic Belt of Iran using a fuzzy rule-based model

    NASA Astrophysics Data System (ADS)

    Farahi Ghasre Aboonasr, Sedigheh; Zamani, Ahmad; Razavipour, Fatemeh; Boostani, Reza

    2017-08-01

    Producing accurate seismic hazard map and predicting hazardous areas is necessary for risk mitigation strategies. In this paper, a fuzzy logic inference system is utilized to estimate the earthquake potential and seismic zoning of Zagros Orogenic Belt. In addition to the interpretability, fuzzy predictors can capture both nonlinearity and chaotic behavior of data, where the number of data is limited. In this paper, earthquake pattern in the Zagros has been assessed for the intervals of 10 and 50 years using fuzzy rule-based model. The Molchan statistical procedure has been used to show that our forecasting model is reliable. The earthquake hazard maps for this area reveal some remarkable features that cannot be observed on the conventional maps. Regarding our achievements, some areas in the southern (Bandar Abbas), southwestern (Bandar Kangan) and western (Kermanshah) parts of Iran display high earthquake severity even though they are geographically far apart.

  3. Representing Micro-Macro Linkages by Actor-Based Dynamic Network Models

    ERIC Educational Resources Information Center

    Snijders, Tom A. B.; Steglich, Christian E. G.

    2015-01-01

    Stochastic actor-based models for network dynamics have the primary aim of statistical inference about processes of network change, but may be regarded as a kind of agent-based models. Similar to many other agent-based models, they are based on local rules for actor behavior. Different from many other agent-based models, by including elements of…

  4. Rule-based reasoning is fast and belief-based reasoning can be slow: Challenging current explanations of belief-bias and base-rate neglect.

    PubMed

    Newman, Ian R; Gibb, Maia; Thompson, Valerie A

    2017-07-01

    It is commonly assumed that belief-based reasoning is fast and automatic, whereas rule-based reasoning is slower and more effortful. Dual-Process theories of reasoning rely on this speed-asymmetry explanation to account for a number of reasoning phenomena, such as base-rate neglect and belief-bias. The goal of the current study was to test this hypothesis about the relative speed of belief-based and rule-based processes. Participants solved base-rate problems (Experiment 1) and conditional inferences (Experiment 2) under a challenging deadline; they then gave a second response in free time. We found that fast responses were informed by rules of probability and logical validity, and that slow responses incorporated belief-based information. Implications for Dual-Process theories and future research options for dissociating Type I and Type II processes are discussed. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  5. Simulation-based model checking approach to cell fate specification during Caenorhabditis elegans vulval development by hybrid functional Petri net with extension.

    PubMed

    Li, Chen; Nagasaki, Masao; Ueno, Kazuko; Miyano, Satoru

    2009-04-27

    Model checking approaches were applied to biological pathway validations around 2003. Recently, Fisher et al. have proved the importance of model checking approach by inferring new regulation of signaling crosstalk in C. elegans and confirming the regulation with biological experiments. They took a discrete and state-based approach to explore all possible states of the system underlying vulval precursor cell (VPC) fate specification for desired properties. However, since both discrete and continuous features appear to be an indispensable part of biological processes, it is more appropriate to use quantitative models to capture the dynamics of biological systems. Our key motivation of this paper is to establish a quantitative methodology to model and analyze in silico models incorporating the use of model checking approach. A novel method of modeling and simulating biological systems with the use of model checking approach is proposed based on hybrid functional Petri net with extension (HFPNe) as the framework dealing with both discrete and continuous events. Firstly, we construct a quantitative VPC fate model with 1761 components by using HFPNe. Secondly, we employ two major biological fate determination rules - Rule I and Rule II - to VPC fate model. We then conduct 10,000 simulations for each of 48 sets of different genotypes, investigate variations of cell fate patterns under each genotype, and validate the two rules by comparing three simulation targets consisting of fate patterns obtained from in silico and in vivo experiments. In particular, an evaluation was successfully done by using our VPC fate model to investigate one target derived from biological experiments involving hybrid lineage observations. However, the understandings of hybrid lineages are hard to make on a discrete model because the hybrid lineage occurs when the system comes close to certain thresholds as discussed by Sternberg and Horvitz in 1986. Our simulation results suggest that: Rule I that cannot be applied with qualitative based model checking, is more reasonable than Rule II owing to the high coverage of predicted fate patterns (except for the genotype of lin-15ko; lin-12ko double mutants). More insights are also suggested. The quantitative simulation-based model checking approach is a useful means to provide us valuable biological insights and better understandings of biological systems and observation data that may be hard to capture with the qualitative one.

  6. Tuberculosis-Diagnostic Expert System: an architecture for translating patients information from the web for use in tuberculosis diagnosis.

    PubMed

    Osamor, Victor C; Azeta, Ambrose A; Ajulo, Oluseyi O

    2014-12-01

    Over 1.5-2 million tuberculosis deaths occur annually. Medical professionals are faced with a lot of challenges in delivering good health-care with unassisted automation in hospitals where there are several patients who need the doctor's attention. To automate the pre-laboratory screening process against tuberculosis infection to aid diagnosis and make it fast and accessible to the public via the Internet. The expert system we have built is designed to also take care of people who do not have access to medical experts, but would want to check their medical status. A rule-based approach has been used, and unified modeling language and the client-server architecture technique were applied to model the system and to develop it as a web-based expert system for tuberculosis diagnosis. Algorithmic rules in the Tuberculosis-Diagnosis Expert System necessitate decision coverage where tuberculosis is either suspected or not suspected. The architecture consists of a rule base, knowledge base, and patient database. These units interact with the inference engine, which receives patient' data through the Internet via a user interface. We present the architecture of the Tuberculosis-Diagnosis Expert System and its implementation. We evaluated it for usability to determine the level of effectiveness, efficiency and user satisfaction. The result of the usability evaluation reveals that the system has a usability of 4.08 out of a scale of 5. This is an indication of a more-than-average system performance. Several existing expert systems have been developed for the purpose of supporting different medical diagnoses, but none is designed to translate tuberculosis patients' symptomatic data for online pre-laboratory screening. Our Tuberculosis-Diagnosis Expert System is an effective solution for the implementation of the needed web-based expert system diagnosis. © The Author(s) 2013.

  7. Ensemble Classifiers for Predicting HIV-1 Resistance from Three Rule-Based Genotypic Resistance Interpretation Systems.

    PubMed

    Raposo, Letícia M; Nobre, Flavio F

    2017-08-30

    Resistance to antiretrovirals (ARVs) is a major problem faced by HIV-infected individuals. Different rule-based algorithms were developed to infer HIV-1 susceptibility to antiretrovirals from genotypic data. However, there is discordance between them, resulting in difficulties for clinical decisions about which treatment to use. Here, we developed ensemble classifiers integrating three interpretation algorithms: Agence Nationale de Recherche sur le SIDA (ANRS), Rega, and the genotypic resistance interpretation system from Stanford HIV Drug Resistance Database (HIVdb). Three approaches were applied to develop a classifier with a single resistance profile: stacked generalization, a simple plurality vote scheme and the selection of the interpretation system with the best performance. The strategies were compared with the Friedman's test and the performance of the classifiers was evaluated using the F-measure, sensitivity and specificity values. We found that the three strategies had similar performances for the selected antiretrovirals. For some cases, the stacking technique with naïve Bayes as the learning algorithm showed a statistically superior F-measure. This study demonstrates that ensemble classifiers can be an alternative tool for clinical decision-making since they provide a single resistance profile from the most commonly used resistance interpretation systems.

  8. Moral empiricism and the bias for act-based rules.

    PubMed

    Ayars, Alisabeth; Nichols, Shaun

    2017-10-01

    Previous studies on rule learning show a bias in favor of act-based rules, which prohibit intentionally producing an outcome but not merely allowing the outcome. Nichols, Kumar, Lopez, Ayars, and Chan (2016) found that exposure to a single sample violation in which an agent intentionally causes the outcome was sufficient for participants to infer that the rule was act-based. One explanation is that people have an innate bias to think rules are act-based. We suggest an alternative empiricist account: since most rules that people learn are act-based, people form an overhypothesis (Goodman, 1955) that rules are typically act-based. We report three studies that indicate that people can use information about violations to form overhypotheses about rules. In study 1, participants learned either three "consequence-based" rules that prohibited allowing an outcome or three "act-based" rules that prohibiting producing the outcome; in a subsequent learning task, we found that participants who had learned three consequence-based rules were more likely to think that the new rule prohibited allowing an outcome. In study 2, we presented participants with either 1 consequence-based rule or 3 consequence-based rules, and we found that those exposed to 3 such rules were more likely to think that a new rule was also consequence based. Thus, in both studies, it seems that learning 3 consequence-based rules generates an overhypothesis to expect new rules to be consequence-based. In a final study, we used a more subtle manipulation. We exposed participants to examples act-based or accident-based (strict liability) laws and then had them learn a novel rule. We found that participants who were exposed to the accident-based laws were more likely to think a new rule was accident-based. The fact that participants' bias for act-based rules can be shaped by evidence from other rules supports the idea that the bias for act-based rules might be acquired as an overhypothesis from the preponderance of act-based rules. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. The knowledge model of MedFrame/CADIAG-IV.

    PubMed

    Sageder, B; Boegl, K; Adlassnig, K P; Kolousek, G; Trummer, B

    1997-01-01

    The medical consultation system MedFrame/CADIAG-IV is a successor of the prior CADIAG projects. It is the result of a complete redesign to account for today's demands on state-of-the-art software. Its knowledge representation and inference process are based on fuzzy set theory and fuzzy logic. Fuzzy sets are used for conversions from measured numeric values and observational data into symbolic ones. Medical relationships between findings, diseases, and therapies, the rules, are represented by fuzzy relations, that express positive or negative associations. Findings, diseases, and therapies are organised in hierarchies.

  10. The Relative Success of Recognition-Based Inference in Multichoice Decisions

    ERIC Educational Resources Information Center

    McCloy, Rachel; Beaman, C. Philip; Smith, Philip T.

    2008-01-01

    The utility of an "ecologically rational" recognition-based decision rule in multichoice decision problems is analyzed, varying the type of judgment required (greater or lesser). The maximum size and range of a counterintuitive advantage associated with recognition-based judgment (the "less-is-more effect") is identified for a range of cue…

  11. Hybrid clustering based fuzzy structure for vibration control - Part 1: A novel algorithm for building neuro-fuzzy system

    NASA Astrophysics Data System (ADS)

    Nguyen, Sy Dzung; Nguyen, Quoc Hung; Choi, Seung-Bok

    2015-01-01

    This paper presents a new algorithm for building an adaptive neuro-fuzzy inference system (ANFIS) from a training data set called B-ANFIS. In order to increase accuracy of the model, the following issues are executed. Firstly, a data merging rule is proposed to build and perform a data-clustering strategy. Subsequently, a combination of clustering processes in the input data space and in the joint input-output data space is presented. Crucial reason of this task is to overcome problems related to initialization and contradictory fuzzy rules, which usually happen when building ANFIS. The clustering process in the input data space is accomplished based on a proposed merging-possibilistic clustering (MPC) algorithm. The effectiveness of this process is evaluated to resume a clustering process in the joint input-output data space. The optimal parameters obtained after completion of the clustering process are used to build ANFIS. Simulations based on a numerical data, 'Daily Data of Stock A', and measured data sets of a smart damper are performed to analyze and estimate accuracy. In addition, convergence and robustness of the proposed algorithm are investigated based on both theoretical and testing approaches.

  12. Fuzzy regression modeling for tool performance prediction and degradation detection.

    PubMed

    Li, X; Er, M J; Lim, B S; Zhou, J H; Gan, O P; Rutkowski, L

    2010-10-01

    In this paper, the viability of using Fuzzy-Rule-Based Regression Modeling (FRM) algorithm for tool performance and degradation detection is investigated. The FRM is developed based on a multi-layered fuzzy-rule-based hybrid system with Multiple Regression Models (MRM) embedded into a fuzzy logic inference engine that employs Self Organizing Maps (SOM) for clustering. The FRM converts a complex nonlinear problem to a simplified linear format in order to further increase the accuracy in prediction and rate of convergence. The efficacy of the proposed FRM is tested through a case study - namely to predict the remaining useful life of a ball nose milling cutter during a dry machining process of hardened tool steel with a hardness of 52-54 HRc. A comparative study is further made between four predictive models using the same set of experimental data. It is shown that the FRM is superior as compared with conventional MRM, Back Propagation Neural Networks (BPNN) and Radial Basis Function Networks (RBFN) in terms of prediction accuracy and learning speed.

  13. Lithofacies identification using multiple adaptive resonance theory neural networks and group decision expert system

    USGS Publications Warehouse

    Chang, H.-C.; Kopaska-Merkel, D. C.; Chen, H.-C.; Rocky, Durrans S.

    2000-01-01

    Lithofacies identification supplies qualitative information about rocks. Lithofacies represent rock textures and are important components of hydrocarbon reservoir description. Traditional techniques of lithofacies identification from core data are costly and different geologists may provide different interpretations. In this paper, we present a low-cost intelligent system consisting of three adaptive resonance theory neural networks and a rule-based expert system to consistently and objectively identify lithofacies from well-log data. The input data are altered into different forms representing different perspectives of observation of lithofacies. Each form of input is processed by a different adaptive resonance theory neural network. Among these three adaptive resonance theory neural networks, one neural network processes the raw continuous data, another processes categorial data, and the third processes fuzzy-set data. Outputs from these three networks are then combined by the expert system using fuzzy inference to determine to which facies the input data should be assigned. Rules are prioritized to emphasize the importance of firing order. This new approach combines the learning ability of neural networks, the adaptability of fuzzy logic, and the expertise of geologists to infer facies of the rocks. This approach is applied to the Appleton Field, an oil field located in Escambia County, Alabama. The hybrid intelligence system predicts lithofacies identity from log data with 87.6% accuracy. This prediction is more accurate than those of single adaptive resonance theory networks, 79.3%, 68.0% and 66.0%, using raw, fuzzy-set, and categorical data, respectively, and by an error-backpropagation neural network, 57.3%. (C) 2000 Published by Elsevier Science Ltd. All rights reserved.

  14. The new physician as unwitting quantum mechanic: is adapting Dirac's inference system best practice for personalized medicine, genomics, and proteomics?

    PubMed

    Robson, Barry

    2007-08-01

    What is the Best Practice for automated inference in Medical Decision Support for personalized medicine? A known system already exists as Dirac's inference system from quantum mechanics (QM) using bra-kets and bras where A and B are states, events, or measurements representing, say, clinical and biomedical rules. Dirac's system should theoretically be the universal best practice for all inference, though QM is notorious as sometimes leading to bizarre conclusions that appear not to be applicable to the macroscopic world of everyday world human experience and medical practice. It is here argued that this apparent difficulty vanishes if QM is assigned one new multiplication function @, which conserves conditionality appropriately, making QM applicable to classical inference including a quantitative form of the predicate calculus. An alternative interpretation with the same consequences is if every i = radical-1 in Dirac's QM is replaced by h, an entity distinct from 1 and i and arguably a hidden root of 1 such that h2 = 1. With that exception, this paper is thus primarily a review of the application of Dirac's system, by application of linear algebra in the complex domain to help manipulate information about associations and ontology in complicated data. Any combined bra-ket can be shown to be composed only of the sum of QM-like bra and ket weights c(), times an exponential function of Fano's mutual information measure I(A; B) about the association between A and B, that is, an association rule from data mining. With the weights and Fano measure re-expressed as expectations on finite data using Riemann's Incomplete (i.e., Generalized) Zeta Functions, actual counts of observations for real world sparse data can be readily utilized. Finally, the paper compares identical character, distinguishability of states events or measurements, correlation, mutual information, and orthogonal character, important issues in data mining and biomedical analytics, as in QM.

  15. An evidential reasoning extension to quantitative model-based failure diagnosis

    NASA Technical Reports Server (NTRS)

    Gertler, Janos J.; Anderson, Kenneth C.

    1992-01-01

    The detection and diagnosis of failures in physical systems characterized by continuous-time operation are studied. A quantitative diagnostic methodology has been developed that utilizes the mathematical model of the physical system. On the basis of the latter, diagnostic models are derived each of which comprises a set of orthogonal parity equations. To improve the robustness of the algorithm, several models may be used in parallel, providing potentially incomplete and/or conflicting inferences. Dempster's rule of combination is used to integrate evidence from the different models. The basic probability measures are assigned utilizing quantitative information extracted from the mathematical model and from online computation performed therewith.

  16. Artificial Intelligence Methods Applied to Parameter Detection of Atrial Fibrillation

    NASA Astrophysics Data System (ADS)

    Arotaritei, D.; Rotariu, C.

    2015-09-01

    In this paper we present a novel method to develop an atrial fibrillation (AF) based on statistical descriptors and hybrid neuro-fuzzy and crisp system. The inference of system produce rules of type if-then-else that care extracted to construct a binary decision system: normal of atrial fibrillation. We use TPR (Turning Point Ratio), SE (Shannon Entropy) and RMSSD (Root Mean Square of Successive Differences) along with a new descriptor, Teager- Kaiser energy, in order to improve the accuracy of detection. The descriptors are calculated over a sliding window that produce very large number of vectors (massive dataset) used by classifier. The length of window is a crisp descriptor meanwhile the rest of descriptors are interval-valued type. The parameters of hybrid system are adapted using Genetic Algorithm (GA) algorithm with fitness single objective target: highest values for sensibility and sensitivity. The rules are extracted and they are part of the decision system. The proposed method was tested using the Physionet MIT-BIH Atrial Fibrillation Database and the experimental results revealed a good accuracy of AF detection in terms of sensitivity and specificity (above 90%).

  17. A recurrent self-organizing neural fuzzy inference network.

    PubMed

    Juang, C F; Lin, C T

    1999-01-01

    A recurrent self-organizing neural fuzzy inference network (RSONFIN) is proposed in this paper. The RSONFIN is inherently a recurrent multilayered connectionist network for realizing the basic elements and functions of dynamic fuzzy inference, and may be considered to be constructed from a series of dynamic fuzzy rules. The temporal relations embedded in the network are built by adding some feedback connections representing the memory elements to a feedforward neural fuzzy network. Each weight as well as node in the RSONFIN has its own meaning and represents a special element in a fuzzy rule. There are no hidden nodes (i.e., no membership functions and fuzzy rules) initially in the RSONFIN. They are created on-line via concurrent structure identification (the construction of dynamic fuzzy if-then rules) and parameter identification (the tuning of the free parameters of membership functions). The structure learning together with the parameter learning forms a fast learning algorithm for building a small, yet powerful, dynamic neural fuzzy network. Two major characteristics of the RSONFIN can thus be seen: 1) the recurrent property of the RSONFIN makes it suitable for dealing with temporal problems and 2) no predetermination, like the number of hidden nodes, must be given, since the RSONFIN can find its optimal structure and parameters automatically and quickly. Moreover, to reduce the number of fuzzy rules generated, a flexible input partition method, the aligned clustering-based algorithm, is proposed. Various simulations on temporal problems are done and performance comparisons with some existing recurrent networks are also made. Efficiency of the RSONFIN is verified from these results.

  18. An Artificial Intelligence Approach for Modeling and Prediction of Water Diffusion Inside a Carbon Nanotube

    PubMed Central

    2009-01-01

    Modeling of water flow in carbon nanotubes is still a challenge for the classic models of fluid dynamics. In this investigation, an adaptive-network-based fuzzy inference system (ANFIS) is presented to solve this problem. The proposed ANFIS approach can construct an input–output mapping based on both human knowledge in the form of fuzzy if-then rules and stipulated input–output data pairs. Good performance of the designed ANFIS ensures its capability as a promising tool for modeling and prediction of fluid flow at nanoscale where the continuum models of fluid dynamics tend to break down. PMID:20596382

  19. An Artificial Intelligence Approach for Modeling and Prediction of Water Diffusion Inside a Carbon Nanotube.

    PubMed

    Ahadian, Samad; Kawazoe, Yoshiyuki

    2009-06-04

    Modeling of water flow in carbon nanotubes is still a challenge for the classic models of fluid dynamics. In this investigation, an adaptive-network-based fuzzy inference system (ANFIS) is presented to solve this problem. The proposed ANFIS approach can construct an input-output mapping based on both human knowledge in the form of fuzzy if-then rules and stipulated input-output data pairs. Good performance of the designed ANFIS ensures its capability as a promising tool for modeling and prediction of fluid flow at nanoscale where the continuum models of fluid dynamics tend to break down.

  20. Inferring thermodynamic stability relationship of polymorphs from melting data.

    PubMed

    Yu, L

    1995-08-01

    This study investigates the possibility of inferring the thermodynamic stability relationship of polymorphs from their melting data. Thermodynamic formulas are derived for calculating the Gibbs free energy difference (delta G) between two polymorphs and its temperature slope from mainly the temperatures and heats of melting. This information is then used to estimate delta G, thus relative stability, at other temperatures by extrapolation. Both linear and nonlinear extrapolations are considered. Extrapolating delta G to zero gives an estimation of the transition (or virtual transition) temperature, from which the presence of monotropy or enantiotropy is inferred. This procedure is analogous to the use of solubility data measured near the ambient temperature to estimate a transition point at higher temperature. For several systems examined, the two methods are in good agreement. The qualitative rule introduced this way for inferring the presence of monotropy or enantiotropy is approximately the same as The Heat of Fusion Rule introduced previously on a statistical mechanical basis. This method is applied to 96 pairs of polymorphs from the literature. In most cases, the result agrees with the previous determination. The deviation of the calculated transition temperatures from their previous values (n = 18) is 2% on average and 7% at maximum.

  1. Extended specificity studies of mRNA assays used to infer human organ tissues and body fluids.

    PubMed

    van den Berge, Margreet; Sijen, Titia

    2017-12-01

    Messenger RNA (mRNA) profiling is a technique increasingly applied for the forensic identification of body fluids and skin. More recently, an mRNA-based organ typing assay was developed which allows for the inference of brain, lung, liver, skeletal muscle, heart, kidney, and skin tissue. When applying this organ typing system in forensic casework for the presence of animal, rather than human, tissue is an alternative scenario to be proposed, for instance that bullets carry cell material from a hunting event. Even though mRNA profiling systems are commonly in silico designed to be primate specific, physical testing against other animal species is generally limited. In this study, human specificity of the organ tissue inferring system was assessed against organ tissue RNAs of various animals. Results confirm human specificity of the system, especially when utilizing interpretation rules considering multiple markers per cell type. Besides, we cross-tested our organ and body fluid mRNA assays against the target types covered by the other assay. Marker expression in the nontarget organ tissues and body fluids was observed to a limited extent, which emphasizes the importance of involving the case-specific context of the forensic samples in deciding which mRNA profiling assay to use and when for interpreting results. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. A new type of simplified fuzzy rule-based system

    NASA Astrophysics Data System (ADS)

    Angelov, Plamen; Yager, Ronald

    2012-02-01

    Over the last quarter of a century, two types of fuzzy rule-based (FRB) systems dominated, namely Mamdani and Takagi-Sugeno type. They use the same type of scalar fuzzy sets defined per input variable in their antecedent part which are aggregated at the inference stage by t-norms or co-norms representing logical AND/OR operations. In this paper, we propose a significantly simplified alternative to define the antecedent part of FRB systems by data Clouds and density distribution. This new type of FRB systems goes further in the conceptual and computational simplification while preserving the best features (flexibility, modularity, and human intelligibility) of its predecessors. The proposed concept offers alternative non-parametric form of the rules antecedents, which fully reflects the real data distribution and does not require any explicit aggregation operations and scalar membership functions to be imposed. Instead, it derives the fuzzy membership of a particular data sample to a Cloud by the data density distribution of the data associated with that Cloud. Contrast this to the clustering which is parametric data space decomposition/partitioning where the fuzzy membership to a cluster is measured by the distance to the cluster centre/prototype ignoring all the data that form that cluster or approximating their distribution. The proposed new approach takes into account fully and exactly the spatial distribution and similarity of all the real data by proposing an innovative and much simplified form of the antecedent part. In this paper, we provide several numerical examples aiming to illustrate the concept.

  3. Automated visualization of rule-based models

    PubMed Central

    Tapia, Jose-Juan; Faeder, James R.

    2017-01-01

    Frameworks such as BioNetGen, Kappa and Simmune use “reaction rules” to specify biochemical interactions compactly, where each rule specifies a mechanism such as binding or phosphorylation and its structural requirements. Current rule-based models of signaling pathways have tens to hundreds of rules, and these numbers are expected to increase as more molecule types and pathways are added. Visual representations are critical for conveying rule-based models, but current approaches to show rules and interactions between rules scale poorly with model size. Also, inferring design motifs that emerge from biochemical interactions is an open problem, so current approaches to visualize model architecture rely on manual interpretation of the model. Here, we present three new visualization tools that constitute an automated visualization framework for rule-based models: (i) a compact rule visualization that efficiently displays each rule, (ii) the atom-rule graph that conveys regulatory interactions in the model as a bipartite network, and (iii) a tunable compression pipeline that incorporates expert knowledge and produces compact diagrams of model architecture when applied to the atom-rule graph. The compressed graphs convey network motifs and architectural features useful for understanding both small and large rule-based models, as we show by application to specific examples. Our tools also produce more readable diagrams than current approaches, as we show by comparing visualizations of 27 published models using standard graph metrics. We provide an implementation in the open source and freely available BioNetGen framework, but the underlying methods are general and can be applied to rule-based models from the Kappa and Simmune frameworks also. We expect that these tools will promote communication and analysis of rule-based models and their eventual integration into comprehensive whole-cell models. PMID:29131816

  4. Inferring Metadata for a Semantic Web Peer-to-Peer Environment

    ERIC Educational Resources Information Center

    Brase, Jan; Painter, Mark

    2004-01-01

    Learning Objects Metadata (LOM) aims at describing educational resources in order to allow better reusability and retrieval. In this article we show how additional inference rules allows us to derive additional metadata from existing ones. Additionally, using these rules as integrity constraints helps us to define the constraints on LOM elements,…

  5. Controlling a robot with intention derived from motion.

    PubMed

    Crick, Christopher; Scassellati, Brian

    2010-01-01

    We present a novel, sophisticated intention-based control system for a mobile robot built from an extremely inexpensive webcam and radio-controlled toy vehicle. The system visually observes humans participating in various playground games and infers their goals and intentions through analyzing their spatiotemporal activity in relation to itself and each other, and then builds a coherent narrative out of the succession of these intentional states. Starting from zero information about the room, the rules of the games, or even which vehicle it controls, it learns rich relationships between players, their goals and intentions, probing uncertain situations with its own behavior. The robot is able to watch people playing various playground games, learn the roles and rules that apply to specific games, and participate in the play. The narratives it constructs capture essential information about the observed social roles and types of activity. After watching play for a short while, the system is able to participate appropriately in the games. We demonstrate how the system acts appropriately in scenarios such as chasing, follow-the-leader, and variants of tag. Copyright © 2009 Cognitive Science Society, Inc.

  6. Controlling false-negative errors in microarray differential expression analysis: a PRIM approach.

    PubMed

    Cole, Steve W; Galic, Zoran; Zack, Jerome A

    2003-09-22

    Theoretical considerations suggest that current microarray screening algorithms may fail to detect many true differences in gene expression (Type II analytic errors). We assessed 'false negative' error rates in differential expression analyses by conventional linear statistical models (e.g. t-test), microarray-adapted variants (e.g. SAM, Cyber-T), and a novel strategy based on hold-out cross-validation. The latter approach employs the machine-learning algorithm Patient Rule Induction Method (PRIM) to infer minimum thresholds for reliable change in gene expression from Boolean conjunctions of fold-induction and raw fluorescence measurements. Monte Carlo analyses based on four empirical data sets show that conventional statistical models and their microarray-adapted variants overlook more than 50% of genes showing significant up-regulation. Conjoint PRIM prediction rules recover approximately twice as many differentially expressed transcripts while maintaining strong control over false-positive (Type I) errors. As a result, experimental replication rates increase and total analytic error rates decline. RT-PCR studies confirm that gene inductions detected by PRIM but overlooked by other methods represent true changes in mRNA levels. PRIM-based conjoint inference rules thus represent an improved strategy for high-sensitivity screening of DNA microarrays. Freestanding JAVA application at http://microarray.crump.ucla.edu/focus

  7. Optics Toolbox: An Intelligent Relational Database System For Optical Designers

    NASA Astrophysics Data System (ADS)

    Weller, Scott W.; Hopkins, Robert E.

    1986-12-01

    Optical designers were among the first to use the computer as an engineering tool. Powerful programs have been written to do ray-trace analysis, third-order layout, and optimization. However, newer computing techniques such as database management and expert systems have not been adopted by the optical design community. For the purpose of this discussion we will define a relational database system as a database which allows the user to specify his requirements using logical relations. For example, to search for all lenses in a lens database with a F/number less than two, and a half field of view near 28 degrees, you might enter the following: FNO < 2.0 and FOV of 28 degrees ± 5% Again for the purpose of this discussion, we will define an expert system as a program which contains expert knowledge, can ask intelligent questions, and can form conclusions based on the answers given and the knowledge which it contains. Most expert systems store this knowledge in the form of rules-of-thumb, which are written in an English-like language, and which are easily modified by the user. An example rule is: IF require microscope objective in air and require NA > 0.9 THEN suggest the use of an oil immersion objective The heart of the expert system is the rule interpreter, sometimes called an inference engine, which reads the rules and forms conclusions based on them. The use of a relational database system containing lens prototypes seems to be a viable prospect. However, it is not clear that expert systems have a place in optical design. In domains such as medical diagnosis and petrology, expert systems are flourishing. These domains are quite different from optical design, however, because optical design is a creative process, and the rules are difficult to write down. We do think that an expert system is feasible in the area of first order layout, which is sufficiently diagnostic in nature to permit useful rules to be written. This first-order expert would emulate an expert designer as he interacted with a customer for the first time: asking the right questions, forming conclusions, and making suggestions. With these objectives in mind, we have developed the Optics Toolbox. Optics Toolbox is actually two programs in one: it is a powerful relational database system with twenty-one search parameters, four search modes, and multi-database support, as well as a first-order optical design expert system with a rule interpreter which has full access to the relational database. The system schematic is shown in Figure 1.

  8. Modelling dynamics with context-free grammars

    NASA Astrophysics Data System (ADS)

    García-Huerta, Juan-M.; Jiménez-Hernández, Hugo; Herrera-Navarro, Ana-M.; Hernández-Díaz, Teresa; Terol-Villalobos, Ivan

    2014-03-01

    This article presents a strategy to model the dynamics performed by vehicles in a freeway. The proposal consists on encode the movement as a set of finite states. A watershed-based segmentation is used to localize regions with high-probability of motion. Each state represents a proportion of a camera projection in a two-dimensional space, where each state is associated to a symbol, such that any combination of symbols is expressed as a language. Starting from a sequence of symbols through a linear algorithm a free-context grammar is inferred. This grammar represents a hierarchical view of common sequences observed into the scene. Most probable grammar rules express common rules associated to normal movement behavior. Less probable rules express themselves a way to quantify non-common behaviors and they might need more attention. Finally, all sequences of symbols that does not match with the grammar rules, may express itself uncommon behaviors (abnormal). The grammar inference is built with several sequences of images taken from a freeway. Testing process uses the sequence of symbols emitted by the scenario, matching the grammar rules with common freeway behaviors. The process of detect abnormal/normal behaviors is managed as the task of verify if any word generated by the scenario is recognized by the grammar.

  9. An expert system shell for inferring vegetation characteristics: Interface for the addition of techniques (Task H)

    NASA Technical Reports Server (NTRS)

    Harrison, P. Ann

    1993-01-01

    All the NASA VEGetation Workbench (VEG) goals except the Learning System provide the scientist with several different techniques. When VEG is run, rules assist the scientist in selecting the best of the available techniques to apply to the sample of cover type data being studied. The techniques are stored in the VEG knowledge base. The design and implementation of an interface that allows the scientist to add new techniques to VEG without assistance from the developer were completed. A new interface that enables the scientist to add techniques to VEG without assistance from the developer was designed and implemented. This interface does not require the scientist to have a thorough knowledge of Knowledge Engineering Environment (KEE) by Intellicorp or a detailed knowledge of the structure of VEG. The interface prompts the scientist to enter the required information about the new technique. It prompts the scientist to enter the required Common Lisp functions for executing the technique and the left hand side of the rule that causes the technique to be selected. A template for each function and rule and detailed instructions about the arguments of the functions, the values they should return, and the format of the rule are displayed. Checks are made to ensure that the required data were entered, the functions compiled correctly, and the rule parsed correctly before the new technique is stored. The additional techniques are stored separately from the VEG knowledge base. When the VEG knowledge base is loaded, the additional techniques are not normally loaded. The interface allows the scientist the option of adding all the previously defined new techniques before running VEG. When the techniques are added, the required units to store the additional techniques are created automatically in the correct places in the VEG knowledge base. The methods file containing the functions required by the additional techniques is loaded. New rule units are created to store the new rules. The interface that allow the scientist to select which techniques to use is updated automatically to include the new techniques. Task H was completed. The interface that allows the scientist to add techniques to VEG was implemented and comprehensively tested. The Common Lisp code for the Add Techniques system is listed in Appendix A.

  10. A fuzzy logic sliding mode controlled electronic differential for a direct wheel drive EV

    NASA Astrophysics Data System (ADS)

    Ozkop, Emre; Altas, Ismail H.; Okumus, H. Ibrahim; Sharaf, Adel M.

    2015-11-01

    In this study, a direct wheel drive electric vehicle based on an electronic differential system with a fuzzy logic sliding mode controller (FLSMC) is studied. The conventional sliding surface is modified using a fuzzy rule base to obtain fuzzy dynamic sliding surfaces by changing its slopes using the global error and its derivative in a fuzzy logic inference system. The controller is compared with proportional-integral-derivative (PID) and sliding mode controllers (SMCs), which are usually preferred to be used in industry. The proposed controller provides robustness and flexibility to direct wheel drive electric vehicles. The fuzzy logic sliding mode controller, electronic differential system and the overall electrical vehicle mechanism are modelled and digitally simulated by using the Matlab software. Simulation results show that the system with FLSMC has better efficiency and performance compared to those of PID and SMCs.

  11. Preschoolers Can Infer General Rules Governing Fantastical Events in Fiction

    ERIC Educational Resources Information Center

    Van de Vondervoort, Julia W.; Friedman, Ori

    2014-01-01

    Young children are frequently exposed to fantastic fiction. How do they make sense of the unrealistic and impossible events that occur in such fiction? Although children could view such events as isolated episodes, the present experiments suggest that children use such events to infer general fantasy rules. In 2 experiments, 2-to 4-year-olds were…

  12. Atlantis: An Open Architecture for Synergy of Process-Centered Environments and Computer-Supported Cooperative Work

    DTIC Science & Technology

    1998-04-01

    revision phase would be followed, after which a second review would be scheduled , and so forth, until the review succeeds. 2.3 Realization of the...normal rules, when Summit rules are inferred they are enqueued in a separate Summit queue and are scheduled for execution only after local forward... scheduling and activating activities according to the defined process; reac- tively triggering activities based on state changes; monitoring the process

  13. Self-Associations Influence Task-Performance through Bayesian Inference

    PubMed Central

    Bengtsson, Sara L.; Penny, Will D.

    2013-01-01

    The way we think about ourselves impacts greatly on our behavior. This paper describes a behavioral study and a computational model that shed new light on this important area. Participants were primed “clever” and “stupid” using a scrambled sentence task, and we measured the effect on response time and error-rate on a rule-association task. First, we observed a confirmation bias effect in that associations to being “stupid” led to a gradual decrease in performance, whereas associations to being “clever” did not. Second, we observed that the activated self-concepts selectively modified attention toward one’s performance. There was an early to late double dissociation in RTs in that primed “clever” resulted in RT increase following error responses, whereas primed “stupid” resulted in RT increase following correct responses. We propose a computational model of subjects’ behavior based on the logic of the experimental task that involves two processes; memory for rules and the integration of rules with subsequent visual cues. The model incorporates an adaptive decision threshold based on Bayes rule, whereby decision thresholds are increased if integration was inferred to be faulty. Fitting the computational model to experimental data confirmed our hypothesis that priming affects the memory process. This model explains both the confirmation bias and double dissociation effects and demonstrates that Bayesian inferential principles can be used to study the effect of self-concepts on behavior. PMID:23966937

  14. Restoring Consistency In Subjective Information For Groundwater Driven Health Risk Assessment

    NASA Astrophysics Data System (ADS)

    Ozbek, M. M.; Pinder, G. F.

    2004-12-01

    In an earlier work (Ozbek and Pinder, 2003), we constructed a fuzzy rule-based knowledge base that uses subjective expert opinion to calculate risk-based design constraints (i.e., dose and pattern of exposure) to sustain the groundwater-driven individual health risk at a desired level. Ideally, our system must be capable to produce for any individual a meaningful risk result or for any given risk a meaningful design constraint, in the sense that the result is neither the empty set nor the whole domain of the variable of interest. Otherwise we consider our system as inconsistent. We present a method based on fuzzy similarity relations to restore consistency in our implicative fuzzy rule based system used for the risk-based groundwater remediation design problem. Both a global and a local approach are considered. Even though straightforward and computationally less demanding, the global approach can affect pieces of knowledge negatively by inducing unwarranted imprecision into the knowledge base. On the other hand, the local approach, given a family of parameterized similarity relations, determines a parameter for each inference such that consistent results are computed which may not be feasible in real time applications of our knowledge base. Several scenarios are considered for comparing the two approaches that suggest that for specific applications one or several approaches ranging from a completely global to a completely local one will be more suitable than others while calculating the design constraints.

  15. A fuzzy-theory-based behavioral model for studying pedestrian evacuation from a single-exit room

    NASA Astrophysics Data System (ADS)

    Fu, Libi; Song, Weiguo; Lo, Siuming

    2016-08-01

    Many mass events in recent years have highlighted the importance of research on pedestrian evacuation dynamics. A number of models have been developed to analyze crowd behavior under evacuation situations. However, few focus on pedestrians' decision-making with respect to uncertainty, vagueness and imprecision. In this paper, a discrete evacuation model defined on the cellular space is proposed according to the fuzzy theory which is able to describe imprecise and subjective information. Pedestrians' percept information and various characteristics are regarded as fuzzy input. Then fuzzy inference systems with rule bases, which resemble human reasoning, are established to obtain fuzzy output that decides pedestrians' movement direction. This model is tested in two scenarios, namely in a single-exit room with and without obstacles. Simulation results reproduce some classic dynamics phenomena discovered in real building evacuation situations, and are consistent with those in other models and experiments. It is hoped that this study will enrich movement rules and approaches in traditional cellular automaton models for evacuation dynamics.

  16. Expert systems for fault diagnosis in nuclear reactor control

    NASA Astrophysics Data System (ADS)

    Jalel, N. A.; Nicholson, H.

    1990-11-01

    An expert system for accident analysis and fault diagnosis for the Loss Of Fluid Test (LOFT) reactor, a small scale pressurized water reactor, was developed for a personal computer. The knowledge of the system is presented using a production rule approach with a backward chaining inference engine. The data base of the system includes simulated dependent state variables of the LOFT reactor model. Another system is designed to assist the operator in choosing the appropriate cooling mode and to diagnose the fault in the selected cooling system. The response tree, which is used to provide the link between a list of very specific accident sequences and a set of generic emergency procedures which help the operator in monitoring system status, and to differentiate between different accident sequences and select the correct procedures, is used to build the system knowledge base. Both systems are written in TURBO PROLOG language and can be run on an IBM PC compatible with 640k RAM, 40 Mbyte hard disk and color graphics.

  17. Ares I-X Ground Diagnostic Prototype

    NASA Technical Reports Server (NTRS)

    Schwabacher, Mark A.; Martin, Rodney Alexander; Waterman, Robert D.; Oostdyk, Rebecca Lynn; Ossenfort, John P.; Matthews, Bryan

    2010-01-01

    The automation of pre-launch diagnostics for launch vehicles offers three potential benefits: improving safety, reducing cost, and reducing launch delays. The Ares I-X Ground Diagnostic Prototype demonstrated anomaly detection, fault detection, fault isolation, and diagnostics for the Ares I-X first-stage Thrust Vector Control and for the associated ground hydraulics while the vehicle was in the Vehicle Assembly Building at Kennedy Space Center (KSC) and while it was on the launch pad. The prototype combines three existing tools. The first tool, TEAMS (Testability Engineering and Maintenance System), is a model-based tool from Qualtech Systems Inc. for fault isolation and diagnostics. The second tool, SHINE (Spacecraft Health Inference Engine), is a rule-based expert system that was developed at the NASA Jet Propulsion Laboratory. We developed SHINE rules for fault detection and mode identification, and used the outputs of SHINE as inputs to TEAMS. The third tool, IMS (Inductive Monitoring System), is an anomaly detection tool that was developed at NASA Ames Research Center. The three tools were integrated and deployed to KSC, where they were interfaced with live data. This paper describes how the prototype performed during the period of time before the launch, including accuracy and computer resource usage. The paper concludes with some of the lessons that we learned from the experience of developing and deploying the prototype.

  18. A rule-based approach for the correlation of alarms to support Disaster and Emergency Management

    NASA Astrophysics Data System (ADS)

    Gloria, M.; Minei, G.; Lersi, V.; Pasquariello, D.; Monti, C.; Saitto, A.

    2009-04-01

    Key words: Simple Event Correlator, Agent Platform, Ontology, Semantic Web, Distributed Systems, Emergency Management The importance of recognition of emergency's typology to control the critical situation for security of citizens has been always recognized. It follows this aspect is very important for proper management of a hazardous event. In this work we present a solution for the recognition of emergency's typology adopted by an Italian research project, called CI6 (Centro Integrato per Servizi di Emergenza Innovativi). In our approach, CI6 receives alarms by citizen or people involved in the work (for example: police, operator of 112, and so on). CI6 represents any alarm by a set of information, including a text that describes it and obtained when the user points out the danger, and a pair of coordinates for its location. The system realizes an analysis of text and automatically infers information on the type of emergencies by means a set of parsing rules and rules of inference applied by a independent module: a correlator of events based on their log and called Simple Event Correlator (SEC). SEC, integrated in CI6's platform, is an open source and platform independent event correlation tool. SEC accepts input both files and text derived from standard input, making it flexible because it can be matched to any application that is able to write its output to a file stream. The SEC configuration is stored in text files as rules, each rule specifying an event matching condition, an action list, and optionally a Boolean expression whose truth value decides whether the rule can be applied at a given moment. SEC can produce output events by executing user-specified shell scripts or programs, by writing messages to files, and by various other means. SEC has been successfully applied in various domains like network management, system monitoring, data security, intrusion detection, log file monitoring and analysis, etc; it has been used or integrated with many application as CiscoWorks, HP OpenView NNM and Operation, BMC Patrol, etc. Analysis of text of an alarm can detect some keywords that allow to classify the particular event. The inference rules were developed by means an analysis about news regard real emergency found by web reaserches. We have seen that often a kind of emergency is characterized by more keyword. Keywords are not uniquely associated with a specific emergency, but they can be shared by different types of emergencies (such as. keyword "landslide" can be associated both emergency "landslide" and emergency "Flood"). However, the identification of two or more keywords associated with a particular type of emergency identified in most cases the correct type of emergency. So, for example, if text contains words as "water", "flood", "overflowing", "landslide" o other words belonging to the set of defined keywords or words that have some root of keywords, the system "decides" that this alarm belongs to specific typology, in this case "flood typology". The system has the memory of this information, so if a new alarm is reported and belongs to one of the typology already identified, it proceeds with the comparison of coordinates. The comparison between the centers of the alarms allows to see if they describe an area inscribed in an ideal circle that has centered on the first alarm and radius defined by the typology above mentioned. If this happens the system CI6 creates an emergency that has centered on the centre of that area and typology equal to that of the alarms. It follows that an emergency is represented by at least two alarms. Thus, the system suggests to manager (CI6's user) the possibility that most alarms can concern same events and makes a classification of this event. It is important to stress that CI6 is a system of decision support, hence also this service is limited to providing advice to the user to facilitate his task, leaving him the decision to accept it or not. REFERENCES SEC (Simple Event Correlator), http://kodu.neti.ee/~risto/sec/ M. Gloria,V. Lersi, G. Minei, D. Pasquariello, C. Monti, A. Saitto, "A Semantic WEB Services Platform to support Disaster and Emergency Management", 4th biennial Meeting of International Environmental Modelling and Software Society (iEMSs), 2008

  19. Research of Planetary Gear Fault Diagnosis Based on Permutation Entropy of CEEMDAN and ANFIS

    PubMed Central

    Kuai, Moshen; Cheng, Gang; Li, Yong

    2018-01-01

    For planetary gear has the characteristics of small volume, light weight and large transmission ratio, it is widely used in high speed and high power mechanical system. Poor working conditions result in frequent failures of planetary gear. A method is proposed for diagnosing faults in planetary gear based on permutation entropy of Complete Ensemble Empirical Mode Decomposition with Adaptive Noise (CEEMDAN) Adaptive Neuro-fuzzy Inference System (ANFIS) in this paper. The original signal is decomposed into 6 intrinsic mode functions (IMF) and residual components by CEEMDAN. Since the IMF contains the main characteristic information of planetary gear faults, time complexity of IMFs are reflected by permutation entropies to quantify the fault features. The permutation entropies of each IMF component are defined as the input of ANFIS, and its parameters and membership functions are adaptively adjusted according to training samples. Finally, the fuzzy inference rules are determined, and the optimal ANFIS is obtained. The overall recognition rate of the test sample used for ANFIS is 90%, and the recognition rate of gear with one missing tooth is relatively high. The recognition rates of different fault gears based on the method can also achieve better results. Therefore, the proposed method can be applied to planetary gear fault diagnosis effectively. PMID:29510569

  20. Research of Planetary Gear Fault Diagnosis Based on Permutation Entropy of CEEMDAN and ANFIS.

    PubMed

    Kuai, Moshen; Cheng, Gang; Pang, Yusong; Li, Yong

    2018-03-05

    For planetary gear has the characteristics of small volume, light weight and large transmission ratio, it is widely used in high speed and high power mechanical system. Poor working conditions result in frequent failures of planetary gear. A method is proposed for diagnosing faults in planetary gear based on permutation entropy of Complete Ensemble Empirical Mode Decomposition with Adaptive Noise (CEEMDAN) Adaptive Neuro-fuzzy Inference System (ANFIS) in this paper. The original signal is decomposed into 6 intrinsic mode functions (IMF) and residual components by CEEMDAN. Since the IMF contains the main characteristic information of planetary gear faults, time complexity of IMFs are reflected by permutation entropies to quantify the fault features. The permutation entropies of each IMF component are defined as the input of ANFIS, and its parameters and membership functions are adaptively adjusted according to training samples. Finally, the fuzzy inference rules are determined, and the optimal ANFIS is obtained. The overall recognition rate of the test sample used for ANFIS is 90%, and the recognition rate of gear with one missing tooth is relatively high. The recognition rates of different fault gears based on the method can also achieve better results. Therefore, the proposed method can be applied to planetary gear fault diagnosis effectively.

  1. Learning abstract visual concepts via probabilistic program induction in a Language of Thought.

    PubMed

    Overlan, Matthew C; Jacobs, Robert A; Piantadosi, Steven T

    2017-11-01

    The ability to learn abstract concepts is a powerful component of human cognition. It has been argued that variable binding is the key element enabling this ability, but the computational aspects of variable binding remain poorly understood. Here, we address this shortcoming by formalizing the Hierarchical Language of Thought (HLOT) model of rule learning. Given a set of data items, the model uses Bayesian inference to infer a probability distribution over stochastic programs that implement variable binding. Because the model makes use of symbolic variables as well as Bayesian inference and programs with stochastic primitives, it combines many of the advantages of both symbolic and statistical approaches to cognitive modeling. To evaluate the model, we conducted an experiment in which human subjects viewed training items and then judged which test items belong to the same concept as the training items. We found that the HLOT model provides a close match to human generalization patterns, significantly outperforming two variants of the Generalized Context Model, one variant based on string similarity and the other based on visual similarity using features from a deep convolutional neural network. Additional results suggest that variable binding happens automatically, implying that binding operations do not add complexity to peoples' hypothesized rules. Overall, this work demonstrates that a cognitive model combining symbolic variables with Bayesian inference and stochastic program primitives provides a new perspective for understanding people's patterns of generalization. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Exploiting expert systems in cardiology: a comparative study.

    PubMed

    Economou, George-Peter K; Sourla, Efrosini; Stamatopoulou, Konstantina-Maria; Syrimpeis, Vasileios; Sioutas, Spyros; Tsakalidis, Athanasios; Tzimas, Giannis

    2015-01-01

    An improved Adaptive Neuro-Fuzzy Inference System (ANFIS) in the field of critical cardiovascular diseases is presented. The system stems from an earlier application based only on a Sugeno-type Fuzzy Expert System (FES) with the addition of an Artificial Neural Network (ANN) computational structure. Thus, inherent characteristics of ANNs, along with the human-like knowledge representation of fuzzy systems are integrated. The ANFIS has been utilized into building five different sub-systems, distinctly covering Coronary Disease, Hypertension, Atrial Fibrillation, Heart Failure, and Diabetes, hence aiding doctors of medicine (MDs), guide trainees, and encourage medical experts in their diagnoses centering a wide range of Cardiology. The Fuzzy Rules have been trimmed down and the ANNs have been optimized in order to focus into each particular disease and produce results ready-to-be applied to real-world patients.

  3. Online intelligent controllers for an enzyme recovery plant: design methodology and performance.

    PubMed

    Leite, M S; Fujiki, T L; Silva, F V; Fileti, A M F

    2010-12-27

    This paper focuses on the development of intelligent controllers for use in a process of enzyme recovery from pineapple rind. The proteolytic enzyme bromelain (EC 3.4.22.4) is precipitated with alcohol at low temperature in a fed-batch jacketed tank. Temperature control is crucial to avoid irreversible protein denaturation. Fuzzy or neural controllers offer a way of implementing solutions that cover dynamic and nonlinear processes. The design methodology and a comparative study on the performance of fuzzy-PI, neurofuzzy, and neural network intelligent controllers are presented. To tune the fuzzy PI Mamdani controller, various universes of discourse, rule bases, and membership function support sets were tested. A neurofuzzy inference system (ANFIS), based on Takagi-Sugeno rules, and a model predictive controller, based on neural modeling, were developed and tested as well. Using a Fieldbus network architecture, a coolant variable speed pump was driven by the controllers. The experimental results show the effectiveness of fuzzy controllers in comparison to the neural predictive control. The fuzzy PI controller exhibited a reduced error parameter (ITAE), lower power consumption, and better recovery of enzyme activity.

  4. Online Intelligent Controllers for an Enzyme Recovery Plant: Design Methodology and Performance

    PubMed Central

    Leite, M. S.; Fujiki, T. L.; Silva, F. V.; Fileti, A. M. F.

    2010-01-01

    This paper focuses on the development of intelligent controllers for use in a process of enzyme recovery from pineapple rind. The proteolytic enzyme bromelain (EC 3.4.22.4) is precipitated with alcohol at low temperature in a fed-batch jacketed tank. Temperature control is crucial to avoid irreversible protein denaturation. Fuzzy or neural controllers offer a way of implementing solutions that cover dynamic and nonlinear processes. The design methodology and a comparative study on the performance of fuzzy-PI, neurofuzzy, and neural network intelligent controllers are presented. To tune the fuzzy PI Mamdani controller, various universes of discourse, rule bases, and membership function support sets were tested. A neurofuzzy inference system (ANFIS), based on Takagi-Sugeno rules, and a model predictive controller, based on neural modeling, were developed and tested as well. Using a Fieldbus network architecture, a coolant variable speed pump was driven by the controllers. The experimental results show the effectiveness of fuzzy controllers in comparison to the neural predictive control. The fuzzy PI controller exhibited a reduced error parameter (ITAE), lower power consumption, and better recovery of enzyme activity. PMID:21234106

  5. Real-time fuzzy inference based robot path planning

    NASA Technical Reports Server (NTRS)

    Pacini, Peter J.; Teichrow, Jon S.

    1990-01-01

    This project addresses the problem of adaptive trajectory generation for a robot arm. Conventional trajectory generation involves computing a path in real time to minimize a performance measure such as expended energy. This method can be computationally intensive, and it may yield poor results if the trajectory is weakly constrained. Typically some implicit constraints are known, but cannot be encoded analytically. The alternative approach used here is to formulate domain-specific knowledge, including implicit and ill-defined constraints, in terms of fuzzy rules. These rules utilize linguistic terms to relate input variables to output variables. Since the fuzzy rulebase is determined off-line, only high-level, computationally light processing is required in real time. Potential applications for adaptive trajectory generation include missile guidance and various sophisticated robot control tasks, such as automotive assembly, high speed electrical parts insertion, stepper alignment, and motion control for high speed parcel transfer systems.

  6. [Applications of mathematical statistics methods on compatibility researches of traditional Chinese medicines formulae].

    PubMed

    Mai, Lan-Yin; Li, Yi-Xuan; Chen, Yong; Xie, Zhen; Li, Jie; Zhong, Ming-Yu

    2014-05-01

    The compatibility of traditional Chinese medicines (TCMs) formulae containing enormous information, is a complex component system. Applications of mathematical statistics methods on the compatibility researches of traditional Chinese medicines formulae have great significance for promoting the modernization of traditional Chinese medicines and improving clinical efficacies and optimizations of formulae. As a tool for quantitative analysis, data inference and exploring inherent rules of substances, the mathematical statistics method can be used to reveal the working mechanisms of the compatibility of traditional Chinese medicines formulae in qualitatively and quantitatively. By reviewing studies based on the applications of mathematical statistics methods, this paper were summarized from perspective of dosages optimization, efficacies and changes of chemical components as well as the rules of incompatibility and contraindication of formulae, will provide the references for further studying and revealing the working mechanisms and the connotations of traditional Chinese medicines.

  7. A Policy Language for Modelling Recommendations

    NASA Astrophysics Data System (ADS)

    Abou El Kalam, Anas; Balbiani, Philippe

    While current and emergent applications become more and more complex, most of existing security policies and models only consider a yes/no response to the access requests. Consequently, modelling, formalizing and implementing permissions, obligations and prohibitions do not cover the richness of all the possible scenarios. In fact, several applications have access rules with the recommendation access modality. In this paper we focus on the problem of formalizing security policies with recommendation needs. The aim is to provide a generic domain-independent formal system for modelling not only permissions, prohibitions and obligations, but also recommendations. In this respect, we present our logic-based language, the semantics, the truth conditions, our axiomatic as well as inference rules. We also give a representative use case with our specification of recommendation requirements. Finally, we explain how our logical framework could be used to query the security policy and to check its consistency.

  8. Improving Microbial Genome Annotations in an Integrated Database Context

    PubMed Central

    Chen, I-Min A.; Markowitz, Victor M.; Chu, Ken; Anderson, Iain; Mavromatis, Konstantinos; Kyrpides, Nikos C.; Ivanova, Natalia N.

    2013-01-01

    Effective comparative analysis of microbial genomes requires a consistent and complete view of biological data. Consistency regards the biological coherence of annotations, while completeness regards the extent and coverage of functional characterization for genomes. We have developed tools that allow scientists to assess and improve the consistency and completeness of microbial genome annotations in the context of the Integrated Microbial Genomes (IMG) family of systems. All publicly available microbial genomes are characterized in IMG using different functional annotation and pathway resources, thus providing a comprehensive framework for identifying and resolving annotation discrepancies. A rule based system for predicting phenotypes in IMG provides a powerful mechanism for validating functional annotations, whereby the phenotypic traits of an organism are inferred based on the presence of certain metabolic reactions and pathways and compared to experimentally observed phenotypes. The IMG family of systems are available at http://img.jgi.doe.gov/. PMID:23424620

  9. Empirical OPC rule inference for rapid RET application

    NASA Astrophysics Data System (ADS)

    Kulkarni, Anand P.

    2006-10-01

    A given technological node (45 nm, 65 nm) can be expected to process thousands of individual designs. Iterative methods applied at the node consume valuable days in determining proper placement of OPC features, and manufacturing and testing mask correspondence to wafer patterns in a trial-and-error fashion for each design. Repeating this fabrication process for each individual design is a time-consuming and expensive process. We present a novel technique which sidesteps the requirement to iterate through the model-based OPC analysis and pattern verification cycle on subsequent designs at the same node. Our approach relies on the inference of rules from a correct pattern at the wafer surface it relates to the OPC and pre-OPC pattern layout files. We begin with an offline phase where we obtain a "gold standard" design file that has been fab-tested at the node with a prepared, post-OPC layout file that corresponds to the intended on-wafer pattern. We then run an offline analysis to infer rules to be used in this method. During the analysis, our method implicitly identifies contextual OPC strategies for optimal placement of RET features on any design at that node. Using these strategies, we can apply OPC to subsequent designs at the same node with accuracy comparable to the original design file but significantly smaller expected runtimes. The technique promises to offer a rapid and accurate complement to existing RET application strategies.

  10. How people explain their own and others’ behavior: a theory of lay causal explanations

    PubMed Central

    Böhm, Gisela; Pfister, Hans-Rüdiger

    2015-01-01

    A theoretical model is proposed that specifies lay causal theories of behavior; and supporting experimental evidence is presented. The model’s basic assumption is that different types of behavior trigger different hypotheses concerning the types of causes that may have brought about the behavior. Seven categories are distinguished that are assumed to serve as both behavior types and explanation types: goals, dispositions, temporary states such as emotions, intentional actions, outcomes, events, and stimulus attributes. The model specifies inference rules that lay people use when explaining behavior (actions are caused by goals; goals are caused by higher order goals or temporary states; temporary states are caused by dispositions, stimulus attributes, or events; outcomes are caused by actions, temporary states, dispositions, stimulus attributes, or events; events are caused by dispositions or preceding events). Two experiments are reported. Experiment 1 showed that free-response explanations followed the assumed inference rules. Experiment 2 demonstrated that explanations which match the inference rules are generated faster and more frequently than non-matching explanations. Together, the findings support models that incorporate knowledge-based aspects into the process of causal explanation. The results are discussed with respect to their implications for different stages of this process, such as the activation of causal hypotheses and their subsequent selection, as well as with respect to social influences on this process. PMID:25741306

  11. Planning bioinformatics workflows using an expert system.

    PubMed

    Chen, Xiaoling; Chang, Jeffrey T

    2017-04-15

    Bioinformatic analyses are becoming formidably more complex due to the increasing number of steps required to process the data, as well as the proliferation of methods that can be used in each step. To alleviate this difficulty, pipelines are commonly employed. However, pipelines are typically implemented to automate a specific analysis, and thus are difficult to use for exploratory analyses requiring systematic changes to the software or parameters used. To automate the development of pipelines, we have investigated expert systems. We created the Bioinformatics ExperT SYstem (BETSY) that includes a knowledge base where the capabilities of bioinformatics software is explicitly and formally encoded. BETSY is a backwards-chaining rule-based expert system comprised of a data model that can capture the richness of biological data, and an inference engine that reasons on the knowledge base to produce workflows. Currently, the knowledge base is populated with rules to analyze microarray and next generation sequencing data. We evaluated BETSY and found that it could generate workflows that reproduce and go beyond previously published bioinformatics results. Finally, a meta-investigation of the workflows generated from the knowledge base produced a quantitative measure of the technical burden imposed by each step of bioinformatics analyses, revealing the large number of steps devoted to the pre-processing of data. In sum, an expert system approach can facilitate exploratory bioinformatic analysis by automating the development of workflows, a task that requires significant domain expertise. https://github.com/jefftc/changlab. jeffrey.t.chang@uth.tmc.edu. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  12. Planning bioinformatics workflows using an expert system

    PubMed Central

    Chen, Xiaoling; Chang, Jeffrey T.

    2017-01-01

    Abstract Motivation: Bioinformatic analyses are becoming formidably more complex due to the increasing number of steps required to process the data, as well as the proliferation of methods that can be used in each step. To alleviate this difficulty, pipelines are commonly employed. However, pipelines are typically implemented to automate a specific analysis, and thus are difficult to use for exploratory analyses requiring systematic changes to the software or parameters used. Results: To automate the development of pipelines, we have investigated expert systems. We created the Bioinformatics ExperT SYstem (BETSY) that includes a knowledge base where the capabilities of bioinformatics software is explicitly and formally encoded. BETSY is a backwards-chaining rule-based expert system comprised of a data model that can capture the richness of biological data, and an inference engine that reasons on the knowledge base to produce workflows. Currently, the knowledge base is populated with rules to analyze microarray and next generation sequencing data. We evaluated BETSY and found that it could generate workflows that reproduce and go beyond previously published bioinformatics results. Finally, a meta-investigation of the workflows generated from the knowledge base produced a quantitative measure of the technical burden imposed by each step of bioinformatics analyses, revealing the large number of steps devoted to the pre-processing of data. In sum, an expert system approach can facilitate exploratory bioinformatic analysis by automating the development of workflows, a task that requires significant domain expertise. Availability and Implementation: https://github.com/jefftc/changlab Contact: jeffrey.t.chang@uth.tmc.edu PMID:28052928

  13. Fuzzy rule-based image segmentation in dynamic MR images of the liver

    NASA Astrophysics Data System (ADS)

    Kobashi, Syoji; Hata, Yutaka; Tokimoto, Yasuhiro; Ishikawa, Makato

    2000-06-01

    This paper presents a fuzzy rule-based region growing method for segmenting two-dimensional (2-D) and three-dimensional (3- D) magnetic resonance (MR) images. The method is an extension of the conventional region growing method. The proposed method evaluates the growing criteria by using fuzzy inference techniques. The use of the fuzzy if-then rules is appropriate for describing the knowledge of the legions on the MR images. To evaluate the performance of the proposed method, it was applied to artificially generated images. In comparison with the conventional method, the proposed method shows high robustness for noisy images. The method then applied for segmenting the dynamic MR images of the liver. The dynamic MR imaging has been used for diagnosis of hepatocellular carcinoma (HCC), portal hypertension, and so on. Segmenting the liver, portal vein (PV), and inferior vena cava (IVC) can give useful description for the diagnosis, and is a basis work of a pres-surgery planning system and a virtual endoscope. To apply the proposed method, fuzzy if-then rules are derived from the time-density curve of ROIs. In the experimental results, the 2-D reconstructed and 3-D rendered images of the segmented liver, PV, and IVC are shown. The evaluation by a physician shows that the generated images are comparable to the hepatic anatomy, and they would be useful to understanding, diagnosis, and pre-surgery planning.

  14. Using Commercial Off-the-Shelf Software Tools for Space Shuttle Scientific Software

    NASA Technical Reports Server (NTRS)

    Groleau, Nicolas; Friedland, Peter (Technical Monitor)

    1994-01-01

    In October 1993, the Astronaut Science Advisor (ASA) was on board the STS-58 flight of the space shuttle. ASA is an interactive system providing data acquisition and analysis, experiment step re-scheduling, and various other forms of reasoning. As fielded, the system runs on a single Macintosh PowerBook 170, which hosts the six ASA modules. There is one other piece of hardware, an external (GW Instruments, Sommerville, Massachusetts) analog-to-digital converter connected to the PowerBook's SCSI port. Three main software tools were used: LabVIEW, CLIPS, and HyperCard: First, a module written in LabVIEW (National Instruments, Austin, Texas) controls the A/D conversion and stores the resulting data in appropriate arrays. This module also analyzes the numerical data to produce a small set of characteristic numbers or symbols describing the results of an experiment trial. Second, a forward-chaining inference system written in CLIPS (NASA) uses the symbolic information provided by the first stage with a static rule base to infer decisions about the experiment. This expert system shell is used by the system for diagnosis. The third component of the system is the user interface, written in HyperCard (Claris Inc. and Apple Inc., both in Cupertino, California).

  15. Domain repertoires as a tool to derive protein recognition rules.

    PubMed

    Zucconi, A; Panni, S; Paoluzi, S; Castagnoli, L; Dente, L; Cesareni, G

    2000-08-25

    Several approaches, some of which are described in this issue, have been proposed to assemble a complete protein interaction map. These are often based on high throughput methods that explore the ability of each gene product to bind any other element of the proteome of the organism. Here we propose that a large number of interactions can be inferred by revealing the rules underlying recognition specificity of a small number (a few hundreds) of families of protein recognition modules. This can be achieved through the construction and characterization of domain repertoires. A domain repertoire is assembled in a combinatorial fashion by allowing each amino acid position in the binding site of a given protein recognition domain to vary to include all the residues allowed at that position in the domain family. The repertoire is then searched by phage display techniques with any target of interest and from the primary structure of the binding site of the selected domains one derives rules that are used to infer the formation of complexes between natural proteins in the cell.

  16. FoodWiki: Ontology-Driven Mobile Safe Food Consumption System.

    PubMed

    Çelik, Duygu

    2015-01-01

    An ontology-driven safe food consumption mobile system is considered. Over 3,000 compounds are being added to processed food, with numerous effects on the food: to add color, stabilize, texturize, preserve, sweeten, thicken, add flavor, soften, emulsify, and so forth. According to World Health Organization, governments have lately focused on legislation to reduce such ingredients or compounds in manufactured foods as they may have side effects causing health risks such as heart disease, cancer, diabetes, allergens, and obesity. By supervising what and how much to eat as well as what not to eat, we can maximize a patient's life quality through avoidance of unhealthy ingredients. Smart e-health systems with powerful knowledge bases can provide suggestions of appropriate foods to individuals. Next-generation smart knowledgebase systems will not only include traditional syntactic-based search, which limits the utility of the search results, but will also provide semantics for rich searching. In this paper, performance of concept matching of food ingredients is semantic-based, meaning that it runs its own semantic based rule set to infer meaningful results through the proposed Ontology-Driven Mobile Safe Food Consumption System (FoodWiki).

  17. Meta-learning framework applied in bioinformatics inference system design.

    PubMed

    Arredondo, Tomás; Ormazábal, Wladimir

    2015-01-01

    This paper describes a meta-learner inference system development framework which is applied and tested in the implementation of bioinformatic inference systems. These inference systems are used for the systematic classification of the best candidates for inclusion in bacterial metabolic pathway maps. This meta-learner-based approach utilises a workflow where the user provides feedback with final classification decisions which are stored in conjunction with analysed genetic sequences for periodic inference system training. The inference systems were trained and tested with three different data sets related to the bacterial degradation of aromatic compounds. The analysis of the meta-learner-based framework involved contrasting several different optimisation methods with various different parameters. The obtained inference systems were also contrasted with other standard classification methods with accurate prediction capabilities observed.

  18. Knowledge acquisition in the fuzzy knowledge representation framework of a medical consultation system.

    PubMed

    Boegl, Karl; Adlassnig, Klaus-Peter; Hayashi, Yoichi; Rothenfluh, Thomas E; Leitich, Harald

    2004-01-01

    This paper describes the fuzzy knowledge representation framework of the medical computer consultation system MedFrame/CADIAG-IV as well as the specific knowledge acquisition techniques that have been developed to support the definition of knowledge concepts and inference rules. As in its predecessor system CADIAG-II, fuzzy medical knowledge bases are used to model the uncertainty and the vagueness of medical concepts and fuzzy logic reasoning mechanisms provide the basic inference processes. The elicitation and acquisition of medical knowledge from domain experts has often been described as the most difficult and time-consuming task in knowledge-based system development in medicine. It comes as no surprise that this is even more so when unfamiliar representations like fuzzy membership functions are to be acquired. From previous projects we have learned that a user-centered approach is mandatory in complex and ill-defined knowledge domains such as internal medicine. This paper describes the knowledge acquisition framework that has been developed in order to make easier and more accessible the three main tasks of: (a) defining medical concepts; (b) providing appropriate interpretations for patient data; and (c) constructing inferential knowledge in a fuzzy knowledge representation framework. Special emphasis is laid on the motivations for some system design and data modeling decisions. The theoretical framework has been implemented in a software package, the Knowledge Base Builder Toolkit. The conception and the design of this system reflect the need for a user-centered, intuitive, and easy-to-handle tool. First results gained from pilot studies have shown that our approach can be successfully implemented in the context of a complex fuzzy theoretical framework. As a result, this critical aspect of knowledge-based system development can be accomplished more easily.

  19. Genetic algorithm optimized rainfall-runoff fuzzy inference system for row crop watersheds with claypan soils

    USDA-ARS?s Scientific Manuscript database

    The fuzzy logic algorithm has the ability to describe knowledge in a descriptive human-like manner in the form of simple rules using linguistic variables, and provides a new way of modeling uncertain or naturally fuzzy hydrological processes like non-linear rainfall-runoff relationships. Fuzzy infe...

  20. Methodology for the inference of gene function from phenotype data.

    PubMed

    Ascensao, Joao A; Dolan, Mary E; Hill, David P; Blake, Judith A

    2014-12-12

    Biomedical ontologies are increasingly instrumental in the advancement of biological research primarily through their use to efficiently consolidate large amounts of data into structured, accessible sets. However, ontology development and usage can be hampered by the segregation of knowledge by domain that occurs due to independent development and use of the ontologies. The ability to infer data associated with one ontology to data associated with another ontology would prove useful in expanding information content and scope. We here focus on relating two ontologies: the Gene Ontology (GO), which encodes canonical gene function, and the Mammalian Phenotype Ontology (MP), which describes non-canonical phenotypes, using statistical methods to suggest GO functional annotations from existing MP phenotype annotations. This work is in contrast to previous studies that have focused on inferring gene function from phenotype primarily through lexical or semantic similarity measures. We have designed and tested a set of algorithms that represents a novel methodology to define rules for predicting gene function by examining the emergent structure and relationships between the gene functions and phenotypes rather than inspecting the terms semantically. The algorithms inspect relationships among multiple phenotype terms to deduce if there are cases where they all arise from a single gene function. We apply this methodology to data about genes in the laboratory mouse that are formally represented in the Mouse Genome Informatics (MGI) resource. From the data, 7444 rule instances were generated from five generalized rules, resulting in 4818 unique GO functional predictions for 1796 genes. We show that our method is capable of inferring high-quality functional annotations from curated phenotype data. As well as creating inferred annotations, our method has the potential to allow for the elucidation of unforeseen, biologically significant associations between gene function and phenotypes that would be overlooked by a semantics-based approach. Future work will include the implementation of the described algorithms for a variety of other model organism databases, taking full advantage of the abundance of available high quality curated data.

  1. Prediction of drug synergy in cancer using ensemble-based machine learning techniques

    NASA Astrophysics Data System (ADS)

    Singh, Harpreet; Rana, Prashant Singh; Singh, Urvinder

    2018-04-01

    Drug synergy prediction plays a significant role in the medical field for inhibiting specific cancer agents. It can be developed as a pre-processing tool for therapeutic successes. Examination of different drug-drug interaction can be done by drug synergy score. It needs efficient regression-based machine learning approaches to minimize the prediction errors. Numerous machine learning techniques such as neural networks, support vector machines, random forests, LASSO, Elastic Nets, etc., have been used in the past to realize requirement as mentioned above. However, these techniques individually do not provide significant accuracy in drug synergy score. Therefore, the primary objective of this paper is to design a neuro-fuzzy-based ensembling approach. To achieve this, nine well-known machine learning techniques have been implemented by considering the drug synergy data. Based on the accuracy of each model, four techniques with high accuracy are selected to develop ensemble-based machine learning model. These models are Random forest, Fuzzy Rules Using Genetic Cooperative-Competitive Learning method (GFS.GCCL), Adaptive-Network-Based Fuzzy Inference System (ANFIS) and Dynamic Evolving Neural-Fuzzy Inference System method (DENFIS). Ensembling is achieved by evaluating the biased weighted aggregation (i.e. adding more weights to the model with a higher prediction score) of predicted data by selected models. The proposed and existing machine learning techniques have been evaluated on drug synergy score data. The comparative analysis reveals that the proposed method outperforms others in terms of accuracy, root mean square error and coefficient of correlation.

  2. Automatic approach to deriving fuzzy slope positions

    NASA Astrophysics Data System (ADS)

    Zhu, Liang-Jun; Zhu, A.-Xing; Qin, Cheng-Zhi; Liu, Jun-Zhi

    2018-03-01

    Fuzzy characterization of slope positions is important for geographic modeling. Most of the existing fuzzy classification-based methods for fuzzy characterization require extensive user intervention in data preparation and parameter setting, which is tedious and time-consuming. This paper presents an automatic approach to overcoming these limitations in the prototype-based inference method for deriving fuzzy membership value (or similarity) to slope positions. The key contribution is a procedure for finding the typical locations and setting the fuzzy inference parameters for each slope position type. Instead of being determined totally by users in the prototype-based inference method, in the proposed approach the typical locations and fuzzy inference parameters for each slope position type are automatically determined by a rule set based on prior domain knowledge and the frequency distributions of topographic attributes. Furthermore, the preparation of topographic attributes (e.g., slope gradient, curvature, and relative position index) is automated, so the proposed automatic approach has only one necessary input, i.e., the gridded digital elevation model of the study area. All compute-intensive algorithms in the proposed approach were speeded up by parallel computing. Two study cases were provided to demonstrate that this approach can properly, conveniently and quickly derive the fuzzy slope positions.

  3. Fuzzy Inference Based Obstacle Avoidance Control of Electric Powered Wheelchair Considering Driving Risk

    NASA Astrophysics Data System (ADS)

    Kiso, Atsushi; Murakami, Hiroki; Seki, Hirokazu

    This paper describes a novel obstacle avoidance control scheme of electric powered wheelchairs for realizing the safe driving in various environments. The “electric powered wheelchair” which generates the driving force by electric motors is expected to be widely used as a mobility support system for elderly people and disabled people; however, the driving performance must be further improved because the number of driving accidents caused by elderly operator's narrow sight and joystick operation errors is increasing. This paper proposes a novel obstacle avoidance control scheme based on fuzzy algorithm to prevent driving accidents. The proposed control system determines the driving direction by fuzzy algorithm based on the information of the joystick operation and distance to obstacles measured by ultrasonic sensors. Fuzzy rules to determine the driving direction are designed surely to avoid passers-by and walls considering the human's intent and driving environments. Some driving experiments on the practical situations show the effectiveness of the proposed control system.

  4. A model-driven privacy compliance decision support for medical data sharing in Europe.

    PubMed

    Boussi Rahmouni, H; Solomonides, T; Casassa Mont, M; Shiu, S; Rahmouni, M

    2011-01-01

    Clinical practitioners and medical researchers often have to share health data with other colleagues across Europe. Privacy compliance in this context is very important but challenging. Automated privacy guidelines are a practical way of increasing users' awareness of privacy obligations and help eliminating unintentional breaches of privacy. In this paper we present an ontology-plus-rules based approach to privacy decision support for the sharing of patient data across European platforms. We use ontologies to model the required domain and context information about data sharing and privacy requirements. In addition, we use a set of Semantic Web Rule Language rules to reason about legal privacy requirements that are applicable to a specific context of data disclosure. We make the complete set invocable through the use of a semantic web application acting as an interactive privacy guideline system can then invoke the full model in order to provide decision support. When asked, the system will generate privacy reports applicable to a specific case of data disclosure described by the user. Also reports showing guidelines per Member State may be obtained. The advantage of this approach lies in the expressiveness and extensibility of the modelling and inference languages adopted and the ability they confer to reason with complex requirements interpreted from high level regulations. However, the system cannot at this stage fully simulate the role of an ethics committee or review board.

  5. Inferring interplanetary magnetic field polarities from geomagnetic variations

    NASA Astrophysics Data System (ADS)

    Vokhmyanin, M. V.; Ponyavin, D. I.

    2012-06-01

    In this paper, we propose a modified procedure to infer the interplanetary magnetic field (IMF) polarities from geomagnetic observations. It allows to identify the polarity back to 1905. As previous techniques it is based on the well-known Svalgaard-Mansurov effect. We have improved the quality and accuracy of polarity inference compared with the previous results of Svalgaard (1975) and Vennerstroem et al. (2001) by adding new geomagnetic stations and extracting carefully diurnal curve. The data demonstrates an excess of one of the two IMF sectors within equinoxes (Rosenberg-Coleman rule) evidencing polar field reversals at least for the last eight solar cycles. We also found a predominance of the two-sector structure in late of descending phase of solar cycle 16.

  6. The development of causal reasoning.

    PubMed

    Kuhn, Deanna

    2012-05-01

    How do inference rules for causal learning themselves change developmentally? A model of the development of causal reasoning must address this question, as well as specify the inference rules. Here, the evidence for developmental changes in processes of causal reasoning is reviewed, with the distinction made between diagnostic causal inference and causal prediction. Also addressed is the paradox of a causal reasoning literature that highlights the competencies of young children and the proneness to error among adults. WIREs Cogn Sci 2012, 3:327-335. doi: 10.1002/wcs.1160 For further resources related to this article, please visit the WIREs website. Copyright © 2012 John Wiley & Sons, Ltd.

  7. Effective Bayesian Transfer Learning

    DTIC Science & Technology

    2010-03-01

    reasonable value of k , defined by the task B training set size. Transfer Regret 1 Regret = 100 * G AB B No Transfer With Transfer AB...a. REPORT U b . ABSTRACT U c. THIS PAGE U 19b. TELEPHONE NUMBER (Include area code) N/A Standard Form 298 (Rev. 8-98) Prescribed...rule set given the prior and developed staged approximate inference strategy, in which data from observed tasks 1 to k are used to infer general rule

  8. Rule groupings: An approach towards verification of expert systems

    NASA Technical Reports Server (NTRS)

    Mehrotra, Mala

    1991-01-01

    Knowledge-based expert systems are playing an increasingly important role in NASA space and aircraft systems. However, many of NASA's software applications are life- or mission-critical and knowledge-based systems do not lend themselves to the traditional verification and validation techniques for highly reliable software. Rule-based systems lack the control abstractions found in procedural languages. Hence, it is difficult to verify or maintain such systems. Our goal is to automatically structure a rule-based system into a set of rule-groups having a well-defined interface to other rule-groups. Once a rule base is decomposed into such 'firewalled' units, studying the interactions between rules would become more tractable. Verification-aid tools can then be developed to test the behavior of each such rule-group. Furthermore, the interactions between rule-groups can be studied in a manner similar to integration testing. Such efforts will go a long way towards increasing our confidence in the expert-system software. Our research efforts address the feasibility of automating the identification of rule groups, in order to decompose the rule base into a number of meaningful units.

  9. Reverse engineering gene regulatory networks from measurement with missing values.

    PubMed

    Ogundijo, Oyetunji E; Elmas, Abdulkadir; Wang, Xiaodong

    2016-12-01

    Gene expression time series data are usually in the form of high-dimensional arrays. Unfortunately, the data may sometimes contain missing values: for either the expression values of some genes at some time points or the entire expression values of a single time point or some sets of consecutive time points. This significantly affects the performance of many algorithms for gene expression analysis that take as an input, the complete matrix of gene expression measurement. For instance, previous works have shown that gene regulatory interactions can be estimated from the complete matrix of gene expression measurement. Yet, till date, few algorithms have been proposed for the inference of gene regulatory network from gene expression data with missing values. We describe a nonlinear dynamic stochastic model for the evolution of gene expression. The model captures the structural, dynamical, and the nonlinear natures of the underlying biomolecular systems. We present point-based Gaussian approximation (PBGA) filters for joint state and parameter estimation of the system with one-step or two-step missing measurements . The PBGA filters use Gaussian approximation and various quadrature rules, such as the unscented transform (UT), the third-degree cubature rule and the central difference rule for computing the related posteriors. The proposed algorithm is evaluated with satisfying results for synthetic networks, in silico networks released as a part of the DREAM project, and the real biological network, the in vivo reverse engineering and modeling assessment (IRMA) network of yeast Saccharomyces cerevisiae . PBGA filters are proposed to elucidate the underlying gene regulatory network (GRN) from time series gene expression data that contain missing values. In our state-space model, we proposed a measurement model that incorporates the effect of the missing data points into the sequential algorithm. This approach produces a better inference of the model parameters and hence, more accurate prediction of the underlying GRN compared to when using the conventional Gaussian approximation (GA) filters ignoring the missing data points.

  10. HERB: A production system for programming with hierarchical expert rule bases: User's manual, HERB Version 1. 0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hummel, K.E.

    1987-12-01

    Expert systems are artificial intelligence programs that solve problems requiring large amounts of heuristic knowledge, based on years of experience and tradition. Production systems are domain-independent tools that support the development of rule-based expert systems. This document describes a general purpose production system known as HERB. This system was developed to support the programming of expert systems using hierarchically structured rule bases. HERB encourages the partitioning of rules into multiple rule bases and supports the use of multiple conflict resolution strategies. Multiple rule bases can also be placed on a system stack and simultaneously searched during each interpreter cycle. Bothmore » backward and forward chaining rules are supported by HERB. The condition portion of each rule can contain both patterns, which are matched with facts in a data base, and LISP expressions, which are explicitly evaluated in the LISP environment. Properties of objects can also be stored in the HERB data base and referenced within the scope of each rule. This document serves both as an introduction to the principles of LISP-based production systems and as a user's manual for the HERB system. 6 refs., 17 figs.« less

  11. Generic comparison of protein inference engines.

    PubMed

    Claassen, Manfred; Reiter, Lukas; Hengartner, Michael O; Buhmann, Joachim M; Aebersold, Ruedi

    2012-04-01

    Protein identifications, instead of peptide-spectrum matches, constitute the biologically relevant result of shotgun proteomics studies. How to appropriately infer and report protein identifications has triggered a still ongoing debate. This debate has so far suffered from the lack of appropriate performance measures that allow us to objectively assess protein inference approaches. This study describes an intuitive, generic and yet formal performance measure and demonstrates how it enables experimentalists to select an optimal protein inference strategy for a given collection of fragment ion spectra. We applied the performance measure to systematically explore the benefit of excluding possibly unreliable protein identifications, such as single-hit wonders. Therefore, we defined a family of protein inference engines by extending a simple inference engine by thousands of pruning variants, each excluding a different specified set of possibly unreliable identifications. We benchmarked these protein inference engines on several data sets representing different proteomes and mass spectrometry platforms. Optimally performing inference engines retained all high confidence spectral evidence, without posterior exclusion of any type of protein identifications. Despite the diversity of studied data sets consistently supporting this rule, other data sets might behave differently. In order to ensure maximal reliable proteome coverage for data sets arising in other studies we advocate abstaining from rigid protein inference rules, such as exclusion of single-hit wonders, and instead consider several protein inference approaches and assess these with respect to the presented performance measure in the specific application context.

  12. Daily life activity routine discovery in hemiparetic rehabilitation patients using topic models.

    PubMed

    Seiter, J; Derungs, A; Schuster-Amft, C; Amft, O; Tröster, G

    2015-01-01

    Monitoring natural behavior and activity routines of hemiparetic rehabilitation patients across the day can provide valuable progress information for therapists and patients and contribute to an optimized rehabilitation process. In particular, continuous patient monitoring could add type, frequency and duration of daily life activity routines and hence complement standard clinical scores that are assessed for particular tasks only. Machine learning methods have been applied to infer activity routines from sensor data. However, supervised methods require activity annotations to build recognition models and thus require extensive patient supervision. Discovery methods, including topic models could provide patient routine information and deal with variability in activity and movement performance across patients. Topic models have been used to discover characteristic activity routine patterns of healthy individuals using activity primitives recognized from supervised sensor data. Yet, the applicability of topic models for hemiparetic rehabilitation patients and techniques to derive activity primitives without supervision needs to be addressed. We investigate, 1) whether a topic model-based activity routine discovery framework can infer activity routines of rehabilitation patients from wearable motion sensor data. 2) We compare the performance of our topic model-based activity routine discovery using rule-based and clustering-based activity vocabulary. We analyze the activity routine discovery in a dataset recorded with 11 hemiparetic rehabilitation patients during up to ten full recording days per individual in an ambulatory daycare rehabilitation center using wearable motion sensors attached to both wrists and the non-affected thigh. We introduce and compare rule-based and clustering-based activity vocabulary to process statistical and frequency acceleration features to activity words. Activity words were used for activity routine pattern discovery using topic models based on Latent Dirichlet Allocation. Discovered activity routine patterns were then mapped to six categorized activity routines. Using the rule-based approach, activity routines could be discovered with an average accuracy of 76% across all patients. The rule-based approach outperformed clustering by 10% and showed less confusions for predicted activity routines. Topic models are suitable to discover daily life activity routines in hemiparetic rehabilitation patients without trained classifiers and activity annotations. Activity routines show characteristic patterns regarding activity primitives including body and extremity postures and movement. A patient-independent rule set can be derived. Including expert knowledge supports successful activity routine discovery over completely data-driven clustering.

  13. Reverse engineering the gap gene network of Drosophila melanogaster.

    PubMed

    Perkins, Theodore J; Jaeger, Johannes; Reinitz, John; Glass, Leon

    2006-05-01

    A fundamental problem in functional genomics is to determine the structure and dynamics of genetic networks based on expression data. We describe a new strategy for solving this problem and apply it to recently published data on early Drosophila melanogaster development. Our method is orders of magnitude faster than current fitting methods and allows us to fit different types of rules for expressing regulatory relationships. Specifically, we use our approach to fit models using a smooth nonlinear formalism for modeling gene regulation (gene circuits) as well as models using logical rules based on activation and repression thresholds for transcription factors. Our technique also allows us to infer regulatory relationships de novo or to test network structures suggested by the literature. We fit a series of models to test several outstanding questions about gap gene regulation, including regulation of and by hunchback and the role of autoactivation. Based on our modeling results and validation against the experimental literature, we propose a revised network structure for the gap gene system. Interestingly, some relationships in standard textbook models of gap gene regulation appear to be unnecessary for or even inconsistent with the details of gap gene expression during wild-type development.

  14. Enhancing dissolved oxygen control using an on-line hybrid fuzzy-neural soft-sensing model-based control system in an anaerobic/anoxic/oxic process.

    PubMed

    Huang, Mingzhi; Wan, Jinquan; Hu, Kang; Ma, Yongwen; Wang, Yan

    2013-12-01

    An on-line hybrid fuzzy-neural soft-sensing model-based control system was developed to optimize dissolved oxygen concentration in a bench-scale anaerobic/anoxic/oxic (A(2)/O) process. In order to improve the performance of the control system, a self-adapted fuzzy c-means clustering algorithm and adaptive network-based fuzzy inference system (ANFIS) models were employed. The proposed control system permits the on-line implementation of every operating strategy of the experimental system. A set of experiments involving variable hydraulic retention time (HRT), influent pH (pH), dissolved oxygen in the aerobic reactor (DO), and mixed-liquid return ratio (r) was carried out. Using the proposed system, the amount of COD in the effluent stabilized at the set-point and below. The improvement was achieved with optimum dissolved oxygen concentration because the performance of the treatment process was optimized using operating rules implemented in real time. The system allows various expert operational approaches to be deployed with the goal of minimizing organic substances in the outlet while using the minimum amount of energy.

  15. The inference from a single case: moral versus scientific inferences in implementing new biotechnologies.

    PubMed

    Hofmann, B

    2008-06-01

    Are there similarities between scientific and moral inference? This is the key question in this article. It takes as its point of departure an instance of one person's story in the media changing both Norwegian public opinion and a brand-new Norwegian law prohibiting the use of saviour siblings. The case appears to falsify existing norms and to establish new ones. The analysis of this case reveals similarities in the modes of inference in science and morals, inasmuch as (a) a single case functions as a counter-example to an existing rule; (b) there is a common presupposition of stability, similarity and order, which makes it possible to reason from a few cases to a general rule; and (c) this makes it possible to hold things together and retain order. In science, these modes of inference are referred to as falsification, induction and consistency. In morals, they have a variety of other names. Hence, even without abandoning the fact-value divide, there appear to be similarities between inference in science and inference in morals, which may encourage communication across the boundaries between "the two cultures" and which are relevant to medical humanities.

  16. Learning and inference in a nonequilibrium Ising model with hidden nodes.

    PubMed

    Dunn, Benjamin; Roudi, Yasser

    2013-02-01

    We study inference and reconstruction of couplings in a partially observed kinetic Ising model. With hidden spins, calculating the likelihood of a sequence of observed spin configurations requires performing a trace over the configurations of the hidden ones. This, as we show, can be represented as a path integral. Using this representation, we demonstrate that systematic approximate inference and learning rules can be derived using dynamical mean-field theory. Although naive mean-field theory leads to an unstable learning rule, taking into account Gaussian corrections allows learning the couplings involving hidden nodes. It also improves learning of the couplings between the observed nodes compared to when hidden nodes are ignored.

  17. iRENEX: a clinically informed decision support system for the interpretation of ⁹⁹mTc-MAG3 scans to detect renal obstruction.

    PubMed

    Garcia, Ernest V; Taylor, Andrew; Folks, Russell; Manatunga, Daya; Halkar, Raghuveer; Savir-Baruch, Bital; Dubovsky, Eva

    2012-09-01

    Decision support systems for imaging analysis and interpretation are rapidly being developed and will have an increasing impact on the practice of medicine. RENEX is a renal expert system to assist physicians evaluate suspected obstruction in patients undergoing mercaptoacetyltriglycine (MAG3) renography. RENEX uses quantitative parameters extracted from the dynamic renal scan data using QuantEM™II and heuristic rules in the form of a knowledge base gleaned from experts to determine if a kidney is obstructed; however, RENEX does not have access to and could not consider the clinical information available to diagnosticians interpreting these studies. We designed and implemented a methodology to incorporate clinical information into RENEX, implemented motion detection and evaluated this new comprehensive system (iRENEX) in a pilot group of 51 renal patients. To reach a conclusion as to whether a kidney is obstructed, 56 new clinical rules were added to the previously reported 60 rules used to interpret quantitative MAG3 parameters. All the clinical rules were implemented after iRENEX reached a conclusion on obstruction based on the quantitative MAG3 parameters, and the evidence of obstruction was then modified by the new clinical rules. iRENEX consisted of a library to translate parameter values to certainty factors, a knowledge base with 116 heuristic interpretation rules, a forward chaining inference engine to determine obstruction and a justification engine. A clinical database was developed containing patient histories and imaging report data obtained from the hospital information system associated with the pertinent MAG3 studies. The system was fine-tuned and tested using a pilot group of 51 patients (21 men, mean age 58.2 ± 17.1 years, 100 kidneys) deemed by an expert panel to have 61 unobstructed and 39 obstructed kidneys. iRENEX, using only quantitative MAG3 data agreed with the expert panel in 87 % (34/39) of obstructed and 90 % (55/61) of unobstructed kidneys. iRENEX, using both quantitative and clinical data agreed with the expert panel in 95 % (37/39) of obstructed and 92 % (56/61) of unobstructed kidneys. The clinical information significantly (p < 0.001) increased iRENEX certainty in detecting obstruction over using the quantitative data alone. Our renal expert system for detecting renal obstruction has been substantially expanded to incorporate the clinical information available to physicians as well as advanced quality control features and was shown to interpret renal studies in a pilot group at a standardized expert level. These encouraging results warrant a prospective study in a large population of patients with and without renal obstruction to establish the diagnostic performance of iRENEX.

  18. InvestigationOrganizer: The Development and Testing of a Web-based Tool to Support Mishap Investigations

    NASA Technical Reports Server (NTRS)

    Carvalho, Robert F.; Williams, James; Keller, Richard; Sturken, Ian; Panontin, Tina

    2004-01-01

    InvestigationOrganizer (IO) is a collaborative web-based system designed to support the conduct of mishap investigations. IO provides a common repository for a wide range of mishap related information, and allows investigators to make explicit, shared, and meaningful links between evidence, causal models, findings and recommendations. It integrates the functionality of a database, a common document repository, a semantic knowledge network, a rule-based inference engine, and causal modeling and visualization. Thus far, IO has been used to support four mishap investigations within NASA, ranging from a small property damage case to the loss of the Space Shuttle Columbia. This paper describes how the functionality of IO supports mishap investigations and the lessons learned from the experience of supporting two of the NASA mishap investigations: the Columbia Accident Investigation and the CONTOUR Loss Investigation.

  19. Fuzzy logic

    NASA Technical Reports Server (NTRS)

    Zadeh, Lofti A.

    1988-01-01

    The author presents a condensed exposition of some basic ideas underlying fuzzy logic and describes some representative applications. The discussion covers basic principles; meaning representation and inference; basic rules of inference; and the linguistic variable and its application to fuzzy control.

  20. FoodWiki: Ontology-Driven Mobile Safe Food Consumption System

    PubMed Central

    Çelik, Duygu

    2015-01-01

    An ontology-driven safe food consumption mobile system is considered. Over 3,000 compounds are being added to processed food, with numerous effects on the food: to add color, stabilize, texturize, preserve, sweeten, thicken, add flavor, soften, emulsify, and so forth. According to World Health Organization, governments have lately focused on legislation to reduce such ingredients or compounds in manufactured foods as they may have side effects causing health risks such as heart disease, cancer, diabetes, allergens, and obesity. By supervising what and how much to eat as well as what not to eat, we can maximize a patient's life quality through avoidance of unhealthy ingredients. Smart e-health systems with powerful knowledge bases can provide suggestions of appropriate foods to individuals. Next-generation smart knowledgebase systems will not only include traditional syntactic-based search, which limits the utility of the search results, but will also provide semantics for rich searching. In this paper, performance of concept matching of food ingredients is semantic-based, meaning that it runs its own semantic based rule set to infer meaningful results through the proposed Ontology-Driven Mobile Safe Food Consumption System (FoodWiki). PMID:26221624

  1. Formation of an internal model of environment dynamics during upper limb reaching movements: a fuzzy approach.

    PubMed

    MacDonald, Chad; Moussavi, Zahra; Sarkodie-Gyan, Thompson

    2007-01-01

    This paper presents the development and simulation of a fuzzy logic based learning mechanism to emulate human motor learning. In particular, fuzzy inference was used to develop an internal model of a novel dynamic environment experienced during planar reaching movements with the upper limb. A dynamic model of the human arm was developed and a fuzzy if-then rule base was created to relate trajectory movement and velocity errors to internal model update parameters. An experimental simulation was performed to compare the fuzzy system's performance with that of human subjects. It was found that the dynamic model behaved as expected, and the fuzzy learning mechanism created an internal model that was capable of opposing the environmental force field to regain a trajectory closely resembling the desired ideal.

  2. Classification of toddler nutritional status using fuzzy inference system (FIS)

    NASA Astrophysics Data System (ADS)

    Permatasari, Dian; Azizah, Isnaini Nur; Hadiat, Hanifah Latifah; Abadi, Agus Maman

    2017-08-01

    Nutrition is a major health problem and concern for parents when it is relating with their toddler. The nutritional status is an expression of the state caused by the status of the balance between the number of intake of nutrients and the amount needed by the body for a variety of biological functions. The indicators that often used to determine the nutritional status is the combination of Weight (W) and Height (H) symbolized by W/H, because it describe a sensitive and specific nutritional status. This study aims to apply the Fuzzy Inference System Mamdani method to classify the nutritional status of toddler. The inputs are weight and height of the toddler. There are nine rules that used and the output is nutritional status classification consisting of four criteria: stunting, wasting, normal, and overweight. Fuzzy Inference System that be used is Mamdani method and the defuzzification use Centroid Method. The result of this study is compared with Assessment Anthropometric Standard of Toddler Nutritional Status by Ministry of Health. The accuracy level of this fuzzy model is about 84%.

  3. PlanWorks: A Debugging Environment for Constraint Based Planning Systems

    NASA Technical Reports Server (NTRS)

    Daley, Patrick; Frank, Jeremy; Iatauro, Michael; McGann, Conor; Taylor, Will

    2005-01-01

    Numerous planning and scheduling systems employ underlying constraint reasoning systems. Debugging such systems involves the search for errors in model rules, constraint reasoning algorithms, search heuristics, and the problem instance (initial state and goals). In order to effectively find such problems, users must see why each state or action is in a plan by tracking causal chains back to part of the initial problem instance. They must be able to visualize complex relationships among many different entities and distinguish between those entities easily. For example, a variable can be in the scope of several constraints, as well as part of a state or activity in a plan; the activity can arise as a consequence of another activity and a model rule. Finally, they must be able to track each logical inference made during planning. We have developed PlanWorks, a comprehensive system for debugging constraint-based planning and scheduling systems. PlanWorks assumes a strong transaction model of the entire planning process, including adding and removing parts of the constraint network, variable assignment, and constraint propagation. A planner logs all transactions to a relational database that is tailored to support queries for of specialized views to display different forms of data (e.g. constraints, activities, resources, and causal links). PlanWorks was specifically developed for the Extensible Universal Remote Operations Planning Architecture (EUROPA(sub 2)) developed at NASA, but the underlying principles behind PlanWorks make it useful for many constraint-based planning systems. The paper is organized as follows. We first describe some fundamentals of EUROPA(sub 2). We then describe PlanWorks' principal components. We then discuss each component in detail, and then describe inter-component navigation features. We close with a discussion of how PlanWorks is used to find model flaws.

  4. Composite Faces Are Not (Necessarily) Processed Coactively: A Test Using Systems Factorial Technology and Logical-Rule Models

    ERIC Educational Resources Information Center

    Cheng, Xue Jun; McCarthy, Callum J.; Wang, Tony S. L.; Palmeri, Thomas J.; Little, Daniel R.

    2018-01-01

    Upright faces are thought to be processed more holistically than inverted faces. In the widely used composite face paradigm, holistic processing is inferred from interference in recognition performance from a to-be-ignored face half for upright and aligned faces compared with inverted or misaligned faces. We sought to characterize the nature of…

  5. Nondeterministic data base for computerized visual perception

    NASA Technical Reports Server (NTRS)

    Yakimovsky, Y.

    1976-01-01

    A description is given of the knowledge representation data base in the perception subsystem of the Mars robot vehicle prototype. Two types of information are stored. The first is generic information that represents general rules that are conformed to by structures in the expected environments. The second kind of information is a specific description of a structure, i.e., the properties and relations of objects in the specific case being analyzed. The generic knowledge is represented so that it can be applied to extract and infer the description of specific structures. The generic model of the rules is substantially a Bayesian representation of the statistics of the environment, which means it is geared to representation of nondeterministic rules relating properties of, and relations between, objects. The description of a specific structure is also nondeterministic in the sense that all properties and relations may take a range of values with an associated probability distribution.

  6. Clinical Outcome Prediction in Aneurysmal Subarachnoid Hemorrhage Using Bayesian Neural Networks with Fuzzy Logic Inferences

    PubMed Central

    Lo, Benjamin W. Y.; Macdonald, R. Loch; Baker, Andrew; Levine, Mitchell A. H.

    2013-01-01

    Objective. The novel clinical prediction approach of Bayesian neural networks with fuzzy logic inferences is created and applied to derive prognostic decision rules in cerebral aneurysmal subarachnoid hemorrhage (aSAH). Methods. The approach of Bayesian neural networks with fuzzy logic inferences was applied to data from five trials of Tirilazad for aneurysmal subarachnoid hemorrhage (3551 patients). Results. Bayesian meta-analyses of observational studies on aSAH prognostic factors gave generalizable posterior distributions of population mean log odd ratios (ORs). Similar trends were noted in Bayesian and linear regression ORs. Significant outcome predictors include normal motor response, cerebral infarction, history of myocardial infarction, cerebral edema, history of diabetes mellitus, fever on day 8, prior subarachnoid hemorrhage, admission angiographic vasospasm, neurological grade, intraventricular hemorrhage, ruptured aneurysm size, history of hypertension, vasospasm day, age and mean arterial pressure. Heteroscedasticity was present in the nontransformed dataset. Artificial neural networks found nonlinear relationships with 11 hidden variables in 1 layer, using the multilayer perceptron model. Fuzzy logic decision rules (centroid defuzzification technique) denoted cut-off points for poor prognosis at greater than 2.5 clusters. Discussion. This aSAH prognostic system makes use of existing knowledge, recognizes unknown areas, incorporates one's clinical reasoning, and compensates for uncertainty in prognostication. PMID:23690884

  7. Refining the Relationships among Historical Figures by Implementing Inference Rules in SWRL

    NASA Astrophysics Data System (ADS)

    Fajrin Ariyani, Nurul; Saralita, Madis; Sarwosri; Sarno, Riyanarto

    2018-03-01

    The biography of historical figures is often fascinating to be known. Everything about their character, work, invention, and personal life sometimes are presented in their biography. The social and family relationships among historical figures also put into concern, especially for political figures, heroes, kings or persons who have ever been ruled a monarchy in their past. Some biographies can be found in Wikipedia as articles. Most of the social and family relationship contents of these figures are not completely depicted due to a various article’s contributors and sources. Fortunately, the missing relatives of a person might reside in the other figures’ biography in different pages. Each Wikipedia article has metadata which represents its essential information of content. By processing the metadata obtained from DBpedia and composing the inferencing rules (in the form of ontology) to identify the relationships content, it can generate several new inferred facts that complement the existing relationships. This work proposes a methodology for finding missing relationships among historical figures using inference rules in an ontology. As a result, our method can present new facts about the relationships that absent in the existing Wikipedia articles.

  8. Comparative Analysis of Membership Function on Mamdani Fuzzy Inference System for Decision Making

    NASA Astrophysics Data System (ADS)

    harliana, Putri; Rahim, Robbi

    2017-12-01

    Membership function is a curve that shows mapping the input data points into the value or degree of membership which has an interval between 0 and 1. One way to get membership value is through a function approach. There are some membership functions can be used on mamdani fuzzy inference system. They are triangular, trapezoid, singleton, sigmoid, Gaussian, etc. In this paper only discuss three membership functions, are triangular, trapezoid and Gaussian. These three membership functions will be compared to see the difference in parameter values and results obtained. For case study in this paper is admission of students at popular school. There are three variable can be used, they are students’ report, IQ score and parents’ income. Which will then be created if-then rules.

  9. Rule groupings: A software engineering approach towards verification of expert systems

    NASA Technical Reports Server (NTRS)

    Mehrotra, Mala

    1991-01-01

    Currently, most expert system shells do not address software engineering issues for developing or maintaining expert systems. As a result, large expert systems tend to be incomprehensible, difficult to debug or modify and almost impossible to verify or validate. Partitioning rule based systems into rule groups which reflect the underlying subdomains of the problem should enhance the comprehensibility, maintainability, and reliability of expert system software. Attempts were made to semiautomatically structure a CLIPS rule base into groups of related rules that carry the same type of information. Different distance metrics that capture relevant information from the rules for grouping are discussed. Two clustering algorithms that partition the rule base into groups of related rules are given. Two independent evaluation criteria are developed to measure the effectiveness of the grouping strategies. Results of the experiment with three sample rule bases are presented.

  10. Research in Natural Language Understanding

    DTIC Science & Technology

    1979-08-31

    REFERENCES 159 APPENDIX A 1 A. 2 A. 3 A. 4 A. 5 A. PLAN-RELATED INFERENCE PROCEDURES ACTION -topian: ACTION -toplanrecog: S-REQUEST-INFORMlF...speech acts and attached procedures> The system’s model of its display actions . Taxonomy of display actions . The display scope after sentence 3...34. At various points the system is in a state where it needs to determine which of a large number of possible rules of action are applicable to its

  11. Automated revision of CLIPS rule-bases

    NASA Technical Reports Server (NTRS)

    Murphy, Patrick M.; Pazzani, Michael J.

    1994-01-01

    This paper describes CLIPS-R, a theory revision system for the revision of CLIPS rule-bases. CLIPS-R may be used for a variety of knowledge-base revision tasks, such as refining a prototype system, adapting an existing system to slightly different operating conditions, or improving an operational system that makes occasional errors. We present a description of how CLIPS-R revises rule-bases, and an evaluation of the system on three rule-bases.

  12. Design and implementation of fuzzy logic controllers. Thesis Final Report, 27 Jul. 1992 - 1 Jan. 1993

    NASA Technical Reports Server (NTRS)

    Abihana, Osama A.; Gonzalez, Oscar R.

    1993-01-01

    The main objectives of our research are to present a self-contained overview of fuzzy sets and fuzzy logic, develop a methodology for control system design using fuzzy logic controllers, and to design and implement a fuzzy logic controller for a real system. We first present the fundamental concepts of fuzzy sets and fuzzy logic. Fuzzy sets and basic fuzzy operations are defined. In addition, for control systems, it is important to understand the concepts of linguistic values, term sets, fuzzy rule base, inference methods, and defuzzification methods. Second, we introduce a four-step fuzzy logic control system design procedure. The design procedure is illustrated via four examples, showing the capabilities and robustness of fuzzy logic control systems. This is followed by a tuning procedure that we developed from our design experience. Third, we present two Lyapunov based techniques for stability analysis. Finally, we present our design and implementation of a fuzzy logic controller for a linear actuator to be used to control the direction of the Free Flight Rotorcraft Research Vehicle at LaRC.

  13. Inference for multivariate regression model based on multiply imputed synthetic data generated via posterior predictive sampling

    NASA Astrophysics Data System (ADS)

    Moura, Ricardo; Sinha, Bimal; Coelho, Carlos A.

    2017-06-01

    The recent popularity of the use of synthetic data as a Statistical Disclosure Control technique has enabled the development of several methods of generating and analyzing such data, but almost always relying in asymptotic distributions and in consequence being not adequate for small sample datasets. Thus, a likelihood-based exact inference procedure is derived for the matrix of regression coefficients of the multivariate regression model, for multiply imputed synthetic data generated via Posterior Predictive Sampling. Since it is based in exact distributions this procedure may even be used in small sample datasets. Simulation studies compare the results obtained from the proposed exact inferential procedure with the results obtained from an adaptation of Reiters combination rule to multiply imputed synthetic datasets and an application to the 2000 Current Population Survey is discussed.

  14. An evaluation and implementation of rule-based Home Energy Management System using the Rete algorithm.

    PubMed

    Kawakami, Tomoya; Fujita, Naotaka; Yoshihisa, Tomoki; Tsukamoto, Masahiko

    2014-01-01

    In recent years, sensors become popular and Home Energy Management System (HEMS) takes an important role in saving energy without decrease in QoL (Quality of Life). Currently, many rule-based HEMSs have been proposed and almost all of them assume "IF-THEN" rules. The Rete algorithm is a typical pattern matching algorithm for IF-THEN rules. Currently, we have proposed a rule-based Home Energy Management System (HEMS) using the Rete algorithm. In the proposed system, rules for managing energy are processed by smart taps in network, and the loads for processing rules and collecting data are distributed to smart taps. In addition, the number of processes and collecting data are reduced by processing rules based on the Rete algorithm. In this paper, we evaluated the proposed system by simulation. In the simulation environment, rules are processed by a smart tap that relates to the action part of each rule. In addition, we implemented the proposed system as HEMS using smart taps.

  15. New developments of a knowledge based system (VEG) for inferring vegetation characteristics

    NASA Technical Reports Server (NTRS)

    Kimes, D. S.; Harrison, P. A.; Harrison, P. R.

    1992-01-01

    An extraction technique for inferring physical and biological surface properties of vegetation using nadir and/or directional reflectance data as input has been developed. A knowledge-based system (VEG) accepts spectral data of an unknown target as input, determines the best strategy for inferring the desired vegetation characteristic, applies the strategy to the target data, and provides a rigorous estimate of the accuracy of the inference. Progress in developing the system is presented. VEG combines methods from remote sensing and artificial intelligence, and integrates input spectral measurements with diverse knowledge bases. VEG has been developed to (1) infer spectral hemispherical reflectance from any combination of nadir and/or off-nadir view angles; (2) test and develop new extraction techniques on an internal spectral database; (3) browse, plot, or analyze directional reflectance data in the system's spectral database; (4) discriminate between user-defined vegetation classes using spectral and directional reflectance relationships; and (5) infer unknown view angles from known view angles (known as view angle extension).

  16. SCL: An off-the-shelf system for spacecraft control

    NASA Astrophysics Data System (ADS)

    Buckley, Brian; Vangaasbeck, James

    1994-11-01

    In this age of shrinking military, civil, and commercial space budgets, an off-the-shelf solution is needed to provide a multimission approach to spacecraft control. A standard operational interface which can be applied to multiple spacecraft allows a common approach to ground and space operations. A trend for many space programs has been to reduce operational staff by applying autonomy to the spacecraft and to the ground stations. The Spacecraft Command Language (SCL) system developed by Interface and Control Systems, Inc. (ICS) provides an off-the-shelf solution for spacecraft operations. The SCL system is designed to provide a hyper-scripting interface which remains standard from program to program. The spacecraft and ground station hardware specifics are isolated to provide the maximum amount of portability from system to system. Uplink and downlink interfaces are also isolated to allow the system to perform independent of the communications protocols chosen. The SCL system can be used for both the ground stations and the spacecraft, or as a value added package for existing ground station environments. The SCL system provides an expanded stored commanding capability as well as a rule-based expert system on-board. The expert system allows reactive control on-board the spacecraft for functions such as electrical power systems (EPS), thermal control, etc. which have traditionally been performed on the ground. The SCL rule and scripting capability share a common syntax allowing control of scripts from rules and rules from scripts. Rather than telemeter over sampled data to the ground, the SCL system maintains a database on-board which is available for interrogation by the scripts and rules. The SCL knowledge base is constructed on the ground and uploaded to the spacecraft. The SCL system follows an open-systems approach allowing other tasks to communicate with SCL on the ground and in space. The SCL system was used on the Clementine program (launched January 25, 1994) and is required to have bidirectional communications with the guidance, navigation, and control (GNC) algorithms which were written as another task. Sequencing of the spacecraft maneuvers are handled by SCL, but the low-level thruster pulse commands are handled by the GNC software. Attitude information is reported back as telemetry, allowing the SCL expert system to inference on the changing data. The Clementine SCL flight software was largely reused from another Naval Center for Space Technology (NCST) satellite program.

  17. SCL: An off-the-shelf system for spacecraft control

    NASA Technical Reports Server (NTRS)

    Buckley, Brian; Vangaasbeck, James

    1994-01-01

    In this age of shrinking military, civil, and commercial space budgets, an off-the-shelf solution is needed to provide a multimission approach to spacecraft control. A standard operational interface which can be applied to multiple spacecraft allows a common approach to ground and space operations. A trend for many space programs has been to reduce operational staff by applying autonomy to the spacecraft and to the ground stations. The Spacecraft Command Language (SCL) system developed by Interface and Control Systems, Inc. (ICS) provides an off-the-shelf solution for spacecraft operations. The SCL system is designed to provide a hyper-scripting interface which remains standard from program to program. The spacecraft and ground station hardware specifics are isolated to provide the maximum amount of portability from system to system. Uplink and downlink interfaces are also isolated to allow the system to perform independent of the communications protocols chosen. The SCL system can be used for both the ground stations and the spacecraft, or as a value added package for existing ground station environments. The SCL system provides an expanded stored commanding capability as well as a rule-based expert system on-board. The expert system allows reactive control on-board the spacecraft for functions such as electrical power systems (EPS), thermal control, etc. which have traditionally been performed on the ground. The SCL rule and scripting capability share a common syntax allowing control of scripts from rules and rules from scripts. Rather than telemeter over sampled data to the ground, the SCL system maintains a database on-board which is available for interrogation by the scripts and rules. The SCL knowledge base is constructed on the ground and uploaded to the spacecraft. The SCL system follows an open-systems approach allowing other tasks to communicate with SCL on the ground and in space. The SCL system was used on the Clementine program (launched January 25, 1994) and is required to have bidirectional communications with the guidance, navigation, and control (GNC) algorithms which were written as another task. Sequencing of the spacecraft maneuvers are handled by SCL, but the low-level thruster pulse commands are handled by the GNC software. Attitude information is reported back as telemetry, allowing the SCL expert system to inference on the changing data. The Clementine SCL flight software was largely reused from another Naval Center for Space Technology (NCST) satellite program. This paper details the SCL architecture and how an off-the-shelf solution makes sense for multimission spacecraft programs. The Clementine mission will be used as a case study in the application of the SCL to a 'fast track' program. The benefits of such a system in a 'better, cheaper, faster' climate will be discussed.

  18. Simple explanations and reasoning: From philosophy of science to expert systems

    NASA Technical Reports Server (NTRS)

    Rochowiak, Daniel

    1988-01-01

    A preliminary prototype of a simple explanation system was constructed. Although the system, based on the idea of storytelling, did not incorporate all of the principles of simple explanation, it did demonstrate the potential of the approach. The system incorporated a hypertext system, an inference engine, and facilities for constructing contrast type explanations. The continued development of such a system should prove to be valuable. By extending the resources of the expert system paradigm, the knowledge engineer is not forced to learn a new set of skills, and the domain knowledge already acquired by him is not lost. Further, both the beginning user and the more advanced user can be accommodated. For the beginning user, corrective explanations and ES explanations provide facilities for more clearly understanding the way in which the system is functioning. For the more advanced user, the instance and state explanations allow him to focus on the issues at hand. The simple model of explanation attempts to exploit and show how the why and how facilities of the expert system paradigm can be extended by attending to the pragmatics of explanation and adding texture to the ordinary pattern of reasoning in a rule based system.

  19. An object oriented generic controller using CLIPS

    NASA Technical Reports Server (NTRS)

    Nivens, Cody R.

    1990-01-01

    In today's applications, the need for the division of code and data has focused on the growth of object oriented programming. This philosophy gives software engineers greater control over the environment of an application. Yet the use of object oriented design does not exclude the need for greater understanding by the application of what the controller is doing. Such understanding is only possible by using expert systems. Providing a controller that is capable of controlling an object by using rule-based expertise would expedite the use of both object oriented design and expert knowledge of the dynamic of an environment in modern controllers. This project presents a model of a controller that uses the CLIPS expert system and objects in C++ to create a generic controller. The polymorphic abilities of C++ allow for the design of a generic component stored in individual data files. Accompanying the component is a set of rules written in CLIPS which provide the following: the control of individual components, the input of sensory data from components and the ability to find the status of a given component. Along with the data describing the application, a set of inference rules written in CLIPS allows the application to make use of sensory facts and status and control abilities. As a demonstration of this ability, the control of the environment of a house is provided. This demonstration includes the data files describing the rooms and their contents as far as devices, windows and doors. The rules used for the home consist of the flow of people in the house and the control of devices by the home owner.

  20. Representing and computing regular languages on massively parallel networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miller, M.I.; O'Sullivan, J.A.; Boysam, B.

    1991-01-01

    This paper proposes a general method for incorporating rule-based constraints corresponding to regular languages into stochastic inference problems, thereby allowing for a unified representation of stochastic and syntactic pattern constraints. The authors' approach first established the formal connection of rules to Chomsky grammars, and generalizes the original work of Shannon on the encoding of rule-based channel sequences to Markov chains of maximum entropy. This maximum entropy probabilistic view leads to Gibb's representations with potentials which have their number of minima growing at precisely the exponential rate that the language of deterministically constrained sequences grow. These representations are coupled to stochasticmore » diffusion algorithms, which sample the language-constrained sequences by visiting the energy minima according to the underlying Gibbs' probability law. The coupling to stochastic search methods yields the all-important practical result that fully parallel stochastic cellular automata may be derived to generate samples from the rule-based constraint sets. The production rules and neighborhood state structure of the language of sequences directly determines the necessary connection structures of the required parallel computing surface. Representations of this type have been mapped to the DAP-510 massively-parallel processor consisting of 1024 mesh-connected bit-serial processing elements for performing automated segmentation of electron-micrograph images.« less

  1. Behavioral personal digital assistants: The seventh generation of computing

    PubMed Central

    Stephens, Kenneth R.; Hutchison, William R.

    1992-01-01

    Skinner (1985) described two divergent approaches to developing computer systems that would behave with some approximation to intelligence. The first approach, which corresponds to the mainstream of artificial intelligence and expert systems, models intelligence as a set of production rules that incorporate knowledge and a set of heuristics for inference and symbol manipulation. The alternative is a system that models the behavioral repertoire as a network of associations between antecedent stimuli and operants, and adapts when supplied with reinforcement. The latter approach is consistent with developments in the field of “neural networks.” The authors describe how an existing adaptive network software system, based on behavior analysis and developed since 1983, can be extended to provide a new generation of software systems capable of acquiring verbal behavior. This effort will require the collaboration of the academic and commercial sectors of the behavioral community, but the end result will enable a generational change in computer systems and support for behavior analytic concepts. PMID:22477053

  2. Bridging the gap between formal and experience-based knowledge for context-aware laparoscopy.

    PubMed

    Katić, Darko; Schuck, Jürgen; Wekerle, Anna-Laura; Kenngott, Hannes; Müller-Stich, Beat Peter; Dillmann, Rüdiger; Speidel, Stefanie

    2016-06-01

    Computer assistance is increasingly common in surgery. However, the amount of information is bound to overload processing abilities of surgeons. We propose methods to recognize the current phase of a surgery for context-aware information filtering. The purpose is to select the most suitable subset of information for surgical situations which require special assistance. We combine formal knowledge, represented by an ontology, and experience-based knowledge, represented by training samples, to recognize phases. For this purpose, we have developed two different methods. Firstly, we use formal knowledge about possible phase transitions to create a composition of random forests. Secondly, we propose a method based on cultural optimization to infer formal rules from experience to recognize phases. The proposed methods are compared with a purely formal knowledge-based approach using rules and a purely experience-based one using regular random forests. The comparative evaluation on laparoscopic pancreas resections and adrenalectomies employs a consistent set of quality criteria on clean and noisy input. The rule-based approaches proved best with noisefree data. The random forest-based ones were more robust in the presence of noise. Formal and experience-based knowledge can be successfully combined for robust phase recognition.

  3. Heart health risk assessment system: a nonintrusive proposal using ontologies and expert rules.

    PubMed

    Garcia-Valverde, Teresa; Muñoz, Andrés; Arcas, Francisco; Bueno-Crespo, Andrés; Caballero, Alberto

    2014-01-01

    According to the World Health Organization, the world's leading cause of death is heart disease, with nearly two million deaths per year. Although some factors are not possible to change, there are some keys that help to prevent heart diseases. One of the most important keys is to keep an active daily life, with moderate exercise. However, deciding what a moderate exercise is or when a slightly abnormal heart rate value is a risk depends on the person and the activity. In this paper we propose a context-aware system that is able to determine the activity the person is performing in an unobtrusive way. Then, we have defined ontology to represent the available knowledge about the person (biometric data, fitness status, medical information, etc.) and her current activity (level of intensity, heart rate recommended for that activity, etc.). With such knowledge, a set of expert rules based on this ontology are involved in a reasoning process to infer levels of alerts or suggestions for the users when the intensity of the activity is detected as dangerous for her health. We show how this approach can be accomplished by using only everyday devices such as a smartphone and a smartwatch.

  4. Heart Health Risk Assessment System: A Nonintrusive Proposal Using Ontologies and Expert Rules

    PubMed Central

    2014-01-01

    According to the World Health Organization, the world's leading cause of death is heart disease, with nearly two million deaths per year. Although some factors are not possible to change, there are some keys that help to prevent heart diseases. One of the most important keys is to keep an active daily life, with moderate exercise. However, deciding what a moderate exercise is or when a slightly abnormal heart rate value is a risk depends on the person and the activity. In this paper we propose a context-aware system that is able to determine the activity the person is performing in an unobtrusive way. Then, we have defined ontology to represent the available knowledge about the person (biometric data, fitness status, medical information, etc.) and her current activity (level of intensity, heart rate recommended for that activity, etc.). With such knowledge, a set of expert rules based on this ontology are involved in a reasoning process to infer levels of alerts or suggestions for the users when the intensity of the activity is detected as dangerous for her health. We show how this approach can be accomplished by using only everyday devices such as a smartphone and a smartwatch. PMID:25045715

  5. Abrasive slurry jet cutting model based on fuzzy relations

    NASA Astrophysics Data System (ADS)

    Qiang, C. H.; Guo, C. W.

    2017-12-01

    The cutting process of pre-mixed abrasive slurry or suspension jet (ASJ) is a complex process affected by many factors, and there is a highly nonlinear relationship between the cutting parameters and cutting quality. In this paper, guided by fuzzy theory, the fuzzy cutting model of ASJ was developed. In the modeling of surface roughness, the upper surface roughness prediction model and the lower surface roughness prediction model were established respectively. The adaptive fuzzy inference system combines the learning mechanism of neural networks and the linguistic reasoning ability of the fuzzy system, membership functions, and fuzzy rules are obtained by adaptive adjustment. Therefore, the modeling process is fast and effective. In this paper, the ANFIS module of MATLAB fuzzy logic toolbox was used to establish the fuzzy cutting model of ASJ, which is found to be quite instrumental to ASJ cutting applications.

  6. Spiking neuron network Helmholtz machine.

    PubMed

    Sountsov, Pavel; Miller, Paul

    2015-01-01

    An increasing amount of behavioral and neurophysiological data suggests that the brain performs optimal (or near-optimal) probabilistic inference and learning during perception and other tasks. Although many machine learning algorithms exist that perform inference and learning in an optimal way, the complete description of how one of those algorithms (or a novel algorithm) can be implemented in the brain is currently incomplete. There have been many proposed solutions that address how neurons can perform optimal inference but the question of how synaptic plasticity can implement optimal learning is rarely addressed. This paper aims to unify the two fields of probabilistic inference and synaptic plasticity by using a neuronal network of realistic model spiking neurons to implement a well-studied computational model called the Helmholtz Machine. The Helmholtz Machine is amenable to neural implementation as the algorithm it uses to learn its parameters, called the wake-sleep algorithm, uses a local delta learning rule. Our spiking-neuron network implements both the delta rule and a small example of a Helmholtz machine. This neuronal network can learn an internal model of continuous-valued training data sets without supervision. The network can also perform inference on the learned internal models. We show how various biophysical features of the neural implementation constrain the parameters of the wake-sleep algorithm, such as the duration of the wake and sleep phases of learning and the minimal sample duration. We examine the deviations from optimal performance and tie them to the properties of the synaptic plasticity rule.

  7. Spiking neuron network Helmholtz machine

    PubMed Central

    Sountsov, Pavel; Miller, Paul

    2015-01-01

    An increasing amount of behavioral and neurophysiological data suggests that the brain performs optimal (or near-optimal) probabilistic inference and learning during perception and other tasks. Although many machine learning algorithms exist that perform inference and learning in an optimal way, the complete description of how one of those algorithms (or a novel algorithm) can be implemented in the brain is currently incomplete. There have been many proposed solutions that address how neurons can perform optimal inference but the question of how synaptic plasticity can implement optimal learning is rarely addressed. This paper aims to unify the two fields of probabilistic inference and synaptic plasticity by using a neuronal network of realistic model spiking neurons to implement a well-studied computational model called the Helmholtz Machine. The Helmholtz Machine is amenable to neural implementation as the algorithm it uses to learn its parameters, called the wake-sleep algorithm, uses a local delta learning rule. Our spiking-neuron network implements both the delta rule and a small example of a Helmholtz machine. This neuronal network can learn an internal model of continuous-valued training data sets without supervision. The network can also perform inference on the learned internal models. We show how various biophysical features of the neural implementation constrain the parameters of the wake-sleep algorithm, such as the duration of the wake and sleep phases of learning and the minimal sample duration. We examine the deviations from optimal performance and tie them to the properties of the synaptic plasticity rule. PMID:25954191

  8. A fuzzy classifier system for process control

    NASA Technical Reports Server (NTRS)

    Karr, C. L.; Phillips, J. C.

    1994-01-01

    A fuzzy classifier system that discovers rules for controlling a mathematical model of a pH titration system was developed by researchers at the U.S. Bureau of Mines (USBM). Fuzzy classifier systems successfully combine the strengths of learning classifier systems and fuzzy logic controllers. Learning classifier systems resemble familiar production rule-based systems, but they represent their IF-THEN rules by strings of characters rather than in the traditional linguistic terms. Fuzzy logic is a tool that allows for the incorporation of abstract concepts into rule based-systems, thereby allowing the rules to resemble the familiar 'rules-of-thumb' commonly used by humans when solving difficult process control and reasoning problems. Like learning classifier systems, fuzzy classifier systems employ a genetic algorithm to explore and sample new rules for manipulating the problem environment. Like fuzzy logic controllers, fuzzy classifier systems encapsulate knowledge in the form of production rules. The results presented in this paper demonstrate the ability of fuzzy classifier systems to generate a fuzzy logic-based process control system.

  9. Reducing the Conflict Factors Strategies in Question Answering System

    NASA Astrophysics Data System (ADS)

    Suwarningsih, W.; Purwarianti, A.; Supriana, I.

    2017-03-01

    A rule-based system is prone to conflict as new knowledge every time will emerge and indirectly must sign in to the knowledge base that is used by the system. A conflict occurred between the rules in the knowledge base can lead to the errors of reasoning or reasoning circulation. Therefore, when added, the new rules will lead to conflict with other rules, and the only rules that really can be added to the knowledge base. From these conditions, this paper aims to propose a conflict resolution strategy for a medical debriefing system by analyzing scenarios based upon the runtime to improve the efficiency and reliability of systems.

  10. Data-driven backward chaining

    NASA Technical Reports Server (NTRS)

    Haley, Paul

    1991-01-01

    The C Language Integrated Production System (CLIPS) cannot effectively perform sound and complete logical inference in most real-world contexts. The problem facing CLIPS is its lack of goal generation. Without automatic goal generation and maintenance, forward chaining can only deduce all instances of a relationship. Backward chaining, which requires goal generation, allows deduction of only that subset of what is logically true which is also relevant to ongoing problem solving. Goal generation can be mimicked in simple cases using forward chaining. However, such mimicry requires manual coding of additional rules which can assert an inadequate goal representation for every condition in every rule that can have corresponding facts derived by backward chaining. In general, for N rules with an average of M conditions per rule the number of goal generation rules required is on the order of N*M. This is clearly intractable from a program maintenance perspective. We describe the support in Eclipse for backward chaining which it automatically asserts as it checks rule conditions. Important characteristics of this extension are that it does not assert goals which cannot match any rule conditions, that 2 equivalent goals are never asserted, and that goals persist as long as, but no longer than, they remain relevant.

  11. Rule-based simulation models

    NASA Technical Reports Server (NTRS)

    Nieten, Joseph L.; Seraphine, Kathleen M.

    1991-01-01

    Procedural modeling systems, rule based modeling systems, and a method for converting a procedural model to a rule based model are described. Simulation models are used to represent real time engineering systems. A real time system can be represented by a set of equations or functions connected so that they perform in the same manner as the actual system. Most modeling system languages are based on FORTRAN or some other procedural language. Therefore, they must be enhanced with a reaction capability. Rule based systems are reactive by definition. Once the engineering system has been decomposed into a set of calculations using only basic algebraic unary operations, a knowledge network of calculations and functions can be constructed. The knowledge network required by a rule based system can be generated by a knowledge acquisition tool or a source level compiler. The compiler would take an existing model source file, a syntax template, and a symbol table and generate the knowledge network. Thus, existing procedural models can be translated and executed by a rule based system. Neural models can be provide the high capacity data manipulation required by the most complex real time models.

  12. On Decision-Making Among Multiple Rule-Bases in Fuzzy Control Systems

    NASA Technical Reports Server (NTRS)

    Tunstel, Edward; Jamshidi, Mo

    1997-01-01

    Intelligent control of complex multi-variable systems can be a challenge for single fuzzy rule-based controllers. This class of problems cam often be managed with less difficulty by distributing intelligent decision-making amongst a collection of rule-bases. Such an approach requires that a mechanism be chosen to ensure goal-oriented interaction between the multiple rule-bases. In this paper, a hierarchical rule-based approach is described. Decision-making mechanisms based on generalized concepts from single-rule-based fuzzy control are described. Finally, the effects of different aggregation operators on multi-rule-base decision-making are examined in a navigation control problem for mobile robots.

  13. Excellent approach to modeling urban expansion by fuzzy cellular automata: agent base model

    NASA Astrophysics Data System (ADS)

    Khajavigodellou, Yousef; Alesheikh, Ali A.; Mohammed, Abdulrazak A. S.; Chapi, Kamran

    2014-09-01

    Recently, the interaction between humans and their environment is the one of important challenges in the world. Landuse/ cover change (LUCC) is a complex process that includes actors and factors at different social and spatial levels. The complexity and dynamics of urban systems make the applicable practice of urban modeling very difficult. With the increased computational power and the greater availability of spatial data, micro-simulation such as the agent based and cellular automata simulation methods, has been developed by geographers, planners, and scholars, and it has shown great potential for representing and simulating the complexity of the dynamic processes involved in urban growth and land use change. This paper presents Fuzzy Cellular Automata in Geospatial Information System and remote Sensing to simulated and predicted urban expansion pattern. These FCA-based dynamic spatial urban models provide an improved ability to forecast and assess future urban growth and to create planning scenarios, allowing us to explore the potential impacts of simulations that correspond to urban planning and management policies. A fuzzy inference guided cellular automata approach. Semantic or linguistic knowledge on Land use change is expressed as fuzzy rules, based on which fuzzy inference is applied to determine the urban development potential for each pixel. The model integrates an ABM (agent-based model) and FCA (Fuzzy Cellular Automata) to investigate a complex decision-making process and future urban dynamic processes. Based on this model rapid development and green land protection under the influences of the behaviors and decision modes of regional authority agents, real estate developer agents, resident agents and non- resident agents and their interactions have been applied to predict the future development patterns of the Erbil metropolitan region.

  14. Challenges for Rule Systems on the Web

    NASA Astrophysics Data System (ADS)

    Hu, Yuh-Jong; Yeh, Ching-Long; Laun, Wolfgang

    The RuleML Challenge started in 2007 with the objective of inspiring the issues of implementation for management, integration, interoperation and interchange of rules in an open distributed environment, such as the Web. Rules are usually classified as three types: deductive rules, normative rules, and reactive rules. The reactive rules are further classified as ECA rules and production rules. The study of combination rule and ontology is traced back to an earlier active rule system for relational and object-oriented (OO) databases. Recently, this issue has become one of the most important research problems in the Semantic Web. Once we consider a computer executable policy as a declarative set of rules and ontologies that guides the behavior of entities within a system, we have a flexible way to implement real world policies without rewriting the computer code, as we did before. Fortunately, we have de facto rule markup languages, such as RuleML or RIF to achieve the portability and interchange of rules for different rule systems. Otherwise, executing real-life rule-based applications on the Web is almost impossible. Several commercial or open source rule engines are available for the rule-based applications. However, we still need a standard rule language and benchmark for not only to compare the rule systems but also to measure the progress in the field. Finally, a number of real-life rule-based use cases will be investigated to demonstrate the applicability of current rule systems on the Web.

  15. Semantics-based plausible reasoning to extend the knowledge coverage of medical knowledge bases for improved clinical decision support.

    PubMed

    Mohammadhassanzadeh, Hossein; Van Woensel, William; Abidi, Samina Raza; Abidi, Syed Sibte Raza

    2017-01-01

    Capturing complete medical knowledge is challenging-often due to incomplete patient Electronic Health Records (EHR), but also because of valuable, tacit medical knowledge hidden away in physicians' experiences. To extend the coverage of incomplete medical knowledge-based systems beyond their deductive closure, and thus enhance their decision-support capabilities, we argue that innovative, multi-strategy reasoning approaches should be applied. In particular, plausible reasoning mechanisms apply patterns from human thought processes, such as generalization, similarity and interpolation, based on attributional, hierarchical, and relational knowledge. Plausible reasoning mechanisms include inductive reasoning , which generalizes the commonalities among the data to induce new rules, and analogical reasoning , which is guided by data similarities to infer new facts. By further leveraging rich, biomedical Semantic Web ontologies to represent medical knowledge, both known and tentative, we increase the accuracy and expressivity of plausible reasoning, and cope with issues such as data heterogeneity, inconsistency and interoperability. In this paper, we present a Semantic Web-based, multi-strategy reasoning approach, which integrates deductive and plausible reasoning and exploits Semantic Web technology to solve complex clinical decision support queries. We evaluated our system using a real-world medical dataset of patients with hepatitis, from which we randomly removed different percentages of data (5%, 10%, 15%, and 20%) to reflect scenarios with increasing amounts of incomplete medical knowledge. To increase the reliability of the results, we generated 5 independent datasets for each percentage of missing values, which resulted in 20 experimental datasets (in addition to the original dataset). The results show that plausibly inferred knowledge extends the coverage of the knowledge base by, on average, 2%, 7%, 12%, and 16% for datasets with, respectively, 5%, 10%, 15%, and 20% of missing values. This expansion in the KB coverage allowed solving complex disease diagnostic queries that were previously unresolvable, without losing the correctness of the answers. However, compared to deductive reasoning, data-intensive plausible reasoning mechanisms yield a significant performance overhead. We observed that plausible reasoning approaches, by generating tentative inferences and leveraging domain knowledge of experts, allow us to extend the coverage of medical knowledge bases, resulting in improved clinical decision support. Second, by leveraging OWL ontological knowledge, we are able to increase the expressivity and accuracy of plausible reasoning methods. Third, our approach is applicable to clinical decision support systems for a range of chronic diseases.

  16. Modularising ontology and designing inference patterns to personalise health condition assessment: the case of obesity.

    PubMed

    Sojic, Aleksandra; Terkaj, Walter; Contini, Giorgia; Sacco, Marco

    2016-05-04

    The public health initiatives for obesity prevention are increasingly exploiting the advantages of smart technologies that can register various kinds of data related to physical, physiological, and behavioural conditions. Since individual features and habits vary among people, the design of appropriate intervention strategies for motivating changes in behavioural patterns towards a healthy lifestyle requires the interpretation and integration of collected information, while considering individual profiles in a personalised manner. The ontology-based modelling is recognised as a promising approach in facing the interoperability and integration of heterogeneous information related to characterisation of personal profiles. The presented ontology captures individual profiles across several obesity-related knowledge-domains structured into dedicated modules in order to support inference about health condition, physical features, behavioural habits associated with a person, and relevant changes over time. The modularisation strategy is designed to facilitate ontology development, maintenance, and reuse. The domain-specific modules formalised in the Web Ontology Language (OWL) integrate the domain-specific sets of rules formalised in the Semantic Web Rule Language (SWRL). The inference rules follow a modelling pattern designed to support personalised assessment of health condition as age- and gender-specific. The test cases exemplify a personalised assessment of the obesity-related health conditions for the population of teenagers. The paper addresses several issues concerning the modelling of normative concepts related to obesity and depicts how the public health concern impacts classification of teenagers according to their phenotypes. The modelling choices regarding the ontology-structure are explained in the context of the modelling goal to integrate multiple knowledge-domains and support reasoning about the individual changes over time. The presented modularisation pattern enhances reusability of the domain-specific modules across various health care domains.

  17. Age-Related Brain Activation Changes during Rule Repetition in Word-Matching.

    PubMed

    Methqal, Ikram; Pinsard, Basile; Amiri, Mahnoush; Wilson, Maximiliano A; Monchi, Oury; Provost, Jean-Sebastien; Joanette, Yves

    2017-01-01

    Objective: The purpose of this study was to explore the age-related brain activation changes during a word-matching semantic-category-based task, which required either repeating or changing a semantic rule to be applied. In order to do so, a word-semantic rule-based task was adapted from the Wisconsin Sorting Card Test, involving the repeated feedback-driven selection of given pairs of words based on semantic category-based criteria. Method: Forty healthy adults (20 younger and 20 older) performed a word-matching task while undergoing a fMRI scan in which they were required to pair a target word with another word from a group of three words. The required pairing is based on three word-pair semantic rules which correspond to different levels of semantic control demands: functional relatedness, moderately typical-relatedness (which were considered as low control demands), and atypical-relatedness (high control demands). The sorting period consisted of a continuous execution of the same sorting rule and an inferred trial-by-trial feedback was given. Results: Behavioral performance revealed increases in response times and decreases of correct responses according to the level of semantic control demands (functional vs. typical vs. atypical) for both age groups (younger and older) reflecting graded differences in the repetition of the application of a given semantic rule. Neuroimaging findings of significant brain activation showed two main results: (1) Greater task-related activation changes for the repetition of the application of atypical rules relative to typical and functional rules, and (2) Changes (older > younger) in the inferior prefrontal regions for functional rules and more extensive and bilateral activations for typical and atypical rules. Regarding the inter-semantic rules comparison, only task-related activation differences were observed for functional > typical (e.g., inferior parietal and temporal regions bilaterally) and atypical > typical (e.g., prefrontal, inferior parietal, posterior temporal, and subcortical regions). Conclusion: These results suggest that healthy cognitive aging relies on the adaptive changes of inferior prefrontal resources involved in the repetitive execution of semantic rules, thus reflecting graded differences in support of task demands.

  18. Dopamine reward prediction errors reflect hidden state inference across time

    PubMed Central

    Starkweather, Clara Kwon; Babayan, Benedicte M.; Uchida, Naoshige; Gershman, Samuel J.

    2017-01-01

    Midbrain dopamine neurons signal reward prediction error (RPE), or actual minus expected reward. The temporal difference (TD) learning model has been a cornerstone in understanding how dopamine RPEs could drive associative learning. Classically, TD learning imparts value to features that serially track elapsed time relative to observable stimuli. In the real world, however, sensory stimuli provide ambiguous information about the hidden state of the environment, leading to the proposal that TD learning might instead compute a value signal based on an inferred distribution of hidden states (a ‘belief state’). In this work, we asked whether dopaminergic signaling supports a TD learning framework that operates over hidden states. We found that dopamine signaling exhibited a striking difference between two tasks that differed only with respect to whether reward was delivered deterministically. Our results favor an associative learning rule that combines cached values with hidden state inference. PMID:28263301

  19. Mapping the ecological networks of microbial communities.

    PubMed

    Xiao, Yandong; Angulo, Marco Tulio; Friedman, Jonathan; Waldor, Matthew K; Weiss, Scott T; Liu, Yang-Yu

    2017-12-11

    Mapping the ecological networks of microbial communities is a necessary step toward understanding their assembly rules and predicting their temporal behavior. However, existing methods require assuming a particular population dynamics model, which is not known a priori. Moreover, those methods require fitting longitudinal abundance data, which are often not informative enough for reliable inference. To overcome these limitations, here we develop a new method based on steady-state abundance data. Our method can infer the network topology and inter-taxa interaction types without assuming any particular population dynamics model. Additionally, when the population dynamics is assumed to follow the classic Generalized Lotka-Volterra model, our method can infer the inter-taxa interaction strengths and intrinsic growth rates. We systematically validate our method using simulated data, and then apply it to four experimental data sets. Our method represents a key step towards reliable modeling of complex, real-world microbial communities, such as the human gut microbiota.

  20. Dopamine reward prediction errors reflect hidden-state inference across time.

    PubMed

    Starkweather, Clara Kwon; Babayan, Benedicte M; Uchida, Naoshige; Gershman, Samuel J

    2017-04-01

    Midbrain dopamine neurons signal reward prediction error (RPE), or actual minus expected reward. The temporal difference (TD) learning model has been a cornerstone in understanding how dopamine RPEs could drive associative learning. Classically, TD learning imparts value to features that serially track elapsed time relative to observable stimuli. In the real world, however, sensory stimuli provide ambiguous information about the hidden state of the environment, leading to the proposal that TD learning might instead compute a value signal based on an inferred distribution of hidden states (a 'belief state'). Here we asked whether dopaminergic signaling supports a TD learning framework that operates over hidden states. We found that dopamine signaling showed a notable difference between two tasks that differed only with respect to whether reward was delivered in a deterministic manner. Our results favor an associative learning rule that combines cached values with hidden-state inference.

  1. KOJAK Group Finder: Scalable Group Detection via Integrated Knowledge-Based and Statistical Reasoning

    DTIC Science & Technology

    2006-09-01

    STELLA and PowerLoomn. These modules comunicate with a knowledge basec using KIF and stan(lardl relational database systelnis using either standard...groups ontology as well as a rule that infers additional seed members based on joint participation in a terrorism event. EDB schema files are a special... terrorism links from the Ali Baba EDB. Our interpretation of such links is that they KOJAK Manual E-42 encode that two people committed an act of

  2. A logical model of cooperating rule-based systems

    NASA Technical Reports Server (NTRS)

    Bailin, Sidney C.; Moore, John M.; Hilberg, Robert H.; Murphy, Elizabeth D.; Bahder, Shari A.

    1989-01-01

    A model is developed to assist in the planning, specification, development, and verification of space information systems involving distributed rule-based systems. The model is based on an analysis of possible uses of rule-based systems in control centers. This analysis is summarized as a data-flow model for a hypothetical intelligent control center. From this data-flow model, the logical model of cooperating rule-based systems is extracted. This model consists of four layers of increasing capability: (1) communicating agents, (2) belief-sharing knowledge sources, (3) goal-sharing interest areas, and (4) task-sharing job roles.

  3. A neural network architecture for implementation of expert systems for real time monitoring

    NASA Technical Reports Server (NTRS)

    Ramamoorthy, P. A.

    1991-01-01

    Since neural networks have the advantages of massive parallelism and simple architecture, they are good tools for implementing real time expert systems. In a rule based expert system, the antecedents of rules are in the conjunctive or disjunctive form. We constructed a multilayer feedforward type network in which neurons represent AND or OR operations of rules. Further, we developed a translator which can automatically map a given rule base into the network. Also, we proposed a new and powerful yet flexible architecture that combines the advantages of both fuzzy expert systems and neural networks. This architecture uses the fuzzy logic concepts to separate input data domains into several smaller and overlapped regions. Rule-based expert systems for time critical applications using neural networks, the automated implementation of rule-based expert systems with neural nets, and fuzzy expert systems vs. neural nets are covered.

  4. Genetic network inference as a series of discrimination tasks.

    PubMed

    Kimura, Shuhei; Nakayama, Satoshi; Hatakeyama, Mariko

    2009-04-01

    Genetic network inference methods based on sets of differential equations generally require a great deal of time, as the equations must be solved many times. To reduce the computational cost, researchers have proposed other methods for inferring genetic networks by solving sets of differential equations only a few times, or even without solving them at all. When we try to obtain reasonable network models using these methods, however, we must estimate the time derivatives of the gene expression levels with great precision. In this study, we propose a new method to overcome the drawbacks of inference methods based on sets of differential equations. Our method infers genetic networks by obtaining classifiers capable of predicting the signs of the derivatives of the gene expression levels. For this purpose, we defined a genetic network inference problem as a series of discrimination tasks, then solved the defined series of discrimination tasks with a linear programming machine. Our experimental results demonstrated that the proposed method is capable of correctly inferring genetic networks, and doing so more than 500 times faster than the other inference methods based on sets of differential equations. Next, we applied our method to actual expression data of the bacterial SOS DNA repair system. And finally, we demonstrated that our approach relates to the inference method based on the S-system model. Though our method provides no estimation of the kinetic parameters, it should be useful for researchers interested only in the network structure of a target system. Supplementary data are available at Bioinformatics online.

  5. POPPER, a simple programming language for probabilistic semantic inference in medicine.

    PubMed

    Robson, Barry

    2015-01-01

    Our previous reports described the use of the Hyperbolic Dirac Net (HDN) as a method for probabilistic inference from medical data, and a proposed probabilistic medical Semantic Web (SW) language Q-UEL to provide that data. Rather like a traditional Bayes Net, that HDN provided estimates of joint and conditional probabilities, and was static, with no need for evolution due to "reasoning". Use of the SW will require, however, (a) at least the semantic triple with more elaborate relations than conditional ones, as seen in use of most verbs and prepositions, and (b) rules for logical, grammatical, and definitional manipulation that can generate changes in the inference net. Here is described the simple POPPER language for medical inference. It can be automatically written by Q-UEL, or by hand. Based on studies with our medical students, it is believed that a tool like this may help in medical education and that a physician unfamiliar with SW science can understand it. It is here used to explore the considerable challenges of assigning probabilities, and not least what the meaning and utility of inference net evolution would be for a physician. Copyright © 2014 Elsevier Ltd. All rights reserved.

  6. A Bayesian Scoring Technique for Mining Predictive and Non-Spurious Rules

    PubMed Central

    Batal, Iyad; Cooper, Gregory; Hauskrecht, Milos

    2015-01-01

    Rule mining is an important class of data mining methods for discovering interesting patterns in data. The success of a rule mining method heavily depends on the evaluation function that is used to assess the quality of the rules. In this work, we propose a new rule evaluation score - the Predictive and Non-Spurious Rules (PNSR) score. This score relies on Bayesian inference to evaluate the quality of the rules and considers the structure of the rules to filter out spurious rules. We present an efficient algorithm for finding rules with high PNSR scores. The experiments demonstrate that our method is able to cover and explain the data with a much smaller rule set than existing methods. PMID:25938136

  7. A Bayesian Scoring Technique for Mining Predictive and Non-Spurious Rules.

    PubMed

    Batal, Iyad; Cooper, Gregory; Hauskrecht, Milos

    Rule mining is an important class of data mining methods for discovering interesting patterns in data. The success of a rule mining method heavily depends on the evaluation function that is used to assess the quality of the rules. In this work, we propose a new rule evaluation score - the Predictive and Non-Spurious Rules (PNSR) score. This score relies on Bayesian inference to evaluate the quality of the rules and considers the structure of the rules to filter out spurious rules. We present an efficient algorithm for finding rules with high PNSR scores. The experiments demonstrate that our method is able to cover and explain the data with a much smaller rule set than existing methods.

  8. Entropic Inference

    NASA Astrophysics Data System (ADS)

    Caticha, Ariel

    2011-03-01

    In this tutorial we review the essential arguments behing entropic inference. We focus on the epistemological notion of information and its relation to the Bayesian beliefs of rational agents. The problem of updating from a prior to a posterior probability distribution is tackled through an eliminative induction process that singles out the logarithmic relative entropy as the unique tool for inference. The resulting method of Maximum relative Entropy (ME), includes as special cases both MaxEnt and Bayes' rule, and therefore unifies the two themes of these workshops—the Maximum Entropy and the Bayesian methods—into a single general inference scheme.

  9. A fully-online Neuro-Fuzzy model for flow forecasting in basins with limited data

    NASA Astrophysics Data System (ADS)

    Ashrafi, Mohammad; Chua, Lloyd Hock Chye; Quek, Chai; Qin, Xiaosheng

    2017-02-01

    Current state-of-the-art online neuro fuzzy models (NFMs) such as DENFIS (Dynamic Evolving Neural-Fuzzy Inference System) have been used for runoff forecasting. Online NFMs adopt a local learning approach and are able to adapt to changes continuously. The DENFIS model however requires upper/lower bound for normalization and also the number of rules increases monotonically. This requirement makes the model unsuitable for use in basins with limited data, since a priori data is required. In order to address this and other drawbacks of current online models, the Generic Self-Evolving Takagi-Sugeno-Kang (GSETSK) is adopted in this study for forecast applications in basins with limited data. GSETSK is a fully-online NFM which updates its structure and parameters based on the most recent data. The model does not require the need for historical data and adopts clustering and rule pruning techniques to generate a compact and up-to-date rule-base. GSETSK was used in two forecast applications, rainfall-runoff (a catchment in Sweden) and river routing (Lower Mekong River) forecasts. Each of these two applications was studied under two scenarios: (i) there is no prior data, and (ii) only limited data is available (1 year for the Swedish catchment and 1 season for the Mekong River). For the Swedish Basin, GSETSK model results were compared to available results from a calibrated HBV (Hydrologiska Byråns Vattenbalansavdelning) model. For the Mekong River, GSETSK results were compared against the URBS (Unified River Basin Simulator) model. Both comparisons showed that results from GSETSK are comparable with the physically based models, which were calibrated with historical data. Thus, even though GSETSK was trained with a very limited dataset in comparison with HBV or URBS, similar results were achieved. Similarly, further comparisons between GSETSK with DENFIS and the RBF (Radial Basis Function) models highlighted further advantages of GSETSK as having a rule-base (compared to opaque RBF) which is more compact, up-to-date and more easily interpretable.

  10. FAIL-SAFE: Fault Aware IntelLigent Software for Exascale

    DTIC Science & Technology

    2016-06-13

    and that these programs can continue to correct solutions. To broaden the impact of this research, we also needed to be able to ameliorate errors...designing an interface between the application and an introspection framework for resilience ( IFR ) based on the inference engine SHINE; (4) using...the ROSE compiler to translate annotations into reasoning rules for the IFR ; and (5) designing a Knowledge/Experience Database, which will store

  11. Design of an expert system for the development and formulation of push-pull osmotic pump tablets containing poorly water-soluble drugs.

    PubMed

    Zhang, Zhi-hong; Dong, Hong-ye; Peng, Bo; Liu, Hong-fei; Li, Chun-lei; Liang, Min; Pan, Wei-san

    2011-05-30

    The purpose of this article was to build an expert system for the development and formulation of push-pull osmotic pump tablets (PPOP). Hundreds of PPOP formulations were studied according to different poorly water-soluble drugs and pharmaceutical acceptable excipients. The knowledge base including database and rule base was built based on the reported results of hundreds of PPOP formulations containing different poorly water-soluble drugs and pharmaceutical excipients and the experiences available from other researchers. The prediction model of release behavior was built using back propagation (BP) neural network, which is good at nonlinear mapping and learning function. Formulation design model was established based on the prediction model of release behavior, which was the nucleus of the inference engine. Finally, the expert system program was constructed by VB.NET associating with SQL Server. Expert system is one of the most popular aspects in artificial intelligence. To date there is no expert system available for the formulation of controlled release dosage forms yet. Moreover, osmotic pump technology (OPT) is gradually getting consummate all over the world. It is meaningful to apply expert system on OPT. Famotidine, a water insoluble drug was chosen as the model drug to validate the applicability of the developed expert system. Copyright © 2011 Elsevier B.V. All rights reserved.

  12. N Reasons Why Production-Rules are Insufficient Models for Expert System Knowledge Representation Schemes

    DTIC Science & Technology

    1991-02-01

    3 2.2 Hybrid Rule/Fact Schemas .............................................................. 3 3 THE LIMITATIONS OF RULE BASED KNOWLEDGE...or hybrid rule/fact schemas. 2 UNCLASSIFIED .WA UNCLASSIFIED ERL-0520-RR 2.1 Propositional Logic The simplest form of production-rules are based upon...requirements which may lead to poor system performance. 2.2 Hybrid Rule/Fact Schemas Hybrid rule/fact relationships (also known as Predicate Calculus ) have

  13. Impact of flow routing on catchment area calculations, slope estimates, and numerical simulations of landscape development

    NASA Astrophysics Data System (ADS)

    Shelef, Eitan; Hilley, George E.

    2013-12-01

    Flow routing across real or modeled topography determines the modeled discharge and wetness index and thus plays a central role in predicting surface lowering rate, runoff generation, likelihood of slope failure, and transition from hillslope to channel forming processes. In this contribution, we compare commonly used flow-routing rules as well as a new routing rule, to commonly used benchmarks. We also compare results for different routing rules using Airborne Laser Swath Mapping (ALSM) topography to explore the impact of different flow-routing schemes on inferring the generation of saturation overland flow and the transition between hillslope to channel forming processes, as well as on location of saturation overland flow. Finally, we examined the impact of flow-routing and slope-calculation rules on modeled topography produced by Geomorphic Transport Law (GTL)-based simulations. We found that different rules produce substantive differences in the structure of the modeled topography and flow patterns over ALSM data. Our results highlight the impact of flow-routing and slope-calculation rules on modeled topography, as well as on calculated geomorphic metrics across real landscapes. As such, studies that use a variety of routing rules to analyze and simulate topography are necessary to determine those aspects that most strongly depend on a chosen routing rule.

  14. Research on key technology of the verification system of steel rule based on vision measurement

    NASA Astrophysics Data System (ADS)

    Jia, Siyuan; Wang, Zhong; Liu, Changjie; Fu, Luhua; Li, Yiming; Lu, Ruijun

    2018-01-01

    The steel rule plays an important role in quantity transmission. However, the traditional verification method of steel rule based on manual operation and reading brings about low precision and low efficiency. A machine vison based verification system of steel rule is designed referring to JJG1-1999-Verificaiton Regulation of Steel Rule [1]. What differentiates this system is that it uses a new calibration method of pixel equivalent and decontaminates the surface of steel rule. Experiments show that these two methods fully meet the requirements of the verification system. Measuring results strongly prove that these methods not only meet the precision of verification regulation, but also improve the reliability and efficiency of the verification system.

  15. Analysis prediction of Indonesian banks (BCA, BNI, MANDIRI) using adaptive neuro-fuzzy inference system (ANFIS) and investment strategies

    NASA Astrophysics Data System (ADS)

    Trianto, Andriantama Budi; Hadi, I. M.; Liong, The Houw; Purqon, Acep

    2015-09-01

    Indonesian economical development is growing well. It has effect for their invesment in Banks and the stock market. In this study, we perform prediction for the three blue chips of Indonesian bank i.e. BCA, BNI, and MANDIRI by using the method of Adaptive Neuro-Fuzzy Inference System (ANFIS) with Takagi-Sugeno rules and Generalized bell (Gbell) as the membership function. Our results show that ANFIS perform good prediction with RMSE for BCA of 27, BNI of 5.29, and MANDIRI of 13.41, respectively. Furthermore, we develop an active strategy to gain more benefit. We compare between passive strategy versus active strategy. Our results shows that for the passive strategy gains 13 million rupiah, while for the active strategy gains 47 million rupiah in one year. The active investment strategy significantly shows gaining multiple benefit than the passive one.

  16. Fuzzy logic and image processing techniques for the interpretation of seismic data

    NASA Astrophysics Data System (ADS)

    Orozco-del-Castillo, M. G.; Ortiz-Alemán, C.; Urrutia-Fucugauchi, J.; Rodríguez-Castellanos, A.

    2011-06-01

    Since interpretation of seismic data is usually a tedious and repetitive task, the ability to do so automatically or semi-automatically has become an important objective of recent research. We believe that the vagueness and uncertainty in the interpretation process makes fuzzy logic an appropriate tool to deal with seismic data. In this work we developed a semi-automated fuzzy inference system to detect the internal architecture of a mass transport complex (MTC) in seismic images. We propose that the observed characteristics of a MTC can be expressed as fuzzy if-then rules consisting of linguistic values associated with fuzzy membership functions. The constructions of the fuzzy inference system and various image processing techniques are presented. We conclude that this is a well-suited problem for fuzzy logic since the application of the proposed methodology yields a semi-automatically interpreted MTC which closely resembles the MTC from expert manual interpretation.

  17. 78 FR 59994 - Self-Regulatory Organizations; The Options Clearing Corporation; Order Approving Proposed Rule...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-30

    ... to avoid any erroneous inference that those are the only provisions of OCC's By-Laws and Rules that... its By-Laws and Rules as well. III. Discussion Section 19(b)(2)(C) of the Act \\5\\ directs the... of participants or among participants in the use of the clearing agency. \\5\\ 15 U.S.C. 78s(b)(2)(C...

  18. Automated implementation of rule-based expert systems with neural networks for time-critical applications

    NASA Technical Reports Server (NTRS)

    Ramamoorthy, P. A.; Huang, Song; Govind, Girish

    1991-01-01

    In fault diagnosis, control and real-time monitoring, both timing and accuracy are critical for operators or machines to reach proper solutions or appropriate actions. Expert systems are becoming more popular in the manufacturing community for dealing with such problems. In recent years, neural networks have revived and their applications have spread to many areas of science and engineering. A method of using neural networks to implement rule-based expert systems for time-critical applications is discussed here. This method can convert a given rule-based system into a neural network with fixed weights and thresholds. The rules governing the translation are presented along with some examples. We also present the results of automated machine implementation of such networks from the given rule-base. This significantly simplifies the translation process to neural network expert systems from conventional rule-based systems. Results comparing the performance of the proposed approach based on neural networks vs. the classical approach are given. The possibility of very large scale integration (VLSI) realization of such neural network expert systems is also discussed.

  19. Probabilistic Algorithmic Knowledge

    DTIC Science & Technology

    2005-12-20

    standard possible-worlds sense. Although soundness is not required in the basic definition, it does seem to be useful in many applications. Our interest...think of as describing basic facts about the system, such as “the door is closed” or “agent A sent the message m to B”, more complicated formulas are...messages as long as the adversary knows the decryption key. (The function submsg basically implements the inference rules for ⊢DY .) A DY i (hasi(m

  20. Evaluating data-driven causal inference techniques in noisy physical and ecological systems

    NASA Astrophysics Data System (ADS)

    Tennant, C.; Larsen, L.

    2016-12-01

    Causal inference from observational time series challenges traditional approaches for understanding processes and offers exciting opportunities to gain new understanding of complex systems where nonlinearity, delayed forcing, and emergent behavior are common. We present a formal evaluation of the performance of convergent cross-mapping (CCM) and transfer entropy (TE) for data-driven causal inference under real-world conditions. CCM is based on nonlinear state-space reconstruction, and causality is determined by the convergence of prediction skill with an increasing number of observations of the system. TE is the uncertainty reduction based on transition probabilities of a pair of time-lagged variables. With TE, causal inference is based on asymmetry in information flow between the variables. Observational data and numerical simulations from a number of classical physical and ecological systems: atmospheric convection (the Lorenz system), species competition (patch-tournaments), and long-term climate change (Vostok ice core) were used to evaluate the ability of CCM and TE to infer causal-relationships as data series become increasingly corrupted by observational (instrument-driven) or process (model-or -stochastic-driven) noise. While both techniques show promise for causal inference, TE appears to be applicable to a wider range of systems, especially when the data series are of sufficient length to reliably estimate transition probabilities of system components. Both techniques also show a clear effect of observational noise on causal inference. For example, CCM exhibits a negative logarithmic decline in prediction skill as the noise level of the system increases. Changes in TE strongly depend on noise type and which variable the noise was added to. The ability of CCM and TE to detect driving influences suggest that their application to physical and ecological systems could be transformative for understanding driving mechanisms as Earth systems undergo change.

  1. Computational approaches to protein inference in shotgun proteomics

    PubMed Central

    2012-01-01

    Shotgun proteomics has recently emerged as a powerful approach to characterizing proteomes in biological samples. Its overall objective is to identify the form and quantity of each protein in a high-throughput manner by coupling liquid chromatography with tandem mass spectrometry. As a consequence of its high throughput nature, shotgun proteomics faces challenges with respect to the analysis and interpretation of experimental data. Among such challenges, the identification of proteins present in a sample has been recognized as an important computational task. This task generally consists of (1) assigning experimental tandem mass spectra to peptides derived from a protein database, and (2) mapping assigned peptides to proteins and quantifying the confidence of identified proteins. Protein identification is fundamentally a statistical inference problem with a number of methods proposed to address its challenges. In this review we categorize current approaches into rule-based, combinatorial optimization and probabilistic inference techniques, and present them using integer programing and Bayesian inference frameworks. We also discuss the main challenges of protein identification and propose potential solutions with the goal of spurring innovative research in this area. PMID:23176300

  2. How the brain predicts people's behavior in relation to rules and desires. Evidence of a medio-prefrontal dissociation.

    PubMed

    Corradi-Dell'Acqua, Corrado; Turri, Francesco; Kaufmann, Laurence; Clément, Fabrice; Schwartz, Sophie

    2015-09-01

    Forming and updating impressions about others is critical in everyday life and engages portions of the dorsomedial prefrontal cortex (dMPFC), the posterior cingulate cortex (PCC) and the amygdala. Some of these activations are attributed to "mentalizing" functions necessary to represent people's mental states, such as beliefs or desires. Evolutionary psychology and developmental studies, however, suggest that interpersonal inferences can also be obtained through the aid of deontic heuristics, which dictate what must (or must not) be done in given circumstances. We used fMRI and asked 18 participants to predict whether unknown characters would follow their desires or obey external rules. Participants had no means, at the beginning, to make accurate predictions, but slowly learned (throughout the experiment) each character's behavioral profile. We isolated brain regions whose activity changed during the experiment, as a neural signature of impression updating: whereas dMPFC was progressively more involved in predicting characters' behavior in relation to their desires, the medial orbitofrontal cortex and the amygdala were progressively more recruited in predicting rule-based behavior. Our data provide evidence of a neural dissociation between deontic inference and theory-of-mind (ToM), and support a differentiation of orbital and dorsal prefrontal cortex in terms of low- and high-level social cognition. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Strategies for adding adaptive learning mechanisms to rule-based diagnostic expert systems

    NASA Technical Reports Server (NTRS)

    Stclair, D. C.; Sabharwal, C. L.; Bond, W. E.; Hacke, Keith

    1988-01-01

    Rule-based diagnostic expert systems can be used to perform many of the diagnostic chores necessary in today's complex space systems. These expert systems typically take a set of symptoms as input and produce diagnostic advice as output. The primary objective of such expert systems is to provide accurate and comprehensive advice which can be used to help return the space system in question to nominal operation. The development and maintenance of diagnostic expert systems is time and labor intensive since the services of both knowledge engineer(s) and domain expert(s) are required. The use of adaptive learning mechanisms to increment evaluate and refine rules promises to reduce both time and labor costs associated with such systems. This paper describes the basic adaptive learning mechanisms of strengthening, weakening, generalization, discrimination, and discovery. Next basic strategies are discussed for adding these learning mechanisms to rule-based diagnostic expert systems. These strategies support the incremental evaluation and refinement of rules in the knowledge base by comparing the set of advice given by the expert system (A) with the correct diagnosis (C). Techniques are described for selecting those rules in the in the knowledge base which should participate in adaptive learning. The strategies presented may be used with a wide variety of learning algorithms. Further, these strategies are applicable to a large number of rule-based diagnostic expert systems. They may be used to provide either immediate or deferred updating of the knowledge base.

  4. Development and Testing of a Decision Making Based Method to Adjust Automatically the Harrowing Intensity

    PubMed Central

    Rueda-Ayala, Victor; Weis, Martin; Keller, Martina; Andújar, Dionisio; Gerhards, Roland

    2013-01-01

    Harrowing is often used to reduce weed competition, generally using a constant intensity across a whole field. The efficacy of weed harrowing in wheat and barley can be optimized, if site-specific conditions of soil, weed infestation and crop growth stage are taken into account. This study aimed to develop and test an algorithm to automatically adjust the harrowing intensity by varying the tine angle and number of passes. The field variability of crop leaf cover, weed density and soil density was acquired with geo-referenced sensors to investigate the harrowing selectivity and crop recovery. Crop leaf cover and weed density were assessed using bispectral cameras through differential images analysis. The draught force of the soil opposite to the direction of travel was measured with electronic load cell sensor connected to a rigid tine mounted in front of the harrow. Optimal harrowing intensity levels were derived in previously implemented experiments, based on the weed control efficacy and yield gain. The assessments of crop leaf cover, weed density and soil density were combined via rules with the aforementioned optimal intensities, in a linguistic fuzzy inference system (LFIS). The system was evaluated in two field experiments that compared constant intensities with variable intensities inferred by the system. A higher weed density reduction could be achieved when the harrowing intensity was not kept constant along the cultivated plot. Varying the intensity tended to reduce the crop leaf cover, though slightly improving crop yield. A real-time intensity adjustment with this system is achievable, if the cameras are attached in the front and at the rear or sides of the harrow. PMID:23669712

  5. Automated software system for checking the structure and format of ACM SIG documents

    NASA Astrophysics Data System (ADS)

    Mirza, Arsalan Rahman; Sah, Melike

    2017-04-01

    Microsoft (MS) Office Word is one of the most commonly used software tools for creating documents. MS Word 2007 and above uses XML to represent the structure of MS Word documents. Metadata about the documents are automatically created using Office Open XML (OOXML) syntax. We develop a new framework, which is called ADFCS (Automated Document Format Checking System) that takes the advantage of the OOXML metadata, in order to extract semantic information from MS Office Word documents. In particular, we develop a new ontology for Association for Computing Machinery (ACM) Special Interested Group (SIG) documents for representing the structure and format of these documents by using OWL (Web Ontology Language). Then, the metadata is extracted automatically in RDF (Resource Description Framework) according to this ontology using the developed software. Finally, we generate extensive rules in order to infer whether the documents are formatted according to ACM SIG standards. This paper, introduces ACM SIG ontology, metadata extraction process, inference engine, ADFCS online user interface, system evaluation and user study evaluations.

  6. An expert system for natural language processing

    NASA Technical Reports Server (NTRS)

    Hennessy, John F.

    1988-01-01

    A solution to the natural language processing problem that uses a rule based system, written in OPS5, to replace the traditional parsing method is proposed. The advantage to using a rule based system are explored. Specifically, the extensibility of a rule based solution is discussed as well as the value of maintaining rules that function independently. Finally, the power of using semantics to supplement the syntactic analysis of a sentence is considered.

  7. Figure Control of Lightweight Optical Structures

    NASA Technical Reports Server (NTRS)

    Main, John A.; Song, Haiping

    2005-01-01

    The goal of this paper is to demonstrate the use of fuzzy logic controllers in modifying the figure of a piezoceramic bimorph mirror. Non-contact electron actuation technology is used to actively control a bimorph mirror comprised two PZT-5H wafers by varying the electron flux and electron voltages. Due to electron blooming generated by the electron flux, it is difficult to develop an accurate control model for the bimorph mirror through theoretical analysis alone. The non-contact shape control system with electron flux blooming can be approximately described with a heuristic model based on experimental data. Two fuzzy logic feedback controllers are developed to control the shape of the bimorph mirror according to heuristic fuzzy inference rules generated from previous experimental results. Validation of the proposed fuzzy logic controllers is also discussed.

  8. Agile Text Mining for the 2014 i2b2/UTHealth Cardiac Risk Factors Challenge

    PubMed Central

    Cormack, James; Nath, Chinmoy; Milward, David; Raja, Kalpana; Jonnalagadda, Siddhartha R

    2016-01-01

    This paper describes the use of an agile text mining platform (Linguamatics’ Interactive Information Extraction Platform, I2E) to extract document-level cardiac risk factors in patient records as defined in the i2b2/UTHealth 2014 Challenge. The approach uses a data-driven rule-based methodology with the addition of a simple supervised classifier. We demonstrate that agile text mining allows for rapid optimization of extraction strategies, while post-processing can leverage annotation guidelines, corpus statistics and logic inferred from the gold standard data. We also show how data imbalance in a training set affects performance. Evaluation of this approach on the test data gave an F-Score of 91.7%, one percent behind the top performing system. PMID:26209007

  9. Improved Noninterferometric Test of Collapse Models Using Ultracold Cantilevers

    NASA Astrophysics Data System (ADS)

    Vinante, A.; Mezzena, R.; Falferi, P.; Carlesso, M.; Bassi, A.

    2017-09-01

    Spontaneous collapse models predict that a weak force noise acts on any mechanical system, as a consequence of the collapse of the wave function. Significant upper limits on the collapse rate have been recently inferred from precision mechanical experiments, such as ultracold cantilevers and the space mission LISA Pathfinder. Here, we report new results from an experiment based on a high-Q cantilever cooled to millikelvin temperatures, which is potentially able to improve the current bounds on the continuous spontaneous localization (CSL) model by 1 order of magnitude. High accuracy measurements of the cantilever thermal fluctuations reveal a nonthermal force noise of unknown origin. This excess noise is compatible with the CSL heating predicted by Adler. Several physical mechanisms able to explain the observed noise have been ruled out.

  10. The Role of Probability-Based Inference in an Intelligent Tutoring System.

    ERIC Educational Resources Information Center

    Mislevy, Robert J.; Gitomer, Drew H.

    Probability-based inference in complex networks of interdependent variables is an active topic in statistical research, spurred by such diverse applications as forecasting, pedigree analysis, troubleshooting, and medical diagnosis. This paper concerns the role of Bayesian inference networks for updating student models in intelligent tutoring…

  11. 48 CFR 6101.21 - Hearing procedures [Rule 21].

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... determination of the amount of recovery, if any, for other proceedings. (5) Before the hearing begins, the Board... the record the inferences it draws from the witness's refusal to testify under oath or affirmation... and, in the event of continued refusal, the Board may state for the record the inferences it draws...

  12. Bayesian or Laplacien inference, entropy and information theory and information geometry in data and signal processing

    NASA Astrophysics Data System (ADS)

    Mohammad-Djafari, Ali

    2015-01-01

    The main object of this tutorial article is first to review the main inference tools using Bayesian approach, Entropy, Information theory and their corresponding geometries. This review is focused mainly on the ways these tools have been used in data, signal and image processing. After a short introduction of the different quantities related to the Bayes rule, the entropy and the Maximum Entropy Principle (MEP), relative entropy and the Kullback-Leibler divergence, Fisher information, we will study their use in different fields of data and signal processing such as: entropy in source separation, Fisher information in model order selection, different Maximum Entropy based methods in time series spectral estimation and finally, general linear inverse problems.

  13. Hierarchical modeling and inference in ecology: The analysis of data from populations, metapopulations and communities

    USGS Publications Warehouse

    Royle, J. Andrew; Dorazio, Robert M.

    2008-01-01

    A guide to data collection, modeling and inference strategies for biological survey data using Bayesian and classical statistical methods. This book describes a general and flexible framework for modeling and inference in ecological systems based on hierarchical models, with a strict focus on the use of probability models and parametric inference. Hierarchical models represent a paradigm shift in the application of statistics to ecological inference problems because they combine explicit models of ecological system structure or dynamics with models of how ecological systems are observed. The principles of hierarchical modeling are developed and applied to problems in population, metapopulation, community, and metacommunity systems. The book provides the first synthetic treatment of many recent methodological advances in ecological modeling and unifies disparate methods and procedures. The authors apply principles of hierarchical modeling to ecological problems, including * occurrence or occupancy models for estimating species distribution * abundance models based on many sampling protocols, including distance sampling * capture-recapture models with individual effects * spatial capture-recapture models based on camera trapping and related methods * population and metapopulation dynamic models * models of biodiversity, community structure and dynamics.

  14. A Kinect based intelligent e-rehabilitation system in physical therapy.

    PubMed

    Gal, Norbert; Andrei, Diana; Nemeş, Dan Ion; Nădăşan, Emanuela; Stoicu-Tivadar, Vasile

    2015-01-01

    This paper presents an intelligent Kinect and fuzzy inference system based e-rehabilitation system. The Kinect can detect the posture and motion of the patients while the fuzzy inference system can interpret the acquired data on the cognitive level. The system is capable to assess the initial posture and motion ranges of 20 joints. Using angles to describe the motion of the joints, exercise patterns can be developed for each patient. Using the exercise descriptors the fuzzy inference system can track the patient and deliver real-time feedback to maximize the efficiency of the rehabilitation. The first laboratory tests confirm the utility of this system for the initial posture detection, motion range and exercise tracking.

  15. 32 CFR 776.29 - Imputed disqualification: General rule.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 32 National Defense 5 2011-07-01 2011-07-01 false Imputed disqualification: General rule. 776.29... inferences, deductions, or working presumptions that reasonably may be made about the way in which covered... interests of another. When such independence is lacking or unlikely, representation cannot be zealous. (5...

  16. 26 CFR 1.279-1 - General rule; purpose.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... (5), 279(f), or 279(i) are present. However, no inference should be drawn from the rules of section... respect to its corporate acquisition indebtedness to the extent such interest exceeds $5 million. However, the $5 million limitation is reduced by the amount of interest paid or incurred on obligations issued...

  17. 32 CFR 776.29 - Imputed disqualification: General rule.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 32 National Defense 5 2010-07-01 2010-07-01 false Imputed disqualification: General rule. 776.29... inferences, deductions, or working presumptions that reasonably may be made about the way in which covered... interests of another. When such independence is lacking or unlikely, representation cannot be zealous. (5...

  18. Automatic Diagnosis of Fetal Heart Rate: Comparison of Different Methodological Approaches

    DTIC Science & Technology

    2001-10-25

    Apgar score). Each recording lasted at least 30 minutes and it contained both the cardiographic series and the toco trace. We focused on four...inference rules automatically generated by the learning procedure showed that n° Rules can be manually reduced to 37 without deteriorating so much the

  19. Research and development for Onboard Navigation (ONAV) ground based expert/trainer system: ONAV entry expert system code

    NASA Technical Reports Server (NTRS)

    Bochsler, Daniel C.

    1988-01-01

    A complete listing is given of the expert system rules for the Entry phase of the Onboard Navigation (ONAV) Ground Based Expert Trainer System for aircraft/space shuttle navigation. These source listings appear in the same format as utilized and required by the C Language Integrated Production System (CLIPS) expert system shell which is the basis for the ONAV entry system. A schematic overview is given of how the rules are organized. These groups result from a partitioning of the rules according to the overall function which a given set of rules performs. This partitioning was established and maintained according to that established in the knowledge specification document. In addition, four other groups of rules are specified. The four groups (control flow, operator inputs, output management, and data tables) perform functions that affect all the other functional rule groups. As the name implies, control flow ensures that the rule groups are executed in the order required for proper operation; operator input rules control the introduction into the CLIPS fact base of various kinds of data required by the expert system; output management rules control the updating of the ONAV expert system user display screen during execution of the system; and data tables are static information utilized by many different rule sets gathered in one convenient place.

  20. Are Scientific Analogies Metaphors?

    DTIC Science & Technology

    1981-02-01

    attraction is certainly familiar, but its rules are unfortunately unclear; so this analogy does not tell the student precisely what to map from the...seriously. The rules of analogical mappings are such that, unless there is a principled reason to exempt a given predicate, it must be mapped if it belongs...contributes to their richness, for no possible mapping need be ruled out. Mutually contradictory inferences can co-exist. These analogies derive much

  1. Assessing an ensemble Kalman filter inference of Manning's n coefficient of an idealized tidal inlet against a polynomial chaos-based MCMC

    NASA Astrophysics Data System (ADS)

    Siripatana, Adil; Mayo, Talea; Sraj, Ihab; Knio, Omar; Dawson, Clint; Le Maitre, Olivier; Hoteit, Ibrahim

    2017-08-01

    Bayesian estimation/inversion is commonly used to quantify and reduce modeling uncertainties in coastal ocean model, especially in the framework of parameter estimation. Based on Bayes rule, the posterior probability distribution function (pdf) of the estimated quantities is obtained conditioned on available data. It can be computed either directly, using a Markov chain Monte Carlo (MCMC) approach, or by sequentially processing the data following a data assimilation approach, which is heavily exploited in large dimensional state estimation problems. The advantage of data assimilation schemes over MCMC-type methods arises from the ability to algorithmically accommodate a large number of uncertain quantities without significant increase in the computational requirements. However, only approximate estimates are generally obtained by this approach due to the restricted Gaussian prior and noise assumptions that are generally imposed in these methods. This contribution aims at evaluating the effectiveness of utilizing an ensemble Kalman-based data assimilation method for parameter estimation of a coastal ocean model against an MCMC polynomial chaos (PC)-based scheme. We focus on quantifying the uncertainties of a coastal ocean ADvanced CIRCulation (ADCIRC) model with respect to the Manning's n coefficients. Based on a realistic framework of observation system simulation experiments (OSSEs), we apply an ensemble Kalman filter and the MCMC method employing a surrogate of ADCIRC constructed by a non-intrusive PC expansion for evaluating the likelihood, and test both approaches under identical scenarios. We study the sensitivity of the estimated posteriors with respect to the parameters of the inference methods, including ensemble size, inflation factor, and PC order. A full analysis of both methods, in the context of coastal ocean model, suggests that an ensemble Kalman filter with appropriate ensemble size and well-tuned inflation provides reliable mean estimates and uncertainties of Manning's n coefficients compared to the full posterior distributions inferred by MCMC.

  2. Metagenomic systems biology and metabolic modeling of the human microbiome: from species composition to community assembly rules.

    PubMed

    Levy, Roie; Borenstein, Elhanan

    2014-01-01

    The human microbiome is a key contributor to health and development. Yet little is known about the ecological forces that are at play in defining the composition of such host-associated communities. Metagenomics-based studies have uncovered clear patterns of community structure but are often incapable of distinguishing alternative structuring paradigms. In a recent study, we integrated metagenomic analysis with a systems biology approach, using a reverse ecology framework to model numerous human microbiota species and to infer metabolic interactions between species. Comparing predicted interactions with species composition data revealed that the assembly of the human microbiome is dominated at the community level by habitat filtering. Furthermore, we demonstrated that this habitat filtering cannot be accounted for by known host phenotypes or by the metabolic versatility of the various species. Here we provide a summary of our findings and offer a brief perspective on related studies and on future approaches utilizing this metagenomic systems biology framework.

  3. Process Algebra Approach for Action Recognition in the Maritime Domain

    NASA Technical Reports Server (NTRS)

    Huntsberger, Terry

    2011-01-01

    The maritime environment poses a number of challenges for autonomous operation of surface boats. Among these challenges are the highly dynamic nature of the environment, the onboard sensing and reasoning requirements for obeying the navigational rules of the road, and the need for robust day/night hazard detection and avoidance. Development of full mission level autonomy entails addressing these challenges, coupled with inference of the tactical and strategic intent of possibly adversarial vehicles in the surrounding environment. This paper introduces PACIFIC (Process Algebra Capture of Intent From Information Content), an onboard system based on formal process algebras that is capable of extracting actions/activities from sensory inputs and reasoning within a mission context to ensure proper responses. PACIFIC is part of the Behavior Engine in CARACaS (Cognitive Architecture for Robotic Agent Command and Sensing), a system that is currently running on a number of U.S. Navy unmanned surface and underwater vehicles. Results from a series of experimental studies that demonstrate the effectiveness of the system are also presented.

  4. Implementation of artificial intelligence rules in a data base management system

    NASA Technical Reports Server (NTRS)

    Feyock, S.

    1986-01-01

    The intelligent front end prototype was transformed into a RIM-integrated system. A RIM-based expert system was written which demonstrated the developed capability. The use of rules to produce extensibility of the intelligent front end, including the concept of demons and rule manipulation rules were investigated. Innovative approaches such as syntax programming were to be considered.

  5. Techniques and implementation of the embedded rule-based expert system using Ada

    NASA Technical Reports Server (NTRS)

    Liberman, Eugene M.; Jones, Robert E.

    1991-01-01

    Ada is becoming an increasingly popular programming language for large Government-funded software projects. Ada with its portability, transportability, and maintainability lends itself well to today's complex programming environment. In addition, expert systems have also assured a growing role in providing human-like reasoning capability and expertise for computer systems. The integration of expert system technology with Ada programming language, specifically a rule-based expert system using an ART-Ada (Automated Reasoning Tool for Ada) system shell is discussed. The NASA Lewis Research Center was chosen as a beta test site for ART-Ada. The test was conducted by implementing the existing Autonomous Power EXpert System (APEX), a Lisp-base power expert system, in ART-Ada. Three components, the rule-based expert system, a graphics user interface, and communications software make up SMART-Ada (Systems fault Management with ART-Ada). The main objective, to conduct a beta test on the ART-Ada rule-based expert system shell, was achieved. The system is operational. New Ada tools will assist in future successful projects. ART-Ada is one such tool and is a viable alternative to the straight Ada code when an application requires a rule-based or knowledge-based approach.

  6. A Rule-Based System Implementing a Method for Translating FOL Formulas into NL Sentences

    NASA Astrophysics Data System (ADS)

    Mpagouli, Aikaterini; Hatzilygeroudis, Ioannis

    In this paper, we mainly present the implementation of a system that translates first order logic (FOL) formulas into natural language (NL) sentences. The motivation comes from an intelligent tutoring system teaching logic as a knowledge representation language, where it is used as a means for feedback to the students-users. FOL to NL conversion is achieved by using a rule-based approach, where we exploit the pattern matching capabilities of rules. So, the system consists of rule-based modules corresponding to the phases of our translation methodology. Facts are used in a lexicon providing lexical and grammatical information that helps in producing the NL sentences. The whole system is implemented in Jess, a java-implemented rule-based programming tool. Experimental results confirm the success of our choices.

  7. Panacea, a semantic-enabled drug recommendations discovery framework.

    PubMed

    Doulaverakis, Charalampos; Nikolaidis, George; Kleontas, Athanasios; Kompatsiaris, Ioannis

    2014-03-06

    Personalized drug prescription can be benefited from the use of intelligent information management and sharing. International standard classifications and terminologies have been developed in order to provide unique and unambiguous information representation. Such standards can be used as the basis of automated decision support systems for providing drug-drug and drug-disease interaction discovery. Additionally, Semantic Web technologies have been proposed in earlier works, in order to support such systems. The paper presents Panacea, a semantic framework capable of offering drug-drug and drug-diseases interaction discovery. For enabling this kind of service, medical information and terminology had to be translated to ontological terms and be appropriately coupled with medical knowledge of the field. International standard classifications and terminologies, provide the backbone of the common representation of medical data while the medical knowledge of drug interactions is represented by a rule base which makes use of the aforementioned standards. Representation is based on a lightweight ontology. A layered reasoning approach is implemented where at the first layer ontological inference is used in order to discover underlying knowledge, while at the second layer a two-step rule selection strategy is followed resulting in a computationally efficient reasoning approach. Details of the system architecture are presented while also giving an outline of the difficulties that had to be overcome. Panacea is evaluated both in terms of quality of recommendations against real clinical data and performance. The quality recommendation gave useful insights regarding requirements for real world deployment and revealed several parameters that affected the recommendation results. Performance-wise, Panacea is compared to a previous published work by the authors, a service for drug recommendations named GalenOWL, and presents their differences in modeling and approach to the problem, while also pinpointing the advantages of Panacea. Overall, the paper presents a framework for providing an efficient drug recommendations service where Semantic Web technologies are coupled with traditional business rule engines.

  8. Towards a Framework for Evaluating and Comparing Diagnosis Algorithms

    NASA Technical Reports Server (NTRS)

    Kurtoglu, Tolga; Narasimhan, Sriram; Poll, Scott; Garcia,David; Kuhn, Lukas; deKleer, Johan; vanGemund, Arjan; Feldman, Alexander

    2009-01-01

    Diagnostic inference involves the detection of anomalous system behavior and the identification of its cause, possibly down to a failed unit or to a parameter of a failed unit. Traditional approaches to solving this problem include expert/rule-based, model-based, and data-driven methods. Each approach (and various techniques within each approach) use different representations of the knowledge required to perform the diagnosis. The sensor data is expected to be combined with these internal representations to produce the diagnosis result. In spite of the availability of various diagnosis technologies, there have been only minimal efforts to develop a standardized software framework to run, evaluate, and compare different diagnosis technologies on the same system. This paper presents a framework that defines a standardized representation of the system knowledge, the sensor data, and the form of the diagnosis results and provides a run-time architecture that can execute diagnosis algorithms, send sensor data to the algorithms at appropriate time steps from a variety of sources (including the actual physical system), and collect resulting diagnoses. We also define a set of metrics that can be used to evaluate and compare the performance of the algorithms, and provide software to calculate the metrics.

  9. Verification and Validation of KBS with Neural Network Components

    NASA Technical Reports Server (NTRS)

    Wen, Wu; Callahan, John

    1996-01-01

    Artificial Neural Network (ANN) play an important role in developing robust Knowledge Based Systems (KBS). The ANN based components used in these systems learn to give appropriate predictions through training with correct input-output data patterns. Unlike traditional KBS that depends on a rule database and a production engine, the ANN based system mimics the decisions of an expert without specifically formulating the if-than type of rules. In fact, the ANNs demonstrate their superiority when such if-then type of rules are hard to generate by human expert. Verification of traditional knowledge based system is based on the proof of consistency and completeness of the rule knowledge base and correctness of the production engine.These techniques, however, can not be directly applied to ANN based components.In this position paper, we propose a verification and validation procedure for KBS with ANN based components. The essence of the procedure is to obtain an accurate system specification through incremental modification of the specifications using an ANN rule extraction algorithm.

  10. Evaluation of an artificial intelligence guided inverse planning system: clinical case study.

    PubMed

    Yan, Hui; Yin, Fang-Fang; Willett, Christopher

    2007-04-01

    An artificial intelligence (AI) guided method for parameter adjustment of inverse planning was implemented on a commercial inverse treatment planning system. For evaluation purpose, four typical clinical cases were tested and the results from both plans achieved by automated and manual methods were compared. The procedure of parameter adjustment mainly consists of three major loops. Each loop is in charge of modifying parameters of one category, which is carried out by a specially customized fuzzy inference system. A physician prescribed multiple constraints for a selected volume were adopted to account for the tradeoff between prescription dose to the PTV and dose-volume constraints for critical organs. The searching process for an optimal parameter combination began with the first constraint, and proceeds to the next until a plan with acceptable dose was achieved. The initial setup of the plan parameters was the same for each case and was adjusted independently by both manual and automated methods. After the parameters of one category were updated, the intensity maps of all fields were re-optimized and the plan dose was subsequently re-calculated. When final plan arrived, the dose statistics were calculated from both plans and compared. For planned target volume (PTV), the dose for 95% volume is up to 10% higher in plans using the automated method than those using the manual method. For critical organs, an average decrease of the plan dose was achieved. However, the automated method cannot improve the plan dose for some critical organs due to limitations of the inference rules currently employed. For normal tissue, there was no significant difference between plan doses achieved by either automated or manual method. With the application of AI-guided method, the basic parameter adjustment task can be accomplished automatically and a comparable plan dose was achieved in comparison with that achieved by the manual method. Future improvements to incorporate case-specific inference rules are essential to fully automate the inverse planning process.

  11. A logic-based method to build signaling networks and propose experimental plans.

    PubMed

    Rougny, Adrien; Gloaguen, Pauline; Langonné, Nathalie; Reiter, Eric; Crépieux, Pascale; Poupon, Anne; Froidevaux, Christine

    2018-05-18

    With the dramatic increase of the diversity and the sheer quantity of biological data generated, the construction of comprehensive signaling networks that include precise mechanisms cannot be carried out manually anymore. In this context, we propose a logic-based method that allows building large signaling networks automatically. Our method is based on a set of expert rules that make explicit the reasoning made by biologists when interpreting experimental results coming from a wide variety of experiment types. These rules allow formulating all the conclusions that can be inferred from a set of experimental results, and thus building all the possible networks that explain these results. Moreover, given an hypothesis, our system proposes experimental plans to carry out in order to validate or invalidate it. To evaluate the performance of our method, we applied our framework to the reconstruction of the FSHR-induced and the EGFR-induced signaling networks. The FSHR is known to induce the transactivation of the EGFR, but very little is known on the resulting FSH- and EGF-dependent network. We built a single network using data underlying both networks. This leads to a new hypothesis on the activation of MEK by p38MAPK, which we validate experimentally. These preliminary results represent a first step in the demonstration of a cross-talk between these two major MAP kinases pathways.

  12. 46 CFR 355.3 - Criteria to be applied in support of stock data in affidavit.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Affidavit as those observed for the primary corporation. If, on the other hand, the “fair inference rule” is... the veracity of the statutory statements made in the Affidavit (paragraph 5) may be relied upon by the Maritime Administration. (b) When applying the fair inference rule (where there are more than 30...

  13. 46 CFR 355.3 - Criteria to be applied in support of stock data in affidavit.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Affidavit as those observed for the primary corporation. If, on the other hand, the “fair inference rule” is... the veracity of the statutory statements made in the Affidavit (paragraph 5) may be relied upon by the Maritime Administration. (b) When applying the fair inference rule (where there are more than 30...

  14. Real Rules of Inference

    DTIC Science & Technology

    1986-01-01

    the AAAI Workshop on Uncertainty and Probability in Artificial Intelligence , 1985. [McC771 McCarthy, J. "Epistemological Problems of Aritificial ...NUMBER OF PAGES Artificial Intelligence , Data Fusion, Inference, Probability, 30 Philosophy, Inheritance Hierachies, Default Reasoning ia.PRCECODE I...prominent philosophers Glymour and Thomason even applaud the uninhibited steps: Artificial Intelligence has done us the service not only of reminding us

  15. An Embedded Rule-Based Diagnostic Expert System in Ada

    NASA Technical Reports Server (NTRS)

    Jones, Robert E.; Liberman, Eugene M.

    1992-01-01

    Ada is becoming an increasingly popular programming language for large Government-funded software projects. Ada with it portability, transportability, and maintainability lends itself well to today's complex programming environment. In addition, expert systems have also assumed a growing role in providing human-like reasoning capability expertise for computer systems. The integration is discussed of expert system technology with Ada programming language, especially a rule-based expert system using an ART-Ada (Automated Reasoning Tool for Ada) system shell. NASA Lewis was chosen as a beta test site for ART-Ada. The test was conducted by implementing the existing Autonomous Power EXpert System (APEX), a Lisp-based power expert system, in ART-Ada. Three components, the rule-based expert systems, a graphics user interface, and communications software make up SMART-Ada (Systems fault Management with ART-Ada). The rules were written in the ART-Ada development environment and converted to Ada source code. The graphics interface was developed with the Transportable Application Environment (TAE) Plus, which generates Ada source code to control graphics images. SMART-Ada communicates with a remote host to obtain either simulated or real data. The Ada source code generated with ART-Ada, TAE Plus, and communications code was incorporated into an Ada expert system that reads the data from a power distribution test bed, applies the rule to determine a fault, if one exists, and graphically displays it on the screen. The main objective, to conduct a beta test on the ART-Ada rule-based expert system shell, was achieved. The system is operational. New Ada tools will assist in future successful projects. ART-Ada is one such tool and is a viable alternative to the straight Ada code when an application requires a rule-based or knowledge-based approach.

  16. A Software Engine to Justify the Conclusions of an Expert System for Detecting Renal Obstruction on 99mTc-MAG3 Scans

    PubMed Central

    Garcia, Ernest V.; Taylor, Andrew; Manatunga, Daya; Folks, Russell

    2013-01-01

    The purposes of this study were to describe and evaluate a software engine to justify the conclusions reached by a renal expert system (RENEX) for assessing patients with suspected renal obstruction and to obtain from this evaluation new knowledge that can be incorporated into RENEX to attempt to improve diagnostic performance. Methods RENEX consists of 60 heuristic rules extracted from the rules used by a domain expert to generate the knowledge base and a forward-chaining inference engine to determine obstruction. The justification engine keeps track of the sequence of the rules that are instantiated to reach a conclusion. The interpreter can then request justification by clicking on the specific conclusion. The justification process then reports the English translation of all concatenated rules instantiated to reach that conclusion. The justification engine was evaluated with a prospective group of 60 patients (117 kidneys). After reviewing the standard renal mercaptoacetyltriglycine (MAG3) scans obtained before and after the administration of furosemide, a masked expert determined whether each kidney was obstructed, whether the results were equivocal, or whether the kidney was not obstructed and identified and ranked the main variables associated with each interpretation. Two parameters were then tabulated: the frequency with which the main variables associated with obstruction by the expert were also justified by RENEX and the frequency with which the justification rules provided by RENEX were deemed to be correct by the expert. Only when RENEX and the domain expert agreed on the diagnosis (87 kidneys) were the results used to test the justification. Results RENEX agreed with 91% (184/203) of the rules supplied by the expert for justifying the diagnosis. RENEX provided 103 additional rules justifying the diagnosis; the expert agreed that 102 (99%) were correct, although the rules were considered to be of secondary importance. Conclusion We have described and evaluated a software engine to justify the conclusions of RENEX for detecting renal obstruction with MAG3 renal scans obtained before and after the administration of furosemide. This tool is expected to increase physician confidence in the interpretations provided by RENEX and to assist physicians and trainees in gaining a higher level of expertise. PMID:17332625

  17. A software engine to justify the conclusions of an expert system for detecting renal obstruction on 99mTc-MAG3 scans.

    PubMed

    Garcia, Ernest V; Taylor, Andrew; Manatunga, Daya; Folks, Russell

    2007-03-01

    The purposes of this study were to describe and evaluate a software engine to justify the conclusions reached by a renal expert system (RENEX) for assessing patients with suspected renal obstruction and to obtain from this evaluation new knowledge that can be incorporated into RENEX to attempt to improve diagnostic performance. RENEX consists of 60 heuristic rules extracted from the rules used by a domain expert to generate the knowledge base and a forward-chaining inference engine to determine obstruction. The justification engine keeps track of the sequence of the rules that are instantiated to reach a conclusion. The interpreter can then request justification by clicking on the specific conclusion. The justification process then reports the English translation of all concatenated rules instantiated to reach that conclusion. The justification engine was evaluated with a prospective group of 60 patients (117 kidneys). After reviewing the standard renal mercaptoacetyltriglycine (MAG3) scans obtained before and after the administration of furosemide, a masked expert determined whether each kidney was obstructed, whether the results were equivocal, or whether the kidney was not obstructed and identified and ranked the main variables associated with each interpretation. Two parameters were then tabulated: the frequency with which the main variables associated with obstruction by the expert were also justified by RENEX and the frequency with which the justification rules provided by RENEX were deemed to be correct by the expert. Only when RENEX and the domain expert agreed on the diagnosis (87 kidneys) were the results used to test the justification. RENEX agreed with 91% (184/203) of the rules supplied by the expert for justifying the diagnosis. RENEX provided 103 additional rules justifying the diagnosis; the expert agreed that 102 (99%) were correct, although the rules were considered to be of secondary importance. We have described and evaluated a software engine to justify the conclusions of RENEX for detecting renal obstruction with MAG3 renal scans obtained before and after the administration of furosemide. This tool is expected to increase physician confidence in the interpretations provided by RENEX and to assist physicians and trainees in gaining a higher level of expertise.

  18. Rule based artificial intelligence expert system for determination of upper extremity impairment rating.

    PubMed

    Lim, I; Walkup, R K; Vannier, M W

    1993-04-01

    Quantitative evaluation of upper extremity impairment, a percentage rating most often determined using a rule based procedure, has been implemented on a personal computer using an artificial intelligence, rule-based expert system (AI system). In this study, the rules given in Chapter 3 of the AMA Guides to the Evaluation of Permanent Impairment (Third Edition) were used to develop such an AI system for the Apple Macintosh. The program applies the rules from the Guides in a consistent and systematic fashion. It is faster and less error-prone than the manual method, and the results have a higher degree of precision, since intermediate values are not truncated.

  19. Semantic Web Research Trends and Directions

    DTIC Science & Technology

    2006-01-01

    workflow templates. Workflow templates are used for various different tasks such as en- coding business rules in a B2B application, specifying domain...recently suggest that rules are desirable in this space, both in terms of their expressivity, and in some cases, due to their attractive computational...of OWL documents. However, in most cases, a more attractive solution is to simply write a rule that captures the inference needed, as it is reusable

  20. Comparison of conventional rule based flow control with control processes based on fuzzy logic in a combined sewer system.

    PubMed

    Klepiszewski, K; Schmitt, T G

    2002-01-01

    While conventional rule based, real time flow control of sewer systems is in common use, control systems based on fuzzy logic have been used only rarely, but successfully. The intention of this study is to compare a conventional rule based control of a combined sewer system with a fuzzy logic control by using hydrodynamic simulation. The objective of both control strategies is to reduce the combined sewer overflow volume by an optimization of the utilized storage capacities of four combined sewer overflow tanks. The control systems affect the outflow of four combined sewer overflow tanks depending on the water levels inside the structures. Both systems use an identical rule base. The developed control systems are tested and optimized for a single storm event which affects heterogeneously hydraulic load conditions and local discharge. Finally the efficiencies of the two different control systems are compared for two more storm events. The results indicate that the conventional rule based control and the fuzzy control similarly reach the objective of the control strategy. In spite of the higher expense to design the fuzzy control system its use provides no advantages in this case.

  1. Rule-based topology system for spatial databases to validate complex geographic datasets

    NASA Astrophysics Data System (ADS)

    Martinez-Llario, J.; Coll, E.; Núñez-Andrés, M.; Femenia-Ribera, C.

    2017-06-01

    A rule-based topology software system providing a highly flexible and fast procedure to enforce integrity in spatial relationships among datasets is presented. This improved topology rule system is built over the spatial extension Jaspa. Both projects are open source, freely available software developed by the corresponding author of this paper. Currently, there is no spatial DBMS that implements a rule-based topology engine (considering that the topology rules are designed and performed in the spatial backend). If the topology rules are applied in the frontend (as in many GIS desktop programs), ArcGIS is the most advanced solution. The system presented in this paper has several major advantages over the ArcGIS approach: it can be extended with new topology rules, it has a much wider set of rules, and it can mix feature attributes with topology rules as filters. In addition, the topology rule system can work with various DBMSs, including PostgreSQL, H2 or Oracle, and the logic is performed in the spatial backend. The proposed topology system allows users to check the complex spatial relationships among features (from one or several spatial layers) that require some complex cartographic datasets, such as the data specifications proposed by INSPIRE in Europe and the Land Administration Domain Model (LADM) for Cadastral data.

  2. Expert system validation in prolog

    NASA Technical Reports Server (NTRS)

    Stock, Todd; Stachowitz, Rolf; Chang, Chin-Liang; Combs, Jacqueline

    1988-01-01

    An overview of the Expert System Validation Assistant (EVA) is being implemented in Prolog at the Lockheed AI Center. Prolog was chosen to facilitate rapid prototyping of the structure and logic checkers and since February 1987, we have implemented code to check for irrelevance, subsumption, duplication, deadends, unreachability, and cycles. The architecture chosen is extremely flexible and expansible, yet concise and complementary with the normal interactive style of Prolog. The foundation of the system is in the connection graph representation. Rules and facts are modeled as nodes in the graph and arcs indicate common patterns between rules. The basic activity of the validation system is then a traversal of the connection graph, searching for various patterns the system recognizes as erroneous. To aid in specifying these patterns, a metalanguage is developed, providing the user with the basic facilities required to reason about the expert system. Using the metalanguage, the user can, for example, give the Prolog inference engine the goal of finding inconsistent conclusions among the rules, and Prolog will search the graph intantiations which can match the definition of inconsistency. Examples of code for some of the checkers are provided and the algorithms explained. Technical highlights include automatic construction of a connection graph, demonstration of the use of metalanguage, the A* algorithm modified to detect all unique cycles, general-purpose stacks in Prolog, and a general-purpose database browser with pattern completion.

  3. Evaluation of Anomaly Detection Capability for Ground-Based Pre-Launch Shuttle Operations. Chapter 8

    NASA Technical Reports Server (NTRS)

    Martin, Rodney Alexander

    2010-01-01

    This chapter will provide a thorough end-to-end description of the process for evaluation of three different data-driven algorithms for anomaly detection to select the best candidate for deployment as part of a suite of IVHM (Integrated Vehicle Health Management) technologies. These algorithms were deemed to be sufficiently mature enough to be considered viable candidates for deployment in support of the maiden launch of Ares I-X, the successor to the Space Shuttle for NASA's Constellation program. Data-driven algorithms are just one of three different types being deployed. The other two types of algorithms being deployed include a "nile-based" expert system, and a "model-based" system. Within these two categories, the deployable candidates have already been selected based upon qualitative factors such as flight heritage. For the rule-based system, SHINE (Spacecraft High-speed Inference Engine) has been selected for deployment, which is a component of BEAM (Beacon-based Exception Analysis for Multimissions), a patented technology developed at NASA's JPL (Jet Propulsion Laboratory) and serves to aid in the management and identification of operational modes. For the "model-based" system, a commercially available package developed by QSI (Qualtech Systems, Inc.), TEAMS (Testability Engineering and Maintenance System) has been selected for deployment to aid in diagnosis. In the context of this particular deployment, distinctions among the use of the terms "data-driven," "rule-based," and "model-based," can be found in. Although there are three different categories of algorithms that have been selected for deployment, our main focus in this chapter will be on the evaluation of three candidates for data-driven anomaly detection. These algorithms will be evaluated upon their capability for robustly detecting incipient faults or failures in the ground-based phase of pre-launch space shuttle operations, rather than based oil heritage as performed in previous studies. Robust detection will allow for the achievement of pre-specified minimum false alarm and/or missed detection rates in the selection of alert thresholds. All algorithms will also be optimized with respect to an aggregation of these same criteria. Our study relies upon the use of Shuttle data to act as was a proxy for and in preparation for application to Ares I-X data, which uses a very similar hardware platform for the subsystems that are being targeted (TVC - Thrust Vector Control subsystem for the SRB (Solid Rocket Booster)).

  4. Efficient Web Services Policy Combination

    NASA Technical Reports Server (NTRS)

    Vatan, Farrokh; Harman, Joseph G.

    2010-01-01

    Large-scale Web security systems usually involve cooperation between domains with non-identical policies. The network management and Web communication software used by the different organizations presents a stumbling block. Many of the tools used by the various divisions do not have the ability to communicate network management data with each other. At best, this means that manual human intervention into the communication protocols used at various network routers and endpoints is required. Developing practical, sound, and automated ways to compose policies to bridge these differences is a long-standing problem. One of the key subtleties is the need to deal with inconsistencies and defaults where one organization proposes a rule on a particular feature, and another has a different rule or expresses no rule. A general approach is to assign priorities to rules and observe the rules with the highest priorities when there are conflicts. The present methods have inherent inefficiency, which heavily restrict their practical applications. A new, efficient algorithm combines policies utilized for Web services. The method is based on an algorithm that allows an automatic and scalable composition of security policies between multiple organizations. It is based on defeasible policy composition, a promising approach for finding conflicts and resolving priorities between rules. In the general case, policy negotiation is an intractable problem. A promising method, suggested in the literature, is when policies are represented in defeasible logic, and composition is based on rules for non-monotonic inference. In this system, policy writers construct metapolicies describing both the policy that they wish to enforce and annotations describing their composition preferences. These annotations can indicate whether certain policy assertions are required by the policy writer or, if not, under what circumstances the policy writer is willing to compromise and allow other assertions to take precedence. Meta-policies are specified in defeasible logic, a computationally efficient non-monotonic logic developed to model human reasoning. One drawback of this method is that at one point the algorithm starts an exhaustive search of all subsets of the set of conclusions of a defeasible theory. Although the propositional defeasible logic has linear complexity, the set of conclusions here may be large, especially in real-life practical cases. This phenomenon leads to an inefficient exponential explosion of complexity. The current process of getting a Web security policy from combination of two meta-policies consists of two steps. The first is generating a new meta-policy that is a composition of the input meta-policies, and the second is mapping the meta-policy onto a security policy. The new algorithm avoids the exhaustive search in the current algorithm, and provides a security policy that matches all requirements of the involved metapolicies.

  5. A new hybrid case-based reasoning approach for medical diagnosis systems.

    PubMed

    Sharaf-El-Deen, Dina A; Moawad, Ibrahim F; Khalifa, M E

    2014-02-01

    Case-Based Reasoning (CBR) has been applied in many different medical applications. Due to the complexities and the diversities of this domain, most medical CBR systems become hybrid. Besides, the case adaptation process in CBR is often a challenging issue as it is traditionally carried out manually by domain experts. In this paper, a new hybrid case-based reasoning approach for medical diagnosis systems is proposed to improve the accuracy of the retrieval-only CBR systems. The approach integrates case-based reasoning and rule-based reasoning, and also applies the adaptation process automatically by exploiting adaptation rules. Both adaptation rules and reasoning rules are generated from the case-base. After solving a new case, the case-base is expanded, and both adaptation and reasoning rules are updated. To evaluate the proposed approach, a prototype was implemented and experimented to diagnose breast cancer and thyroid diseases. The final results show that the proposed approach increases the diagnosing accuracy of the retrieval-only CBR systems, and provides a reliable accuracy comparing to the current breast cancer and thyroid diagnosis systems.

  6. A self-learning rule base for command following in dynamical systems

    NASA Technical Reports Server (NTRS)

    Tsai, Wei K.; Lee, Hon-Mun; Parlos, Alexander

    1992-01-01

    In this paper, a self-learning Rule Base for command following in dynamical systems is presented. The learning is accomplished though reinforcement learning using an associative memory called SAM. The main advantage of SAM is that it is a function approximator with explicit storage of training samples. A learning algorithm patterned after the dynamic programming is proposed. Two artificially created, unstable dynamical systems are used for testing, and the Rule Base was used to generate a feedback control to improve the command following ability of the otherwise uncontrolled systems. The numerical results are very encouraging. The controlled systems exhibit a more stable behavior and a better capability to follow reference commands. The rules resulting from the reinforcement learning are explicitly stored and they can be modified or augmented by human experts. Due to overlapping storage scheme of SAM, the stored rules are similar to fuzzy rules.

  7. Empirical Analysis and Refinement of Expert System Knowledge Bases

    DTIC Science & Technology

    1988-08-31

    refinement. Both a simulated case generation program, and a random rule basher were developed to enhance rule refinement experimentation. *Substantial...the second fiscal year 88 objective was fully met. Rule Refinement System Simulated Rule Basher Case Generator Stored Cases Expert System Knowledge...generated until the rule is satisfied. Cases may be randomly generated for a given rule or hypothesis. Rule Basher Given that one has a correct

  8. The impact of egocentric vs. allocentric agency attributions on the neural bases of reasoning about social rules.

    PubMed

    Canessa, Nicola; Pantaleo, Giuseppe; Crespi, Chiara; Gorini, Alessandra; Cappa, Stefano F

    2014-09-18

    We used the "standard" and "switched" social contract versions of the Wason Selection-task to investigate the neural bases of human reasoning about social rules. Both these versions typically elicit the deontically correct answer, i.e. the proper identification of the violations of a conditional obligation. Only in the standard version of the task, however, this response corresponds to the logically correct one. We took advantage of this differential adherence to logical vs. deontical accuracy to test the different predictions of logic rule-based vs. visuospatial accounts of inferential abilities in 14 participants who solved the standard and switched versions of the Selection-task during functional-Magnetic-Resonance-Imaging. Both versions activated the well known left fronto-parietal network of deductive reasoning. The standard version additionally recruited the medial parietal and right inferior parietal cortex, previously associated with mental imagery and with the adoption of egocentric vs. allocentric spatial reference frames. These results suggest that visuospatial processes encoding one's own subjective experience in social interactions may support and shape the interpretation of deductive arguments and/or the resulting inferences, thus contributing to elicit content effects in human reasoning. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Expert networks in CLIPS

    NASA Technical Reports Server (NTRS)

    Hruska, S. I.; Dalke, A.; Ferguson, J. J.; Lacher, R. C.

    1991-01-01

    Rule-based expert systems may be structurally and functionally mapped onto a special class of neural networks called expert networks. This mapping lends itself to adaptation of connectionist learning strategies for the expert networks. A parsing algorithm to translate C Language Integrated Production System (CLIPS) rules into a network of interconnected assertion and operation nodes has been developed. The translation of CLIPS rules to an expert network and back again is illustrated. Measures of uncertainty similar to those rules in MYCIN-like systems are introduced into the CLIPS system and techniques for combining and hiring nodes in the network based on rule-firing with these certainty factors in the expert system are presented. Several learning algorithms are under study which automate the process of attaching certainty factors to rules.

  10. Development of a Spacecraft Materials Selector Expert System

    NASA Technical Reports Server (NTRS)

    Pippin, G.; Kauffman, W. (Technical Monitor)

    2002-01-01

    This report contains a description of the knowledge base tool and examples of its use. A downloadable version of the Spacecraft Materials Selector (SMS) knowledge base is available through the NASA Space Environments and Effects Program. The "Spacecraft Materials Selector" knowledge base is part of an electronic expert system. The expert system consists of an inference engine that contains the "decision-making" code and the knowledge base that contains the selected body of information. The inference engine is a software package previously developed at Boeing, called the Boeing Expert System Tool (BEST) kit.

  11. Synaptic and nonsynaptic plasticity approximating probabilistic inference

    PubMed Central

    Tully, Philip J.; Hennig, Matthias H.; Lansner, Anders

    2014-01-01

    Learning and memory operations in neural circuits are believed to involve molecular cascades of synaptic and nonsynaptic changes that lead to a diverse repertoire of dynamical phenomena at higher levels of processing. Hebbian and homeostatic plasticity, neuromodulation, and intrinsic excitability all conspire to form and maintain memories. But it is still unclear how these seemingly redundant mechanisms could jointly orchestrate learning in a more unified system. To this end, a Hebbian learning rule for spiking neurons inspired by Bayesian statistics is proposed. In this model, synaptic weights and intrinsic currents are adapted on-line upon arrival of single spikes, which initiate a cascade of temporally interacting memory traces that locally estimate probabilities associated with relative neuronal activation levels. Trace dynamics enable synaptic learning to readily demonstrate a spike-timing dependence, stably return to a set-point over long time scales, and remain competitive despite this stability. Beyond unsupervised learning, linking the traces with an external plasticity-modulating signal enables spike-based reinforcement learning. At the postsynaptic neuron, the traces are represented by an activity-dependent ion channel that is shown to regulate the input received by a postsynaptic cell and generate intrinsic graded persistent firing levels. We show how spike-based Hebbian-Bayesian learning can be performed in a simulated inference task using integrate-and-fire (IAF) neurons that are Poisson-firing and background-driven, similar to the preferred regime of cortical neurons. Our results support the view that neurons can represent information in the form of probability distributions, and that probabilistic inference could be a functional by-product of coupled synaptic and nonsynaptic mechanisms operating over several timescales. The model provides a biophysical realization of Bayesian computation by reconciling several observed neural phenomena whose functional effects are only partially understood in concert. PMID:24782758

  12. CLIPS - C LANGUAGE INTEGRATED PRODUCTION SYSTEM (IBM PC VERSION)

    NASA Technical Reports Server (NTRS)

    Riley, G.

    1994-01-01

    The C Language Integrated Production System, CLIPS, is a shell for developing expert systems. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. The primary design goals for CLIPS are portability, efficiency, and functionality. For these reasons, the program is written in C. CLIPS meets or outperforms most micro- and minicomputer based artificial intelligence tools. CLIPS is a forward chaining rule-based language. The program contains an inference engine and a language syntax that provide a framework for the construction of an expert system. It also includes tools for debugging an application. CLIPS is based on the Rete algorithm, which enables very efficient pattern matching. The collection of conditions and actions to be taken if the conditions are met is constructed into a rule network. As facts are asserted either prior to or during a session, CLIPS pattern-matches the number of fields. Wildcards and variables are supported for both single and multiple fields. CLIPS syntax allows the inclusion of externally defined functions (outside functions which are written in a language other than CLIPS). CLIPS itself can be embedded in a program such that the expert system is available as a simple subroutine call. Advanced features found in CLIPS version 4.3 include an integrated microEMACS editor, the ability to generate C source code from a CLIPS rule base to produce a dedicated executable, binary load and save capabilities for CLIPS rule bases, and the utility program CRSV (Cross-Reference, Style, and Verification) designed to facilitate the development and maintenance of large rule bases. Five machine versions are available. Each machine version includes the source and the executable for that machine. The UNIX version includes the source and binaries for IBM RS/6000, Sun3 series, and Sun4 series computers. The UNIX, DEC VAX, and DEC RISC Workstation versions are line oriented. The PC version and the Macintosh version each contain a windowing variant of CLIPS as well as the standard line oriented version. The mouse/window interface version for the PC works with a Microsoft compatible mouse or without a mouse. This window version uses the proprietary CURSES library for the PC, but a working executable of the window version is provided. The window oriented version for the Macintosh includes a version which uses a full Macintosh-style interface, including an integrated editor. This version allows the user to observe the changing fact base and rule activations in separate windows while a CLIPS program is executing. The IBM PC version is available bundled with CLIPSITS, The CLIPS Intelligent Tutoring System for a special combined price (COS-10025). The goal of CLIPSITS is to provide the student with a tool to practice the syntax and concepts covered in the CLIPS User's Guide. It attempts to provide expert diagnosis and advice during problem solving which is typically not available without an instructor. CLIPSITS is divided into 10 lessons which mirror the first 10 chapters of the CLIPS User's Guide. The program was developed for the IBM PC series with a hard disk. CLIPSITS is also available separately as MSC-21679. The CLIPS program is written in C for interactive execution and has been implemented on an IBM PC computer operating under DOS, a Macintosh and DEC VAX series computers operating under VMS or ULTRIX. The line oriented version should run on any computer system which supports a full (Kernighan and Ritchie) C compiler or the ANSI standard C language. CLIPS was developed in 1986 and Version 4.2 was released in July of 1988. Version 4.3 was released in June of 1989.

  13. CLIPS - C LANGUAGE INTEGRATED PRODUCTION SYSTEM (MACINTOSH VERSION)

    NASA Technical Reports Server (NTRS)

    Culbert, C.

    1994-01-01

    The C Language Integrated Production System, CLIPS, is a shell for developing expert systems. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. The primary design goals for CLIPS are portability, efficiency, and functionality. For these reasons, the program is written in C. CLIPS meets or outperforms most micro- and minicomputer based artificial intelligence tools. CLIPS is a forward chaining rule-based language. The program contains an inference engine and a language syntax that provide a framework for the construction of an expert system. It also includes tools for debugging an application. CLIPS is based on the Rete algorithm, which enables very efficient pattern matching. The collection of conditions and actions to be taken if the conditions are met is constructed into a rule network. As facts are asserted either prior to or during a session, CLIPS pattern-matches the number of fields. Wildcards and variables are supported for both single and multiple fields. CLIPS syntax allows the inclusion of externally defined functions (outside functions which are written in a language other than CLIPS). CLIPS itself can be embedded in a program such that the expert system is available as a simple subroutine call. Advanced features found in CLIPS version 4.3 include an integrated microEMACS editor, the ability to generate C source code from a CLIPS rule base to produce a dedicated executable, binary load and save capabilities for CLIPS rule bases, and the utility program CRSV (Cross-Reference, Style, and Verification) designed to facilitate the development and maintenance of large rule bases. Five machine versions are available. Each machine version includes the source and the executable for that machine. The UNIX version includes the source and binaries for IBM RS/6000, Sun3 series, and Sun4 series computers. The UNIX, DEC VAX, and DEC RISC Workstation versions are line oriented. The PC version and the Macintosh version each contain a windowing variant of CLIPS as well as the standard line oriented version. The mouse/window interface version for the PC works with a Microsoft compatible mouse or without a mouse. This window version uses the proprietary CURSES library for the PC, but a working executable of the window version is provided. The window oriented version for the Macintosh includes a version which uses a full Macintosh-style interface, including an integrated editor. This version allows the user to observe the changing fact base and rule activations in separate windows while a CLIPS program is executing. The IBM PC version is available bundled with CLIPSITS, The CLIPS Intelligent Tutoring System for a special combined price (COS-10025). The goal of CLIPSITS is to provide the student with a tool to practice the syntax and concepts covered in the CLIPS User's Guide. It attempts to provide expert diagnosis and advice during problem solving which is typically not available without an instructor. CLIPSITS is divided into 10 lessons which mirror the first 10 chapters of the CLIPS User's Guide. The program was developed for the IBM PC series with a hard disk. CLIPSITS is also available separately as MSC-21679. The CLIPS program is written in C for interactive execution and has been implemented on an IBM PC computer operating under DOS, a Macintosh and DEC VAX series computers operating under VMS or ULTRIX. The line oriented version should run on any computer system which supports a full (Kernighan and Ritchie) C compiler or the ANSI standard C language. CLIPS was developed in 1986 and Version 4.2 was released in July of 1988. Version 4.3 was released in June of 1989.

  14. CLIPS - C LANGUAGE INTEGRATED PRODUCTION SYSTEM (IBM PC VERSION WITH CLIPSITS)

    NASA Technical Reports Server (NTRS)

    Riley, , .

    1994-01-01

    The C Language Integrated Production System, CLIPS, is a shell for developing expert systems. It is designed to allow artificial intelligence research, development, and delivery on conventional computers. The primary design goals for CLIPS are portability, efficiency, and functionality. For these reasons, the program is written in C. CLIPS meets or outperforms most micro- and minicomputer based artificial intelligence tools. CLIPS is a forward chaining rule-based language. The program contains an inference engine and a language syntax that provide a framework for the construction of an expert system. It also includes tools for debugging an application. CLIPS is based on the Rete algorithm, which enables very efficient pattern matching. The collection of conditions and actions to be taken if the conditions are met is constructed into a rule network. As facts are asserted either prior to or during a session, CLIPS pattern-matches the number of fields. Wildcards and variables are supported for both single and multiple fields. CLIPS syntax allows the inclusion of externally defined functions (outside functions which are written in a language other than CLIPS). CLIPS itself can be embedded in a program such that the expert system is available as a simple subroutine call. Advanced features found in CLIPS version 4.3 include an integrated microEMACS editor, the ability to generate C source code from a CLIPS rule base to produce a dedicated executable, binary load and save capabilities for CLIPS rule bases, and the utility program CRSV (Cross-Reference, Style, and Verification) designed to facilitate the development and maintenance of large rule bases. Five machine versions are available. Each machine version includes the source and the executable for that machine. The UNIX version includes the source and binaries for IBM RS/6000, Sun3 series, and Sun4 series computers. The UNIX, DEC VAX, and DEC RISC Workstation versions are line oriented. The PC version and the Macintosh version each contain a windowing variant of CLIPS as well as the standard line oriented version. The mouse/window interface version for the PC works with a Microsoft compatible mouse or without a mouse. This window version uses the proprietary CURSES library for the PC, but a working executable of the window version is provided. The window oriented version for the Macintosh includes a version which uses a full Macintosh-style interface, including an integrated editor. This version allows the user to observe the changing fact base and rule activations in separate windows while a CLIPS program is executing. The IBM PC version is available bundled with CLIPSITS, The CLIPS Intelligent Tutoring System for a special combined price (COS-10025). The goal of CLIPSITS is to provide the student with a tool to practice the syntax and concepts covered in the CLIPS User's Guide. It attempts to provide expert diagnosis and advice during problem solving which is typically not available without an instructor. CLIPSITS is divided into 10 lessons which mirror the first 10 chapters of the CLIPS User's Guide. The program was developed for the IBM PC series with a hard disk. CLIPSITS is also available separately as MSC-21679. The CLIPS program is written in C for interactive execution and has been implemented on an IBM PC computer operating under DOS, a Macintosh and DEC VAX series computers operating under VMS or ULTRIX. The line oriented version should run on any computer system which supports a full (Kernighan and Ritchie) C compiler or the ANSI standard C language. CLIPS was developed in 1986 and Version 4.2 was released in July of 1988. Version 4.3 was released in June of 1989.

  15. GenSo-EWS: a novel neural-fuzzy based early warning system for predicting bank failures.

    PubMed

    Tung, W L; Quek, C; Cheng, P

    2004-05-01

    Bank failure prediction is an important issue for the regulators of the banking industries. The collapse and failure of a bank could trigger an adverse financial repercussion and generate negative impacts such as a massive bail out cost for the failing bank and loss of confidence from the investors and depositors. Very often, bank failures are due to financial distress. Hence, it is desirable to have an early warning system (EWS) that identifies potential bank failure or high-risk banks through the traits of financial distress. Various traditional statistical models have been employed to study bank failures [J Finance 1 (1975) 21; J Banking Finance 1 (1977) 249; J Banking Finance 10 (1986) 511; J Banking Finance 19 (1995) 1073]. However, these models do not have the capability to identify the characteristics of financial distress and thus function as black boxes. This paper proposes the use of a new neural fuzzy system [Foundations of neuro-fuzzy systems, 1997], namely the Generic Self-organising Fuzzy Neural Network (GenSoFNN) [IEEE Trans Neural Networks 13 (2002c) 1075] based on the compositional rule of inference (CRI) [Commun ACM 37 (1975) 77], as an alternative to predict banking failure. The CRI based GenSoFNN neural fuzzy network, henceforth denoted as GenSoFNN-CRI(S), functions as an EWS and is able to identify the inherent traits of financial distress based on financial covariates (features) derived from publicly available financial statements. The interaction between the selected features is captured in the form of highly intuitive IF-THEN fuzzy rules. Such easily comprehensible rules provide insights into the possible characteristics of financial distress and form the knowledge base for a highly desired EWS that aids bank regulation. The performance of the GenSoFNN-CRI(S) network is subsequently benchmarked against that of the Cox's proportional hazards model [J Banking Finance 10 (1986) 511; J Banking Finance 19 (1995) 1073], the multi-layered perceptron (MLP) and the modified cerebellar model articulation controller (MCMAC) [IEEE Trans Syst Man Cybern: Part B 30 (2000) 491] in predicting bank failures based on a population of 3635 US banks observed over a 21 years period. Three sets of experiments are performed-bank failure classification based on the last available financial record and prediction using financial records one and two years prior to the last available financial statements. The performance of the GenSoFNN-CRI(S) network as a bank failure classification and EWS is encouraging.

  16. Research on complex 3D tree modeling based on L-system

    NASA Astrophysics Data System (ADS)

    Gang, Chen; Bin, Chen; Yuming, Liu; Hui, Li

    2018-03-01

    L-system as a fractal iterative system could simulate complex geometric patterns. Based on the field observation data of trees and knowledge of forestry experts, this paper extracted modeling constraint rules and obtained an L-system rules set. Using the self-developed L-system modeling software the L-system rule set was parsed to generate complex tree 3d models.The results showed that the geometrical modeling method based on l-system could be used to describe the morphological structure of complex trees and generate 3D tree models.

  17. ShinyKGode: an interactive application for ODE parameter inference using gradient matching.

    PubMed

    Wandy, Joe; Niu, Mu; Giurghita, Diana; Daly, Rónán; Rogers, Simon; Husmeier, Dirk

    2018-07-01

    Mathematical modelling based on ordinary differential equations (ODEs) is widely used to describe the dynamics of biological systems, particularly in systems and pathway biology. Often the kinetic parameters of these ODE systems are unknown and have to be inferred from the data. Approximate parameter inference methods based on gradient matching (which do not require performing computationally expensive numerical integration of the ODEs) have been getting popular in recent years, but many implementations are difficult to run without expert knowledge. Here, we introduce ShinyKGode, an interactive web application to perform fast parameter inference on ODEs using gradient matching. ShinyKGode can be used to infer ODE parameters on simulated and observed data using gradient matching. Users can easily load their own models in Systems Biology Markup Language format, and a set of pre-defined ODE benchmark models are provided in the application. Inferred parameters are visualized alongside diagnostic plots to assess convergence. The R package for ShinyKGode can be installed through the Comprehensive R Archive Network (CRAN). Installation instructions, as well as tutorial videos and source code are available at https://joewandy.github.io/shinyKGode. Supplementary data are available at Bioinformatics online.

  18. RuleMonkey: software for stochastic simulation of rule-based models

    PubMed Central

    2010-01-01

    Background The system-level dynamics of many molecular interactions, particularly protein-protein interactions, can be conveniently represented using reaction rules, which can be specified using model-specification languages, such as the BioNetGen language (BNGL). A set of rules implicitly defines a (bio)chemical reaction network. The reaction network implied by a set of rules is often very large, and as a result, generation of the network implied by rules tends to be computationally expensive. Moreover, the cost of many commonly used methods for simulating network dynamics is a function of network size. Together these factors have limited application of the rule-based modeling approach. Recently, several methods for simulating rule-based models have been developed that avoid the expensive step of network generation. The cost of these "network-free" simulation methods is independent of the number of reactions implied by rules. Software implementing such methods is now needed for the simulation and analysis of rule-based models of biochemical systems. Results Here, we present a software tool called RuleMonkey, which implements a network-free method for simulation of rule-based models that is similar to Gillespie's method. The method is suitable for rule-based models that can be encoded in BNGL, including models with rules that have global application conditions, such as rules for intramolecular association reactions. In addition, the method is rejection free, unlike other network-free methods that introduce null events, i.e., steps in the simulation procedure that do not change the state of the reaction system being simulated. We verify that RuleMonkey produces correct simulation results, and we compare its performance against DYNSTOC, another BNGL-compliant tool for network-free simulation of rule-based models. We also compare RuleMonkey against problem-specific codes implementing network-free simulation methods. Conclusions RuleMonkey enables the simulation of rule-based models for which the underlying reaction networks are large. It is typically faster than DYNSTOC for benchmark problems that we have examined. RuleMonkey is freely available as a stand-alone application http://public.tgen.org/rulemonkey. It is also available as a simulation engine within GetBonNie, a web-based environment for building, analyzing and sharing rule-based models. PMID:20673321

  19. TARGETED SEQUENTIAL DESIGN FOR TARGETED LEARNING INFERENCE OF THE OPTIMAL TREATMENT RULE AND ITS MEAN REWARD.

    PubMed

    Chambaz, Antoine; Zheng, Wenjing; van der Laan, Mark J

    2017-01-01

    This article studies the targeted sequential inference of an optimal treatment rule (TR) and its mean reward in the non-exceptional case, i.e. , assuming that there is no stratum of the baseline covariates where treatment is neither beneficial nor harmful, and under a companion margin assumption. Our pivotal estimator, whose definition hinges on the targeted minimum loss estimation (TMLE) principle, actually infers the mean reward under the current estimate of the optimal TR. This data-adaptive statistical parameter is worthy of interest on its own. Our main result is a central limit theorem which enables the construction of confidence intervals on both mean rewards under the current estimate of the optimal TR and under the optimal TR itself. The asymptotic variance of the estimator takes the form of the variance of an efficient influence curve at a limiting distribution, allowing to discuss the efficiency of inference. As a by product, we also derive confidence intervals on two cumulated pseudo-regrets, a key notion in the study of bandits problems. A simulation study illustrates the procedure. One of the corner-stones of the theoretical study is a new maximal inequality for martingales with respect to the uniform entropy integral.

  20. No interpretation without representation: the role of domain-specific representations and inferences in the Wason selection task.

    PubMed

    Fiddick, L; Cosmides, L; Tooby, J

    2000-10-16

    The Wason selection task is a tool used to study reasoning about conditional rules. Performance on this task changes systematically when one varies its content, and these content effects have been used to argue that the human cognitive architecture contains a number of domain-specific representation and inference systems, such as social contract algorithms and hazard management systems. Recently, however, Sperber, Cara & Girotto (Sperber, D., Cara, F., & Girotto, V. (1995). Relevance theory explains the selection task. Cognition, 57, 31-95) have proposed that relevance theory can explain performance on the selection task - including all content effects - without invoking inference systems that are content-specialized. Herein, we show that relevance theory alone cannot explain a variety of content effects - effects that were predicted in advance and are parsimoniously explained by theories that invoke domain-specific algorithms for representing and making inferences about (i) social contracts and (ii) reducing risk in hazardous situations. Moreover, although Sperber et al. (1995) were able to use relevance theory to produce some new content effects in other domains, they conducted no experiments involving social exchanges or precautions, and so were unable to determine which - content-specialized algorithms or relevance effects - dominate reasoning when the two conflict. When experiments, reported herein, are constructed so that the different theories predict divergent outcomes, the results support the predictions of social contract theory and hazard management theory, indicating that these inference systems override content-general relevance factors. The fact that social contract and hazard management algorithms provide better explanations for performance in their respective domains does not mean that the content-general logical procedures posited by relevance theory do not exist, or that relevance effects never occur. It does mean, however, that one needs a principled way of explaining which effects will dominate when a set of inputs activate more than one reasoning system. We propose the principle of pre-emptive specificity - that the human cognitive architecture should be designed so that more specialized inference systems pre-empt more general ones whenever the stimuli centrally fit the input conditions of the more specialized system. This principle follows from evolutionary and computational considerations that are common to both relevance theory and the ecological rationality approach.

  1. Agile text mining for the 2014 i2b2/UTHealth Cardiac risk factors challenge.

    PubMed

    Cormack, James; Nath, Chinmoy; Milward, David; Raja, Kalpana; Jonnalagadda, Siddhartha R

    2015-12-01

    This paper describes the use of an agile text mining platform (Linguamatics' Interactive Information Extraction Platform, I2E) to extract document-level cardiac risk factors in patient records as defined in the i2b2/UTHealth 2014 challenge. The approach uses a data-driven rule-based methodology with the addition of a simple supervised classifier. We demonstrate that agile text mining allows for rapid optimization of extraction strategies, while post-processing can leverage annotation guidelines, corpus statistics and logic inferred from the gold standard data. We also show how data imbalance in a training set affects performance. Evaluation of this approach on the test data gave an F-Score of 91.7%, one percent behind the top performing system. Copyright © 2015 Elsevier Inc. All rights reserved.

  2. Language Acquisition in a Unification-Based Grammar Processing System Using a Real-World Knowledge Base.

    ERIC Educational Resources Information Center

    Russell, Dale W.

    An obstacle in Natural Language understanding is the existence of lexical gaps, i.e. words or word senses that are not in the lexicon of the system. This thesis describes the implementation of MURRAY, a learning mechanism which infers the properties of a new lexical item from its syntactical environment and infers its meaning based on context and…

  3. A Swarm Optimization approach for clinical knowledge mining.

    PubMed

    Christopher, J Jabez; Nehemiah, H Khanna; Kannan, A

    2015-10-01

    Rule-based classification is a typical data mining task that is being used in several medical diagnosis and decision support systems. The rules stored in the rule base have an impact on classification efficiency. Rule sets that are extracted with data mining tools and techniques are optimized using heuristic or meta-heuristic approaches in order to improve the quality of the rule base. In this work, a meta-heuristic approach called Wind-driven Swarm Optimization (WSO) is used. The uniqueness of this work lies in the biological inspiration that underlies the algorithm. WSO uses Jval, a new metric, to evaluate the efficiency of a rule-based classifier. Rules are extracted from decision trees. WSO is used to obtain different permutations and combinations of rules whereby the optimal ruleset that satisfies the requirement of the developer is used for predicting the test data. The performance of various extensions of decision trees, namely, RIPPER, PART, FURIA and Decision Tables are analyzed. The efficiency of WSO is also compared with the traditional Particle Swarm Optimization. Experiments were carried out with six benchmark medical datasets. The traditional C4.5 algorithm yields 62.89% accuracy with 43 rules for liver disorders dataset where as WSO yields 64.60% with 19 rules. For Heart disease dataset, C4.5 is 68.64% accurate with 98 rules where as WSO is 77.8% accurate with 34 rules. The normalized standard deviation for accuracy of PSO and WSO are 0.5921 and 0.5846 respectively. WSO provides accurate and concise rulesets. PSO yields results similar to that of WSO but the novelty of WSO lies in its biological motivation and it is customization for rule base optimization. The trade-off between the prediction accuracy and the size of the rule base is optimized during the design and development of rule-based clinical decision support system. The efficiency of a decision support system relies on the content of the rule base and classification accuracy. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  4. Avoiding criminal liabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Blattner, J.W.; Bramble, G.M.

    1994-06-01

    Armed with more than 120 investigative agents, the US Environmental Protection Agency, through its attorneys at the Dept. of Justice, charges 5 to 10 engineers and business people with criminal violations of the nation's environmental regulations in any given week. There are some 10,000 pages of federal (let alone state) environmental regulations. The rules apply to large and small companies alike. As a practical matter, the sheer scope and complexity of environmental regulatory programs make 100% compliance virtually unattainable for most industrial enterprises. Where it is no longer a defense to claim lack of knowledge of one's regulatory obligations, andmore » where courts allow the inference of criminal knowledge based on what the defendant should have known, what is a company to do The environmental audit provides a solution to this problem. Progressive audit programs are established with three goals in mind: to ensure that programs and practices at facilities are in compliance with applicable rules and regulations; to affirm that management systems are in place at the facilities to support ongoing compliance; and to identify needs or opportunities where it may be desirable to go beyond compliance to protect human health and the environment. This paper discusses the implementation of an audit program.« less

  5. A fuzzy decision tree for fault classification.

    PubMed

    Zio, Enrico; Baraldi, Piero; Popescu, Irina C

    2008-02-01

    In plant accident management, the control room operators are required to identify the causes of the accident, based on the different patterns of evolution of the monitored process variables thereby developing. This task is often quite challenging, given the large number of process parameters monitored and the intense emotional states under which it is performed. To aid the operators, various techniques of fault classification have been engineered. An important requirement for their practical application is the physical interpretability of the relationships among the process variables underpinning the fault classification. In this view, the present work propounds a fuzzy approach to fault classification, which relies on fuzzy if-then rules inferred from the clustering of available preclassified signal data, which are then organized in a logical and transparent decision tree structure. The advantages offered by the proposed approach are precisely that a transparent fault classification model is mined out of the signal data and that the underlying physical relationships among the process variables are easily interpretable as linguistic if-then rules that can be explicitly visualized in the decision tree structure. The approach is applied to a case study regarding the classification of simulated faults in the feedwater system of a boiling water reactor.

  6. The architecture of adaptive neural network based on a fuzzy inference system for implementing intelligent control in photovoltaic systems

    NASA Astrophysics Data System (ADS)

    Gimazov, R.; Shidlovskiy, S.

    2018-05-01

    In this paper, we consider the architecture of the algorithm for extreme regulation in the photovoltaic system. An algorithm based on an adaptive neural network with fuzzy inference is proposed. The implementation of such an algorithm not only allows solving a number of problems in existing algorithms for extreme power regulation of photovoltaic systems, but also creates a reserve for the creation of a universal control system for a photovoltaic system.

  7. System diagnostic builder: a rule-generation tool for expert systems that do intelligent data evaluation

    NASA Astrophysics Data System (ADS)

    Nieten, Joseph L.; Burke, Roger

    1993-03-01

    The system diagnostic builder (SDB) is an automated knowledge acquisition tool using state- of-the-art artificial intelligence (AI) technologies. The SDB uses an inductive machine learning technique to generate rules from data sets that are classified by a subject matter expert (SME). Thus, data is captured from the subject system, classified by an expert, and used to drive the rule generation process. These rule-bases are used to represent the observable behavior of the subject system, and to represent knowledge about this system. The rule-bases can be used in any knowledge based system which monitors or controls a physical system or simulation. The SDB has demonstrated the utility of using inductive machine learning technology to generate reliable knowledge bases. In fact, we have discovered that the knowledge captured by the SDB can be used in any number of applications. For example, the knowledge bases captured from the SMS can be used as black box simulations by intelligent computer aided training devices. We can also use the SDB to construct knowledge bases for the process control industry, such as chemical production, or oil and gas production. These knowledge bases can be used in automated advisory systems to ensure safety, productivity, and consistency.

  8. Bayesian Networks Improve Causal Environmental Assessments for Evidence-Based Policy.

    PubMed

    Carriger, John F; Barron, Mace G; Newman, Michael C

    2016-12-20

    Rule-based weight of evidence approaches to ecological risk assessment may not account for uncertainties and generally lack probabilistic integration of lines of evidence. Bayesian networks allow causal inferences to be made from evidence by including causal knowledge about the problem, using this knowledge with probabilistic calculus to combine multiple lines of evidence, and minimizing biases in predicting or diagnosing causal relationships. Too often, sources of uncertainty in conventional weight of evidence approaches are ignored that can be accounted for with Bayesian networks. Specifying and propagating uncertainties improve the ability of models to incorporate strength of the evidence in the risk management phase of an assessment. Probabilistic inference from a Bayesian network allows evaluation of changes in uncertainty for variables from the evidence. The network structure and probabilistic framework of a Bayesian approach provide advantages over qualitative approaches in weight of evidence for capturing the impacts of multiple sources of quantifiable uncertainty on predictions of ecological risk. Bayesian networks can facilitate the development of evidence-based policy under conditions of uncertainty by incorporating analytical inaccuracies or the implications of imperfect information, structuring and communicating causal issues through qualitative directed graph formulations, and quantitatively comparing the causal power of multiple stressors on valued ecological resources. These aspects are demonstrated through hypothetical problem scenarios that explore some major benefits of using Bayesian networks for reasoning and making inferences in evidence-based policy.

  9. An expert system to manage the operation of the Space Shuttle's fuel cell cryogenic reactant tanks

    NASA Technical Reports Server (NTRS)

    Murphey, Amy Y.

    1990-01-01

    This paper describes a rule-based expert system to manage the operation of the Space Shuttle's cryogenic fuel system. Rules are based on standard fuel tank operating procedures described in the EECOM Console Handbook. The problem of configuring the operation of the Space Shuttle's fuel tanks is well-bounded and well defined. Moreover, the solution of this problem can be encoded in a knowledge-based system. Therefore, a rule-based expert system is the appropriate paradigm. Furthermore, the expert system could be used in coordination with power system simulation software to design operating procedures for specific missions.

  10. Research of Litchi Diseases Diagnosis Expertsystem Based on Rbr and Cbr

    NASA Astrophysics Data System (ADS)

    Xu, Bing; Liu, Liqun

    To conquer the bottleneck problems existing in the traditional rule-based reasoning diseases diagnosis system, such as low reasoning efficiency and lack of flexibility, etc.. It researched the integrated case-based reasoning (CBR) and rule-based reasoning (RBR) technology, and put forward a litchi diseases diagnosis expert system (LDDES) with integrated reasoning method. The method use data mining and knowledge obtaining technology to establish knowledge base and case library. It adopt rules to instruct the retrieval and matching for CBR, and use association rule and decision trees algorithm to calculate case similarity.The experiment shows that the method can increase the system's flexibility and reasoning ability, and improve the accuracy of litchi diseases diagnosis.

  11. Fuzzy support vector machine: an efficient rule-based classification technique for microarrays.

    PubMed

    Hajiloo, Mohsen; Rabiee, Hamid R; Anooshahpour, Mahdi

    2013-01-01

    The abundance of gene expression microarray data has led to the development of machine learning algorithms applicable for tackling disease diagnosis, disease prognosis, and treatment selection problems. However, these algorithms often produce classifiers with weaknesses in terms of accuracy, robustness, and interpretability. This paper introduces fuzzy support vector machine which is a learning algorithm based on combination of fuzzy classifiers and kernel machines for microarray classification. Experimental results on public leukemia, prostate, and colon cancer datasets show that fuzzy support vector machine applied in combination with filter or wrapper feature selection methods develops a robust model with higher accuracy than the conventional microarray classification models such as support vector machine, artificial neural network, decision trees, k nearest neighbors, and diagonal linear discriminant analysis. Furthermore, the interpretable rule-base inferred from fuzzy support vector machine helps extracting biological knowledge from microarray data. Fuzzy support vector machine as a new classification model with high generalization power, robustness, and good interpretability seems to be a promising tool for gene expression microarray classification.

  12. F-OWL: An Inference Engine for Semantic Web

    NASA Technical Reports Server (NTRS)

    Zou, Youyong; Finin, Tim; Chen, Harry

    2004-01-01

    Understanding and using the data and knowledge encoded in semantic web documents requires an inference engine. F-OWL is an inference engine for the semantic web language OWL language based on F-logic, an approach to defining frame-based systems in logic. F-OWL is implemented using XSB and Flora-2 and takes full advantage of their features. We describe how F-OWL computes ontology entailment and compare it with other description logic based approaches. We also describe TAGA, a trading agent environment that we have used as a test bed for F-OWL and to explore how multiagent systems can use semantic web concepts and technology.

  13. Comparative analysis of expert and machine-learning methods for classification of body cavity effusions in companion animals.

    PubMed

    Hotz, Christine S; Templeton, Steven J; Christopher, Mary M

    2005-03-01

    A rule-based expert system using CLIPS programming language was created to classify body cavity effusions as transudates, modified transudates, exudates, chylous, and hemorrhagic effusions. The diagnostic accuracy of the rule-based system was compared with that produced by 2 machine-learning methods: Rosetta, a rough sets algorithm and RIPPER, a rule-induction method. Results of 508 body cavity fluid analyses (canine, feline, equine) obtained from the University of California-Davis Veterinary Medical Teaching Hospital computerized patient database were used to test CLIPS and to test and train RIPPER and Rosetta. The CLIPS system, using 17 rules, achieved an accuracy of 93.5% compared with pathologist consensus diagnoses. Rosetta accurately classified 91% of effusions by using 5,479 rules. RIPPER achieved the greatest accuracy (95.5%) using only 10 rules. When the original rules of the CLIPS application were replaced with those of RIPPER, the accuracy rates were identical. These results suggest that both rule-based expert systems and machine-learning methods hold promise for the preliminary classification of body fluids in the clinical laboratory.

  14. Double jeopardy in inferring cognitive processes

    PubMed Central

    Fific, Mario

    2014-01-01

    Inferences we make about underlying cognitive processes can be jeopardized in two ways due to problematic forms of aggregation. First, averaging across individuals is typically considered a very useful tool for removing random variability. The threat is that averaging across subjects leads to averaging across different cognitive strategies, thus harming our inferences. The second threat comes from the construction of inadequate research designs possessing a low diagnostic accuracy of cognitive processes. For that reason we introduced the systems factorial technology (SFT), which has primarily been designed to make inferences about underlying processing order (serial, parallel, coactive), stopping rule (terminating, exhaustive), and process dependency. SFT proposes that the minimal research design complexity to learn about n number of cognitive processes should be equal to 2n. In addition, SFT proposes that (a) each cognitive process should be controlled by a separate experimental factor, and (b) The saliency levels of all factors should be combined in a full factorial design. In the current study, the author cross combined the levels of jeopardies in a 2 × 2 analysis, leading to four different analysis conditions. The results indicate a decline in the diagnostic accuracy of inferences made about cognitive processes due to the presence of each jeopardy in isolation and when combined. The results warrant the development of more individual subject analyses and the utilization of full-factorial (SFT) experimental designs. PMID:25374545

  15. Implementing a Commercial Rule Base as a Medication Order Safety Net

    PubMed Central

    Reichley, Richard M.; Seaton, Terry L.; Resetar, Ervina; Micek, Scott T.; Scott, Karen L.; Fraser, Victoria J.; Dunagan, W. Claiborne; Bailey, Thomas C.

    2005-01-01

    A commercial rule base (Cerner Multum) was used to identify medication orders exceeding recommended dosage limits at five hospitals within BJC HealthCare, an integrated health care system. During initial testing, clinical pharmacists determined that there was an excessive number of nuisance and clinically insignificant alerts, with an overall alert rate of 9.2%. A method for customizing the commercial rule base was implemented to increase rule specificity for problematic rules. The system was subsequently deployed at two facilities and achieved alert rates of less than 1%. Pharmacists screened these alerts and contacted ordering physicians in 21% of cases. Physicians made therapeutic changes in response to 38% of alerts presented to them. By applying simple techniques to customize rules, commercial rule bases can be used to rapidly deploy a safety net to screen drug orders for excessive dosages, while preserving the rule architecture for later implementations of more finely tuned clinical decision support. PMID:15802481

  16. A fuzzy-logic-based model to predict biogas and methane production rates in a pilot-scale mesophilic UASB reactor treating molasses wastewater.

    PubMed

    Turkdogan-Aydinol, F Ilter; Yetilmezsoy, Kaan

    2010-10-15

    A MIMO (multiple inputs and multiple outputs) fuzzy-logic-based model was developed to predict biogas and methane production rates in a pilot-scale 90-L mesophilic up-flow anaerobic sludge blanket (UASB) reactor treating molasses wastewater. Five input variables such as volumetric organic loading rate (OLR), volumetric total chemical oxygen demand (TCOD) removal rate (R(V)), influent alkalinity, influent pH and effluent pH were fuzzified by the use of an artificial intelligence-based approach. Trapezoidal membership functions with eight levels were conducted for the fuzzy subsets, and a Mamdani-type fuzzy inference system was used to implement a total of 134 rules in the IF-THEN format. The product (prod) and the centre of gravity (COG, centroid) methods were employed as the inference operator and defuzzification methods, respectively. Fuzzy-logic predicted results were compared with the outputs of two exponential non-linear regression models derived in this study. The UASB reactor showed a remarkable performance on the treatment of molasses wastewater, with an average TCOD removal efficiency of 93 (+/-3)% and an average volumetric TCOD removal rate of 6.87 (+/-3.93) kg TCOD(removed)/m(3)-day, respectively. Findings of this study clearly indicated that, compared to non-linear regression models, the proposed MIMO fuzzy-logic-based model produced smaller deviations and exhibited a superior predictive performance on forecasting of both biogas and methane production rates with satisfactory determination coefficients over 0.98. 2010 Elsevier B.V. All rights reserved.

  17. Expert Witness: A system for developing expert medical testimony

    NASA Technical Reports Server (NTRS)

    Lewandowski, Raymond; Perkins, David; Leasure, David

    1994-01-01

    Expert Witness in an expert system designed to assist attorneys and medical experts in determining the merit of medical malpractice claims in the area of obstetrics. It substitutes the time of the medical expert with the time of a paralegal assistant guided by the expert system during the initial investigation of the medical records and patient interviews. The product of the system is a narrative transcript containing important data, immediate conclusions from the data, and overall conclusions of the case that the attorney and medical expert use to make decisions about whether and how to proceed with the case. The transcript may also contain directives for gathering additional information needed for the case. The system is a modified heuristic classifier and is implemented using over 600 CLIPS rules together with a C-based user interface. The data abstraction and solution refinement are implemented directly using forward chaining production and matching. The use of CLIPS and C is essential to delivering a system that runs on a generic PC platform. The direct implementation in CLIPS together with locality of inference ensures that the system will scale gracefully. Two years of use has revealed no errors in the reasoning.

  18. Evolving rule-based systems in two medical domains using genetic programming.

    PubMed

    Tsakonas, Athanasios; Dounias, Georgios; Jantzen, Jan; Axer, Hubertus; Bjerregaard, Beth; von Keyserlingk, Diedrich Graf

    2004-11-01

    To demonstrate and compare the application of different genetic programming (GP) based intelligent methodologies for the construction of rule-based systems in two medical domains: the diagnosis of aphasia's subtypes and the classification of pap-smear examinations. Past data representing (a) successful diagnosis of aphasia's subtypes from collaborating medical experts through a free interview per patient, and (b) correctly classified smears (images of cells) by cyto-technologists, previously stained using the Papanicolaou method. Initially a hybrid approach is proposed, which combines standard genetic programming and heuristic hierarchical crisp rule-base construction. Then, genetic programming for the production of crisp rule based systems is attempted. Finally, another hybrid intelligent model is composed by a grammar driven genetic programming system for the generation of fuzzy rule-based systems. Results denote the effectiveness of the proposed systems, while they are also compared for their efficiency, accuracy and comprehensibility, to those of an inductive machine learning approach as well as to those of a standard genetic programming symbolic expression approach. The proposed GP-based intelligent methodologies are able to produce accurate and comprehensible results for medical experts performing competitive to other intelligent approaches. The aim of the authors was the production of accurate but also sensible decision rules that could potentially help medical doctors to extract conclusions, even at the expense of a higher classification score achievement.

  19. Radiation of Pollination Systems in the Iridaceae of sub-Saharan Africa

    PubMed Central

    GOLDBLATT, PETER; MANNING, JOHN C.

    2006-01-01

    • Background Seventeen distinct pollination systems are known for genera of sub-Saharan African Iridaceae and recurrent shifts in pollination system have evolved in those with ten or more species. Pollination by long-tongued anthophorine bees foraging for nectar and coincidentally acquiring pollen on some part of their bodies is the inferred ancestral pollination strategy for most genera of the large subfamilies Iridoideae and Crocoideae and may be ancestral for the latter. Derived strategies include pollination by long-proboscid flies, large butterflies, night-flying hovering and settling moths, hopliine beetles and sunbirds. Bee pollination is diverse, with active pollen collection by female bees occurring in several genera, vibratile systems in a few and non-volatile oil as a reward in one species. Long-proboscid fly pollination, which is apparently restricted to southern Africa, includes four separate syndromes using different sets of flies and plant species in different parts of the subcontinent. Small numbers of species use bibionid flies, short-proboscid flies or wasps for their pollination; only about 2 % of species use multiple pollinators and can be described as generalists. • Scope Using pollination observations for 375 species and based on repeated patterns of floral attractants and rewards, we infer pollination mechanisms for an additional 610 species. Matching pollination system to phylogeny or what is known about species relationships based on shared derived features, we infer repeated shifts in pollination system in some genera, as frequently as one shift for every five or six species of southern African Babiana or Gladiolus. Specialized systems using pollinators of one pollination group, or even a single pollinator species are the rule in the family. Shifts in pollination system are more frequent in genera of Crocoideae that have bilaterally symmetric flowers and a perianth tube, features that promote adaptive radiation by facilitating precise shifts in pollen placement, in conjunction with changes in flower colour, scent and tube length. • Conclusions Diversity of pollination systems explains in part the huge species diversity of Iridaceae in sub-Saharan Africa, and permits species packing locally. Pollination shifts are, however, seen as playing a secondary role in speciation by promoting reproductive isolation in peripheral, ecologically distinct populations in areas of diverse topography, climate and soils. Pollination of Iridaceae in Eurasia and the New World, where the family is also well represented, is poorly studied but appears less diverse, although pollination by both pollen- and oil-collecting bees is frequent and bird pollination rare. PMID:16377653

  20. Radiation of pollination systems in the Iridaceae of sub-Saharan Africa.

    PubMed

    Goldblatt, Peter; Manning, John C

    2006-03-01

    Seventeen distinct pollination systems are known for genera of sub-Saharan African Iridaceae and recurrent shifts in pollination system have evolved in those with ten or more species. Pollination by long-tongued anthophorine bees foraging for nectar and coincidentally acquiring pollen on some part of their bodies is the inferred ancestral pollination strategy for most genera of the large subfamilies Iridoideae and Crocoideae and may be ancestral for the latter. Derived strategies include pollination by long-proboscid flies, large butterflies, night-flying hovering and settling moths, hopliine beetles and sunbirds. Bee pollination is diverse, with active pollen collection by female bees occurring in several genera, vibratile systems in a few and non-volatile oil as a reward in one species. Long-proboscid fly pollination, which is apparently restricted to southern Africa, includes four separate syndromes using different sets of flies and plant species in different parts of the subcontinent. Small numbers of species use bibionid flies, short-proboscid flies or wasps for their pollination; only about 2 % of species use multiple pollinators and can be described as generalists. Using pollination observations for 375 species and based on repeated patterns of floral attractants and rewards, we infer pollination mechanisms for an additional 610 species. Matching pollination system to phylogeny or what is known about species relationships based on shared derived features, we infer repeated shifts in pollination system in some genera, as frequently as one shift for every five or six species of southern African Babiana or Gladiolus. Specialized systems using pollinators of one pollination group, or even a single pollinator species are the rule in the family. Shifts in pollination system are more frequent in genera of Crocoideae that have bilaterally symmetric flowers and a perianth tube, features that promote adaptive radiation by facilitating precise shifts in pollen placement, in conjunction with changes in flower colour, scent and tube length. Diversity of pollination systems explains in part the huge species diversity of Iridaceae in sub-Saharan Africa, and permits species packing locally. Pollination shifts are, however, seen as playing a secondary role in speciation by promoting reproductive isolation in peripheral, ecologically distinct populations in areas of diverse topography, climate and soils. Pollination of Iridaceae in Eurasia and the New World, where the family is also well represented, is poorly studied but appears less diverse, although pollination by both pollen- and oil-collecting bees is frequent and bird pollination rare.

  1. Rule-based mechanisms of learning for intelligent adaptive flight control

    NASA Technical Reports Server (NTRS)

    Handelman, David A.; Stengel, Robert F.

    1990-01-01

    How certain aspects of human learning can be used to characterize learning in intelligent adaptive control systems is investigated. Reflexive and declarative memory and learning are described. It is shown that model-based systems-theoretic adaptive control methods exhibit attributes of reflexive learning, whereas the problem-solving capabilities of knowledge-based systems of artificial intelligence are naturally suited for implementing declarative learning. Issues related to learning in knowledge-based control systems are addressed, with particular attention given to rule-based systems. A mechanism for real-time rule-based knowledge acquisition is suggested, and utilization of this mechanism within the context of failure diagnosis for fault-tolerant flight control is demonstrated.

  2. MedXN: an open source medication extraction and normalization tool for clinical text

    PubMed Central

    Sohn, Sunghwan; Clark, Cheryl; Halgrim, Scott R; Murphy, Sean P; Chute, Christopher G; Liu, Hongfang

    2014-01-01

    Objective We developed the Medication Extraction and Normalization (MedXN) system to extract comprehensive medication information and normalize it to the most appropriate RxNorm concept unique identifier (RxCUI) as specifically as possible. Methods Medication descriptions in clinical notes were decomposed into medication name and attributes, which were separately extracted using RxNorm dictionary lookup and regular expression. Then, each medication name and its attributes were combined together according to RxNorm convention to find the most appropriate RxNorm representation. To do this, we employed serialized hierarchical steps implemented in Apache's Unstructured Information Management Architecture. We also performed synonym expansion, removed false medications, and employed inference rules to improve the medication extraction and normalization performance. Results An evaluation on test data of 397 medication mentions showed F-measures of 0.975 for medication name and over 0.90 for most attributes. The RxCUI assignment produced F-measures of 0.932 for medication name and 0.864 for full medication information. Most false negative RxCUI assignments in full medication information are due to human assumption of missing attributes and medication names in the gold standard. Conclusions The MedXN system (http://sourceforge.net/projects/ohnlp/files/MedXN/) was able to extract comprehensive medication information with high accuracy and demonstrated good normalization capability to RxCUI as long as explicit evidence existed. More sophisticated inference rules might result in further improvements to specific RxCUI assignments for incomplete medication descriptions. PMID:24637954

  3. Bayesian probability estimates are not necessary to make choices satisfying Bayes' rule in elementary situations.

    PubMed

    Domurat, Artur; Kowalczuk, Olga; Idzikowska, Katarzyna; Borzymowska, Zuzanna; Nowak-Przygodzka, Marta

    2015-01-01

    This paper has two aims. First, we investigate how often people make choices conforming to Bayes' rule when natural sampling is applied. Second, we show that using Bayes' rule is not necessary to make choices satisfying Bayes' rule. Simpler methods, even fallacious heuristics, might prescribe correct choices reasonably often under specific circumstances. We considered elementary situations with binary sets of hypotheses and data. We adopted an ecological approach and prepared two-stage computer tasks resembling natural sampling. Probabilistic relations were inferred from a set of pictures, followed by a choice which was made to maximize the chance of a preferred outcome. Use of Bayes' rule was deduced indirectly from choices. Study 1 used a stratified sample of N = 60 participants equally distributed with regard to gender and type of education (humanities vs. pure sciences). Choices satisfying Bayes' rule were dominant. To investigate ways of making choices more directly, we replicated Study 1, adding a task with a verbal report. In Study 2 (N = 76) choices conforming to Bayes' rule dominated again. However, the verbal reports revealed use of a new, non-inverse rule, which always renders correct choices, but is easier than Bayes' rule to apply. It does not require inversion of conditions [transforming P(H) and P(D|H) into P(H|D)] when computing chances. Study 3 examined the efficiency of three fallacious heuristics (pre-Bayesian, representativeness, and evidence-only) in producing choices concordant with Bayes' rule. Computer-simulated scenarios revealed that the heuristics produced correct choices reasonably often under specific base rates and likelihood ratios. Summing up we conclude that natural sampling results in most choices conforming to Bayes' rule. However, people tend to replace Bayes' rule with simpler methods, and even use of fallacious heuristics may be satisfactorily efficient.

  4. Bayesian probability estimates are not necessary to make choices satisfying Bayes’ rule in elementary situations

    PubMed Central

    Domurat, Artur; Kowalczuk, Olga; Idzikowska, Katarzyna; Borzymowska, Zuzanna; Nowak-Przygodzka, Marta

    2015-01-01

    This paper has two aims. First, we investigate how often people make choices conforming to Bayes’ rule when natural sampling is applied. Second, we show that using Bayes’ rule is not necessary to make choices satisfying Bayes’ rule. Simpler methods, even fallacious heuristics, might prescribe correct choices reasonably often under specific circumstances. We considered elementary situations with binary sets of hypotheses and data. We adopted an ecological approach and prepared two-stage computer tasks resembling natural sampling. Probabilistic relations were inferred from a set of pictures, followed by a choice which was made to maximize the chance of a preferred outcome. Use of Bayes’ rule was deduced indirectly from choices. Study 1 used a stratified sample of N = 60 participants equally distributed with regard to gender and type of education (humanities vs. pure sciences). Choices satisfying Bayes’ rule were dominant. To investigate ways of making choices more directly, we replicated Study 1, adding a task with a verbal report. In Study 2 (N = 76) choices conforming to Bayes’ rule dominated again. However, the verbal reports revealed use of a new, non-inverse rule, which always renders correct choices, but is easier than Bayes’ rule to apply. It does not require inversion of conditions [transforming P(H) and P(D|H) into P(H|D)] when computing chances. Study 3 examined the efficiency of three fallacious heuristics (pre-Bayesian, representativeness, and evidence-only) in producing choices concordant with Bayes’ rule. Computer-simulated scenarios revealed that the heuristics produced correct choices reasonably often under specific base rates and likelihood ratios. Summing up we conclude that natural sampling results in most choices conforming to Bayes’ rule. However, people tend to replace Bayes’ rule with simpler methods, and even use of fallacious heuristics may be satisfactorily efficient. PMID:26347676

  5. Automatic de-identification of French clinical records: comparison of rule-based and machine-learning approaches.

    PubMed

    Grouin, Cyril; Zweigenbaum, Pierre

    2013-01-01

    In this paper, we present a comparison of two approaches to automatically de-identify medical records written in French: a rule-based system and a machine-learning based system using a conditional random fields (CRF) formalism. Both systems have been designed to process nine identifiers in a corpus of medical records in cardiology. We performed two evaluations: first, on 62 documents in cardiology, and on 10 documents in foetopathology - produced by optical character recognition (OCR) - to evaluate the robustness of our systems. We achieved a 0.843 (rule-based) and 0.883 (machine-learning) exact match overall F-measure in cardiology. While the rule-based system allowed us to achieve good results on nominative (first and last names) and numerical data (dates, phone numbers, and zip codes), the machine-learning approach performed best on more complex categories (postal addresses, hospital names, medical devices, and towns). On the foetopathology corpus, although our systems have not been designed for this corpus and despite OCR character recognition errors, we obtained promising results: a 0.681 (rule-based) and 0.638 (machine-learning) exact-match overall F-measure. This demonstrates that existing tools can be applied to process new documents of lower quality.

  6. A Bayesian Framework for Analysis of Pseudo-Spatial Models of Comparable Engineered Systems with Application to Spacecraft Anomaly Prediction Based on Precedent Data

    NASA Astrophysics Data System (ADS)

    Ndu, Obibobi Kamtochukwu

    To ensure that estimates of risk and reliability inform design and resource allocation decisions in the development of complex engineering systems, early engagement in the design life cycle is necessary. An unfortunate constraint on the accuracy of such estimates at this stage of concept development is the limited amount of high fidelity design and failure information available on the actual system under development. Applying the human ability to learn from experience and augment our state of knowledge to evolve better solutions mitigates this limitation. However, the challenge lies in formalizing a methodology that takes this highly abstract, but fundamentally human cognitive, ability and extending it to the field of risk analysis while maintaining the tenets of generalization, Bayesian inference, and probabilistic risk analysis. We introduce an integrated framework for inferring the reliability, or other probabilistic measures of interest, of a new system or a conceptual variant of an existing system. Abstractly, our framework is based on learning from the performance of precedent designs and then applying the acquired knowledge, appropriately adjusted based on degree of relevance, to the inference process. This dissertation presents a method for inferring properties of the conceptual variant using a pseudo-spatial model that describes the spatial configuration of the family of systems to which the concept belongs. Through non-metric multidimensional scaling, we formulate the pseudo-spatial model based on rank-ordered subjective expert perception of design similarity between systems that elucidate the psychological space of the family. By a novel extension of Kriging methods for analysis of geospatial data to our "pseudo-space of comparable engineered systems", we develop a Bayesian inference model that allows prediction of the probabilistic measure of interest.

  7. An approach for environmental risk assessment of engineered nanomaterials using Analytical Hierarchy Process (AHP) and fuzzy inference rules.

    PubMed

    Topuz, Emel; van Gestel, Cornelis A M

    2016-01-01

    The usage of Engineered Nanoparticles (ENPs) in consumer products is relatively new and there is a need to conduct environmental risk assessment (ERA) to evaluate their impacts on the environment. However, alternative approaches are required for ERA of ENPs because of the huge gap in data and knowledge compared to conventional pollutants and their unique properties that make it difficult to apply existing approaches. This study aims to propose an ERA approach for ENPs by integrating Analytical Hierarchy Process (AHP) and fuzzy inference models which provide a systematic evaluation of risk factors and reducing uncertainty about the data and information, respectively. Risk is assumed to be the combination of occurrence likelihood, exposure potential and toxic effects in the environment. A hierarchy was established to evaluate the sub factors of these components. Evaluation was made with fuzzy numbers to reduce uncertainty and incorporate the expert judgements. Overall score of each component was combined with fuzzy inference rules by using expert judgements. Proposed approach reports the risk class and its membership degree such as Minor (0.7). Therefore, results are precise and helpful to determine the risk management strategies. Moreover, priority weights calculated by comparing the risk factors based on their importance for the risk enable users to understand which factor is effective on the risk. Proposed approach was applied for Ag (two nanoparticles with different coating) and TiO2 nanoparticles for different case studies. Results verified the proposed benefits of the approach. Copyright © 2016 Elsevier Ltd. All rights reserved.

  8. Statistical inference of static analysis rules

    NASA Technical Reports Server (NTRS)

    Engler, Dawson Richards (Inventor)

    2009-01-01

    Various apparatus and methods are disclosed for identifying errors in program code. Respective numbers of observances of at least one correctness rule by different code instances that relate to the at least one correctness rule are counted in the program code. Each code instance has an associated counted number of observances of the correctness rule by the code instance. Also counted are respective numbers of violations of the correctness rule by different code instances that relate to the correctness rule. Each code instance has an associated counted number of violations of the correctness rule by the code instance. A respective likelihood of the validity is determined for each code instance as a function of the counted number of observances and counted number of violations. The likelihood of validity indicates a relative likelihood that a related code instance is required to observe the correctness rule. The violations may be output in order of the likelihood of validity of a violated correctness rule.

  9. Learning the Rules of the Game

    ERIC Educational Resources Information Center

    Smith, Donald A.

    2018-01-01

    Games have often been used in the classroom to teach physics ideas and concepts, but there has been less published on games that can be used to teach scientific thinking. D. Maloney and M. Masters describe an activity in which students attempt to infer rules to a game from a history of moves, but the students do not actually play the game. Giving…

  10. Identifying desertification risk areas using fuzzy membership and geospatial technique - A case study, Kota District, Rajasthan

    NASA Astrophysics Data System (ADS)

    Dasgupta, Arunima; Sastry, K. L. N.; Dhinwa, P. S.; Rathore, V. S.; Nathawat, M. S.

    2013-08-01

    Desertification risk assessment is important in order to take proper measures for its prevention. Present research intends to identify the areas under risk of desertification along with their severity in terms of degradation in natural parameters. An integrated model with fuzzy membership analysis, fuzzy rule-based inference system and geospatial techniques was adopted, including five specific natural parameters namely slope, soil pH, soil depth, soil texture and NDVI. Individual parameters were classified according to their deviation from mean. Membership of each individual values to be in a certain class was derived using the normal probability density function of that class. Thus if a single class of a single parameter is with mean μ and standard deviation σ, the values falling beyond μ + 2 σ and μ - 2 σ are not representing that class, but a transitional zone between two subsequent classes. These are the most important areas in terms of degradation, as they have the lowest probability to be in a certain class, hence highest probability to be extended or narrowed down in next or previous class respectively. Eventually, these are the values which can be easily altered, under extrogenic influences, hence are identified as risk areas. The overall desertification risk is derived by incorporating the different risk severity of each parameter using fuzzy rule-based interference system in GIS environment. Multicriteria based geo-statistics are applied to locate the areas under different severity of desertification risk. The study revealed that in Kota, various anthropogenic pressures are accelerating land deterioration, coupled with natural erosive forces. Four major sources of desertification in Kota are, namely Gully and Ravine erosion, inappropriate mining practices, growing urbanization and random deforestation.

  11. Using an improved association rules mining optimization algorithm in web-based mobile-learning system

    NASA Astrophysics Data System (ADS)

    Huang, Yin; Chen, Jianhua; Xiong, Shaojun

    2009-07-01

    Mobile-Learning (M-learning) makes many learners get the advantages of both traditional learning and E-learning. Currently, Web-based Mobile-Learning Systems have created many new ways and defined new relationships between educators and learners. Association rule mining is one of the most important fields in data mining and knowledge discovery in databases. Rules explosion is a serious problem which causes great concerns, as conventional mining algorithms often produce too many rules for decision makers to digest. Since Web-based Mobile-Learning System collects vast amounts of student profile data, data mining and knowledge discovery techniques can be applied to find interesting relationships between attributes of learners, assessments, the solution strategies adopted by learners and so on. Therefore ,this paper focus on a new data-mining algorithm, combined with the advantages of genetic algorithm and simulated annealing algorithm , called ARGSA(Association rules based on an improved Genetic Simulated Annealing Algorithm), to mine the association rules. This paper first takes advantage of the Parallel Genetic Algorithm and Simulated Algorithm designed specifically for discovering association rules. Moreover, the analysis and experiment are also made to show the proposed method is superior to the Apriori algorithm in this Mobile-Learning system.

  12. A fast combination method in DSmT and its application to recommender system

    PubMed Central

    Liu, Yihai

    2018-01-01

    In many applications involving epistemic uncertainties usually modeled by belief functions, it is often necessary to approximate general (non-Bayesian) basic belief assignments (BBAs) to subjective probabilities (called Bayesian BBAs). This necessity occurs if one needs to embed the fusion result in a system based on the probabilistic framework and Bayesian inference (e.g. tracking systems), or if one needs to make a decision in the decision making problems. In this paper, we present a new fast combination method, called modified rigid coarsening (MRC), to obtain the final Bayesian BBAs based on hierarchical decomposition (coarsening) of the frame of discernment. Regarding this method, focal elements with probabilities are coarsened efficiently to reduce computational complexity in the process of combination by using disagreement vector and a simple dichotomous approach. In order to prove the practicality of our approach, this new approach is applied to combine users’ soft preferences in recommender systems (RSs). Additionally, in order to make a comprehensive performance comparison, the proportional conflict redistribution rule #6 (PCR6) is regarded as a baseline in a range of experiments. According to the results of experiments, MRC is more effective in accuracy of recommendations compared to original Rigid Coarsening (RC) method and comparable in computational time. PMID:29351297

  13. Children's schemes for anticipating the validity of nets for solids

    NASA Astrophysics Data System (ADS)

    Wright, Vince; Smith, Ken

    2017-09-01

    There is growing acknowledgement of the importance of spatial abilities to student achievement across a broad range of domains and disciplines. Nets are one way to connect three-dimensional shapes and their two-dimensional representations and are a common focus of geometry curricula. Thirty-four students at year 6 (upper primary school) were interviewed on two occasions about their anticipation of whether or not given nets for the cube- and square-based pyramid would fold to form the target solid. Vergnaud's ( Journal of Mathematical Behavior, 17(2), 167-181, 1998, Human Development, 52, 83-94, 2009) four characteristics of schemes were used as a theoretical lens to analyse the data. Successful schemes depended on the interaction of operational invariants, such as strategic choice of the base, rules for action, particularly rotation of shapes, and anticipations of composites of polygons in the net forming arrangements of faces in the solid. Inferences were rare. These data suggest that students need teacher support to make inferences, in order to create transferable schemes.

  14. Anticipatory systems using a probabilistic-possibilistic formalism

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsoukalas, L.H.

    1989-01-01

    A methodology for the realization of the Anticipatory Paradigm in the diagnosis and control of complex systems, such as power plants, is developed. The objective is to synthesize engineering systems as analogs of certain biological systems which are capable of modifying their present states on the basis of anticipated future states. These future states are construed to be the output of predictive, numerical, stochastic or symbolic models. The mathematical basis of the implementation is developed on the basis of a formulation coupling probabilistic (random) and possibilistic(fuzzy) data in the form of an Information Granule. Random data are generated from observationsmore » and sensors input from the environment. Fuzzy data consists of eqistemic information, such as criteria or constraints qualifying the environmental inputs. The approach generates mathematical performance measures upon which diagnostic inferences and control functions are based. Anticipated performance is generated using a fuzzified Bayes formula. Triplex arithmetic is used in the numerical estimation of the performance measures. Representation of the system is based upon a goal-tree within the rule-based paradigm from the field of Applied Artificial Intelligence. The ensuing construction incorporates a coupling of Symbolic and Procedural programming methods. As a demonstration of the possibility of constructing such systems, a model-based system of a nuclear reactor is constructed. A numerical model of the reactor as a damped simple harmonic oscillator is used. The neutronic behavior is described by a point kinetics model with temperature feedback. The resulting system is programmed in OPS5 for the symbolic component and in FORTRAN for the procedural part.« less

  15. Integrating policy-based management and SLA performance monitoring

    NASA Astrophysics Data System (ADS)

    Liu, Tzong-Jye; Lin, Chin-Yi; Chang, Shu-Hsin; Yen, Meng-Tzu

    2001-10-01

    Policy-based management system provides the configuration capability for the system administrators to focus on the requirements of customers. The service level agreement performance monitoring mechanism helps system administrators to verify the correctness of policies. However, it is difficult for a device to process the policies directly because the policies are the management concept. This paper proposes a mechanism to decompose a policy into rules that can be efficiently processed by a device. Thus, the device may process the rule and collect the performance statistics information efficiently; and the policy-based management system may collect these performance statistics information and report the service-level agreement performance monitoring information to the system administrator. The proposed policy-based management system achieves both the policy configuration and service-level agreement performance monitoring requirements. A policy consists of a condition part and an action part. The condition part is a Boolean expression of a source host IP group, a destination host IP group, etc. The action part is the parameters of services. We say that an address group is compact if it only consists of a range of IP address that can be denoted by a pair of IP address and corresponding IP mask. If the condition part of a policy only consists of the compact address group, we say that the policy is a rule. Since a device can efficiently process a compact address and a system administrator prefers to define a range of IP address, the policy-based management system has to translate policy into rules and supplements the gaps between policy and rules. The proposed policy-based management system builds the relationships between VPN and policies, policy and rules. Since the system administrator wants to monitor the system performance information of VPNs and policies, the proposed policy-based management system downloads the relationships among VPNs, policies and rules to the SNMP agents. The SNMP agents build the management information base (MIB) of all VPNs, policies and rules according to the relationships obtained from the management server. Thus, the proposed policy-based management system may get all performance monitoring information of VPNs and policies from agents. The proposed policy-based manager achieves two goals: a) provide a management environment for the system administrator to configure their network only considering the policy requirement issues and b) let the device have only to process the packet and then collect the required performance information. These two things make the proposed management system satisfy both the user and device requirements.

  16. Activity inference for Ambient Intelligence through handling artifacts in a healthcare environment.

    PubMed

    Martínez-Pérez, Francisco E; González-Fraga, Jose Ángel; Cuevas-Tello, Juan C; Rodríguez, Marcela D

    2012-01-01

    Human activity inference is not a simple process due to distinct ways of performing it. Our proposal presents the SCAN framework for activity inference. SCAN is divided into three modules: (1) artifact recognition, (2) activity inference, and (3) activity representation, integrating three important elements of Ambient Intelligence (AmI) (artifact-behavior modeling, event interpretation and context extraction). The framework extends the roaming beat (RB) concept by obtaining the representation using three kinds of technologies for activity inference. The RB is based on both analysis and recognition from artifact behavior for activity inference. A practical case is shown in a nursing home where a system affording 91.35% effectiveness was implemented in situ. Three examples are shown using RB representation for activity representation. Framework description, RB description and CALog system overcome distinct problems such as the feasibility to implement AmI systems, and to show the feasibility for accomplishing the challenges related to activity recognition based on artifact recognition. We discuss how the use of RBs might positively impact the problems faced by designers and developers for recovering information in an easier manner and thus they can develop tools focused on the user.

  17. Activity Inference for Ambient Intelligence Through Handling Artifacts in a Healthcare Environment

    PubMed Central

    Martínez-Pérez, Francisco E.; González-Fraga, Jose Ángel; Cuevas-Tello, Juan C.; Rodríguez, Marcela D.

    2012-01-01

    Human activity inference is not a simple process due to distinct ways of performing it. Our proposal presents the SCAN framework for activity inference. SCAN is divided into three modules: (1) artifact recognition, (2) activity inference, and (3) activity representation, integrating three important elements of Ambient Intelligence (AmI) (artifact-behavior modeling, event interpretation and context extraction). The framework extends the roaming beat (RB) concept by obtaining the representation using three kinds of technologies for activity inference. The RB is based on both analysis and recognition from artifact behavior for activity inference. A practical case is shown in a nursing home where a system affording 91.35% effectiveness was implemented in situ. Three examples are shown using RB representation for activity representation. Framework description, RB description and CALog system overcome distinct problems such as the feasibility to implement AmI systems, and to show the feasibility for accomplishing the challenges related to activity recognition based on artifact recognition. We discuss how the use of RBs might positively impact the problems faced by designers and developers for recovering information in an easier manner and thus they can develop tools focused on the user. PMID:22368512

  18. eFSM--a novel online neural-fuzzy semantic memory model.

    PubMed

    Tung, Whye Loon; Quek, Chai

    2010-01-01

    Fuzzy rule-based systems (FRBSs) have been successfully applied to many areas. However, traditional fuzzy systems are often manually crafted, and their rule bases that represent the acquired knowledge are static and cannot be trained to improve the modeling performance. This subsequently leads to intensive research on the autonomous construction and tuning of a fuzzy system directly from the observed training data to address the knowledge acquisition bottleneck, resulting in well-established hybrids such as neural-fuzzy systems (NFSs) and genetic fuzzy systems (GFSs). However, the complex and dynamic nature of real-world problems demands that fuzzy rule-based systems and models be able to adapt their parameters and ultimately evolve their rule bases to address the nonstationary (time-varying) characteristics of their operating environments. Recently, considerable research efforts have been directed to the study of evolving Tagaki-Sugeno (T-S)-type NFSs based on the concept of incremental learning. In contrast, there are very few incremental learning Mamdani-type NFSs reported in the literature. Hence, this paper presents the evolving neural-fuzzy semantic memory (eFSM) model, a neural-fuzzy Mamdani architecture with a data-driven progressively adaptive structure (i.e., rule base) based on incremental learning. Issues related to the incremental learning of the eFSM rule base are carefully investigated, and a novel parameter learning approach is proposed for the tuning of the fuzzy set parameters in eFSM. The proposed eFSM model elicits highly interpretable semantic knowledge in the form of Mamdani-type if-then fuzzy rules from low-level numeric training data. These Mamdani fuzzy rules define the computing structure of eFSM and are incrementally learned with the arrival of each training data sample. New rules are constructed from the emergence of novel training data and obsolete fuzzy rules that no longer describe the recently observed data trends are pruned. This enables eFSM to maintain a current and compact set of Mamdani-type if-then fuzzy rules that collectively generalizes and describes the salient associative mappings between the inputs and outputs of the underlying process being modeled. The learning and modeling performances of the proposed eFSM are evaluated using several benchmark applications and the results are encouraging.

  19. Organizational Knowledge Transfer Using Ontologies and a Rule-Based System

    NASA Astrophysics Data System (ADS)

    Okabe, Masao; Yoshioka, Akiko; Kobayashi, Keido; Yamaguchi, Takahira

    In recent automated and integrated manufacturing, so-called intelligence skill is becoming more and more important and its efficient transfer to next-generation engineers is one of the urgent issues. In this paper, we propose a new approach without costly OJT (on-the-job training), that is, combinational usage of a domain ontology, a rule ontology and a rule-based system. Intelligence skill can be decomposed into pieces of simple engineering rules. A rule ontology consists of these engineering rules as primitives and the semantic relations among them. A domain ontology consists of technical terms in the engineering rules and the semantic relations among them. A rule ontology helps novices get the total picture of the intelligence skill and a domain ontology helps them understand the exact meanings of the engineering rules. A rule-based system helps domain experts externalize their tacit intelligence skill to ontologies and also helps novices internalize them. As a case study, we applied our proposal to some actual job at a remote control and maintenance office of hydroelectric power stations in Tokyo Electric Power Co., Inc. We also did an evaluation experiment for this case study and the result supports our proposal.

  20. Coupling functions: Universal insights into dynamical interaction mechanisms

    NASA Astrophysics Data System (ADS)

    Stankovski, Tomislav; Pereira, Tiago; McClintock, Peter V. E.; Stefanovska, Aneta

    2017-10-01

    The dynamical systems found in nature are rarely isolated. Instead they interact and influence each other. The coupling functions that connect them contain detailed information about the functional mechanisms underlying the interactions and prescribe the physical rule specifying how an interaction occurs. A coherent and comprehensive review is presented encompassing the rapid progress made recently in the analysis, understanding, and applications of coupling functions. The basic concepts and characteristics of coupling functions are presented through demonstrative examples of different domains, revealing the mechanisms and emphasizing their multivariate nature. The theory of coupling functions is discussed through gradually increasing complexity from strong and weak interactions to globally coupled systems and networks. A variety of methods that have been developed for the detection and reconstruction of coupling functions from measured data is described. These methods are based on different statistical techniques for dynamical inference. Stemming from physics, such methods are being applied in diverse areas of science and technology, including chemistry, biology, physiology, neuroscience, social sciences, mechanics, and secure communications. This breadth of application illustrates the universality of coupling functions for studying the interaction mechanisms of coupled dynamical systems.

  1. Goal-recognition-based adaptive brain-computer interface for navigating immersive robotic systems.

    PubMed

    Abu-Alqumsan, Mohammad; Ebert, Felix; Peer, Angelika

    2017-06-01

    This work proposes principled strategies for self-adaptations in EEG-based Brain-computer interfaces (BCIs) as a way out of the bandwidth bottleneck resulting from the considerable mismatch between the low-bandwidth interface and the bandwidth-hungry application, and a way to enable fluent and intuitive interaction in embodiment systems. The main focus is laid upon inferring the hidden target goals of users while navigating in a remote environment as a basis for possible adaptations. To reason about possible user goals, a general user-agnostic Bayesian update rule is devised to be recursively applied upon the arrival of evidences, i.e. user input and user gaze. Experiments were conducted with healthy subjects within robotic embodiment settings to evaluate the proposed method. These experiments varied along three factors: the type of the robot/environment (simulated and physical), the type of the interface (keyboard or BCI), and the way goal recognition (GR) is used to guide a simple shared control (SC) driving scheme. Our results show that the proposed GR algorithm is able to track and infer the hidden user goals with relatively high precision and recall. Further, the realized SC driving scheme benefits from the output of the GR system and is able to reduce the user effort needed to accomplish the assigned tasks. Despite the fact that the BCI requires higher effort compared to the keyboard conditions, most subjects were able to complete the assigned tasks, and the proposed GR system is additionally shown able to handle the uncertainty in user input during SSVEP-based interaction. The SC application of the belief vector indicates that the benefits of the GR module are more pronounced for BCIs, compared to the keyboard interface. Being based on intuitive heuristics that model the behavior of the general population during the execution of navigation tasks, the proposed GR method can be used without prior tuning for the individual users. The proposed methods can be easily integrated in devising more advanced SC schemes and/or strategies for automatic BCI self-adaptations.

  2. Goal-recognition-based adaptive brain-computer interface for navigating immersive robotic systems

    NASA Astrophysics Data System (ADS)

    Abu-Alqumsan, Mohammad; Ebert, Felix; Peer, Angelika

    2017-06-01

    Objective. This work proposes principled strategies for self-adaptations in EEG-based Brain-computer interfaces (BCIs) as a way out of the bandwidth bottleneck resulting from the considerable mismatch between the low-bandwidth interface and the bandwidth-hungry application, and a way to enable fluent and intuitive interaction in embodiment systems. The main focus is laid upon inferring the hidden target goals of users while navigating in a remote environment as a basis for possible adaptations. Approach. To reason about possible user goals, a general user-agnostic Bayesian update rule is devised to be recursively applied upon the arrival of evidences, i.e. user input and user gaze. Experiments were conducted with healthy subjects within robotic embodiment settings to evaluate the proposed method. These experiments varied along three factors: the type of the robot/environment (simulated and physical), the type of the interface (keyboard or BCI), and the way goal recognition (GR) is used to guide a simple shared control (SC) driving scheme. Main results. Our results show that the proposed GR algorithm is able to track and infer the hidden user goals with relatively high precision and recall. Further, the realized SC driving scheme benefits from the output of the GR system and is able to reduce the user effort needed to accomplish the assigned tasks. Despite the fact that the BCI requires higher effort compared to the keyboard conditions, most subjects were able to complete the assigned tasks, and the proposed GR system is additionally shown able to handle the uncertainty in user input during SSVEP-based interaction. The SC application of the belief vector indicates that the benefits of the GR module are more pronounced for BCIs, compared to the keyboard interface. Significance. Being based on intuitive heuristics that model the behavior of the general population during the execution of navigation tasks, the proposed GR method can be used without prior tuning for the individual users. The proposed methods can be easily integrated in devising more advanced SC schemes and/or strategies for automatic BCI self-adaptations.

  3. A multilayer perceptron solution to the match phase problem in rule-based artificial intelligence systems

    NASA Technical Reports Server (NTRS)

    Sartori, Michael A.; Passino, Kevin M.; Antsaklis, Panos J.

    1992-01-01

    In rule-based AI planning, expert, and learning systems, it is often the case that the left-hand-sides of the rules must be repeatedly compared to the contents of some 'working memory'. The traditional approach to solve such a 'match phase problem' for production systems is to use the Rete Match Algorithm. Here, a new technique using a multilayer perceptron, a particular artificial neural network model, is presented to solve the match phase problem for rule-based AI systems. A syntax for premise formulas (i.e., the left-hand-sides of the rules) is defined, and working memory is specified. From this, it is shown how to construct a multilayer perceptron that finds all of the rules which can be executed for the current situation in working memory. The complexity of the constructed multilayer perceptron is derived in terms of the maximum number of nodes and the required number of layers. A method for reducing the number of layers to at most three is also presented.

  4. ARROWSMITH-P: A prototype expert system for software engineering management

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Ramsey, Connie Loggia

    1985-01-01

    Although the field of software engineering is relatively new, it can benefit from the use of expert systems. Two prototype expert systems were developed to aid in software engineering management. Given the values for certain metrics, these systems will provide interpretations which explain any abnormal patterns of these values during the development of a software project. The two systems, which solve the same problem, were built using different methods, rule-based deduction and frame-based abduction. A comparison was done to see which method was better suited to the needs of this field. It was found that both systems performed moderately well, but the rule-based deduction system using simple rules provided more complete solutions than did the frame-based abduction system.

  5. The logical primitives of thought: Empirical foundations for compositional cognitive models.

    PubMed

    Piantadosi, Steven T; Tenenbaum, Joshua B; Goodman, Noah D

    2016-07-01

    The notion of a compositional language of thought (LOT) has been central in computational accounts of cognition from earliest attempts (Boole, 1854; Fodor, 1975) to the present day (Feldman, 2000; Penn, Holyoak, & Povinelli, 2008; Fodor, 2008; Kemp, 2012; Goodman, Tenenbaum, & Gerstenberg, 2015). Recent modeling work shows how statistical inferences over compositionally structured hypothesis spaces might explain learning and development across a variety of domains. However, the primitive components of such representations are typically assumed a priori by modelers and theoreticians rather than determined empirically. We show how different sets of LOT primitives, embedded in a psychologically realistic approximate Bayesian inference framework, systematically predict distinct learning curves in rule-based concept learning experiments. We use this feature of LOT models to design a set of large-scale concept learning experiments that can determine the most likely primitives for psychological concepts involving Boolean connectives and quantification. Subjects' inferences are most consistent with a rich (nonminimal) set of Boolean operations, including first-order, but not second-order, quantification. Our results more generally show how specific LOT theories can be distinguished empirically. (PsycINFO Database Record (c) 2016 APA, all rights reserved).

  6. CRISPR-based herd immunity can limit phage epidemics in bacterial populations

    PubMed Central

    Geyrhofer, Lukas; Barton, Nicholas H

    2018-01-01

    Herd immunity, a process in which resistant individuals limit the spread of a pathogen among susceptible hosts has been extensively studied in eukaryotes. Even though bacteria have evolved multiple immune systems against their phage pathogens, herd immunity in bacteria remains unexplored. Here we experimentally demonstrate that herd immunity arises during phage epidemics in structured and unstructured Escherichia coli populations consisting of differing frequencies of susceptible and resistant cells harboring CRISPR immunity. In addition, we develop a mathematical model that quantifies how herd immunity is affected by spatial population structure, bacterial growth rate, and phage replication rate. Using our model we infer a general epidemiological rule describing the relative speed of an epidemic in partially resistant spatially structured populations. Our experimental and theoretical findings indicate that herd immunity may be important in bacterial communities, allowing for stable coexistence of bacteria and their phages and the maintenance of polymorphism in bacterial immunity. PMID:29521625

  7. Simulation of operating rules and discretional decisions using a fuzzy rule-based system integrated into a water resources management model

    NASA Astrophysics Data System (ADS)

    Macian-Sorribes, Hector; Pulido-Velazquez, Manuel

    2013-04-01

    Water resources systems are operated, mostly, using a set of pre-defined rules not regarding, usually, to an optimal allocation in terms of water use or economic benefits, but to historical and institutional reasons. These operating policies are reproduced, commonly, as hedging rules, pack rules or zone-based operations, and simulation models can be used to test their performance under a wide range of hydrological and/or socio-economic hypothesis. Despite the high degree of acceptation and testing that these models have achieved, the actual operation of water resources systems hardly follows all the time the pre-defined rules with the consequent uncertainty on the system performance. Real-world reservoir operation is very complex, affected by input uncertainty (imprecision in forecast inflow, seepage and evaporation losses, etc.), filtered by the reservoir operator's experience and natural risk-aversion, while considering the different physical and legal/institutional constraints in order to meet the different demands and system requirements. The aim of this work is to expose a fuzzy logic approach to derive and assess the historical operation of a system. This framework uses a fuzzy rule-based system to reproduce pre-defined rules and also to match as close as possible the actual decisions made by managers. After built up, the fuzzy rule-based system can be integrated in a water resources management model, making possible to assess the system performance at the basin scale. The case study of the Mijares basin (eastern Spain) is used to illustrate the method. A reservoir operating curve regulates the two main reservoir releases (operated in a conjunctive way) with the purpose of guaranteeing a high realiability of supply to the traditional irrigation districts with higher priority (more senior demands that funded the reservoir construction). A fuzzy rule-based system has been created to reproduce the operating curve's performance, defining the system state (total water stored in the reservoirs) and the month of the year as inputs; and the demand deliveries as outputs. The developed simulation management model integrates the fuzzy-ruled system of the operation of the two main reservoirs of the basin with the corresponding mass balance equations, the physical or boundary conditions and the water allocation rules among the competing demands. Historical information on inflow time series is used as inputs to the model simulation, being trained and validated using historical information on reservoir storage level and flow in several streams of the Mijares river. This methodology provides a more flexible and close to real policies approach. The model is easy to develop and to understand due to its rule-based structure, which mimics the human way of thinking. This can improve cooperation and negotiation between managers, decision-makers and stakeholders. The approach can be also applied to analyze the historical operation of the reservoir (what we have called a reservoir operation "audit").

  8. Designing boosting ensemble of relational fuzzy systems.

    PubMed

    Scherer, Rafał

    2010-10-01

    A method frequently used in classification systems for improving classification accuracy is to combine outputs of several classifiers. Among various types of classifiers, fuzzy ones are tempting because of using intelligible fuzzy if-then rules. In the paper we build an AdaBoost ensemble of relational neuro-fuzzy classifiers. Relational fuzzy systems bond input and output fuzzy linguistic values by a binary relation; thus, fuzzy rules have additional, comparing to traditional fuzzy systems, weights - elements of a fuzzy relation matrix. Thanks to this the system is better adjustable to data during learning. In the paper an ensemble of relational fuzzy systems is proposed. The problem is that such an ensemble contains separate rule bases which cannot be directly merged. As systems are separate, we cannot treat fuzzy rules coming from different systems as rules from the same (single) system. In the paper, the problem is addressed by a novel design of fuzzy systems constituting the ensemble, resulting in normalization of individual rule bases during learning. The method described in the paper is tested on several known benchmarks and compared with other machine learning solutions from the literature.

  9. Compartmental and Spatial Rule-Based Modeling with Virtual Cell.

    PubMed

    Blinov, Michael L; Schaff, James C; Vasilescu, Dan; Moraru, Ion I; Bloom, Judy E; Loew, Leslie M

    2017-10-03

    In rule-based modeling, molecular interactions are systematically specified in the form of reaction rules that serve as generators of reactions. This provides a way to account for all the potential molecular complexes and interactions among multivalent or multistate molecules. Recently, we introduced rule-based modeling into the Virtual Cell (VCell) modeling framework, permitting graphical specification of rules and merger of networks generated automatically (using the BioNetGen modeling engine) with hand-specified reaction networks. VCell provides a number of ordinary differential equation and stochastic numerical solvers for single-compartment simulations of the kinetic systems derived from these networks, and agent-based network-free simulation of the rules. In this work, compartmental and spatial modeling of rule-based models has been implemented within VCell. To enable rule-based deterministic and stochastic spatial simulations and network-free agent-based compartmental simulations, the BioNetGen and NFSim engines were each modified to support compartments. In the new rule-based formalism, every reactant and product pattern and every reaction rule are assigned locations. We also introduce the rule-based concept of molecular anchors. This assures that any species that has a molecule anchored to a predefined compartment will remain in this compartment. Importantly, in addition to formulation of compartmental models, this now permits VCell users to seamlessly connect reaction networks derived from rules to explicit geometries to automatically generate a system of reaction-diffusion equations. These may then be simulated using either the VCell partial differential equations deterministic solvers or the Smoldyn stochastic simulator. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  10. System Diagnostic Builder - A rule generation tool for expert systems that do intelligent data evaluation. [applied to Shuttle Mission Simulator

    NASA Technical Reports Server (NTRS)

    Nieten, Joseph; Burke, Roger

    1993-01-01

    Consideration is given to the System Diagnostic Builder (SDB), an automated knowledge acquisition tool using state-of-the-art AI technologies. The SDB employs an inductive machine learning technique to generate rules from data sets that are classified by a subject matter expert. Thus, data are captured from the subject system, classified, and used to drive the rule generation process. These rule bases are used to represent the observable behavior of the subject system, and to represent knowledge about this system. The knowledge bases captured from the Shuttle Mission Simulator can be used as black box simulations by the Intelligent Computer Aided Training devices. The SDB can also be used to construct knowledge bases for the process control industry, such as chemical production or oil and gas production.

  11. A Theory of Conditional Information for Probabilistic Inference in Intelligent Systems: 1. Interval of Events Approach

    DTIC Science & Technology

    1994-06-01

    8217tonditional events" as well-defined ob- jects as in De Finetti [14];, Gilio t[15]L When the strength of the rule b-)a is computed in the context of...uncertain outcome (see, e.g., McGee [5]-) or a coherency argument inthe sense ’of De Finetti as employed by Gilio et al [15],([17J1 or Coletti et al. 118...probabil- ity through a scoring characterization, extending De Finetti’s coherency principle. (See also Gilio et al. [17] for additional results

  12. Creating an ontology driven rules base for an expert system for medical diagnosis.

    PubMed

    Bertaud Gounot, Valérie; Donfack, Valéry; Lasbleiz, Jérémy; Bourde, Annabel; Duvauferrier, Régis

    2011-01-01

    Expert systems of the 1980s have failed on the difficulties of maintaining large rule bases. The current work proposes a method to achieve and maintain rule bases grounded on ontologies (like NCIT). The process described here for an expert system on plasma cell disorder encompasses extraction of a sub-ontology and automatic and comprehensive generation of production rules. The creation of rules is not based directly on classes, but on individuals (instances). Instances can be considered as prototypes of diseases formally defined by "destrictions" in the ontology. Thus, it is possible to use this process to make diagnoses of diseases. The perspectives of this work are considered: the process described with an ontology formalized in OWL1 can be extended by using an ontology in OWL2 and allow reasoning about numerical data in addition to symbolic data.

  13. An expert system shell for inferring vegetation characteristics

    NASA Technical Reports Server (NTRS)

    Harrison, P. Ann; Harrison, Patrick R.

    1992-01-01

    The NASA VEGetation Workbench (VEG) is a knowledge based system that infers vegetation characteristics from reflectance data. The report describes the extensions that have been made to the first generation version of VEG. An interface to a file of unkown cover type data has been constructed. An interface that allows the results of VEG to be written to a file has been implemented. A learning system that learns class descriptions from a data base of historical cover type data and then uses the learned class descriptions to classify an unknown sample has been built. This system has an interface that integrates it into the rest of VEG. The VEG subgoal PROPORTION.GROUND.COVER has been completed and a number of additional techniques that infer the proportion ground cover of a sample have been implemented.

  14. Intrusion Detection Systems with Live Knowledge System

    DTIC Science & Technology

    2016-05-31

    Ripple -down Rule (RDR) to maintain the knowledge from human experts with knowledge base generated by the Induct RDR, which is a machine-learning based RDR...propose novel approach that uses Ripple -down Rule (RDR) to maintain the knowledge from human experts with knowledge base generated by the Induct RDR...detection model by applying Induct RDR approach. The proposed induct RDR ( Ripple Down Rules) approach allows to acquire the phishing detection

  15. A simple signaling rule for variable life-adjusted display derived from an equivalent risk-adjusted CUSUM chart.

    PubMed

    Wittenberg, Philipp; Gan, Fah Fatt; Knoth, Sven

    2018-04-17

    The variable life-adjusted display (VLAD) is the first risk-adjusted graphical procedure proposed in the literature for monitoring the performance of a surgeon. It displays the cumulative sum of expected minus observed deaths. It has since become highly popular because the statistic plotted is easy to understand. But it is also easy to misinterpret a surgeon's performance by utilizing the VLAD, potentially leading to grave consequences. The problem of misinterpretation is essentially caused by the variance of the VLAD's statistic that increases with sample size. In order for the VLAD to be truly useful, a simple signaling rule is desperately needed. Various forms of signaling rules have been developed, but they are usually quite complicated. Without signaling rules, making inferences using the VLAD alone is difficult if not misleading. In this paper, we establish an equivalence between a VLAD with V-mask and a risk-adjusted cumulative sum (RA-CUSUM) chart based on the difference between the estimated probability of death and surgical outcome. Average run length analysis based on simulation shows that this particular RA-CUSUM chart has similar performance as compared to the established RA-CUSUM chart based on the log-likelihood ratio statistic obtained by testing the odds ratio of death. We provide a simple design procedure for determining the V-mask parameters based on a resampling approach. Resampling from a real data set ensures that these parameters can be estimated appropriately. Finally, we illustrate the monitoring of a real surgeon's performance using VLAD with V-mask. Copyright © 2018 John Wiley & Sons, Ltd.

  16. XSemantic: An Extension of LCA Based XML Semantic Search

    NASA Astrophysics Data System (ADS)

    Supasitthimethee, Umaporn; Shimizu, Toshiyuki; Yoshikawa, Masatoshi; Porkaew, Kriengkrai

    One of the most convenient ways to query XML data is a keyword search because it does not require any knowledge of XML structure or learning a new user interface. However, the keyword search is ambiguous. The users may use different terms to search for the same information. Furthermore, it is difficult for a system to decide which node is likely to be chosen as a return node and how much information should be included in the result. To address these challenges, we propose an XML semantic search based on keywords called XSemantic. On the one hand, we give three definitions to complete in terms of semantics. Firstly, the semantic term expansion, our system is robust from the ambiguous keywords by using the domain ontology. Secondly, to return semantic meaningful answers, we automatically infer the return information from the user queries and take advantage of the shortest path to return meaningful connections between keywords. Thirdly, we present the semantic ranking that reflects the degree of similarity as well as the semantic relationship so that the search results with the higher relevance are presented to the users first. On the other hand, in the LCA and the proximity search approaches, we investigated the problem of information included in the search results. Therefore, we introduce the notion of the Lowest Common Element Ancestor (LCEA) and define our simple rule without any requirement on the schema information such as the DTD or XML Schema. The first experiment indicated that XSemantic not only properly infers the return information but also generates compact meaningful results. Additionally, the benefits of our proposed semantics are demonstrated by the second experiment.

  17. Method of fuzzy inference for one class of MISO-structure systems with non-singleton inputs

    NASA Astrophysics Data System (ADS)

    Sinuk, V. G.; Panchenko, M. V.

    2018-03-01

    In fuzzy modeling, the inputs of the simulated systems can receive both crisp values and non-Singleton. Computational complexity of fuzzy inference with fuzzy non-Singleton inputs corresponds to an exponential. This paper describes a new method of inference, based on the theorem of decomposition of a multidimensional fuzzy implication and a fuzzy truth value. This method is considered for fuzzy inputs and has a polynomial complexity, which makes it possible to use it for modeling large-dimensional MISO-structure systems.

  18. Incorporation of expert variability into breast cancer treatment recommendation in designing clinical protocol guided fuzzy rule system models.

    PubMed

    Garibaldi, Jonathan M; Zhou, Shang-Ming; Wang, Xiao-Ying; John, Robert I; Ellis, Ian O

    2012-06-01

    It has been often demonstrated that clinicians exhibit both inter-expert and intra-expert variability when making difficult decisions. In contrast, the vast majority of computerized models that aim to provide automated support for such decisions do not explicitly recognize or replicate this variability. Furthermore, the perfect consistency of computerized models is often presented as a de facto benefit. In this paper, we describe a novel approach to incorporate variability within a fuzzy inference system using non-stationary fuzzy sets in order to replicate human variability. We apply our approach to a decision problem concerning the recommendation of post-operative breast cancer treatment; specifically, whether or not to administer chemotherapy based on assessment of five clinical variables: NPI (the Nottingham Prognostic Index), estrogen receptor status, vascular invasion, age and lymph node status. In doing so, we explore whether such explicit modeling of variability provides any performance advantage over a more conventional fuzzy approach, when tested on a set of 1310 unselected cases collected over a fourteen year period at the Nottingham University Hospitals NHS Trust, UK. The experimental results show that the standard fuzzy inference system (that does not model variability) achieves overall agreement to clinical practice around 84.6% (95% CI: 84.1-84.9%), while the non-stationary fuzzy model can significantly increase performance to around 88.1% (95% CI: 88.0-88.2%), p<0.001. We conclude that non-stationary fuzzy models provide a valuable new approach that may be applied to clinical decision support systems in any application domain. Copyright © 2012 Elsevier Inc. All rights reserved.

  19. Network-Based Method for Identifying Co-Regeneration Genes in Bone, Dentin, Nerve and Vessel Tissues

    PubMed Central

    Pan, Hongying; Zhang, Yu-Hang; Feng, Kaiyan; Kong, XiangYin; Cai, Yu-Dong

    2017-01-01

    Bone and dental diseases are serious public health problems. Most current clinical treatments for these diseases can produce side effects. Regeneration is a promising therapy for bone and dental diseases, yielding natural tissue recovery with few side effects. Because soft tissues inside the bone and dentin are densely populated with nerves and vessels, the study of bone and dentin regeneration should also consider the co-regeneration of nerves and vessels. In this study, a network-based method to identify co-regeneration genes for bone, dentin, nerve and vessel was constructed based on an extensive network of protein–protein interactions. Three procedures were applied in the network-based method. The first procedure, searching, sought the shortest paths connecting regeneration genes of one tissue type with regeneration genes of other tissues, thereby extracting possible co-regeneration genes. The second procedure, testing, employed a permutation test to evaluate whether possible genes were false discoveries; these genes were excluded by the testing procedure. The last procedure, screening, employed two rules, the betweenness ratio rule and interaction score rule, to select the most essential genes. A total of seventeen genes were inferred by the method, which were deemed to contribute to co-regeneration of at least two tissues. All these seventeen genes were extensively discussed to validate the utility of the method. PMID:28974058

  20. Network-Based Method for Identifying Co- Regeneration Genes in Bone, Dentin, Nerve and Vessel Tissues.

    PubMed

    Chen, Lei; Pan, Hongying; Zhang, Yu-Hang; Feng, Kaiyan; Kong, XiangYin; Huang, Tao; Cai, Yu-Dong

    2017-10-02

    Bone and dental diseases are serious public health problems. Most current clinical treatments for these diseases can produce side effects. Regeneration is a promising therapy for bone and dental diseases, yielding natural tissue recovery with few side effects. Because soft tissues inside the bone and dentin are densely populated with nerves and vessels, the study of bone and dentin regeneration should also consider the co-regeneration of nerves and vessels. In this study, a network-based method to identify co-regeneration genes for bone, dentin, nerve and vessel was constructed based on an extensive network of protein-protein interactions. Three procedures were applied in the network-based method. The first procedure, searching, sought the shortest paths connecting regeneration genes of one tissue type with regeneration genes of other tissues, thereby extracting possible co-regeneration genes. The second procedure, testing, employed a permutation test to evaluate whether possible genes were false discoveries; these genes were excluded by the testing procedure. The last procedure, screening, employed two rules, the betweenness ratio rule and interaction score rule, to select the most essential genes. A total of seventeen genes were inferred by the method, which were deemed to contribute to co-regeneration of at least two tissues. All these seventeen genes were extensively discussed to validate the utility of the method.

  1. Foundational model of structural connectivity in the nervous system with a schema for wiring diagrams, connectome, and basic plan architecture

    PubMed Central

    Swanson, Larry W.; Bota, Mihail

    2010-01-01

    The nervous system is a biological computer integrating the body's reflex and voluntary environmental interactions (behavior) with a relatively constant internal state (homeostasis)—promoting survival of the individual and species. The wiring diagram of the nervous system's structural connectivity provides an obligatory foundational model for understanding functional localization at molecular, cellular, systems, and behavioral organization levels. This paper provides a high-level, downwardly extendible, conceptual framework—like a compass and map—for describing and exploring in neuroinformatics systems (such as our Brain Architecture Knowledge Management System) the structural architecture of the nervous system's basic wiring diagram. For this, the Foundational Model of Connectivity's universe of discourse is the structural architecture of nervous system connectivity in all animals at all resolutions, and the model includes two key elements—a set of basic principles and an internally consistent set of concepts (defined vocabulary of standard terms)—arranged in an explicitly defined schema (set of relationships between concepts) allowing automatic inferences. In addition, rules and procedures for creating and modifying the foundational model are considered. Controlled vocabularies with broad community support typically are managed by standing committees of experts that create and refine boundary conditions, and a set of rules that are available on the Web. PMID:21078980

  2. Parsing the roles of the frontal lobes and basal ganglia in task control using multivoxel pattern analysis

    PubMed Central

    Kehagia, Angie A.; Ye, Rong; Joyce, Dan W.; Doyle, Orla M.; Rowe, James B.; Robbins, Trevor W.

    2017-01-01

    Cognitive control has traditionally been associated with the prefrontal cortex, based on observations of deficits in patients with frontal lesions. However, evidence from patients with Parkinson’s disease (PD) indicates that subcortical regions also contribute to control under certain conditions. We scanned 17 healthy volunteers while they performed a task switching paradigm that previously dissociated performance deficits arising from frontal lesions in comparison with PD, as a function of the abstraction of the rules that are switched. From a multivoxel pattern analysis by Gaussian Process Classification (GPC), we then estimated the forward (generative) model to infer regional patterns of activity that predict Switch / Repeat behaviour between rule conditions. At 1000 permutations, Switch / Repeat classification accuracy for concrete rules was significant in the basal ganglia, but at chance in the frontal lobe. The inverse pattern was obtained for abstract rules, whereby the conditions were successfully discriminated in the frontal lobe but not in the basal ganglia. This double dissociation highlights the difference between cortical and subcortical contributions to cognitive control and demonstrates the utility of multivariate approaches in investigations of functions that rely on distributed and overlapping neural substrates. PMID:28387585

  3. Expert system shell to reason on large amounts of data

    NASA Technical Reports Server (NTRS)

    Giuffrida, Gionanni

    1994-01-01

    The current data base management systems (DBMS's) do not provide a sophisticated environment to develop rule based expert systems applications. Some of the new DBMS's come with some sort of rule mechanism; these are active and deductive database systems. However, both of these are not featured enough to support full implementation based on rules. On the other hand, current expert system shells do not provide any link with external databases. That is, all the data are kept in the system working memory. Such working memory is maintained in main memory. For some applications the reduced size of the available working memory could represent a constraint for the development. Typically these are applications which require reasoning on huge amounts of data. All these data do not fit into the computer main memory. Moreover, in some cases these data can be already available in some database systems and continuously updated while the expert system is running. This paper proposes an architecture which employs knowledge discovering techniques to reduce the amount of data to be stored in the main memory; in this architecture a standard DBMS is coupled with a rule-based language. The data are stored into the DBMS. An interface between the two systems is responsible for inducing knowledge from the set of relations. Such induced knowledge is then transferred to the rule-based language working memory.

  4. Decision generation tools and Bayesian inference

    NASA Astrophysics Data System (ADS)

    Jannson, Tomasz; Wang, Wenjian; Forrester, Thomas; Kostrzewski, Andrew; Veeris, Christian; Nielsen, Thomas

    2014-05-01

    Digital Decision Generation (DDG) tools are important software sub-systems of Command and Control (C2) systems and technologies. In this paper, we present a special type of DDGs based on Bayesian Inference, related to adverse (hostile) networks, including such important applications as terrorism-related networks and organized crime ones.

  5. Automated rule-base creation via CLIPS-Induce

    NASA Technical Reports Server (NTRS)

    Murphy, Patrick M.

    1994-01-01

    Many CLIPS rule-bases contain one or more rule groups that perform classification. In this paper we describe CLIPS-Induce, an automated system for the creation of a CLIPS classification rule-base from a set of test cases. CLIPS-Induce consists of two components, a decision tree induction component and a CLIPS production extraction component. ID3, a popular decision tree induction algorithm, is used to induce a decision tree from the test cases. CLIPS production extraction is accomplished through a top-down traversal of the decision tree. Nodes of the tree are used to construct query rules, and branches of the tree are used to construct classification rules. The learned CLIPS productions may easily be incorporated into a large CLIPS system that perform tasks such as accessing a database or displaying information.

  6. Inferring soil salinity in a drip irrigation system from multi-configuration EMI measurements using adaptive Markov chain Monte Carlo

    NASA Astrophysics Data System (ADS)

    Zaib Jadoon, Khan; Umer Altaf, Muhammad; McCabe, Matthew Francis; Hoteit, Ibrahim; Muhammad, Nisar; Moghadas, Davood; Weihermüller, Lutz

    2017-10-01

    A substantial interpretation of electromagnetic induction (EMI) measurements requires quantifying optimal model parameters and uncertainty of a nonlinear inverse problem. For this purpose, an adaptive Bayesian Markov chain Monte Carlo (MCMC) algorithm is used to assess multi-orientation and multi-offset EMI measurements in an agriculture field with non-saline and saline soil. In MCMC the posterior distribution is computed using Bayes' rule. The electromagnetic forward model based on the full solution of Maxwell's equations was used to simulate the apparent electrical conductivity measured with the configurations of EMI instrument, the CMD Mini-Explorer. Uncertainty in the parameters for the three-layered earth model are investigated by using synthetic data. Our results show that in the scenario of non-saline soil, the parameters of layer thickness as compared to layers electrical conductivity are not very informative and are therefore difficult to resolve. Application of the proposed MCMC-based inversion to field measurements in a drip irrigation system demonstrates that the parameters of the model can be well estimated for the saline soil as compared to the non-saline soil, and provides useful insight about parameter uncertainty for the assessment of the model outputs.

  7. Rule Based Expert System for Monitoring Real Time Drug Supply in Hospital Using Radio Frequency Identification Technology

    NASA Astrophysics Data System (ADS)

    Driandanu, Galih; Surarso, Bayu; Suryono

    2018-02-01

    A radio frequency identification (RFID) has obtained increasing attention with the emergence of various applications. This study aims to examine the implementation of rule based expert system supported by RFID technology into a monitoring information system of drug supply in a hospital. This research facilitates in monitoring the real time drug supply by using data sample from the hospital pharmacy. This system able to identify and count the number of drug and provide warning and report in real time. the conclusion is the rule based expert system and RFID technology can facilitate the performance in monitoring the drug supply quickly and precisely.

  8. Fuzzylot: a novel self-organising fuzzy-neural rule-based pilot system for automated vehicles.

    PubMed

    Pasquier, M; Quek, C; Toh, M

    2001-10-01

    This paper presents part of our research work concerned with the realisation of an Intelligent Vehicle and the technologies required for its routing, navigation, and control. An automated driver prototype has been developed using a self-organising fuzzy rule-based system (POPFNN-CRI(S)) to model and subsequently emulate human driving expertise. The ability of fuzzy logic to represent vague information using linguistic variables makes it a powerful tool to develop rule-based control systems when an exact working model is not available, as is the case of any vehicle-driving task. Designing a fuzzy system, however, is a complex endeavour, due to the need to define the variables and their associated fuzzy sets, and determine a suitable rule base. Many efforts have thus been devoted to automating this process, yielding the development of learning and optimisation techniques. One of them is the family of POP-FNNs, or Pseudo-Outer Product Fuzzy Neural Networks (TVR, AARS(S), AARS(NS), CRI, Yager). These generic self-organising neural networks developed at the Intelligent Systems Laboratory (ISL/NTU) are based on formal fuzzy mathematical theory and are able to objectively extract a fuzzy rule base from training data. In this application, a driving simulator has been developed, that integrates a detailed model of the car dynamics, complete with engine characteristics and environmental parameters, and an OpenGL-based 3D-simulation interface coupled with driving wheel and accelerator/ brake pedals. The simulator has been used on various road scenarios to record from a human pilot driving data consisting of steering and speed control actions associated to road features. Specifically, the POPFNN-CRI(S) system is used to cluster the data and extract a fuzzy rule base modelling the human driving behaviour. Finally, the effectiveness of the generated rule base has been validated using the simulator in autopilot mode.

  9. Color identification and fuzzy reasoning based monitoring and controlling of fermentation process of branched chain amino acid

    NASA Astrophysics Data System (ADS)

    Ma, Lei; Wang, Yizhong; Xu, Qingyang; Huang, Huafang; Zhang, Rui; Chen, Ning

    2009-11-01

    The main production method of branched chain amino acid (BCAA) is microbial fermentation. In this paper, to monitor and to control the fermentation process of BCAA, especially its logarithmic phase, parameters such as the color of fermentation broth, culture temperature, pH, revolution, dissolved oxygen, airflow rate, pressure, optical density, and residual glucose, are measured and/or controlled and/or adjusted. The color of fermentation broth is measured using the HIS color model and a BP neural network. The network's input is the histograms of hue H and saturation S, and output is the color description. Fermentation process parameters are adjusted using fuzzy reasoning, which is performed by inference rules. According to the practical situation of BCAA fermentation process, all parameters are divided into four grades, and different fuzzy rules are established.

  10. Evaluating the Effectiveness of Auditing Rules for Electronic Health Record Systems

    PubMed Central

    Hedda, Monica; Malin, Bradley A.; Yan, Chao; Fabbri, Daniel

    2017-01-01

    Healthcare organizations (HCOs) often deploy rule-based auditing systems to detect insider threats to sensitive patient health information in electronic health record (EHR) systems. These rule-based systems define behavior deemed to be high-risk a priori (e.g., family member, co-worker access). While such rules seem logical, there has been little scientific investigation into the effectiveness of these auditing rules in identifying inappropriate behavior. Thus, in this paper, we introduce an approach to evaluate the effectiveness of individual high-risk rules and rank them according to their potential risk. We investigate the rate of high-risk access patterns and minimum rate of high-risk accesses that can be explained with appropriate clinical reasons in a large EHR system. An analysis of 8M accesses from one-week of data shows that specific high-risk flags occur more frequently than theoretically expected and the rate at which accesses can be explained away with five simple reasons is 16 - 43%. PMID:29854153

  11. Evaluating the Effectiveness of Auditing Rules for Electronic Health Record Systems.

    PubMed

    Hedda, Monica; Malin, Bradley A; Yan, Chao; Fabbri, Daniel

    2017-01-01

    Healthcare organizations (HCOs) often deploy rule-based auditing systems to detect insider threats to sensitive patient health information in electronic health record (EHR) systems. These rule-based systems define behavior deemed to be high-risk a priori (e.g., family member, co-worker access). While such rules seem logical, there has been little scientific investigation into the effectiveness of these auditing rules in identifying inappropriate behavior. Thus, in this paper, we introduce an approach to evaluate the effectiveness of individual high-risk rules and rank them according to their potential risk. We investigate the rate of high-risk access patterns and minimum rate of high-risk accesses that can be explained with appropriate clinical reasons in a large EHR system. An analysis of 8M accesses from one-week of data shows that specific high-risk flags occur more frequently than theoretically expected and the rate at which accesses can be explained away with five simple reasons is 16 - 43%.

  12. Optimized face recognition algorithm using radial basis function neural networks and its practical applications.

    PubMed

    Yoo, Sung-Hoon; Oh, Sung-Kwun; Pedrycz, Witold

    2015-09-01

    In this study, we propose a hybrid method of face recognition by using face region information extracted from the detected face region. In the preprocessing part, we develop a hybrid approach based on the Active Shape Model (ASM) and the Principal Component Analysis (PCA) algorithm. At this step, we use a CCD (Charge Coupled Device) camera to acquire a facial image by using AdaBoost and then Histogram Equalization (HE) is employed to improve the quality of the image. ASM extracts the face contour and image shape to produce a personal profile. Then we use a PCA method to reduce dimensionality of face images. In the recognition part, we consider the improved Radial Basis Function Neural Networks (RBF NNs) to identify a unique pattern associated with each person. The proposed RBF NN architecture consists of three functional modules realizing the condition phase, the conclusion phase, and the inference phase completed with the help of fuzzy rules coming in the standard 'if-then' format. In the formation of the condition part of the fuzzy rules, the input space is partitioned with the use of Fuzzy C-Means (FCM) clustering. In the conclusion part of the fuzzy rules, the connections (weights) of the RBF NNs are represented by four kinds of polynomials such as constant, linear, quadratic, and reduced quadratic. The values of the coefficients are determined by running a gradient descent method. The output of the RBF NNs model is obtained by running a fuzzy inference method. The essential design parameters of the network (including learning rate, momentum coefficient and fuzzification coefficient used by the FCM) are optimized by means of Differential Evolution (DE). The proposed P-RBF NNs (Polynomial based RBF NNs) are applied to facial recognition and its performance is quantified from the viewpoint of the output performance and recognition rate. Copyright © 2015 Elsevier Ltd. All rights reserved.

  13. RB-ARD: A proof of concept rule-based abort

    NASA Technical Reports Server (NTRS)

    Smith, Richard; Marinuzzi, John

    1987-01-01

    The Abort Region Determinator (ARD) is a console program in the space shuttle mission control center. During shuttle ascent, the Flight Dynamics Officer (FDO) uses the ARD to determine the possible abort modes and make abort calls for the crew. The goal of the Rule-based Abort region Determinator (RB/ARD) project was to test the concept of providing an onboard ARD for the shuttle or an automated ARD for the mission control center (MCC). A proof of concept rule-based system was developed on a LMI Lambda computer using PICON, a knowdedge-based system shell. Knowdedge derived from documented flight rules and ARD operation procedures was coded in PICON rules. These rules, in conjunction with modules of conventional code, enable the RB-ARD to carry out key parts of the ARD task. Current capabilities of the RB-ARD include: continuous updating of the available abort mode, recognition of a limited number of main engine faults and recommendation of safing actions. Safing actions recommended by the RB-ARD concern the Space Shuttle Main Engine (SSME) limit shutdown system and powerdown of the SSME Ac buses.

  14. Community disassembly under global change: evidence in favor of the stress dominance hypothesis.

    PubMed

    Kuczynski, Lucie; Grenouillet, Gaël

    2018-05-22

    Ecological theory suggests that communities are not random combinations of species but rather the results of community assembly processes filtering and sorting species that are able to coexist together. To date, such processes (i.e. assembly rules) have been inferred from observed spatial patterns of biodiversity combined with null model approaches, but relatively few attempts have been made to assess how these processes may be changing through time. Specifically in the context of the on-going biodiversity crisis and global change, understanding how processes shaping communities may be changing and identifying the potential drivers underlying these changes become increasingly critical. Here, we used time series of 460 French freshwater fish communities and assessed both functional and phylogenetic diversity patterns to determine the relative importance of two key assembly rules (i.e. habitat filtering and limiting similarity) in shaping these communities over the last two decades. We aimed to (i) describe the temporal changes in both functional and phylogenetic diversity patterns, (ii) determine to what extent temporal changes in processes inferred through the use of standardized diversity indices were congruent, and (iii) test the relationships between the dynamics of assembly rules and both climatic and biotic drivers. Our results revealed that habitat filtering, though already largely predominant over limiting similarity, became more widespread over time. We also highlighted that phylogenetic and trait-based approaches offered complementary information about temporal changes in assembly rules. Finally, we found that increased environmental harshness over the study period (especially higher seasonality of temperature) led to an increase in habitat filtering and that biological invasions increased functional redundancy within communities. Overall, these findings underlie the need to develop temporal perspectives in community assembly studies, as understanding on-going temporal changes could provide a better vision about the way communities could respond to future global changes. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  15. Towards Smart Homes Using Low Level Sensory Data

    PubMed Central

    Khattak, Asad Masood; Truc, Phan Tran Ho; Hung, Le Xuan; Vinh, La The; Dang, Viet-Hung; Guan, Donghai; Pervez, Zeeshan; Han, Manhyung; Lee, Sungyoung; Lee, Young-Koo

    2011-01-01

    Ubiquitous Life Care (u-Life care) is receiving attention because it provides high quality and low cost care services. To provide spontaneous and robust healthcare services, knowledge of a patient’s real-time daily life activities is required. Context information with real-time daily life activities can help to provide better services and to improve healthcare delivery. The performance and accuracy of existing life care systems is not reliable, even with a limited number of services. This paper presents a Human Activity Recognition Engine (HARE) that monitors human health as well as activities using heterogeneous sensor technology and processes these activities intelligently on a Cloud platform for providing improved care at low cost. We focus on activity recognition using video-based, wearable sensor-based, and location-based activity recognition engines and then use intelligent processing to analyze the context of the activities performed. The experimental results of all the components showed good accuracy against existing techniques. The system is deployed on Cloud for Alzheimer’s disease patients (as a case study) with four activity recognition engines to identify low level activity from the raw data captured by sensors. These are then manipulated using ontology to infer higher level activities and make decisions about a patient’s activity using patient profile information and customized rules. PMID:22247682

  16. On the multiple imputation variance estimator for control-based and delta-adjusted pattern mixture models.

    PubMed

    Tang, Yongqiang

    2017-12-01

    Control-based pattern mixture models (PMM) and delta-adjusted PMMs are commonly used as sensitivity analyses in clinical trials with non-ignorable dropout. These PMMs assume that the statistical behavior of outcomes varies by pattern in the experimental arm in the imputation procedure, but the imputed data are typically analyzed by a standard method such as the primary analysis model. In the multiple imputation (MI) inference, Rubin's variance estimator is generally biased when the imputation and analysis models are uncongenial. One objective of the article is to quantify the bias of Rubin's variance estimator in the control-based and delta-adjusted PMMs for longitudinal continuous outcomes. These PMMs assume the same observed data distribution as the mixed effects model for repeated measures (MMRM). We derive analytic expressions for the MI treatment effect estimator and the associated Rubin's variance in these PMMs and MMRM as functions of the maximum likelihood estimator from the MMRM analysis and the observed proportion of subjects in each dropout pattern when the number of imputations is infinite. The asymptotic bias is generally small or negligible in the delta-adjusted PMM, but can be sizable in the control-based PMM. This indicates that the inference based on Rubin's rule is approximately valid in the delta-adjusted PMM. A simple variance estimator is proposed to ensure asymptotically valid MI inferences in these PMMs, and compared with the bootstrap variance. The proposed method is illustrated by the analysis of an antidepressant trial, and its performance is further evaluated via a simulation study. © 2017, The International Biometric Society.

  17. Simple effective rule to estimate the jamming packing fraction of polydisperse hard spheres.

    PubMed

    Santos, Andrés; Yuste, Santos B; López de Haro, Mariano; Odriozola, Gerardo; Ogarko, Vitaliy

    2014-04-01

    A recent proposal in which the equation of state of a polydisperse hard-sphere mixture is mapped onto that of the one-component fluid is extrapolated beyond the freezing point to estimate the jamming packing fraction ϕJ of the polydisperse system as a simple function of M1M3/M22, where Mk is the kth moment of the size distribution. An analysis of experimental and simulation data of ϕJ for a large number of different mixtures shows a remarkable general agreement with the theoretical estimate. To give extra support to the procedure, simulation data for seventeen mixtures in the high-density region are used to infer the equation of state of the pure hard-sphere system in the metastable region. An excellent collapse of the inferred curves up to the glass transition and a significant narrowing of the different out-of-equilibrium glass branches all the way to jamming are observed. Thus, the present approach provides an extremely simple criterion to unify in a common framework and to give coherence to data coming from very different polydisperse hard-sphere mixtures.

  18. A Boolean Consistent Fuzzy Inference System for Diagnosing Diseases and Its Application for Determining Peritonitis Likelihood

    PubMed Central

    Dragović, Ivana; Turajlić, Nina; Pilčević, Dejan; Petrović, Bratislav; Radojević, Dragan

    2015-01-01

    Fuzzy inference systems (FIS) enable automated assessment and reasoning in a logically consistent manner akin to the way in which humans reason. However, since no conventional fuzzy set theory is in the Boolean frame, it is proposed that Boolean consistent fuzzy logic should be used in the evaluation of rules. The main distinction of this approach is that it requires the execution of a set of structural transformations before the actual values can be introduced, which can, in certain cases, lead to different results. While a Boolean consistent FIS could be used for establishing the diagnostic criteria for any given disease, in this paper it is applied for determining the likelihood of peritonitis, as the leading complication of peritoneal dialysis (PD). Given that patients could be located far away from healthcare institutions (as peritoneal dialysis is a form of home dialysis) the proposed Boolean consistent FIS would enable patients to easily estimate the likelihood of them having peritonitis (where a high likelihood would suggest that prompt treatment is indicated), when medical experts are not close at hand. PMID:27069500

  19. Bayesian inference of interaction properties of noisy dynamical systems with time-varying coupling: capabilities and limitations

    NASA Astrophysics Data System (ADS)

    Wilting, Jens; Lehnertz, Klaus

    2015-08-01

    We investigate a recently published analysis framework based on Bayesian inference for the time-resolved characterization of interaction properties of noisy, coupled dynamical systems. It promises wide applicability and a better time resolution than well-established methods. At the example of representative model systems, we show that the analysis framework has the same weaknesses as previous methods, particularly when investigating interacting, structurally different non-linear oscillators. We also inspect the tracking of time-varying interaction properties and propose a further modification of the algorithm, which improves the reliability of obtained results. We exemplarily investigate the suitability of this algorithm to infer strength and direction of interactions between various regions of the human brain during an epileptic seizure. Within the limitations of the applicability of this analysis tool, we show that the modified algorithm indeed allows a better time resolution through Bayesian inference when compared to previous methods based on least square fits.

  20. Data-driven sensitivity inference for Thomson scattering electron density measurement systems.

    PubMed

    Fujii, Keisuke; Yamada, Ichihiro; Hasuo, Masahiro

    2017-01-01

    We developed a method to infer the calibration parameters of multichannel measurement systems, such as channel variations of sensitivity and noise amplitude, from experimental data. We regard such uncertainties of the calibration parameters as dependent noise. The statistical properties of the dependent noise and that of the latent functions were modeled and implemented in the Gaussian process kernel. Based on their statistical difference, both parameters were inferred from the data. We applied this method to the electron density measurement system by Thomson scattering for the Large Helical Device plasma, which is equipped with 141 spatial channels. Based on the 210 sets of experimental data, we evaluated the correction factor of the sensitivity and noise amplitude for each channel. The correction factor varies by ≈10%, and the random noise amplitude is ≈2%, i.e., the measurement accuracy increases by a factor of 5 after this sensitivity correction. The certainty improvement in the spatial derivative inference was demonstrated.

  1. Expert systems for diagnostic purposes, prospected applications to the radar field

    NASA Astrophysics Data System (ADS)

    Filippi, Riccardo

    Expert systems applied to fault diagnosis, particularly electrical circuit troubleshooting, are introduced. Diagnostic systems consisting of sequences of rules of the symptom-disease type (rule based system) and systems based upon a physical and functional description of the unit subjected to fault diagnosis are treated. Application of such systems to radar equipment troubleshooting, in particular to the transmitter, is discussed.

  2. Rule-based modeling: a computational approach for studying biomolecular site dynamics in cell signaling systems

    PubMed Central

    Chylek, Lily A.; Harris, Leonard A.; Tung, Chang-Shung; Faeder, James R.; Lopez, Carlos F.

    2013-01-01

    Rule-based modeling was developed to address the limitations of traditional approaches for modeling chemical kinetics in cell signaling systems. These systems consist of multiple interacting biomolecules (e.g., proteins), which themselves consist of multiple parts (e.g., domains, linear motifs, and sites of phosphorylation). Consequently, biomolecules that mediate information processing generally have the potential to interact in multiple ways, with the number of possible complexes and post-translational modification states tending to grow exponentially with the number of binary interactions considered. As a result, only large reaction networks capture all possible consequences of the molecular interactions that occur in a cell signaling system, which is problematic because traditional modeling approaches for chemical kinetics (e.g., ordinary differential equations) require explicit network specification. This problem is circumvented through representation of interactions in terms of local rules. With this approach, network specification is implicit and model specification is concise. Concise representation results in a coarse graining of chemical kinetics, which is introduced because all reactions implied by a rule inherit the rate law associated with that rule. Coarse graining can be appropriate if interactions are modular, and the coarseness of a model can be adjusted as needed. Rules can be specified using specialized model-specification languages, and recently developed tools designed for specification of rule-based models allow one to leverage powerful software engineering capabilities. A rule-based model comprises a set of rules, which can be processed by general-purpose simulation and analysis tools to achieve different objectives (e.g., to perform either a deterministic or stochastic simulation). PMID:24123887

  3. Combining Computational Modeling and Neuroimaging to Examine Multiple Category Learning Systems in the Brain

    PubMed Central

    Nomura, Emi M.; Reber, Paul J.

    2012-01-01

    Considerable evidence has argued in favor of multiple neural systems supporting human category learning, one based on conscious rule inference and one based on implicit information integration. However, there have been few attempts to study potential system interactions during category learning. The PINNACLE (Parallel Interactive Neural Networks Active in Category Learning) model incorporates multiple categorization systems that compete to provide categorization judgments about visual stimuli. Incorporating competing systems requires inclusion of cognitive mechanisms associated with resolving this competition and creates a potential credit assignment problem in handling feedback. The hypothesized mechanisms make predictions about internal mental states that are not always reflected in choice behavior, but may be reflected in neural activity. Two prior functional magnetic resonance imaging (fMRI) studies of category learning were re-analyzed using PINNACLE to identify neural correlates of internal cognitive states on each trial. These analyses identified additional brain regions supporting the two types of category learning, regions particularly active when the systems are hypothesized to be in maximal competition, and found evidence of covert learning activity in the “off system” (the category learning system not currently driving behavior). These results suggest that PINNACLE provides a plausible framework for how competing multiple category learning systems are organized in the brain and shows how computational modeling approaches and fMRI can be used synergistically to gain access to cognitive processes that support complex decision-making machinery. PMID:24962771

  4. On the fusion of tuning parameters of fuzzy rules and neural network

    NASA Astrophysics Data System (ADS)

    Mamuda, Mamman; Sathasivam, Saratha

    2017-08-01

    Learning fuzzy rule-based system with neural network can lead to a precise valuable empathy of several problems. Fuzzy logic offers a simple way to reach at a definite conclusion based upon its vague, ambiguous, imprecise, noisy or missing input information. Conventional learning algorithm for tuning parameters of fuzzy rules using training input-output data usually end in a weak firing state, this certainly powers the fuzzy rule and makes it insecure for a multiple-input fuzzy system. In this paper, we introduce a new learning algorithm for tuning the parameters of the fuzzy rules alongside with radial basis function neural network (RBFNN) in training input-output data based on the gradient descent method. By the new learning algorithm, the problem of weak firing using the conventional method was addressed. We illustrated the efficiency of our new learning algorithm by means of numerical examples. MATLAB R2014(a) software was used in simulating our result The result shows that the new learning method has the best advantage of training the fuzzy rules without tempering with the fuzzy rule table which allowed a membership function of the rule to be used more than one time in the fuzzy rule base.

  5. eHive: an artificial intelligence workflow system for genomic analysis.

    PubMed

    Severin, Jessica; Beal, Kathryn; Vilella, Albert J; Fitzgerald, Stephen; Schuster, Michael; Gordon, Leo; Ureta-Vidal, Abel; Flicek, Paul; Herrero, Javier

    2010-05-11

    The Ensembl project produces updates to its comparative genomics resources with each of its several releases per year. During each release cycle approximately two weeks are allocated to generate all the genomic alignments and the protein homology predictions. The number of calculations required for this task grows approximately quadratically with the number of species. We currently support 50 species in Ensembl and we expect the number to continue to grow in the future. We present eHive, a new fault tolerant distributed processing system initially designed to support comparative genomic analysis, based on blackboard systems, network distributed autonomous agents, dataflow graphs and block-branch diagrams. In the eHive system a MySQL database serves as the central blackboard and the autonomous agent, a Perl script, queries the system and runs jobs as required. The system allows us to define dataflow and branching rules to suit all our production pipelines. We describe the implementation of three pipelines: (1) pairwise whole genome alignments, (2) multiple whole genome alignments and (3) gene trees with protein homology inference. Finally, we show the efficiency of the system in real case scenarios. eHive allows us to produce computationally demanding results in a reliable and efficient way with minimal supervision and high throughput. Further documentation is available at: http://www.ensembl.org/info/docs/eHive/.

  6. Modelling of Reservoir Operations using Fuzzy Logic and ANNs

    NASA Astrophysics Data System (ADS)

    Van De Giesen, N.; Coerver, B.; Rutten, M.

    2015-12-01

    Today, almost 40.000 large reservoirs, containing approximately 6.000 km3 of water and inundating an area of almost 400.000 km2, can be found on earth. Since these reservoirs have a storage capacity of almost one-sixth of the global annual river discharge they have a large impact on the timing, volume and peaks of river discharges. Global Hydrological Models (GHM) are thus significantly influenced by these anthropogenic changes in river flows. We developed a parametrically parsimonious method to extract operational rules based on historical reservoir storage and inflow time-series. Managing a reservoir is an imprecise and vague undertaking. Operators always face uncertainties about inflows, evaporation, seepage losses and various water demands to be met. They often base their decisions on experience and on available information, like reservoir storage and the previous periods inflow. We modeled this decision-making process through a combination of fuzzy logic and artificial neural networks in an Adaptive-Network-based Fuzzy Inference System (ANFIS). In a sensitivity analysis, we compared results for reservoirs in Vietnam, Central Asia and the USA. ANFIS can indeed capture reservoirs operations adequately when fed with a historical monthly time-series of inflows and storage. It was shown that using ANFIS, operational rules of existing reservoirs can be derived without much prior knowledge about the reservoirs. Their validity was tested by comparing actual and simulated releases with each other. For the eleven reservoirs modelled, the normalised outflow, <0,1>, was predicted with a MSE of 0.002 to 0.044. The rules can be incorporated into GHMs. After a network for a specific reservoir has been trained, the inflow calculated by the hydrological model can be combined with the release and initial storage to calculate the storage for the next time-step using a mass balance. Subsequently, the release can be predicted one time-step ahead using the inflow and storage.

  7. The perception of rational, goal-directed action in nonhuman primates.

    PubMed

    Wood, Justin N; Glynn, David D; Phillips, Brenda C; Hauser, Marc D

    2007-09-07

    Humans are capable of making inferences about other individuals' intentions and goals by evaluating their actions in relation to the constraints imposed by the environment. This capacity enables humans to go beyond the surface appearance of behavior to draw inferences about an individual's mental states. Presently unclear is whether this capacity is uniquely human or is shared with other animals. We show that cotton-top tamarins, rhesus macaques, and chimpanzees all make spontaneous inferences about a human experimenter's goal by attending to the environmental constraints that guide rational action. These findings rule out simple associative accounts of action perception and show that our capacity to infer rational, goal-directed action likely arose at least as far back as the New World monkeys, some 40 million years ago.

  8. Analysis of Rules for Islamic Inheritance Law in Indonesia Using Hybrid Rule Based Learning

    NASA Astrophysics Data System (ADS)

    Khosyi'ah, S.; Irfan, M.; Maylawati, D. S.; Mukhlas, O. S.

    2018-01-01

    Along with the development of human civilization in Indonesia, the changes and reform of Islamic inheritance law so as to conform to the conditions and culture cannot be denied. The distribution of inheritance in Indonesia can be done automatically by storing the rule of Islamic inheritance law in the expert system. In this study, we analyze the knowledge of experts in Islamic inheritance in Indonesia and represent it in the form of rules using rule-based Forward Chaining (FC) and Davis-Putman-Logemann-Loveland (DPLL) algorithms. By hybridizing FC and DPLL algorithms, the rules of Islamic inheritance law in Indonesia are clearly defined and measured. The rules were conceptually validated by some experts in Islamic laws and informatics. The results revealed that generally all rules were ready for use in an expert system.

  9. Decision support system for triage management: A hybrid approach using rule-based reasoning and fuzzy logic.

    PubMed

    Dehghani Soufi, Mahsa; Samad-Soltani, Taha; Shams Vahdati, Samad; Rezaei-Hachesu, Peyman

    2018-06-01

    Fast and accurate patient triage for the response process is a critical first step in emergency situations. This process is often performed using a paper-based mode, which intensifies workload and difficulty, wastes time, and is at risk of human errors. This study aims to design and evaluate a decision support system (DSS) to determine the triage level. A combination of the Rule-Based Reasoning (RBR) and Fuzzy Logic Classifier (FLC) approaches were used to predict the triage level of patients according to the triage specialist's opinions and Emergency Severity Index (ESI) guidelines. RBR was applied for modeling the first to fourth decision points of the ESI algorithm. The data relating to vital signs were used as input variables and modeled using fuzzy logic. Narrative knowledge was converted to If-Then rules using XML. The extracted rules were then used to create the rule-based engine and predict the triage levels. Fourteen RBR and 27 fuzzy rules were extracted and used in the rule-based engine. The performance of the system was evaluated using three methods with real triage data. The accuracy of the clinical decision support systems (CDSSs; in the test data) was 99.44%. The evaluation of the error rate revealed that, when using the traditional method, 13.4% of the patients were miss-triaged, which is statically significant. The completeness of the documentation also improved from 76.72% to 98.5%. Designed system was effective in determining the triage level of patients and it proved helpful for nurses as they made decisions, generated nursing diagnoses based on triage guidelines. The hybrid approach can reduce triage misdiagnosis in a highly accurate manner and improve the triage outcomes. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. Generative models for discovering sparse distributed representations.

    PubMed Central

    Hinton, G E; Ghahramani, Z

    1997-01-01

    We describe a hierarchical, generative model that can be viewed as a nonlinear generalization of factor analysis and can be implemented in a neural network. The model uses bottom-up, top-down and lateral connections to perform Bayesian perceptual inference correctly. Once perceptual inference has been performed the connection strengths can be updated using a very simple learning rule that only requires locally available information. We demonstrate that the network learns to extract sparse, distributed, hierarchical representations. PMID:9304685

  11. Cardiovascular oscillations: in search of a nonlinear parametric model

    NASA Astrophysics Data System (ADS)

    Bandrivskyy, Andriy; Luchinsky, Dmitry; McClintock, Peter V.; Smelyanskiy, Vadim; Stefanovska, Aneta; Timucin, Dogan

    2003-05-01

    We suggest a fresh approach to the modeling of the human cardiovascular system. Taking advantage of a new Bayesian inference technique, able to deal with stochastic nonlinear systems, we show that one can estimate parameters for models of the cardiovascular system directly from measured time series. We present preliminary results of inference of parameters of a model of coupled oscillators from measured cardiovascular data addressing cardiorespiratory interaction. We argue that the inference technique offers a very promising tool for the modeling, able to contribute significantly towards the solution of a long standing challenge -- development of new diagnostic techniques based on noninvasive measurements.

  12. A rule-based expert system for chemical prioritization using effects-based chemical categories

    EPA Science Inventory

    A rule-based expert system (ES) was developed to predict chemical binding to the estrogen receptor (ER) patterned on the research approaches championed by Gilman Veith to whom this article and journal issue are dedicated. The ERES was built to be mechanistically-transparent and m...

  13. Reveal, A General Reverse Engineering Algorithm for Inference of Genetic Network Architectures

    NASA Technical Reports Server (NTRS)

    Liang, Shoudan; Fuhrman, Stefanie; Somogyi, Roland

    1998-01-01

    Given the immanent gene expression mapping covering whole genomes during development, health and disease, we seek computational methods to maximize functional inference from such large data sets. Is it possible, in principle, to completely infer a complex regulatory network architecture from input/output patterns of its variables? We investigated this possibility using binary models of genetic networks. Trajectories, or state transition tables of Boolean nets, resemble time series of gene expression. By systematically analyzing the mutual information between input states and output states, one is able to infer the sets of input elements controlling each element or gene in the network. This process is unequivocal and exact for complete state transition tables. We implemented this REVerse Engineering ALgorithm (REVEAL) in a C program, and found the problem to be tractable within the conditions tested so far. For n = 50 (elements) and k = 3 (inputs per element), the analysis of incomplete state transition tables (100 state transition pairs out of a possible 10(exp 15)) reliably produced the original rule and wiring sets. While this study is limited to synchronous Boolean networks, the algorithm is generalizable to include multi-state models, essentially allowing direct application to realistic biological data sets. The ability to adequately solve the inverse problem may enable in-depth analysis of complex dynamic systems in biology and other fields.

  14. C-Language Integrated Production System, Version 5.1

    NASA Technical Reports Server (NTRS)

    Riley, Gary; Donnell, Brian; Ly, Huyen-Anh VU; Culbert, Chris; Savely, Robert T.; Mccoy, Daniel J.; Giarratano, Joseph

    1992-01-01

    CLIPS 5.1 provides cohesive software tool for handling wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming provides representation of knowledge by use of heuristics. Object-oriented programming enables modeling of complex systems as modular components. Procedural programming enables CLIPS to represent knowledge in ways similar to those allowed in such languages as C, Pascal, Ada, and LISP. Working with CLIPS 5.1, one can develop expert-system software by use of rule-based programming only, object-oriented programming only, procedural programming only, or combinations of the three.

  15. A rule-based smart automated fertilization and irrigation systems

    NASA Astrophysics Data System (ADS)

    Yousif, Musab El-Rashid; Ghafar, Khairuddin; Zahari, Rahimi; Lim, Tiong Hoo

    2018-04-01

    Smart automation in industries has become very important as it can improve the reliability and efficiency of the systems. The use of smart technologies in agriculture have increased over the year to ensure and control the production of crop and address food security. However, it is important to use proper irrigation systems avoid water wastage and overfeeding of the plant. In this paper, a Smart Rule-based Automated Fertilization and Irrigation System is proposed and evaluated. We propose a rule based decision making algorithm to monitor and control the food supply to the plant and the soil quality. A build-in alert system is also used to update the farmer using a text message. The system is developed and evaluated using a real hardware.

  16. Stock and option portfolio using fuzzy logic approach

    NASA Astrophysics Data System (ADS)

    Sumarti, Novriana; Wahyudi, Nanang

    2014-03-01

    Fuzzy Logic in decision-making process has been widely implemented in various problems in industries. It is the theory of imprecision and uncertainty that was not based on probability theory. Fuzzy Logic adds values of degree between absolute true and absolute false. It starts with and builds on a set of human language rules supplied by the user. The fuzzy systems convert these rules to their mathematical equivalents. This could simplify the job of the system designer and the computer, and results in much more accurate representations of the way systems behave in the real world. In this paper we examine the decision making process of stock and option trading by the usage of MACD (Moving Average Convergence Divergence) technical analysis and Option Pricing with Fuzzy Logic approach. MACD technical analysis is for the prediction of the trends of underlying stock prices, such as bearish (going downward), bullish (going upward), and sideways. By using Fuzzy C-Means technique and Mamdani Fuzzy Inference System, we define the decision output where the value of MACD is high then decision is "Strong Sell", and the value of MACD is Low then the decision is "Strong Buy". We also implement the fuzzification of the Black-Scholes option-pricing formula. The stock and options methods are implemented on a portfolio of one stock and its options. Even though the values of input data, such as interest rates, stock price and its volatility, cannot be obtain accurately, these fuzzy methods can give a belief degree of the calculated the Black-Scholes formula so we can make the decision on option trading. The results show the good capability of the methods in the prediction of stock price trends. The performance of the simulated portfolio for a particular period of time also shows good return.

  17. Representations of the language recognition problem for a theorem prover

    NASA Technical Reports Server (NTRS)

    Minker, J.; Vanderbrug, G. J.

    1972-01-01

    Two representations of the language recognition problem for a theorem prover in first order logic are presented and contrasted. One of the representations is based on the familiar method of generating sentential forms of the language, and the other is based on the Cocke parsing algorithm. An augmented theorem prover is described which permits recognition of recursive languages. The state-transformation method developed by Cordell Green to construct problem solutions in resolution-based systems can be used to obtain the parse tree. In particular, the end-order traversal of the parse tree is derived in one of the representations. An inference system, termed the cycle inference system, is defined which makes it possible for the theorem prover to model the method on which the representation is based. The general applicability of the cycle inference system to state space problems is discussed. Given an unsatisfiable set S, where each clause has at most one positive literal, it is shown that there exists an input proof. The clauses for the two representations satisfy these conditions, as do many state space problems.

  18. Bayesian parameter inference for stochastic biochemical network models using particle Markov chain Monte Carlo

    PubMed Central

    Golightly, Andrew; Wilkinson, Darren J.

    2011-01-01

    Computational systems biology is concerned with the development of detailed mechanistic models of biological processes. Such models are often stochastic and analytically intractable, containing uncertain parameters that must be estimated from time course data. In this article, we consider the task of inferring the parameters of a stochastic kinetic model defined as a Markov (jump) process. Inference for the parameters of complex nonlinear multivariate stochastic process models is a challenging problem, but we find here that algorithms based on particle Markov chain Monte Carlo turn out to be a very effective computationally intensive approach to the problem. Approximations to the inferential model based on stochastic differential equations (SDEs) are considered, as well as improvements to the inference scheme that exploit the SDE structure. We apply the methodology to a Lotka–Volterra system and a prokaryotic auto-regulatory network. PMID:23226583

  19. TRICARE revision to CHAMPUS DRG-based payment system, pricing of hospital claims. Final rule.

    PubMed

    2014-05-21

    This Final rule changes TRICARE's current regulatory provision for inpatient hospital claims priced under the DRG-based payment system. Claims are currently priced by using the rates and weights that are in effect on a beneficiary's date of admission. This Final rule changes that provision to price such claims by using the rates and weights that are in effect on a beneficiary's date of discharge.

  20. Phenomenology-Based Inverse Scattering for Sensor Information Fusion

    DTIC Science & Technology

    2006-09-15

    abilities in the past. Rule -based systems and mathematics of logic implied significant similarities between the two: Thoughts, words, and phrases...all are logical statements. The situation has changed, in part due to the fact that logic- rule systems have not been sufficiently powerful to explain...references]. 3 Language mechanisms of our mind include abilities to acquire a large vocabulary, rules of grammar, and to use the finite set of

  1. Personalization of Rule-based Web Services.

    PubMed

    Choi, Okkyung; Han, Sang Yong

    2008-04-04

    Nowadays Web users have clearly expressed their wishes to receive personalized services directly. Personalization is the way to tailor services directly to the immediate requirements of the user. However, the current Web Services System does not provide any features supporting this such as consideration of personalization of services and intelligent matchmaking. In this research a flexible, personalized Rule-based Web Services System to address these problems and to enable efficient search, discovery and construction across general Web documents and Semantic Web documents in a Web Services System is proposed. This system utilizes matchmaking among service requesters', service providers' and users' preferences using a Rule-based Search Method, and subsequently ranks search results. A prototype of efficient Web Services search and construction for the suggested system is developed based on the current work.

  2. An analytical fuzzy-based approach to ?-gain optimal control of input-affine nonlinear systems using Newton-type algorithm

    NASA Astrophysics Data System (ADS)

    Milic, Vladimir; Kasac, Josip; Novakovic, Branko

    2015-10-01

    This paper is concerned with ?-gain optimisation of input-affine nonlinear systems controlled by analytic fuzzy logic system. Unlike the conventional fuzzy-based strategies, the non-conventional analytic fuzzy control method does not require an explicit fuzzy rule base. As the first contribution of this paper, we prove, by using the Stone-Weierstrass theorem, that the proposed fuzzy system without rule base is universal approximator. The second contribution of this paper is an algorithm for solving a finite-horizon minimax problem for ?-gain optimisation. The proposed algorithm consists of recursive chain rule for first- and second-order derivatives, Newton's method, multi-step Adams method and automatic differentiation. Finally, the results of this paper are evaluated on a second-order nonlinear system.

  3. Retinal Connectomics: Towards Complete, Accurate Networks

    PubMed Central

    Marc, Robert E.; Jones, Bryan W.; Watt, Carl B.; Anderson, James R.; Sigulinsky, Crystal; Lauritzen, Scott

    2013-01-01

    Connectomics is a strategy for mapping complex neural networks based on high-speed automated electron optical imaging, computational assembly of neural data volumes, web-based navigational tools to explore 1012–1015 byte (terabyte to petabyte) image volumes, and annotation and markup tools to convert images into rich networks with cellular metadata. These collections of network data and associated metadata, analyzed using tools from graph theory and classification theory, can be merged with classical systems theory, giving a more completely parameterized view of how biologic information processing systems are implemented in retina and brain. Networks have two separable features: topology and connection attributes. The first findings from connectomics strongly validate the idea that the topologies complete retinal networks are far more complex than the simple schematics that emerged from classical anatomy. In particular, connectomics has permitted an aggressive refactoring of the retinal inner plexiform layer, demonstrating that network function cannot be simply inferred from stratification; exposing the complex geometric rules for inserting different cells into a shared network; revealing unexpected bidirectional signaling pathways between mammalian rod and cone systems; documenting selective feedforward systems, novel candidate signaling architectures, new coupling motifs, and the highly complex architecture of the mammalian AII amacrine cell. This is but the beginning, as the underlying principles of connectomics are readily transferrable to non-neural cell complexes and provide new contexts for assessing intercellular communication. PMID:24016532

  4. Life insurance risk assessment using a fuzzy logic expert system

    NASA Technical Reports Server (NTRS)

    Carreno, Luis A.; Steel, Roy A.

    1992-01-01

    In this paper, we present a knowledge based system that combines fuzzy processing with rule-based processing to form an improved decision aid for evaluating risk for life insurance. This application illustrates the use of FuzzyCLIPS to build a knowledge based decision support system possessing fuzzy components to improve user interactions and KBS performance. The results employing FuzzyCLIPS are compared with the results obtained from the solution of the problem using traditional numerical equations. The design of the fuzzy solution consists of a CLIPS rule-based system for some factors combined with fuzzy logic rules for others. This paper describes the problem, proposes a solution, presents the results, and provides a sample output of the software product.

  5. Translating expert system rules into Ada code with validation and verification

    NASA Technical Reports Server (NTRS)

    Becker, Lee; Duckworth, R. James; Green, Peter; Michalson, Bill; Gosselin, Dave; Nainani, Krishan; Pease, Adam

    1991-01-01

    The purpose of this ongoing research and development program is to develop software tools which enable the rapid development, upgrading, and maintenance of embedded real-time artificial intelligence systems. The goals of this phase of the research were to investigate the feasibility of developing software tools which automatically translate expert system rules into Ada code and develop methods for performing validation and verification testing of the resultant expert system. A prototype system was demonstrated which automatically translated rules from an Air Force expert system was demonstrated which detected errors in the execution of the resultant system. The method and prototype tools for converting AI representations into Ada code by converting the rules into Ada code modules and then linking them with an Activation Framework based run-time environment to form an executable load module are discussed. This method is based upon the use of Evidence Flow Graphs which are a data flow representation for intelligent systems. The development of prototype test generation and evaluation software which was used to test the resultant code is discussed. This testing was performed automatically using Monte-Carlo techniques based upon a constraint based description of the required performance for the system.

  6. Runtime Verification of Pacemaker Functionality Using Hierarchical Fuzzy Colored Petri-nets.

    PubMed

    Majma, Negar; Babamir, Seyed Morteza; Monadjemi, Amirhassan

    2017-02-01

    Today, implanted medical devices are increasingly used for many patients and in case of diverse health problems. However, several runtime problems and errors are reported by the relevant organizations, even resulting in patient death. One of those devices is the pacemaker. The pacemaker is a device helping the patient to regulate the heartbeat by connecting to the cardiac vessels. This device is directed by its software, so any failure in this software causes a serious malfunction. Therefore, this study aims to a better way to monitor the device's software behavior to decrease the failure risk. Accordingly, we supervise the runtime function and status of the software. The software verification means examining limitations and needs of the system users by the system running software. In this paper, a method to verify the pacemaker software, based on the fuzzy function of the device, is presented. So, the function limitations of the device are identified and presented as fuzzy rules and then the device is verified based on the hierarchical Fuzzy Colored Petri-net (FCPN), which is formed considering the software limits. Regarding the experiences of using: 1) Fuzzy Petri-nets (FPN) to verify insulin pumps, 2) Colored Petri-nets (CPN) to verify the pacemaker and 3) To verify the pacemaker by a software agent with Petri-network based knowledge, which we gained during the previous studies, the runtime behavior of the pacemaker software is examined by HFCPN, in this paper. This is considered a developing step compared to the earlier work. HFCPN in this paper, compared to the FPN and CPN used in our previous studies reduces the complexity. By presenting the Petri-net (PN) in a hierarchical form, the verification runtime, decreased as 90.61% compared to the verification runtime in the earlier work. Since we need an inference engine in the runtime verification, we used the HFCPN to enhance the performance of the inference engine.

  7. Debugging expert systems using a dynamically created hypertext network

    NASA Technical Reports Server (NTRS)

    Boyle, Craig D. B.; Schuette, John F.

    1991-01-01

    The labor intensive nature of expert system writing and debugging motivated this study. The hypothesis is that a hypertext based debugging tool is easier and faster than one traditional tool, the graphical execution trace. HESDE (Hypertext Expert System Debugging Environment) uses Hypertext nodes and links to represent the objects and their relationships created during the execution of a rule based expert system. HESDE operates transparently on top of the CLIPS (C Language Integrated Production System) rule based system environment and is used during the knowledge base debugging process. During the execution process HESDE builds an execution trace. Use of facts, rules, and their values are automatically stored in a Hypertext network for each execution cycle. After the execution process, the knowledge engineer may access the Hypertext network and browse the network created. The network may be viewed in terms of rules, facts, and values. An experiment was conducted to compare HESDE with a graphical debugging environment. Subjects were given representative tasks. For speed and accuracy, in eight of the eleven tasks given to subjects, HESDE was significantly better.

  8. Integration of Hospital Information and Clinical Decision Support Systems to Enable the Reuse of Electronic Health Record Data.

    PubMed

    Kopanitsa, Georgy

    2017-05-18

    The efficiency and acceptance of clinical decision support systems (CDSS) can increase if they reuse medical data captured during health care delivery. High heterogeneity of the existing legacy data formats has become the main barrier for the reuse of data. Thus, we need to apply data modeling mechanisms that provide standardization, transformation, accumulation and querying medical data to allow its reuse. In this paper, we focus on the interoperability issues of the hospital information systems (HIS) and CDSS data integration. Our study is based on the approach proposed by Marcos et al. where archetypes are used as a standardized mechanism for the interaction of a CDSS with an electronic health record (EHR). We build an integration tool to enable CDSSs collect data from various institutions without a need for modifications in the implementation. The approach implies development of a conceptual level as a set of archetypes representing concepts required by a CDSS. Treatment case data from Regional Clinical Hospital in Tomsk, Russia was extracted, transformed and loaded to the archetype database of a clinical decision support system. Test records' normalization has been performed by defining transformation and aggregation rules between the EHR data and the archetypes. These mapping rules were used to automatically generate openEHR compliant data. After the transformation, archetype data instances were loaded into the CDSS archetype based data storage. The performance times showed acceptable performance for the extraction stage with a mean of 17.428 s per year (3436 case records). The transformation times were also acceptable with 136.954 s per year (0.039 s per one instance). The accuracy evaluation showed the correctness and applicability of the method for the wide range of HISes. These operations were performed without interrupting the HIS workflow to prevent the HISes from disturbing the service provision to the users. The project results have proven that archetype based technologies are mature enough to be applied in routine operations that require extraction, transformation, loading and querying medical data from heterogeneous EHR systems. Inference models in clinical research and CDSS can benefit from this by defining queries to a valid data set with known structure and constraints. The standard based nature of the archetype approach allows an easy integration of CDSSs with existing EHR systems.

  9. Explorations of electric current system in solar active regions. I - Empirical inferences of the current flows

    NASA Technical Reports Server (NTRS)

    Ding, Y. J.; Hong, Q. F.; Hagyard, M. J.; Deloach, A. C.; Liu, X. P.

    1987-01-01

    Techniques to identify sources of electric current systems and their channels of flow in solar active regions are explored. Measured photospheric vector magnetic fields together with high-resolution white-light and H-alpha filtergrams provide the data base to derive the current systems in the photosphere and chromosphere. As an example, the techniques are then applied to infer current systems in AR 2372 in early April 1980.

  10. Exact Hybrid Particle/Population Simulation of Rule-Based Models of Biochemical Systems

    PubMed Central

    Stover, Lori J.; Nair, Niketh S.; Faeder, James R.

    2014-01-01

    Detailed modeling and simulation of biochemical systems is complicated by the problem of combinatorial complexity, an explosion in the number of species and reactions due to myriad protein-protein interactions and post-translational modifications. Rule-based modeling overcomes this problem by representing molecules as structured objects and encoding their interactions as pattern-based rules. This greatly simplifies the process of model specification, avoiding the tedious and error prone task of manually enumerating all species and reactions that can potentially exist in a system. From a simulation perspective, rule-based models can be expanded algorithmically into fully-enumerated reaction networks and simulated using a variety of network-based simulation methods, such as ordinary differential equations or Gillespie's algorithm, provided that the network is not exceedingly large. Alternatively, rule-based models can be simulated directly using particle-based kinetic Monte Carlo methods. This “network-free” approach produces exact stochastic trajectories with a computational cost that is independent of network size. However, memory and run time costs increase with the number of particles, limiting the size of system that can be feasibly simulated. Here, we present a hybrid particle/population simulation method that combines the best attributes of both the network-based and network-free approaches. The method takes as input a rule-based model and a user-specified subset of species to treat as population variables rather than as particles. The model is then transformed by a process of “partial network expansion” into a dynamically equivalent form that can be simulated using a population-adapted network-free simulator. The transformation method has been implemented within the open-source rule-based modeling platform BioNetGen, and resulting hybrid models can be simulated using the particle-based simulator NFsim. Performance tests show that significant memory savings can be achieved using the new approach and a monetary cost analysis provides a practical measure of its utility. PMID:24699269

  11. Exact hybrid particle/population simulation of rule-based models of biochemical systems.

    PubMed

    Hogg, Justin S; Harris, Leonard A; Stover, Lori J; Nair, Niketh S; Faeder, James R

    2014-04-01

    Detailed modeling and simulation of biochemical systems is complicated by the problem of combinatorial complexity, an explosion in the number of species and reactions due to myriad protein-protein interactions and post-translational modifications. Rule-based modeling overcomes this problem by representing molecules as structured objects and encoding their interactions as pattern-based rules. This greatly simplifies the process of model specification, avoiding the tedious and error prone task of manually enumerating all species and reactions that can potentially exist in a system. From a simulation perspective, rule-based models can be expanded algorithmically into fully-enumerated reaction networks and simulated using a variety of network-based simulation methods, such as ordinary differential equations or Gillespie's algorithm, provided that the network is not exceedingly large. Alternatively, rule-based models can be simulated directly using particle-based kinetic Monte Carlo methods. This "network-free" approach produces exact stochastic trajectories with a computational cost that is independent of network size. However, memory and run time costs increase with the number of particles, limiting the size of system that can be feasibly simulated. Here, we present a hybrid particle/population simulation method that combines the best attributes of both the network-based and network-free approaches. The method takes as input a rule-based model and a user-specified subset of species to treat as population variables rather than as particles. The model is then transformed by a process of "partial network expansion" into a dynamically equivalent form that can be simulated using a population-adapted network-free simulator. The transformation method has been implemented within the open-source rule-based modeling platform BioNetGen, and resulting hybrid models can be simulated using the particle-based simulator NFsim. Performance tests show that significant memory savings can be achieved using the new approach and a monetary cost analysis provides a practical measure of its utility.

  12. Monitoring Agents for Assisting NASA Engineers with Shuttle Ground Processing

    NASA Technical Reports Server (NTRS)

    Semmel, Glenn S.; Davis, Steven R.; Leucht, Kurt W.; Rowe, Danil A.; Smith, Kevin E.; Boeloeni, Ladislau

    2005-01-01

    The Spaceport Processing Systems Branch at NASA Kennedy Space Center has designed, developed, and deployed a rule-based agent to monitor the Space Shuttle's ground processing telemetry stream. The NASA Engineering Shuttle Telemetry Agent increases situational awareness for system and hardware engineers during ground processing of the Shuttle's subsystems. The agent provides autonomous monitoring of the telemetry stream and automatically alerts system engineers when user defined conditions are satisfied. Efficiency and safety are improved through increased automation. Sandia National Labs' Java Expert System Shell is employed as the agent's rule engine. The shell's predicate logic lends itself well to capturing the heuristics and specifying the engineering rules within this domain. The declarative paradigm of the rule-based agent yields a highly modular and scalable design spanning multiple subsystems of the Shuttle. Several hundred monitoring rules have been written thus far with corresponding notifications sent to Shuttle engineers. This chapter discusses the rule-based telemetry agent used for Space Shuttle ground processing. We present the problem domain along with design and development considerations such as information modeling, knowledge capture, and the deployment of the product. We also present ongoing work with other condition monitoring agents.

  13. Proving Properties of Rule-Based Systems

    DTIC Science & Technology

    1990-12-01

    in these systems and enable us to use them with more confidence. Each system of rules is encoded as a set of axioms that define the system theory . The...operation of the rule language and information about the subject domain are also described in the system theory . Validation tasks, such as...the validity of the conjecture in the system theory , we have carried out the corresponding validation task. If the proof is restricted to be

  14. Metadata behind the Interoperability of Wireless Sensor Networks

    PubMed Central

    Ballari, Daniela; Wachowicz, Monica; Callejo, Miguel Angel Manso

    2009-01-01

    Wireless Sensor Networks (WSNs) produce changes of status that are frequent, dynamic and unpredictable, and cannot be represented using a linear cause-effect approach. Consequently, a new approach is needed to handle these changes in order to support dynamic interoperability. Our approach is to introduce the notion of context as an explicit representation of changes of a WSN status inferred from metadata elements, which in turn, leads towards a decision-making process about how to maintain dynamic interoperability. This paper describes the developed context model to represent and reason over different WSN status based on four types of contexts, which have been identified as sensing, node, network and organisational contexts. The reasoning has been addressed by developing contextualising and bridges rules. As a result, we were able to demonstrate how contextualising rules have been used to reason on changes of WSN status as a first step towards maintaining dynamic interoperability. PMID:22412330

  15. Metadata behind the Interoperability of Wireless Sensor Networks.

    PubMed

    Ballari, Daniela; Wachowicz, Monica; Callejo, Miguel Angel Manso

    2009-01-01

    Wireless Sensor Networks (WSNs) produce changes of status that are frequent, dynamic and unpredictable, and cannot be represented using a linear cause-effect approach. Consequently, a new approach is needed to handle these changes in order to support dynamic interoperability. Our approach is to introduce the notion of context as an explicit representation of changes of a WSN status inferred from metadata elements, which in turn, leads towards a decision-making process about how to maintain dynamic interoperability. This paper describes the developed context model to represent and reason over different WSN status based on four types of contexts, which have been identified as sensing, node, network and organisational contexts. The reasoning has been addressed by developing contextualising and bridges rules. As a result, we were able to demonstrate how contextualising rules have been used to reason on changes of WSN status as a first step towards maintaining dynamic interoperability.

  16. Simple spatial scaling rules behind complex cities.

    PubMed

    Li, Ruiqi; Dong, Lei; Zhang, Jiang; Wang, Xinran; Wang, Wen-Xu; Di, Zengru; Stanley, H Eugene

    2017-11-28

    Although most of wealth and innovation have been the result of human interaction and cooperation, we are not yet able to quantitatively predict the spatial distributions of three main elements of cities: population, roads, and socioeconomic interactions. By a simple model mainly based on spatial attraction and matching growth mechanisms, we reveal that the spatial scaling rules of these three elements are in a consistent framework, which allows us to use any single observation to infer the others. All numerical and theoretical results are consistent with empirical data from ten representative cities. In addition, our model can also provide a general explanation of the origins of the universal super- and sub-linear aggregate scaling laws and accurately predict kilometre-level socioeconomic activity. Our work opens a new avenue for uncovering the evolution of cities in terms of the interplay among urban elements, and it has a broad range of applications.

  17. CT Image Sequence Analysis for Object Recognition - A Rule-Based 3-D Computer Vision System

    Treesearch

    Dongping Zhu; Richard W. Conners; Daniel L. Schmoldt; Philip A. Araman

    1991-01-01

    Research is now underway to create a vision system for hardwood log inspection using a knowledge-based approach. In this paper, we present a rule-based, 3-D vision system for locating and identifying wood defects using topological, geometric, and statistical attributes. A number of different features can be derived from the 3-D input scenes. These features and evidence...

  18. An analysis of multi-type relational interactions in FMA using graph motifs with disjointness constraints.

    PubMed

    Zhang, Guo-Qiang; Luo, Lingyun; Ogbuji, Chime; Joslyn, Cliff; Mejino, Jose; Sahoo, Satya S

    2012-01-01

    The interaction of multiple types of relationships among anatomical classes in the Foundational Model of Anatomy (FMA) can provide inferred information valuable for quality assurance. This paper introduces a method called Motif Checking (MOCH) to study the effects of such multi-relation type interactions for detecting logical inconsistencies as well as other anomalies represented by the motifs. MOCH represents patterns of multi-type interaction as small labeled (with multiple types of edges) sub-graph motifs, whose nodes represent class variables, and labeled edges represent relational types. By representing FMA as an RDF graph and motifs as SPARQL queries, fragments of FMA are automatically obtained as auditing candidates. Leveraging the scalability and reconfigurability of Semantic Web Technology, we performed exhaustive analyses of a variety of labeled sub-graph motifs. The quality assurance feature of MOCH comes from the distinct use of a subset of the edges of the graph motifs as constraints for disjointness, whereby bringing in rule-based flavor to the approach as well. With possible disjointness implied by antonyms, we performed manual inspection of the resulting FMA fragments and tracked down sources of abnormal inferred conclusions (logical inconsistencies), which are amendable for programmatic revision of the FMA. Our results demonstrate that MOCH provides a unique source of valuable information for quality assurance. Since our approach is general, it is applicable to any ontological system with an OWL representation.

  19. An Analysis of Multi-type Relational Interactions in FMA Using Graph Motifs with Disjointness Constraints

    PubMed Central

    Zhang, Guo-Qiang; Luo, Lingyun; Ogbuji, Chime; Joslyn, Cliff; Mejino, Jose; Sahoo, Satya S

    2012-01-01

    The interaction of multiple types of relationships among anatomical classes in the Foundational Model of Anatomy (FMA) can provide inferred information valuable for quality assurance. This paper introduces a method called Motif Checking (MOCH) to study the effects of such multi-relation type interactions for detecting logical inconsistencies as well as other anomalies represented by the motifs. MOCH represents patterns of multi-type interaction as small labeled (with multiple types of edges) sub-graph motifs, whose nodes represent class variables, and labeled edges represent relational types. By representing FMA as an RDF graph and motifs as SPARQL queries, fragments of FMA are automatically obtained as auditing candidates. Leveraging the scalability and reconfigurability of Semantic Web Technology, we performed exhaustive analyses of a variety of labeled sub-graph motifs. The quality assurance feature of MOCH comes from the distinct use of a subset of the edges of the graph motifs as constraints for disjointness, whereby bringing in rule-based flavor to the approach as well. With possible disjointness implied by antonyms, we performed manual inspection of the resulting FMA fragments and tracked down sources of abnormal inferred conclusions (logical inconsistencies), which are amendable for programmatic revision of the FMA. Our results demonstrate that MOCH provides a unique source of valuable information for quality assurance. Since our approach is general, it is applicable to any ontological system with an OWL representation. PMID:23304382

  20. Rule Extraction Based on Extreme Learning Machine and an Improved Ant-Miner Algorithm for Transient Stability Assessment.

    PubMed

    Li, Yang; Li, Guoqing; Wang, Zhenhao

    2015-01-01

    In order to overcome the problems of poor understandability of the pattern recognition-based transient stability assessment (PRTSA) methods, a new rule extraction method based on extreme learning machine (ELM) and an improved Ant-miner (IAM) algorithm is presented in this paper. First, the basic principles of ELM and Ant-miner algorithm are respectively introduced. Then, based on the selected optimal feature subset, an example sample set is generated by the trained ELM-based PRTSA model. And finally, a set of classification rules are obtained by IAM algorithm to replace the original ELM network. The novelty of this proposal is that transient stability rules are extracted from an example sample set generated by the trained ELM-based transient stability assessment model by using IAM algorithm. The effectiveness of the proposed method is shown by the application results on the New England 39-bus power system and a practical power system--the southern power system of Hebei province.

Top