Sample records for rule-based cell systems

  1. Rule-based modeling: a computational approach for studying biomolecular site dynamics in cell signaling systems

    PubMed Central

    Chylek, Lily A.; Harris, Leonard A.; Tung, Chang-Shung; Faeder, James R.; Lopez, Carlos F.

    2013-01-01

    Rule-based modeling was developed to address the limitations of traditional approaches for modeling chemical kinetics in cell signaling systems. These systems consist of multiple interacting biomolecules (e.g., proteins), which themselves consist of multiple parts (e.g., domains, linear motifs, and sites of phosphorylation). Consequently, biomolecules that mediate information processing generally have the potential to interact in multiple ways, with the number of possible complexes and post-translational modification states tending to grow exponentially with the number of binary interactions considered. As a result, only large reaction networks capture all possible consequences of the molecular interactions that occur in a cell signaling system, which is problematic because traditional modeling approaches for chemical kinetics (e.g., ordinary differential equations) require explicit network specification. This problem is circumvented through representation of interactions in terms of local rules. With this approach, network specification is implicit and model specification is concise. Concise representation results in a coarse graining of chemical kinetics, which is introduced because all reactions implied by a rule inherit the rate law associated with that rule. Coarse graining can be appropriate if interactions are modular, and the coarseness of a model can be adjusted as needed. Rules can be specified using specialized model-specification languages, and recently developed tools designed for specification of rule-based models allow one to leverage powerful software engineering capabilities. A rule-based model comprises a set of rules, which can be processed by general-purpose simulation and analysis tools to achieve different objectives (e.g., to perform either a deterministic or stochastic simulation). PMID:24123887

  2. Systematic reconstruction of TRANSPATH data into Cell System Markup Language

    PubMed Central

    Nagasaki, Masao; Saito, Ayumu; Li, Chen; Jeong, Euna; Miyano, Satoru

    2008-01-01

    Background Many biological repositories store information based on experimental study of the biological processes within a cell, such as protein-protein interactions, metabolic pathways, signal transduction pathways, or regulations of transcription factors and miRNA. Unfortunately, it is difficult to directly use such information when generating simulation-based models. Thus, modeling rules for encoding biological knowledge into system-dynamics-oriented standardized formats would be very useful for fully understanding cellular dynamics at the system level. Results We selected the TRANSPATH database, a manually curated high-quality pathway database, which provides a plentiful source of cellular events in humans, mice, and rats, collected from over 31,500 publications. In this work, we have developed 16 modeling rules based on hybrid functional Petri net with extension (HFPNe), which is suitable for graphical representing and simulating biological processes. In the modeling rules, each Petri net element is incorporated with Cell System Ontology to enable semantic interoperability of models. As a formal ontology for biological pathway modeling with dynamics, CSO also defines biological terminology and corresponding icons. By combining HFPNe with the CSO features, it is possible to make TRANSPATH data to simulation-based and semantically valid models. The results are encoded into a biological pathway format, Cell System Markup Language (CSML), which eases the exchange and integration of biological data and models. Conclusion By using the 16 modeling rules, 97% of the reactions in TRANSPATH are converted into simulation-based models represented in CSML. This reconstruction demonstrates that it is possible to use our rules to generate quantitative models from static pathway descriptions. PMID:18570683

  3. Systematic reconstruction of TRANSPATH data into cell system markup language.

    PubMed

    Nagasaki, Masao; Saito, Ayumu; Li, Chen; Jeong, Euna; Miyano, Satoru

    2008-06-23

    Many biological repositories store information based on experimental study of the biological processes within a cell, such as protein-protein interactions, metabolic pathways, signal transduction pathways, or regulations of transcription factors and miRNA. Unfortunately, it is difficult to directly use such information when generating simulation-based models. Thus, modeling rules for encoding biological knowledge into system-dynamics-oriented standardized formats would be very useful for fully understanding cellular dynamics at the system level. We selected the TRANSPATH database, a manually curated high-quality pathway database, which provides a plentiful source of cellular events in humans, mice, and rats, collected from over 31,500 publications. In this work, we have developed 16 modeling rules based on hybrid functional Petri net with extension (HFPNe), which is suitable for graphical representing and simulating biological processes. In the modeling rules, each Petri net element is incorporated with Cell System Ontology to enable semantic interoperability of models. As a formal ontology for biological pathway modeling with dynamics, CSO also defines biological terminology and corresponding icons. By combining HFPNe with the CSO features, it is possible to make TRANSPATH data to simulation-based and semantically valid models. The results are encoded into a biological pathway format, Cell System Markup Language (CSML), which eases the exchange and integration of biological data and models. By using the 16 modeling rules, 97% of the reactions in TRANSPATH are converted into simulation-based models represented in CSML. This reconstruction demonstrates that it is possible to use our rules to generate quantitative models from static pathway descriptions.

  4. An expert system to manage the operation of the Space Shuttle's fuel cell cryogenic reactant tanks

    NASA Technical Reports Server (NTRS)

    Murphey, Amy Y.

    1990-01-01

    This paper describes a rule-based expert system to manage the operation of the Space Shuttle's cryogenic fuel system. Rules are based on standard fuel tank operating procedures described in the EECOM Console Handbook. The problem of configuring the operation of the Space Shuttle's fuel tanks is well-bounded and well defined. Moreover, the solution of this problem can be encoded in a knowledge-based system. Therefore, a rule-based expert system is the appropriate paradigm. Furthermore, the expert system could be used in coordination with power system simulation software to design operating procedures for specific missions.

  5. Compartmental and Spatial Rule-Based Modeling with Virtual Cell.

    PubMed

    Blinov, Michael L; Schaff, James C; Vasilescu, Dan; Moraru, Ion I; Bloom, Judy E; Loew, Leslie M

    2017-10-03

    In rule-based modeling, molecular interactions are systematically specified in the form of reaction rules that serve as generators of reactions. This provides a way to account for all the potential molecular complexes and interactions among multivalent or multistate molecules. Recently, we introduced rule-based modeling into the Virtual Cell (VCell) modeling framework, permitting graphical specification of rules and merger of networks generated automatically (using the BioNetGen modeling engine) with hand-specified reaction networks. VCell provides a number of ordinary differential equation and stochastic numerical solvers for single-compartment simulations of the kinetic systems derived from these networks, and agent-based network-free simulation of the rules. In this work, compartmental and spatial modeling of rule-based models has been implemented within VCell. To enable rule-based deterministic and stochastic spatial simulations and network-free agent-based compartmental simulations, the BioNetGen and NFSim engines were each modified to support compartments. In the new rule-based formalism, every reactant and product pattern and every reaction rule are assigned locations. We also introduce the rule-based concept of molecular anchors. This assures that any species that has a molecule anchored to a predefined compartment will remain in this compartment. Importantly, in addition to formulation of compartmental models, this now permits VCell users to seamlessly connect reaction networks derived from rules to explicit geometries to automatically generate a system of reaction-diffusion equations. These may then be simulated using either the VCell partial differential equations deterministic solvers or the Smoldyn stochastic simulator. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  6. A CLIPS expert system for clinical flow cytometry data analysis

    NASA Technical Reports Server (NTRS)

    Salzman, G. C.; Duque, R. E.; Braylan, R. C.; Stewart, C. C.

    1990-01-01

    An expert system is being developed using CLIPS to assist clinicians in the analysis of multivariate flow cytometry data from cancer patients. Cluster analysis is used to find subpopulations representing various cell types in multiple datasets each consisting of four to five measurements on each of 5000 cells. CLIPS facts are derived from results of the clustering. CLIPS rules are based on the expertise of Drs. Stewart, Duque, and Braylan. The rules incorporate certainty factors based on case histories.

  7. A hybrid agent-based approach for modeling microbiological systems.

    PubMed

    Guo, Zaiyi; Sloot, Peter M A; Tay, Joc Cing

    2008-11-21

    Models for systems biology commonly adopt Differential Equations or Agent-Based modeling approaches for simulating the processes as a whole. Models based on differential equations presuppose phenomenological intracellular behavioral mechanisms, while models based on Multi-Agent approach often use directly translated, and quantitatively less precise if-then logical rule constructs. We propose an extendible systems model based on a hybrid agent-based approach where biological cells are modeled as individuals (agents) while molecules are represented by quantities. This hybridization in entity representation entails a combined modeling strategy with agent-based behavioral rules and differential equations, thereby balancing the requirements of extendible model granularity with computational tractability. We demonstrate the efficacy of this approach with models of chemotaxis involving an assay of 10(3) cells and 1.2x10(6) molecules. The model produces cell migration patterns that are comparable to laboratory observations.

  8. Rule-Based Simulation of Multi-Cellular Biological Systems—A Review of Modeling Techniques

    PubMed Central

    Hwang, Minki; Garbey, Marc; Berceli, Scott A.; Tran-Son-Tay, Roger

    2011-01-01

    Emergent behaviors of multi-cellular biological systems (MCBS) result from the behaviors of each individual cells and their interactions with other cells and with the environment. Modeling MCBS requires incorporating these complex interactions among the individual cells and the environment. Modeling approaches for MCBS can be grouped into two categories: continuum models and cell-based models. Continuum models usually take the form of partial differential equations, and the model equations provide insight into the relationship among the components in the system. Cell-based models simulate each individual cell behavior and interactions among them enabling the observation of the emergent system behavior. This review focuses on the cell-based models of MCBS, and especially, the technical aspect of the rule-based simulation method for MCBS is reviewed. How to implement the cell behaviors and the interactions with other cells and with the environment into the computational domain is discussed. The cell behaviors reviewed in this paper are division, migration, apoptosis/necrosis, and differentiation. The environmental factors such as extracellular matrix, chemicals, microvasculature, and forces are also discussed. Application examples of these cell behaviors and interactions are presented. PMID:21369345

  9. Creating an ontology driven rules base for an expert system for medical diagnosis.

    PubMed

    Bertaud Gounot, Valérie; Donfack, Valéry; Lasbleiz, Jérémy; Bourde, Annabel; Duvauferrier, Régis

    2011-01-01

    Expert systems of the 1980s have failed on the difficulties of maintaining large rule bases. The current work proposes a method to achieve and maintain rule bases grounded on ontologies (like NCIT). The process described here for an expert system on plasma cell disorder encompasses extraction of a sub-ontology and automatic and comprehensive generation of production rules. The creation of rules is not based directly on classes, but on individuals (instances). Instances can be considered as prototypes of diseases formally defined by "destrictions" in the ontology. Thus, it is possible to use this process to make diagnoses of diseases. The perspectives of this work are considered: the process described with an ontology formalized in OWL1 can be extended by using an ontology in OWL2 and allow reasoning about numerical data in addition to symbolic data.

  10. Simple rules for a "simple" nervous system? Molecular and biomathematical approaches to enteric nervous system formation and malformation.

    PubMed

    Newgreen, Donald F; Dufour, Sylvie; Howard, Marthe J; Landman, Kerry A

    2013-10-01

    We review morphogenesis of the enteric nervous system from migratory neural crest cells, and defects of this process such as Hirschsprung disease, centering on cell motility and assembly, and cell adhesion and extracellular matrix molecules, along with cell proliferation and growth factors. We then review continuum and agent-based (cellular automata) models with rules of cell movement and logistical proliferation. Both movement and proliferation at the individual cell level are modeled with stochastic components from which stereotyped outcomes emerge at the population level. These models reproduced the wave-like colonization of the intestine by enteric neural crest cells, and several new properties emerged, such as colonization by frontal expansion, which were later confirmed biologically. These models predict a surprising level of clonal heterogeneity both in terms of number and distribution of daughter cells. Biologically, migrating cells form stable chains made up of unstable cells, but this is not seen in the initial model. We outline additional rules for cell differentiation into neurons, axon extension, cell-axon and cell-cell adhesions, chemotaxis and repulsion which can reproduce chain migration. After the migration stage, the cells re-arrange as a network of ganglia. Changes in cell adhesion molecules parallel this, and we describe additional rules based on Steinberg's Differential Adhesion Hypothesis, reflecting changing levels of adhesion in neural crest cells and neurons. This was able to reproduce enteric ganglionation in a model. Mouse mutants with disturbances of enteric nervous system morphogenesis are discussed, and these suggest future refinement of the models. The modeling suggests a relatively simple set of cell behavioral rules could account for complex patterns of morphogenesis. The model has allowed the proposal that Hirschsprung disease is mostly an enteric neural crest cell proliferation defect, not a defect of cell migration. In addition, the model suggests an explanations for zonal and skip segment variants of Hirschsprung disease, and also gives a novel stochastic explanation for the observed discordancy of Hirschsprung disease in identical twins. © 2013 Elsevier Inc. All rights reserved.

  11. Deep Reinforcement Learning of Cell Movement in the Early Stage of C. elegans Embryogenesis.

    PubMed

    Wang, Zi; Wang, Dali; Li, Chengcheng; Xu, Yichi; Li, Husheng; Bao, Zhirong

    2018-04-25

    Cell movement in the early phase of C. elegans development is regulated by a highly complex process in which a set of rules and connections are formulated at distinct scales. Previous efforts have demonstrated that agent-based, multi-scale modeling systems can integrate physical and biological rules and provide new avenues to study developmental systems. However, the application of these systems to model cell movement is still challenging and requires a comprehensive understanding of regulatory networks at the right scales. Recent developments in deep learning and reinforcement learning provide an unprecedented opportunity to explore cell movement using 3D time-lapse microscopy images. We present a deep reinforcement learning approach within an agent-based modeling system to characterize cell movement in the embryonic development of C. elegans. Our modeling system captures the complexity of cell movement patterns in the embryo and overcomes the local optimization problem encountered by traditional rule-based, agent-based modeling that uses greedy algorithms. We tested our model with two real developmental processes: the anterior movement of the Cpaaa cell via intercalation and the rearrangement of the superficial left-right asymmetry. In the first case, the model results suggested that Cpaaa's intercalation is an active directional cell movement caused by the continuous effects from a longer distance (farther than the length of two adjacent cells), as opposed to a passive movement caused by neighbor cell movements. In the second case, a leader-follower mechanism well explained the collective cell movement pattern in the asymmetry rearrangement. These results showed that our approach to introduce deep reinforcement learning into agent-based modeling can test regulatory mechanisms by exploring cell migration paths in a reverse engineering perspective. This model opens new doors to explore the large datasets generated by live imaging. Source code is available at https://github.com/zwang84/drl4cellmovement. dwang7@utk.edu, baoz@mskcc.org. Supplementary data are available at Bioinformatics online.

  12. Evolving rule-based systems in two medical domains using genetic programming.

    PubMed

    Tsakonas, Athanasios; Dounias, Georgios; Jantzen, Jan; Axer, Hubertus; Bjerregaard, Beth; von Keyserlingk, Diedrich Graf

    2004-11-01

    To demonstrate and compare the application of different genetic programming (GP) based intelligent methodologies for the construction of rule-based systems in two medical domains: the diagnosis of aphasia's subtypes and the classification of pap-smear examinations. Past data representing (a) successful diagnosis of aphasia's subtypes from collaborating medical experts through a free interview per patient, and (b) correctly classified smears (images of cells) by cyto-technologists, previously stained using the Papanicolaou method. Initially a hybrid approach is proposed, which combines standard genetic programming and heuristic hierarchical crisp rule-base construction. Then, genetic programming for the production of crisp rule based systems is attempted. Finally, another hybrid intelligent model is composed by a grammar driven genetic programming system for the generation of fuzzy rule-based systems. Results denote the effectiveness of the proposed systems, while they are also compared for their efficiency, accuracy and comprehensibility, to those of an inductive machine learning approach as well as to those of a standard genetic programming symbolic expression approach. The proposed GP-based intelligent methodologies are able to produce accurate and comprehensible results for medical experts performing competitive to other intelligent approaches. The aim of the authors was the production of accurate but also sensible decision rules that could potentially help medical doctors to extract conclusions, even at the expense of a higher classification score achievement.

  13. Agent-based model of angiogenesis simulates capillary sprout initiation in multicellular networks

    PubMed Central

    Walpole, J.; Chappell, J.C.; Cluceru, J.G.; Mac Gabhann, F.; Bautch, V.L.; Peirce, S. M.

    2015-01-01

    Many biological processes are controlled by both deterministic and stochastic influences. However, efforts to model these systems often rely on either purely stochastic or purely rule-based methods. To better understand the balance between stochasticity and determinism in biological processes a computational approach that incorporates both influences may afford additional insight into underlying biological mechanisms that give rise to emergent system properties. We apply a combined approach to the simulation and study of angiogenesis, the growth of new blood vessels from existing networks. This complex multicellular process begins with selection of an initiating endothelial cell, or tip cell, which sprouts from the parent vessels in response to stimulation by exogenous cues. We have constructed an agent-based model of sprouting angiogenesis to evaluate endothelial cell sprout initiation frequency and location, and we have experimentally validated it using high-resolution time-lapse confocal microscopy. ABM simulations were then compared to a Monte Carlo model, revealing that purely stochastic simulations could not generate sprout locations as accurately as the rule-informed agent-based model. These findings support the use of rule-based approaches for modeling the complex mechanisms underlying sprouting angiogenesis over purely stochastic methods. PMID:26158406

  14. Agent-based model of angiogenesis simulates capillary sprout initiation in multicellular networks.

    PubMed

    Walpole, J; Chappell, J C; Cluceru, J G; Mac Gabhann, F; Bautch, V L; Peirce, S M

    2015-09-01

    Many biological processes are controlled by both deterministic and stochastic influences. However, efforts to model these systems often rely on either purely stochastic or purely rule-based methods. To better understand the balance between stochasticity and determinism in biological processes a computational approach that incorporates both influences may afford additional insight into underlying biological mechanisms that give rise to emergent system properties. We apply a combined approach to the simulation and study of angiogenesis, the growth of new blood vessels from existing networks. This complex multicellular process begins with selection of an initiating endothelial cell, or tip cell, which sprouts from the parent vessels in response to stimulation by exogenous cues. We have constructed an agent-based model of sprouting angiogenesis to evaluate endothelial cell sprout initiation frequency and location, and we have experimentally validated it using high-resolution time-lapse confocal microscopy. ABM simulations were then compared to a Monte Carlo model, revealing that purely stochastic simulations could not generate sprout locations as accurately as the rule-informed agent-based model. These findings support the use of rule-based approaches for modeling the complex mechanisms underlying sprouting angiogenesis over purely stochastic methods.

  15. Prediction of linear B-cell epitopes of hepatitis C virus for vaccine development

    PubMed Central

    2015-01-01

    Background High genetic heterogeneity in the hepatitis C virus (HCV) is the major challenge of the development of an effective vaccine. Existing studies for developing HCV vaccines have mainly focused on T-cell immune response. However, identification of linear B-cell epitopes that can stimulate B-cell response is one of the major tasks of peptide-based vaccine development. Owing to the variability in B-cell epitope length, the prediction of B-cell epitopes is much more complex than that of T-cell epitopes. Furthermore, the motifs of linear B-cell epitopes in different pathogens are quite different (e. g. HCV and hepatitis B virus). To cope with this challenge, this work aims to propose an HCV-customized sequence-based prediction method to identify B-cell epitopes of HCV. Results This work establishes an experimentally verified dataset comprising the B-cell response of HCV dataset consisting of 774 linear B-cell epitopes and 774 non B-cell epitopes from the Immune Epitope Database. An interpretable rule mining system of B-cell epitopes (IRMS-BE) is proposed to select informative physicochemical properties (PCPs) and then extracts several if-then rule-based knowledge for identifying B-cell epitopes. A web server Bcell-HCV was implemented using an SVM with the 34 informative PCPs, which achieved a training accuracy of 79.7% and test accuracy of 70.7% better than the SVM-based methods for identifying B-cell epitopes of HCV and the two general-purpose methods. This work performs advanced analysis of the 34 informative properties, and the results indicate that the most effective property is the alpha-helix structure of epitopes, which influences the connection between host cells and the E2 proteins of HCV. Furthermore, 12 interpretable rules are acquired from top-five PCPs and achieve a sensitivity of 75.6% and specificity of 71.3%. Finally, a conserved promising vaccine candidate, PDREMVLYQE, is identified for inclusion in a vaccine against HCV. Conclusions This work proposes an interpretable rule mining system IRMS-BE for extracting interpretable rules using informative physicochemical properties and a web server Bcell-HCV for predicting linear B-cell epitopes of HCV. IRMS-BE may also apply to predict B-cell epitopes for other viruses, which benefits the improvement of vaccines development of these viruses without significant modification. Bcell-HCV is useful for identifying B-cell epitopes of HCV antigen to help vaccine development, which is available at http://e045.life.nctu.edu.tw/BcellHCV. PMID:26680271

  16. Simulation-based model checking approach to cell fate specification during Caenorhabditis elegans vulval development by hybrid functional Petri net with extension.

    PubMed

    Li, Chen; Nagasaki, Masao; Ueno, Kazuko; Miyano, Satoru

    2009-04-27

    Model checking approaches were applied to biological pathway validations around 2003. Recently, Fisher et al. have proved the importance of model checking approach by inferring new regulation of signaling crosstalk in C. elegans and confirming the regulation with biological experiments. They took a discrete and state-based approach to explore all possible states of the system underlying vulval precursor cell (VPC) fate specification for desired properties. However, since both discrete and continuous features appear to be an indispensable part of biological processes, it is more appropriate to use quantitative models to capture the dynamics of biological systems. Our key motivation of this paper is to establish a quantitative methodology to model and analyze in silico models incorporating the use of model checking approach. A novel method of modeling and simulating biological systems with the use of model checking approach is proposed based on hybrid functional Petri net with extension (HFPNe) as the framework dealing with both discrete and continuous events. Firstly, we construct a quantitative VPC fate model with 1761 components by using HFPNe. Secondly, we employ two major biological fate determination rules - Rule I and Rule II - to VPC fate model. We then conduct 10,000 simulations for each of 48 sets of different genotypes, investigate variations of cell fate patterns under each genotype, and validate the two rules by comparing three simulation targets consisting of fate patterns obtained from in silico and in vivo experiments. In particular, an evaluation was successfully done by using our VPC fate model to investigate one target derived from biological experiments involving hybrid lineage observations. However, the understandings of hybrid lineages are hard to make on a discrete model because the hybrid lineage occurs when the system comes close to certain thresholds as discussed by Sternberg and Horvitz in 1986. Our simulation results suggest that: Rule I that cannot be applied with qualitative based model checking, is more reasonable than Rule II owing to the high coverage of predicted fate patterns (except for the genotype of lin-15ko; lin-12ko double mutants). More insights are also suggested. The quantitative simulation-based model checking approach is a useful means to provide us valuable biological insights and better understandings of biological systems and observation data that may be hard to capture with the qualitative one.

  17. Fuzzy energy management for hybrid fuel cell/battery systems for more electric aircraft

    NASA Astrophysics Data System (ADS)

    Corcau, Jenica-Ileana; Dinca, Liviu; Grigorie, Teodor Lucian; Tudosie, Alexandru-Nicolae

    2017-06-01

    In this paper is presented the simulation and analysis of a Fuzzy Energy Management for Hybrid Fuel cell/Battery Systems used for More Electric Aircraft. The fuel cell hybrid system contains of fuel cell, lithium-ion batteries along with associated dc to dc boost converters. In this configuration the battery has a dc to dc converter, because it is an active in the system. The energy management scheme includes the rule based fuzzy logic strategy. This scheme has a faster response to load change and is more robust to measurement imprecisions. Simulation will be provided using Matlab/Simulink based models. Simulation results are given to show the overall system performance.

  18. A Fuzzy Rule Based Decision Support System for Identifying Location of Water Harvesting Technologies in Rainfed Agricultural Regions

    NASA Astrophysics Data System (ADS)

    Chaubey, I.; Vema, V. K.; Sudheer, K.

    2016-12-01

    Site suitability evaluation of water conservation structures in water scarce rainfed agricultural areas consist of assessment of various landscape characteristics and various criterion. Many of these landscape characteristic attributes are conveyed through linguistic terms rather than precise numeric values. Fuzzy rule based system are capable of incorporating uncertainty and vagueness, when various decision making criteria expressed in linguistic terms are expressed as fuzzy rules. In this study a fuzzy rule based decision support system is developed, for optimal site selection of water harvesting technologies. Water conservation technologies like farm ponds, Check dams, Rock filled dams and percolation ponds aid in conserving water for irrigation and recharging aquifers and development of such a system will aid in improving the efficiency of the structures. Attributes and criteria involved in decision making are classified into different groups to estimate the suitability of the particular technology. The developed model is applied and tested on an Indian watershed. The input attributes are prepared in raster format in ArcGIS software and suitability of each raster cell is calculated and output is generated in the form of a thematic map showing the suitability of the cells pertaining to different technologies. The output of the developed model is compared against the already existing structures and results are satisfactory. This developed model will aid in improving the sustainability and efficiency of the watershed management programs aimed at enhancing in situ moisture content.

  19. Cell Phones: Rule-Setting, Rule-Breaking, and Relationships in Classrooms

    ERIC Educational Resources Information Center

    Charles, Anita S.

    2012-01-01

    Based on a small qualitative study, this article focuses on understanding the rules for cell phones and other social networking media in schools, an aspect of broader research that led to important understandings of teacher-student negotiations. It considers the rules that schools and teachers make, the rampant breaking of these rules, the…

  20. Efficiency of reactant site sampling in network-free simulation of rule-based models for biochemical systems

    PubMed Central

    Yang, Jin; Hlavacek, William S.

    2011-01-01

    Rule-based models, which are typically formulated to represent cell signaling systems, can now be simulated via various network-free simulation methods. In a network-free method, reaction rates are calculated for rules that characterize molecular interactions, and these rule rates, which each correspond to the cumulative rate of all reactions implied by a rule, are used to perform a stochastic simulation of reaction kinetics. Network-free methods, which can be viewed as generalizations of Gillespie’s method, are so named because these methods do not require that a list of individual reactions implied by a set of rules be explicitly generated, which is a requirement of other methods for simulating rule-based models. This requirement is impractical for rule sets that imply large reaction networks (i.e., long lists of individual reactions), as reaction network generation is expensive. Here, we compare the network-free simulation methods implemented in RuleMonkey and NFsim, general-purpose software tools for simulating rule-based models encoded in the BioNetGen language. The method implemented in NFsim uses rejection sampling to correct overestimates of rule rates, which introduces null events (i.e., time steps that do not change the state of the system being simulated). The method implemented in RuleMonkey uses iterative updates to track rule rates exactly, which avoids null events. To ensure a fair comparison of the two methods, we developed implementations of the rejection and rejection-free methods specific to a particular class of kinetic models for multivalent ligand-receptor interactions. These implementations were written with the intention of making them as much alike as possible, minimizing the contribution of irrelevant coding differences to efficiency differences. Simulation results show that performance of the rejection method is equal to or better than that of the rejection-free method over wide parameter ranges. However, when parameter values are such that ligand-induced aggregation of receptors yields a large connected receptor cluster, the rejection-free method is more efficient. PMID:21832806

  1. Rule groupings: An approach towards verification of expert systems

    NASA Technical Reports Server (NTRS)

    Mehrotra, Mala

    1991-01-01

    Knowledge-based expert systems are playing an increasingly important role in NASA space and aircraft systems. However, many of NASA's software applications are life- or mission-critical and knowledge-based systems do not lend themselves to the traditional verification and validation techniques for highly reliable software. Rule-based systems lack the control abstractions found in procedural languages. Hence, it is difficult to verify or maintain such systems. Our goal is to automatically structure a rule-based system into a set of rule-groups having a well-defined interface to other rule-groups. Once a rule base is decomposed into such 'firewalled' units, studying the interactions between rules would become more tractable. Verification-aid tools can then be developed to test the behavior of each such rule-group. Furthermore, the interactions between rule-groups can be studied in a manner similar to integration testing. Such efforts will go a long way towards increasing our confidence in the expert-system software. Our research efforts address the feasibility of automating the identification of rule groups, in order to decompose the rule base into a number of meaningful units.

  2. ConGEMs: Condensed Gene Co-Expression Module Discovery Through Rule-Based Clustering and Its Application to Carcinogenesis.

    PubMed

    Mallik, Saurav; Zhao, Zhongming

    2017-12-28

    For transcriptomic analysis, there are numerous microarray-based genomic data, especially those generated for cancer research. The typical analysis measures the difference between a cancer sample-group and a matched control group for each transcript or gene. Association rule mining is used to discover interesting item sets through rule-based methodology. Thus, it has advantages to find causal effect relationships between the transcripts. In this work, we introduce two new rule-based similarity measures-weighted rank-based Jaccard and Cosine measures-and then propose a novel computational framework to detect condensed gene co-expression modules ( C o n G E M s) through the association rule-based learning system and the weighted similarity scores. In practice, the list of evolved condensed markers that consists of both singular and complex markers in nature depends on the corresponding condensed gene sets in either antecedent or consequent of the rules of the resultant modules. In our evaluation, these markers could be supported by literature evidence, KEGG (Kyoto Encyclopedia of Genes and Genomes) pathway and Gene Ontology annotations. Specifically, we preliminarily identified differentially expressed genes using an empirical Bayes test. A recently developed algorithm-RANWAR-was then utilized to determine the association rules from these genes. Based on that, we computed the integrated similarity scores of these rule-based similarity measures between each rule-pair, and the resultant scores were used for clustering to identify the co-expressed rule-modules. We applied our method to a gene expression dataset for lung squamous cell carcinoma and a genome methylation dataset for uterine cervical carcinogenesis. Our proposed module discovery method produced better results than the traditional gene-module discovery measures. In summary, our proposed rule-based method is useful for exploring biomarker modules from transcriptomic data.

  3. HERB: A production system for programming with hierarchical expert rule bases: User's manual, HERB Version 1. 0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hummel, K.E.

    1987-12-01

    Expert systems are artificial intelligence programs that solve problems requiring large amounts of heuristic knowledge, based on years of experience and tradition. Production systems are domain-independent tools that support the development of rule-based expert systems. This document describes a general purpose production system known as HERB. This system was developed to support the programming of expert systems using hierarchically structured rule bases. HERB encourages the partitioning of rules into multiple rule bases and supports the use of multiple conflict resolution strategies. Multiple rule bases can also be placed on a system stack and simultaneously searched during each interpreter cycle. Bothmore » backward and forward chaining rules are supported by HERB. The condition portion of each rule can contain both patterns, which are matched with facts in a data base, and LISP expressions, which are explicitly evaluated in the LISP environment. Properties of objects can also be stored in the HERB data base and referenced within the scope of each rule. This document serves both as an introduction to the principles of LISP-based production systems and as a user's manual for the HERB system. 6 refs., 17 figs.« less

  4. An improved cellular automaton method to model multispecies biofilms.

    PubMed

    Tang, Youneng; Valocchi, Albert J

    2013-10-01

    Biomass-spreading rules used in previous cellular automaton methods to simulate multispecies biofilm introduced extensive mixing between different biomass species or resulted in spatially discontinuous biomass concentration and distribution; this caused results based on the cellular automaton methods to deviate from experimental results and those from the more computationally intensive continuous method. To overcome the problems, we propose new biomass-spreading rules in this work: Excess biomass spreads by pushing a line of grid cells that are on the shortest path from the source grid cell to the destination grid cell, and the fractions of different biomass species in the grid cells on the path change due to the spreading. To evaluate the new rules, three two-dimensional simulation examples are used to compare the biomass distribution computed using the continuous method and three cellular automaton methods, one based on the new rules and the other two based on rules presented in two previous studies. The relationship between the biomass species is syntrophic in one example and competitive in the other two examples. Simulation results generated using the cellular automaton method based on the new rules agree much better with the continuous method than do results using the other two cellular automaton methods. The new biomass-spreading rules are no more complex to implement than the existing rules. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Hierarchical graphs for better annotations of rule-based models of biochemical systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, Bin; Hlavacek, William

    2009-01-01

    In the graph-based formalism of the BioNetGen language (BNGL), graphs are used to represent molecules, with a colored vertex representing a component of a molecule, a vertex label representing the internal state of a component, and an edge representing a bond between components. Components of a molecule share the same color. Furthermore, graph-rewriting rules are used to represent molecular interactions, with a rule that specifies addition (removal) of an edge representing a class of association (dissociation) reactions and with a rule that specifies a change of vertex label representing a class of reactions that affect the internal state of amore » molecular component. A set of rules comprises a mathematical/computational model that can be used to determine, through various means, the system-level dynamics of molecular interactions in a biochemical system. Here, for purposes of model annotation, we propose an extension of BNGL that involves the use of hierarchical graphs to represent (1) relationships among components and subcomponents of molecules and (2) relationships among classes of reactions defined by rules. We illustrate how hierarchical graphs can be used to naturally document the structural organization of the functional components and subcomponents of two proteins: the protein tyrosine kinase Lck and the T cell receptor (TCR)/CD3 complex. Likewise, we illustrate how hierarchical graphs can be used to document the similarity of two related rules for kinase-catalyzed phosphorylation of a protein substrate. We also demonstrate how a hierarchical graph representing a protein can be encoded in an XML-based format.« less

  6. Inferring rules of lineage commitment in haematopoiesis.

    PubMed

    Pina, Cristina; Fugazza, Cristina; Tipping, Alex J; Brown, John; Soneji, Shamit; Teles, Jose; Peterson, Carsten; Enver, Tariq

    2012-02-19

    How the molecular programs of differentiated cells develop as cells transit from multipotency through lineage commitment remains unexplored. This reflects the inability to access cells undergoing commitment or located in the immediate vicinity of commitment boundaries. It remains unclear whether commitment constitutes a gradual process, or else represents a discrete transition. Analyses of in vitro self-renewing multipotent systems have revealed cellular heterogeneity with individual cells transiently exhibiting distinct biases for lineage commitment. Such systems can be used to molecularly interrogate early stages of lineage affiliation and infer rules of lineage commitment. In haematopoiesis, population-based studies have indicated that lineage choice is governed by global transcriptional noise, with self-renewing multipotent cells reversibly activating transcriptome-wide lineage-affiliated programs. We examine this hypothesis through functional and molecular analysis of individual blood cells captured from self-renewal cultures, during cytokine-driven differentiation and from primary stem and progenitor bone marrow compartments. We show dissociation between self-renewal potential and transcriptome-wide activation of lineage programs, and instead suggest that multipotent cells experience independent activation of individual regulators resulting in a low probability of transition to the committed state.

  7. Integrated control strategy for autonomous decentralized conveyance systems based on distributed MEMS arrays

    NASA Astrophysics Data System (ADS)

    Zhou, Lingfei; Chapuis, Yves-Andre; Blonde, Jean-Philippe; Bervillier, Herve; Fukuta, Yamato; Fujita, Hiroyuki

    2004-07-01

    In this paper, the authors proposed to study a model and a control strategy of a two-dimensional conveyance system based on the principles of the Autonomous Decentralized Microsystems (ADM). The microconveyance system is based on distributed cooperative MEMS actuators which can produce a force field onto the surface of the device to grip and move a micro-object. The modeling approach proposed here is based on a simple model of a microconveyance system which is represented by a 5 x 5 matrix of cells. Each cell is consisted of a microactuator, a microsensor, and a microprocessor to provide actuation, autonomy and decentralized intelligence to the cell. Thus, each cell is able to identify a micro-object crossing on it and to decide by oneself the appropriate control strategy to convey the micro-object to its destination target. The control strategy could be established through five simple decision rules that the cell itself has to respect at each calculate cycle time. Simulation and FPGA implementation results are given in the end of the paper in order to validate model and control approach of the microconveyance system.

  8. Lecture Rule No. 1: Cell Phones ON, Please! A Low-Cost Personal Response System for Learning and Teaching

    ERIC Educational Resources Information Center

    Lee, Albert W. M.; Ng, Joseph K. Y.; Wong, Eva Y. W.; Tan, Alfred; Lau, April K. Y.; Lai, Stephen F. Y.

    2013-01-01

    phone, that can be used to replace the "clicker" as a personal response device. Our mobile phone-based response system (iQlickers) collects and analyzes the answers or opinions sent in by the students as SMS (short message service) messages. The statistic of the…

  9. Hierarchical graphs for rule-based modeling of biochemical systems

    PubMed Central

    2011-01-01

    Background In rule-based modeling, graphs are used to represent molecules: a colored vertex represents a component of a molecule, a vertex attribute represents the internal state of a component, and an edge represents a bond between components. Components of a molecule share the same color. Furthermore, graph-rewriting rules are used to represent molecular interactions. A rule that specifies addition (removal) of an edge represents a class of association (dissociation) reactions, and a rule that specifies a change of a vertex attribute represents a class of reactions that affect the internal state of a molecular component. A set of rules comprises an executable model that can be used to determine, through various means, the system-level dynamics of molecular interactions in a biochemical system. Results For purposes of model annotation, we propose the use of hierarchical graphs to represent structural relationships among components and subcomponents of molecules. We illustrate how hierarchical graphs can be used to naturally document the structural organization of the functional components and subcomponents of two proteins: the protein tyrosine kinase Lck and the T cell receptor (TCR) complex. We also show that computational methods developed for regular graphs can be applied to hierarchical graphs. In particular, we describe a generalization of Nauty, a graph isomorphism and canonical labeling algorithm. The generalized version of the Nauty procedure, which we call HNauty, can be used to assign canonical labels to hierarchical graphs or more generally to graphs with multiple edge types. The difference between the Nauty and HNauty procedures is minor, but for completeness, we provide an explanation of the entire HNauty algorithm. Conclusions Hierarchical graphs provide more intuitive formal representations of proteins and other structured molecules with multiple functional components than do the regular graphs of current languages for specifying rule-based models, such as the BioNetGen language (BNGL). Thus, the proposed use of hierarchical graphs should promote clarity and better understanding of rule-based models. PMID:21288338

  10. Rule groupings: A software engineering approach towards verification of expert systems

    NASA Technical Reports Server (NTRS)

    Mehrotra, Mala

    1991-01-01

    Currently, most expert system shells do not address software engineering issues for developing or maintaining expert systems. As a result, large expert systems tend to be incomprehensible, difficult to debug or modify and almost impossible to verify or validate. Partitioning rule based systems into rule groups which reflect the underlying subdomains of the problem should enhance the comprehensibility, maintainability, and reliability of expert system software. Attempts were made to semiautomatically structure a CLIPS rule base into groups of related rules that carry the same type of information. Different distance metrics that capture relevant information from the rules for grouping are discussed. Two clustering algorithms that partition the rule base into groups of related rules are given. Two independent evaluation criteria are developed to measure the effectiveness of the grouping strategies. Results of the experiment with three sample rule bases are presented.

  11. Automated revision of CLIPS rule-bases

    NASA Technical Reports Server (NTRS)

    Murphy, Patrick M.; Pazzani, Michael J.

    1994-01-01

    This paper describes CLIPS-R, a theory revision system for the revision of CLIPS rule-bases. CLIPS-R may be used for a variety of knowledge-base revision tasks, such as refining a prototype system, adapting an existing system to slightly different operating conditions, or improving an operational system that makes occasional errors. We present a description of how CLIPS-R revises rule-bases, and an evaluation of the system on three rule-bases.

  12. An evaluation and implementation of rule-based Home Energy Management System using the Rete algorithm.

    PubMed

    Kawakami, Tomoya; Fujita, Naotaka; Yoshihisa, Tomoki; Tsukamoto, Masahiko

    2014-01-01

    In recent years, sensors become popular and Home Energy Management System (HEMS) takes an important role in saving energy without decrease in QoL (Quality of Life). Currently, many rule-based HEMSs have been proposed and almost all of them assume "IF-THEN" rules. The Rete algorithm is a typical pattern matching algorithm for IF-THEN rules. Currently, we have proposed a rule-based Home Energy Management System (HEMS) using the Rete algorithm. In the proposed system, rules for managing energy are processed by smart taps in network, and the loads for processing rules and collecting data are distributed to smart taps. In addition, the number of processes and collecting data are reduced by processing rules based on the Rete algorithm. In this paper, we evaluated the proposed system by simulation. In the simulation environment, rules are processed by a smart tap that relates to the action part of each rule. In addition, we implemented the proposed system as HEMS using smart taps.

  13. Rule-based modeling with Virtual Cell

    PubMed Central

    Schaff, James C.; Vasilescu, Dan; Moraru, Ion I.; Loew, Leslie M.; Blinov, Michael L.

    2016-01-01

    Summary: Rule-based modeling is invaluable when the number of possible species and reactions in a model become too large to allow convenient manual specification. The popular rule-based software tools BioNetGen and NFSim provide powerful modeling and simulation capabilities at the cost of learning a complex scripting language which is used to specify these models. Here, we introduce a modeling tool that combines new graphical rule-based model specification with existing simulation engines in a seamless way within the familiar Virtual Cell (VCell) modeling environment. A mathematical model can be built integrating explicit reaction networks with reaction rules. In addition to offering a large choice of ODE and stochastic solvers, a model can be simulated using a network free approach through the NFSim simulation engine. Availability and implementation: Available as VCell (versions 6.0 and later) at the Virtual Cell web site (http://vcell.org/). The application installs and runs on all major platforms and does not require registration for use on the user’s computer. Tutorials are available at the Virtual Cell website and Help is provided within the software. Source code is available at Sourceforge. Contact: vcell_support@uchc.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:27497444

  14. Agent-based modeling of the interaction between CD8+ T cells and Beta cells in type 1 diabetes.

    PubMed

    Ozturk, Mustafa Cagdas; Xu, Qian; Cinar, Ali

    2018-01-01

    We propose an agent-based model for the simulation of the autoimmune response in T1D. The model incorporates cell behavior from various rules derived from the current literature and is implemented on a high-performance computing system, which enables the simulation of a significant portion of the islets in the mouse pancreas. Simulation results indicate that the model is able to capture the trends that emerge during the progression of the autoimmunity. The multi-scale nature of the model enables definition of rules or equations that govern cellular or sub-cellular level phenomena and observation of the outcomes at the tissue scale. It is expected that such a model would facilitate in vivo clinical studies through rapid testing of hypotheses and planning of future experiments by providing insight into disease progression at different scales, some of which may not be obtained easily in clinical studies. Furthermore, the modular structure of the model simplifies tasks such as the addition of new cell types, and the definition or modification of different behaviors of the environment and the cells with ease.

  15. A fuzzy classifier system for process control

    NASA Technical Reports Server (NTRS)

    Karr, C. L.; Phillips, J. C.

    1994-01-01

    A fuzzy classifier system that discovers rules for controlling a mathematical model of a pH titration system was developed by researchers at the U.S. Bureau of Mines (USBM). Fuzzy classifier systems successfully combine the strengths of learning classifier systems and fuzzy logic controllers. Learning classifier systems resemble familiar production rule-based systems, but they represent their IF-THEN rules by strings of characters rather than in the traditional linguistic terms. Fuzzy logic is a tool that allows for the incorporation of abstract concepts into rule based-systems, thereby allowing the rules to resemble the familiar 'rules-of-thumb' commonly used by humans when solving difficult process control and reasoning problems. Like learning classifier systems, fuzzy classifier systems employ a genetic algorithm to explore and sample new rules for manipulating the problem environment. Like fuzzy logic controllers, fuzzy classifier systems encapsulate knowledge in the form of production rules. The results presented in this paper demonstrate the ability of fuzzy classifier systems to generate a fuzzy logic-based process control system.

  16. Reducing the Conflict Factors Strategies in Question Answering System

    NASA Astrophysics Data System (ADS)

    Suwarningsih, W.; Purwarianti, A.; Supriana, I.

    2017-03-01

    A rule-based system is prone to conflict as new knowledge every time will emerge and indirectly must sign in to the knowledge base that is used by the system. A conflict occurred between the rules in the knowledge base can lead to the errors of reasoning or reasoning circulation. Therefore, when added, the new rules will lead to conflict with other rules, and the only rules that really can be added to the knowledge base. From these conditions, this paper aims to propose a conflict resolution strategy for a medical debriefing system by analyzing scenarios based upon the runtime to improve the efficiency and reliability of systems.

  17. Rule-based simulation models

    NASA Technical Reports Server (NTRS)

    Nieten, Joseph L.; Seraphine, Kathleen M.

    1991-01-01

    Procedural modeling systems, rule based modeling systems, and a method for converting a procedural model to a rule based model are described. Simulation models are used to represent real time engineering systems. A real time system can be represented by a set of equations or functions connected so that they perform in the same manner as the actual system. Most modeling system languages are based on FORTRAN or some other procedural language. Therefore, they must be enhanced with a reaction capability. Rule based systems are reactive by definition. Once the engineering system has been decomposed into a set of calculations using only basic algebraic unary operations, a knowledge network of calculations and functions can be constructed. The knowledge network required by a rule based system can be generated by a knowledge acquisition tool or a source level compiler. The compiler would take an existing model source file, a syntax template, and a symbol table and generate the knowledge network. Thus, existing procedural models can be translated and executed by a rule based system. Neural models can be provide the high capacity data manipulation required by the most complex real time models.

  18. On Decision-Making Among Multiple Rule-Bases in Fuzzy Control Systems

    NASA Technical Reports Server (NTRS)

    Tunstel, Edward; Jamshidi, Mo

    1997-01-01

    Intelligent control of complex multi-variable systems can be a challenge for single fuzzy rule-based controllers. This class of problems cam often be managed with less difficulty by distributing intelligent decision-making amongst a collection of rule-bases. Such an approach requires that a mechanism be chosen to ensure goal-oriented interaction between the multiple rule-bases. In this paper, a hierarchical rule-based approach is described. Decision-making mechanisms based on generalized concepts from single-rule-based fuzzy control are described. Finally, the effects of different aggregation operators on multi-rule-base decision-making are examined in a navigation control problem for mobile robots.

  19. Non-Hermitian Operator Modelling of Basic Cancer Cell Dynamics

    NASA Astrophysics Data System (ADS)

    Bagarello, Fabio; Gargano, Francesco

    2018-04-01

    We propose a dynamical system of tumor cells proliferation based on operatorial methods. The approach we propose is quantum-like: we use ladder and number operators to describe healthy and tumor cells birth and death, and the evolution is ruled by a non-hermitian Hamiltonian which includes, in a non reversible way, the basic biological mechanisms we consider for the system. We show that this approach is rather efficient in describing some processes of the cells. We further add some medical treatment, described by adding a suitable term in the Hamiltonian, which controls and limits the growth of tumor cells, and we propose an optimal approach to stop, and reverse, this growth.

  20. Challenges for Rule Systems on the Web

    NASA Astrophysics Data System (ADS)

    Hu, Yuh-Jong; Yeh, Ching-Long; Laun, Wolfgang

    The RuleML Challenge started in 2007 with the objective of inspiring the issues of implementation for management, integration, interoperation and interchange of rules in an open distributed environment, such as the Web. Rules are usually classified as three types: deductive rules, normative rules, and reactive rules. The reactive rules are further classified as ECA rules and production rules. The study of combination rule and ontology is traced back to an earlier active rule system for relational and object-oriented (OO) databases. Recently, this issue has become one of the most important research problems in the Semantic Web. Once we consider a computer executable policy as a declarative set of rules and ontologies that guides the behavior of entities within a system, we have a flexible way to implement real world policies without rewriting the computer code, as we did before. Fortunately, we have de facto rule markup languages, such as RuleML or RIF to achieve the portability and interchange of rules for different rule systems. Otherwise, executing real-life rule-based applications on the Web is almost impossible. Several commercial or open source rule engines are available for the rule-based applications. However, we still need a standard rule language and benchmark for not only to compare the rule systems but also to measure the progress in the field. Finally, a number of real-life rule-based use cases will be investigated to demonstrate the applicability of current rule systems on the Web.

  1. Incorporation of negative rules and evolution of a fuzzy controller for yeast fermentation process.

    PubMed

    Birle, Stephan; Hussein, Mohamed Ahmed; Becker, Thomas

    2016-08-01

    The control of bioprocesses can be very challenging due to the fact that these kinds of processes are highly affected by various sources of uncertainty like the intrinsic behavior of the used microorganisms. Due to the reason that these kinds of process uncertainties are not directly measureable in most cases, the overall control is either done manually because of the experience of the operator or intelligent expert systems are applied, e.g., on the basis of fuzzy logic theory. In the latter case, however, the control concept is mainly represented by using merely positive rules, e.g., "If A then do B". As this is not straightforward with respect to the semantics of the human decision-making process that also includes negative experience in form of constraints or prohibitions, the incorporation of negative rules for process control based on fuzzy logic is emphasized. In this work, an approach of fuzzy logic control of the yeast propagation process based on a combination of positive and negative rules is presented. The process is guided along a reference trajectory for yeast cell concentration by alternating the process temperature. The incorporation of negative rules leads to a much more stable and accurate control of the process as the root mean squared error of reference trajectory and system response could be reduced by an average of 62.8 % compared to the controller using only positive rules.

  2. The Danger Model Approach to the Pathogenesis of the Rheumatic Diseases

    PubMed Central

    Pacheco-Tena, César; González-Chávez, Susana Aideé

    2015-01-01

    The danger model was proposed by Polly Matzinger as complement to the traditional self-non-self- (SNS-) model to explain the immunoreactivity. The danger model proposes a central role of the tissular cells' discomfort as an element to prime the immune response processes in opposition to the traditional SNS-model where foreignness is a prerequisite. However recent insights in the proteomics of diverse tissular cells have revealed that under stressful conditions they have a significant potential to initiate, coordinate, and perpetuate autoimmune processes, in many cases, ruling over the adaptive immune response cells; this ruling potential can also be confirmed by observations in several genetically manipulated animal models. Here, we review the pathogenesis of rheumatic diseases such as systemic lupus erythematous, rheumatoid arthritis, spondyloarthritis including ankylosing spondylitis, psoriasis, and Crohn's disease and provide realistic approaches based on the logic of the danger model. We assume that tissular dysfunction is a prerequisite for chronic autoimmunity and propose two genetically conferred hypothetical roles for the tissular cells causing the disease: (A) the Impaired cell and (B) the paranoid cell. Both roles are not mutually exclusive. Some examples in human disease and in animal models are provided based on current evidence. PMID:25973436

  3. The S(c)ensory Immune System Theory.

    PubMed

    Veiga-Fernandes, Henrique; Freitas, António A

    2017-10-01

    Viewpoints on the immune system have evolved across different paradigms, including the clonal selection theory, the idiotypic network, and the danger and tolerance models. Herein, we propose that in multicellular organisms, where panoplies of cells from different germ layers interact and immune cells are constantly generated, the behavior of the immune system is defined by the rules governing cell survival, systems physiology and organismic homeostasis. Initially, these rules were imprinted at the single cell-protist level, but supervened modifications in the transition to multicellular organisms. This context determined the emergence of the 'sensory immune system', which operates in a s(c)ensor mode to ensure systems physiology, organismic homeostasis, and perpetuation of its replicating molecules. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Enhanced Weather Radar (EWxR) System

    NASA Technical Reports Server (NTRS)

    Kronfeld, Kevin M. (Technical Monitor)

    2003-01-01

    An airborne weather radar system, the Enhanced Weather Radar (EWxR), with enhanced on-board weather radar data processing was developed and tested. The system features additional weather data that is uplinked from ground-based sources, specialized data processing, and limited automatic radar control to search for hazardous weather. National Weather Service (NWS) ground-based Next Generation Radar (NEXRAD) information is used by the EWxR system to augment the on-board weather radar information. The system will simultaneously display NEXRAD and on-board weather radar information in a split-view format. The on-board weather radar includes an automated or hands-free storm-finding feature that optimizes the radar returns by automatically adjusting the tilt and range settings for the current altitude above the terrain and searches for storm cells near the atmospheric 0-degree isotherm. A rule-based decision aid was developed to automatically characterize cells as hazardous, possibly-hazardous, or non-hazardous based upon attributes of that cell. Cell attributes are determined based on data from the on-board radar and from ground-based radars. A flight path impact prediction algorithm was developed to help pilots to avoid hazardous weather along their flight plan and their mission. During development the system was tested on the NASA B757 aircraft and final tests were conducted on the Rockwell Collins Sabreliner.

  5. Rule groupings in expert systems using nearest neighbour decision rules, and convex hulls

    NASA Technical Reports Server (NTRS)

    Anastasiadis, Stergios

    1991-01-01

    Expert System shells are lacking in many areas of software engineering. Large rule based systems are not semantically comprehensible, difficult to debug, and impossible to modify or validate. Partitioning a set of rules found in CLIPS (C Language Integrated Production System) into groups of rules which reflect the underlying semantic subdomains of the problem, will address adequately the concerns stated above. Techniques are introduced to structure a CLIPS rule base into groups of rules that inherently have common semantic information. The concepts involved are imported from the field of A.I., Pattern Recognition, and Statistical Inference. Techniques focus on the areas of feature selection, classification, and a criteria of how 'good' the classification technique is, based on Bayesian Decision Theory. A variety of distance metrics are discussed for measuring the 'closeness' of CLIPS rules and various Nearest Neighbor classification algorithms are described based on the above metric.

  6. A logical model of cooperating rule-based systems

    NASA Technical Reports Server (NTRS)

    Bailin, Sidney C.; Moore, John M.; Hilberg, Robert H.; Murphy, Elizabeth D.; Bahder, Shari A.

    1989-01-01

    A model is developed to assist in the planning, specification, development, and verification of space information systems involving distributed rule-based systems. The model is based on an analysis of possible uses of rule-based systems in control centers. This analysis is summarized as a data-flow model for a hypothetical intelligent control center. From this data-flow model, the logical model of cooperating rule-based systems is extracted. This model consists of four layers of increasing capability: (1) communicating agents, (2) belief-sharing knowledge sources, (3) goal-sharing interest areas, and (4) task-sharing job roles.

  7. A neural network architecture for implementation of expert systems for real time monitoring

    NASA Technical Reports Server (NTRS)

    Ramamoorthy, P. A.

    1991-01-01

    Since neural networks have the advantages of massive parallelism and simple architecture, they are good tools for implementing real time expert systems. In a rule based expert system, the antecedents of rules are in the conjunctive or disjunctive form. We constructed a multilayer feedforward type network in which neurons represent AND or OR operations of rules. Further, we developed a translator which can automatically map a given rule base into the network. Also, we proposed a new and powerful yet flexible architecture that combines the advantages of both fuzzy expert systems and neural networks. This architecture uses the fuzzy logic concepts to separate input data domains into several smaller and overlapped regions. Rule-based expert systems for time critical applications using neural networks, the automated implementation of rule-based expert systems with neural nets, and fuzzy expert systems vs. neural nets are covered.

  8. SIRE: A Simple Interactive Rule Editor for NICBES

    NASA Technical Reports Server (NTRS)

    Bykat, Alex

    1988-01-01

    To support evolution of domain expertise, and its representation in an expert system knowledge base, a user-friendly rule base editor is mandatory. The Nickel Cadmium Battery Expert System (NICBES), a prototype of an expert system for the Hubble Space Telescope power storage management system, does not provide such an editor. In the following, a description of a Simple Interactive Rule Base Editor (SIRE) for NICBES is described. The SIRE provides a consistent internal representation of the NICBES knowledge base. It supports knowledge presentation and provides a user-friendly and code language independent medium for rule addition and modification. The SIRE is integrated with NICBES via an interface module. This module provides translation of the internal representation to Prolog-type rules (Horn clauses), latter rule assertion, and a simple mechanism for rule selection for its Prolog inference engine.

  9. Color image encryption based on hybrid hyper-chaotic system and cellular automata

    NASA Astrophysics Data System (ADS)

    Yaghouti Niyat, Abolfazl; Moattar, Mohammad Hossein; Niazi Torshiz, Masood

    2017-03-01

    This paper proposes an image encryption scheme based on Cellular Automata (CA). CA is a self-organizing structure with a set of cells in which each cell is updated by certain rules that are dependent on a limited number of neighboring cells. The major disadvantages of cellular automata in cryptography include limited number of reversal rules and inability to produce long sequences of states by these rules. In this paper, a non-uniform cellular automata framework is proposed to solve this problem. This proposed scheme consists of confusion and diffusion steps. In confusion step, the positions of the original image pixels are replaced by chaos mapping. Key image is created using non-uniform cellular automata and then the hyper-chaotic mapping is used to select random numbers from the image key for encryption. The main contribution of the paper is the application of hyper chaotic functions and non-uniform CA for robust key image generation. Security analysis and experimental results show that the proposed method has a very large key space and is resistive against noise and attacks. The correlation between adjacent pixels in the encrypted image is reduced and the amount of entropy is equal to 7.9991 which is very close to 8 which is ideal.

  10. Analysis, Simulation, and Verification of Knowledge-Based, Rule-Based, and Expert Systems

    NASA Technical Reports Server (NTRS)

    Hinchey, Mike; Rash, James; Erickson, John; Gracanin, Denis; Rouff, Chris

    2010-01-01

    Mathematically sound techniques are used to view a knowledge-based system (KBS) as a set of processes executing in parallel and being enabled in response to specific rules being fired. The set of processes can be manipulated, examined, analyzed, and used in a simulation. The tool that embodies this technology may warn developers of errors in their rules, but may also highlight rules (or sets of rules) in the system that are underspecified (or overspecified) and need to be corrected for the KBS to operate as intended. The rules embodied in a KBS specify the allowed situations, events, and/or results of the system they describe. In that sense, they provide a very abstract specification of a system. The system is implemented through the combination of the system specification together with an appropriate inference engine, independent of the algorithm used in that inference engine. Viewing the rule base as a major component of the specification, and choosing an appropriate specification notation to represent it, reveals how additional power can be derived from an approach to the knowledge-base system that involves analysis, simulation, and verification. This innovative approach requires no special knowledge of the rules, and allows a general approach where standardized analysis, verification, simulation, and model checking techniques can be applied to the KBS.

  11. N Reasons Why Production-Rules are Insufficient Models for Expert System Knowledge Representation Schemes

    DTIC Science & Technology

    1991-02-01

    3 2.2 Hybrid Rule/Fact Schemas .............................................................. 3 3 THE LIMITATIONS OF RULE BASED KNOWLEDGE...or hybrid rule/fact schemas. 2 UNCLASSIFIED .WA UNCLASSIFIED ERL-0520-RR 2.1 Propositional Logic The simplest form of production-rules are based upon...requirements which may lead to poor system performance. 2.2 Hybrid Rule/Fact Schemas Hybrid rule/fact relationships (also known as Predicate Calculus ) have

  12. Research on key technology of the verification system of steel rule based on vision measurement

    NASA Astrophysics Data System (ADS)

    Jia, Siyuan; Wang, Zhong; Liu, Changjie; Fu, Luhua; Li, Yiming; Lu, Ruijun

    2018-01-01

    The steel rule plays an important role in quantity transmission. However, the traditional verification method of steel rule based on manual operation and reading brings about low precision and low efficiency. A machine vison based verification system of steel rule is designed referring to JJG1-1999-Verificaiton Regulation of Steel Rule [1]. What differentiates this system is that it uses a new calibration method of pixel equivalent and decontaminates the surface of steel rule. Experiments show that these two methods fully meet the requirements of the verification system. Measuring results strongly prove that these methods not only meet the precision of verification regulation, but also improve the reliability and efficiency of the verification system.

  13. Automated implementation of rule-based expert systems with neural networks for time-critical applications

    NASA Technical Reports Server (NTRS)

    Ramamoorthy, P. A.; Huang, Song; Govind, Girish

    1991-01-01

    In fault diagnosis, control and real-time monitoring, both timing and accuracy are critical for operators or machines to reach proper solutions or appropriate actions. Expert systems are becoming more popular in the manufacturing community for dealing with such problems. In recent years, neural networks have revived and their applications have spread to many areas of science and engineering. A method of using neural networks to implement rule-based expert systems for time-critical applications is discussed here. This method can convert a given rule-based system into a neural network with fixed weights and thresholds. The rules governing the translation are presented along with some examples. We also present the results of automated machine implementation of such networks from the given rule-base. This significantly simplifies the translation process to neural network expert systems from conventional rule-based systems. Results comparing the performance of the proposed approach based on neural networks vs. the classical approach are given. The possibility of very large scale integration (VLSI) realization of such neural network expert systems is also discussed.

  14. Transition Flight Control Room Automation

    NASA Technical Reports Server (NTRS)

    Welborn, Curtis Ray

    1990-01-01

    The Workstation Prototype Laboratory is currently working on a number of projects which we feel can have a direct impact on ground operations automation. These projects include: The Fuel Cell Monitoring System (FCMS), which will monitor and detect problems with the fuel cells on the Shuttle. FCMS will use a combination of rules (forward/backward) and multi-threaded procedures which run concurrently with the rules, to implement the malfunction algorithms of the EGIL flight controllers. The combination of rule based reasoning and procedural reasoning allows us to more easily map the malfunction algorithms into a real-time system implementation. A graphical computation language (AGCOMPL). AGCOMPL is an experimental prototype to determine the benefits and drawbacks of using a graphical language to design computations (algorithms) to work on Shuttle or Space Station telemetry and trajectory data. The design of a system which will allow a model of an electrical system, including telemetry sensors, to be configured on the screen graphically using previously defined electrical icons. This electrical model would then be used to generate rules and procedures for detecting malfunctions in the electrical components of the model. A generic message management (GMM) system. GMM is being designed as a message management system for real-time applications which send advisory messages to a user. The primary purpose of GMM is to reduce the risk of overloading a user with information when multiple failures occurs and in assisting the developer in devising an explanation facility. The emphasis of our work is to develop practical tools and techniques, while determining the feasibility of a given approach, including identification of appropriate software tools to support research, application and tool building activities.

  15. Transition flight control room automation

    NASA Technical Reports Server (NTRS)

    Welborn, Curtis Ray

    1990-01-01

    The Workstation Prototype Laboratory is currently working on a number of projects which can have a direct impact on ground operations automation. These projects include: (1) The fuel cell monitoring system (FCMS), which will monitor and detect problems with the fuel cells on the shuttle. FCMS will use a combination of rules (forward/backward) and multithreaded procedures, which run concurrently with the rules, to implement the malfunction algorithms of the EGIL flight controllers. The combination of rule-based reasoning and procedural reasoning allows us to more easily map the malfunction algorithms into a real-time system implementation. (2) A graphical computation language (AGCOMPL) is an experimental prototype to determine the benefits and drawbacks of using a graphical language to design computations (algorithms) to work on shuttle or space station telemetry and trajectory data. (3) The design of a system will allow a model of an electrical system, including telemetry sensors, to be configured on the screen graphically using previously defined electrical icons. This electrical model would then be used to generate rules and procedures for detecting malfunctions in the electrical components of the model. (4) A generic message management (GMM) system is being designed for real-time applications as a message management system which sends advisory messages to a user. The primary purpose of GMM is to reduce the risk of overloading a user with information when multiple failures occur and to assist the developer in the devising an explanation facility. The emphasis of our work is to develop practical tools and techniques, including identification of appropriate software tools to support research, application, and tool building activities, while determining the feasibility of a given approach.

  16. Weakly coupled map lattice models for multicellular patterning and collective normalization of abnormal single-cell states

    NASA Astrophysics Data System (ADS)

    García-Morales, Vladimir; Manzanares, José A.; Mafe, Salvador

    2017-04-01

    We present a weakly coupled map lattice model for patterning that explores the effects exerted by weakening the local dynamic rules on model biological and artificial networks composed of two-state building blocks (cells). To this end, we use two cellular automata models based on (i) a smooth majority rule (model I) and (ii) a set of rules similar to those of Conway's Game of Life (model II). The normal and abnormal cell states evolve according to local rules that are modulated by a parameter κ . This parameter quantifies the effective weakening of the prescribed rules due to the limited coupling of each cell to its neighborhood and can be experimentally controlled by appropriate external agents. The emergent spatiotemporal maps of single-cell states should be of significance for positional information processes as well as for intercellular communication in tumorigenesis, where the collective normalization of abnormal single-cell states by a predominantly normal neighborhood may be crucial.

  17. Weakly coupled map lattice models for multicellular patterning and collective normalization of abnormal single-cell states.

    PubMed

    García-Morales, Vladimir; Manzanares, José A; Mafe, Salvador

    2017-04-01

    We present a weakly coupled map lattice model for patterning that explores the effects exerted by weakening the local dynamic rules on model biological and artificial networks composed of two-state building blocks (cells). To this end, we use two cellular automata models based on (i) a smooth majority rule (model I) and (ii) a set of rules similar to those of Conway's Game of Life (model II). The normal and abnormal cell states evolve according to local rules that are modulated by a parameter κ. This parameter quantifies the effective weakening of the prescribed rules due to the limited coupling of each cell to its neighborhood and can be experimentally controlled by appropriate external agents. The emergent spatiotemporal maps of single-cell states should be of significance for positional information processes as well as for intercellular communication in tumorigenesis, where the collective normalization of abnormal single-cell states by a predominantly normal neighborhood may be crucial.

  18. Systems biology by the rules: hybrid intelligent systems for pathway modeling and discovery.

    PubMed

    Bosl, William J

    2007-02-15

    Expert knowledge in journal articles is an important source of data for reconstructing biological pathways and creating new hypotheses. An important need for medical research is to integrate this data with high throughput sources to build useful models that span several scales. Researchers traditionally use mental models of pathways to integrate information and development new hypotheses. Unfortunately, the amount of information is often overwhelming and these are inadequate for predicting the dynamic response of complex pathways. Hierarchical computational models that allow exploration of semi-quantitative dynamics are useful systems biology tools for theoreticians, experimentalists and clinicians and may provide a means for cross-communication. A novel approach for biological pathway modeling based on hybrid intelligent systems or soft computing technologies is presented here. Intelligent hybrid systems, which refers to several related computing methods such as fuzzy logic, neural nets, genetic algorithms, and statistical analysis, has become ubiquitous in engineering applications for complex control system modeling and design. Biological pathways may be considered to be complex control systems, which medicine tries to manipulate to achieve desired results. Thus, hybrid intelligent systems may provide a useful tool for modeling biological system dynamics and computational exploration of new drug targets. A new modeling approach based on these methods is presented in the context of hedgehog regulation of the cell cycle in granule cells. Code and input files can be found at the Bionet website: www.chip.ord/~wbosl/Software/Bionet. This paper presents the algorithmic methods needed for modeling complicated biochemical dynamics using rule-based models to represent expert knowledge in the context of cell cycle regulation and tumor growth. A notable feature of this modeling approach is that it allows biologists to build complex models from their knowledge base without the need to translate that knowledge into mathematical form. Dynamics on several levels, from molecular pathways to tissue growth, are seamlessly integrated. A number of common network motifs are examined and used to build a model of hedgehog regulation of the cell cycle in cerebellar neurons, which is believed to play a key role in the etiology of medulloblastoma, a devastating childhood brain cancer.

  19. Strategies for adding adaptive learning mechanisms to rule-based diagnostic expert systems

    NASA Technical Reports Server (NTRS)

    Stclair, D. C.; Sabharwal, C. L.; Bond, W. E.; Hacke, Keith

    1988-01-01

    Rule-based diagnostic expert systems can be used to perform many of the diagnostic chores necessary in today's complex space systems. These expert systems typically take a set of symptoms as input and produce diagnostic advice as output. The primary objective of such expert systems is to provide accurate and comprehensive advice which can be used to help return the space system in question to nominal operation. The development and maintenance of diagnostic expert systems is time and labor intensive since the services of both knowledge engineer(s) and domain expert(s) are required. The use of adaptive learning mechanisms to increment evaluate and refine rules promises to reduce both time and labor costs associated with such systems. This paper describes the basic adaptive learning mechanisms of strengthening, weakening, generalization, discrimination, and discovery. Next basic strategies are discussed for adding these learning mechanisms to rule-based diagnostic expert systems. These strategies support the incremental evaluation and refinement of rules in the knowledge base by comparing the set of advice given by the expert system (A) with the correct diagnosis (C). Techniques are described for selecting those rules in the in the knowledge base which should participate in adaptive learning. The strategies presented may be used with a wide variety of learning algorithms. Further, these strategies are applicable to a large number of rule-based diagnostic expert systems. They may be used to provide either immediate or deferred updating of the knowledge base.

  20. Recommendation System Based On Association Rules For Distributed E-Learning Management Systems

    NASA Astrophysics Data System (ADS)

    Mihai, Gabroveanu

    2015-09-01

    Traditional Learning Management Systems are installed on a single server where learning materials and user data are kept. To increase its performance, the Learning Management System can be installed on multiple servers; learning materials and user data could be distributed across these servers obtaining a Distributed Learning Management System. In this paper is proposed the prototype of a recommendation system based on association rules for Distributed Learning Management System. Information from LMS databases is analyzed using distributed data mining algorithms in order to extract the association rules. Then the extracted rules are used as inference rules to provide personalized recommendations. The quality of provided recommendations is improved because the rules used to make the inferences are more accurate, since these rules aggregate knowledge from all e-Learning systems included in Distributed Learning Management System.

  1. Architecture For The Optimization Of A Machining Process In Real Time Through Rule-Based Expert System

    NASA Astrophysics Data System (ADS)

    Serrano, Rafael; González, Luis Carlos; Martín, Francisco Jesús

    2009-11-01

    Under the project SENSOR-IA which has had financial funding from the Order of Incentives to the Regional Technology Centers of the Counsil of Innovation, Science and Enterprise of Andalusia, an architecture for the optimization of a machining process in real time through rule-based expert system has been developed. The architecture consists of an acquisition system and sensor data processing engine (SATD) from an expert system (SE) rule-based which communicates with the SATD. The SE has been designed as an inference engine with an algorithm for effective action, using a modus ponens rule model of goal-oriented rules.The pilot test demonstrated that it is possible to govern in real time the machining process based on rules contained in a SE. The tests have been done with approximated rules. Future work includes an exhaustive collection of data with different tool materials and geometries in a database to extract more precise rules.

  2. An expert system for natural language processing

    NASA Technical Reports Server (NTRS)

    Hennessy, John F.

    1988-01-01

    A solution to the natural language processing problem that uses a rule based system, written in OPS5, to replace the traditional parsing method is proposed. The advantage to using a rule based system are explored. Specifically, the extensibility of a rule based solution is discussed as well as the value of maintaining rules that function independently. Finally, the power of using semantics to supplement the syntactic analysis of a sentence is considered.

  3. Research and development for Onboard Navigation (ONAV) ground based expert/trainer system: ONAV entry expert system code

    NASA Technical Reports Server (NTRS)

    Bochsler, Daniel C.

    1988-01-01

    A complete listing is given of the expert system rules for the Entry phase of the Onboard Navigation (ONAV) Ground Based Expert Trainer System for aircraft/space shuttle navigation. These source listings appear in the same format as utilized and required by the C Language Integrated Production System (CLIPS) expert system shell which is the basis for the ONAV entry system. A schematic overview is given of how the rules are organized. These groups result from a partitioning of the rules according to the overall function which a given set of rules performs. This partitioning was established and maintained according to that established in the knowledge specification document. In addition, four other groups of rules are specified. The four groups (control flow, operator inputs, output management, and data tables) perform functions that affect all the other functional rule groups. As the name implies, control flow ensures that the rule groups are executed in the order required for proper operation; operator input rules control the introduction into the CLIPS fact base of various kinds of data required by the expert system; output management rules control the updating of the ONAV expert system user display screen during execution of the system; and data tables are static information utilized by many different rule sets gathered in one convenient place.

  4. Implementation of artificial intelligence rules in a data base management system

    NASA Technical Reports Server (NTRS)

    Feyock, S.

    1986-01-01

    The intelligent front end prototype was transformed into a RIM-integrated system. A RIM-based expert system was written which demonstrated the developed capability. The use of rules to produce extensibility of the intelligent front end, including the concept of demons and rule manipulation rules were investigated. Innovative approaches such as syntax programming were to be considered.

  5. Techniques and implementation of the embedded rule-based expert system using Ada

    NASA Technical Reports Server (NTRS)

    Liberman, Eugene M.; Jones, Robert E.

    1991-01-01

    Ada is becoming an increasingly popular programming language for large Government-funded software projects. Ada with its portability, transportability, and maintainability lends itself well to today's complex programming environment. In addition, expert systems have also assured a growing role in providing human-like reasoning capability and expertise for computer systems. The integration of expert system technology with Ada programming language, specifically a rule-based expert system using an ART-Ada (Automated Reasoning Tool for Ada) system shell is discussed. The NASA Lewis Research Center was chosen as a beta test site for ART-Ada. The test was conducted by implementing the existing Autonomous Power EXpert System (APEX), a Lisp-base power expert system, in ART-Ada. Three components, the rule-based expert system, a graphics user interface, and communications software make up SMART-Ada (Systems fault Management with ART-Ada). The main objective, to conduct a beta test on the ART-Ada rule-based expert system shell, was achieved. The system is operational. New Ada tools will assist in future successful projects. ART-Ada is one such tool and is a viable alternative to the straight Ada code when an application requires a rule-based or knowledge-based approach.

  6. A Rule-Based System Implementing a Method for Translating FOL Formulas into NL Sentences

    NASA Astrophysics Data System (ADS)

    Mpagouli, Aikaterini; Hatzilygeroudis, Ioannis

    In this paper, we mainly present the implementation of a system that translates first order logic (FOL) formulas into natural language (NL) sentences. The motivation comes from an intelligent tutoring system teaching logic as a knowledge representation language, where it is used as a means for feedback to the students-users. FOL to NL conversion is achieved by using a rule-based approach, where we exploit the pattern matching capabilities of rules. So, the system consists of rule-based modules corresponding to the phases of our translation methodology. Facts are used in a lexicon providing lexical and grammatical information that helps in producing the NL sentences. The whole system is implemented in Jess, a java-implemented rule-based programming tool. Experimental results confirm the success of our choices.

  7. Transcranial infrared laser stimulation improves rule-based, but not information-integration, category learning in humans.

    PubMed

    Blanco, Nathaniel J; Saucedo, Celeste L; Gonzalez-Lima, F

    2017-03-01

    This is the first randomized, controlled study comparing the cognitive effects of transcranial laser stimulation on category learning tasks. Transcranial infrared laser stimulation is a new non-invasive form of brain stimulation that shows promise for wide-ranging experimental and neuropsychological applications. It involves using infrared laser to enhance cerebral oxygenation and energy metabolism through upregulation of the respiratory enzyme cytochrome oxidase, the primary infrared photon acceptor in cells. Previous research found that transcranial infrared laser stimulation aimed at the prefrontal cortex can improve sustained attention, short-term memory, and executive function. In this study, we directly investigated the influence of transcranial infrared laser stimulation on two neurobiologically dissociable systems of category learning: a prefrontal cortex mediated reflective system that learns categories using explicit rules, and a striatally mediated reflexive learning system that forms gradual stimulus-response associations. Participants (n=118) received either active infrared laser to the lateral prefrontal cortex or sham (placebo) stimulation, and then learned one of two category structures-a rule-based structure optimally learned by the reflective system, or an information-integration structure optimally learned by the reflexive system. We found that prefrontal rule-based learning was substantially improved following transcranial infrared laser stimulation as compared to placebo (treatment X block interaction: F(1, 298)=5.117, p=0.024), while information-integration learning did not show significant group differences (treatment X block interaction: F(1, 288)=1.633, p=0.202). These results highlight the exciting potential of transcranial infrared laser stimulation for cognitive enhancement and provide insight into the neurobiological underpinnings of category learning. Copyright © 2017 Elsevier Inc. All rights reserved.

  8. Intelligent control of a smart walker and its performance evaluation.

    PubMed

    Grondin, Simon L; Li, Qingguo

    2013-06-01

    Recent technological advances have allowed the development of force-dependent, intelligently controlled smart walkers that are able to provide users with enhanced mobility, support and gait assistance. The purpose of this study was to develop an intelligent rule-based controller for a smart walker to achieve a smooth interaction between the user and the walker. This study developed a rule-based mapping between the interaction force, measured by a load cell attached to the walker handle, and the acceleration of the walker. Ten young, healthy subjects were used to evaluate the performance of the proposed controller compared to a well-known admittance-based control system. There were no significant differences between the two control systems concerning their user experience, velocity profiles or average cost of transportation. However, the admittance-based control system required a 1.2N lower average interaction force to maintain the 1m/s target speed (p = 0.002). Metabolic data also indicated that smart walker-assisted gait could considerably reduce the metabolic demand of walking with a four-legged walker.

  9. Electron lithography STAR design guidelines. Part 2: The design of a STAR for space applications

    NASA Technical Reports Server (NTRS)

    Trotter, J. D.; Newman, W.

    1982-01-01

    The STAR design system developed by NASA enables any user with a logic diagram to design a semicustom digital MOS integrated circuit. The system is comprised of a library of standard logic cells and computr programs to place, route, and display designs implemented with cells from the library. Also described is the development of a radiation-hard array designed for the STAR system. The design is based on the CMOS silicon gate technology developed by SANDIA National Laboratories. The design rules used are given as well as the model parameters developed for the basic array element. Library cells of the CMOS metal gate and CMOS silicon gate technologies were simulated using SPICE, and the results are shown and compared.

  10. Verification and Validation of KBS with Neural Network Components

    NASA Technical Reports Server (NTRS)

    Wen, Wu; Callahan, John

    1996-01-01

    Artificial Neural Network (ANN) play an important role in developing robust Knowledge Based Systems (KBS). The ANN based components used in these systems learn to give appropriate predictions through training with correct input-output data patterns. Unlike traditional KBS that depends on a rule database and a production engine, the ANN based system mimics the decisions of an expert without specifically formulating the if-than type of rules. In fact, the ANNs demonstrate their superiority when such if-then type of rules are hard to generate by human expert. Verification of traditional knowledge based system is based on the proof of consistency and completeness of the rule knowledge base and correctness of the production engine.These techniques, however, can not be directly applied to ANN based components.In this position paper, we propose a verification and validation procedure for KBS with ANN based components. The essence of the procedure is to obtain an accurate system specification through incremental modification of the specifications using an ANN rule extraction algorithm.

  11. Cell division plane orientation based on tensile stress in Arabidopsis thaliana

    PubMed Central

    Louveaux, Marion; Julien, Jean-Daniel; Mirabet, Vincent; Boudaoud, Arezki; Hamant, Olivier

    2016-01-01

    Cell geometry has long been proposed to play a key role in the orientation of symmetric cell division planes. In particular, the recently proposed Besson–Dumais rule generalizes Errera’s rule and predicts that cells divide along one of the local minima of plane area. However, this rule has been tested only on tissues with rather local spherical shape and homogeneous growth. Here, we tested the application of the Besson–Dumais rule to the divisions occurring in the Arabidopsis shoot apex, which contains domains with anisotropic curvature and differential growth. We found that the Besson–Dumais rule works well in the central part of the apex, but fails to account for cell division planes in the saddle-shaped boundary region. Because curvature anisotropy and differential growth prescribe directional tensile stress in that region, we tested the putative contribution of anisotropic stress fields to cell division plane orientation at the shoot apex. To do so, we compared two division rules: geometrical (new plane along the shortest path) and mechanical (new plane along maximal tension). The mechanical division rule reproduced the enrichment of long planes observed in the boundary region. Experimental perturbation of mechanical stress pattern further supported a contribution of anisotropic tensile stress in division plane orientation. Importantly, simulations of tissues growing in an isotropic stress field, and dividing along maximal tension, provided division plane distributions comparable to those obtained with the geometrical rule. We thus propose that division plane orientation by tensile stress offers a general rule for symmetric cell division in plants. PMID:27436908

  12. Monte Carlo dose calculation using a cell processor based PlayStation 3 system

    NASA Astrophysics Data System (ADS)

    Chow, James C. L.; Lam, Phil; Jaffray, David A.

    2012-02-01

    This study investigates the performance of the EGSnrc computer code coupled with a Cell-based hardware in Monte Carlo simulation of radiation dose in radiotherapy. Performance evaluations of two processor-intensive functions namely, HOWNEAR and RANMAR_GET in the EGSnrc code were carried out basing on the 20-80 rule (Pareto principle). The execution speeds of the two functions were measured by the profiler gprof specifying the number of executions and total time spent on the functions. A testing architecture designed for Cell processor was implemented in the evaluation using a PlayStation3 (PS3) system. The evaluation results show that the algorithms examined are readily parallelizable on the Cell platform, provided that an architectural change of the EGSnrc was made. However, as the EGSnrc performance was limited by the PowerPC Processing Element in the PS3, PC coupled with graphics processing units or GPCPU may provide a more viable avenue for acceleration.

  13. An Embedded Rule-Based Diagnostic Expert System in Ada

    NASA Technical Reports Server (NTRS)

    Jones, Robert E.; Liberman, Eugene M.

    1992-01-01

    Ada is becoming an increasingly popular programming language for large Government-funded software projects. Ada with it portability, transportability, and maintainability lends itself well to today's complex programming environment. In addition, expert systems have also assumed a growing role in providing human-like reasoning capability expertise for computer systems. The integration is discussed of expert system technology with Ada programming language, especially a rule-based expert system using an ART-Ada (Automated Reasoning Tool for Ada) system shell. NASA Lewis was chosen as a beta test site for ART-Ada. The test was conducted by implementing the existing Autonomous Power EXpert System (APEX), a Lisp-based power expert system, in ART-Ada. Three components, the rule-based expert systems, a graphics user interface, and communications software make up SMART-Ada (Systems fault Management with ART-Ada). The rules were written in the ART-Ada development environment and converted to Ada source code. The graphics interface was developed with the Transportable Application Environment (TAE) Plus, which generates Ada source code to control graphics images. SMART-Ada communicates with a remote host to obtain either simulated or real data. The Ada source code generated with ART-Ada, TAE Plus, and communications code was incorporated into an Ada expert system that reads the data from a power distribution test bed, applies the rule to determine a fault, if one exists, and graphically displays it on the screen. The main objective, to conduct a beta test on the ART-Ada rule-based expert system shell, was achieved. The system is operational. New Ada tools will assist in future successful projects. ART-Ada is one such tool and is a viable alternative to the straight Ada code when an application requires a rule-based or knowledge-based approach.

  14. Rule based artificial intelligence expert system for determination of upper extremity impairment rating.

    PubMed

    Lim, I; Walkup, R K; Vannier, M W

    1993-04-01

    Quantitative evaluation of upper extremity impairment, a percentage rating most often determined using a rule based procedure, has been implemented on a personal computer using an artificial intelligence, rule-based expert system (AI system). In this study, the rules given in Chapter 3 of the AMA Guides to the Evaluation of Permanent Impairment (Third Edition) were used to develop such an AI system for the Apple Macintosh. The program applies the rules from the Guides in a consistent and systematic fashion. It is faster and less error-prone than the manual method, and the results have a higher degree of precision, since intermediate values are not truncated.

  15. Comparison of conventional rule based flow control with control processes based on fuzzy logic in a combined sewer system.

    PubMed

    Klepiszewski, K; Schmitt, T G

    2002-01-01

    While conventional rule based, real time flow control of sewer systems is in common use, control systems based on fuzzy logic have been used only rarely, but successfully. The intention of this study is to compare a conventional rule based control of a combined sewer system with a fuzzy logic control by using hydrodynamic simulation. The objective of both control strategies is to reduce the combined sewer overflow volume by an optimization of the utilized storage capacities of four combined sewer overflow tanks. The control systems affect the outflow of four combined sewer overflow tanks depending on the water levels inside the structures. Both systems use an identical rule base. The developed control systems are tested and optimized for a single storm event which affects heterogeneously hydraulic load conditions and local discharge. Finally the efficiencies of the two different control systems are compared for two more storm events. The results indicate that the conventional rule based control and the fuzzy control similarly reach the objective of the control strategy. In spite of the higher expense to design the fuzzy control system its use provides no advantages in this case.

  16. Model-based approach for design verification and co-optimization of catastrophic and parametric-related defects due to systematic manufacturing variations

    NASA Astrophysics Data System (ADS)

    Perry, Dan; Nakamoto, Mark; Verghese, Nishath; Hurat, Philippe; Rouse, Rich

    2007-03-01

    Model-based hotspot detection and silicon-aware parametric analysis help designers optimize their chips for yield, area and performance without the high cost of applying foundries' recommended design rules. This set of DFM/ recommended rules is primarily litho-driven, but cannot guarantee a manufacturable design without imposing overly restrictive design requirements. This rule-based methodology of making design decisions based on idealized polygons that no longer represent what is on silicon needs to be replaced. Using model-based simulation of the lithography, OPC, RET and etch effects, followed by electrical evaluation of the resulting shapes, leads to a more realistic and accurate analysis. This analysis can be used to evaluate intelligent design trade-offs and identify potential failures due to systematic manufacturing defects during the design phase. The successful DFM design methodology consists of three parts: 1. Achieve a more aggressive layout through limited usage of litho-related recommended design rules. A 10% to 15% area reduction is achieved by using more aggressive design rules. DFM/recommended design rules are used only if there is no impact on cell size. 2. Identify and fix hotspots using a model-based layout printability checker. Model-based litho and etch simulation are done at the cell level to identify hotspots. Violations of recommended rules may cause additional hotspots, which are then fixed. The resulting design is ready for step 3. 3. Improve timing accuracy with a process-aware parametric analysis tool for transistors and interconnect. Contours of diffusion, poly and metal layers are used for parametric analysis. In this paper, we show the results of this physical and electrical DFM methodology at Qualcomm. We describe how Qualcomm was able to develop more aggressive cell designs that yielded a 10% to 15% area reduction using this methodology. Model-based shape simulation was employed during library development to validate architecture choices and to optimize cell layout. At the physical verification stage, the shape simulator was run at full-chip level to identify and fix residual hotspots on interconnect layers, on poly or metal 1 due to interaction between adjacent cells, or on metal 1 due to interaction between routing (via and via cover) and cell geometry. To determine an appropriate electrical DFM solution, Qualcomm developed an experiment to examine various electrical effects. After reporting the silicon results of this experiment, which showed sizeable delay variations due to lithography-related systematic effects, we also explain how contours of diffusion, poly and metal can be used for silicon-aware parametric analysis of transistors and interconnect at the cell-, block- and chip-level.

  17. Rule-based topology system for spatial databases to validate complex geographic datasets

    NASA Astrophysics Data System (ADS)

    Martinez-Llario, J.; Coll, E.; Núñez-Andrés, M.; Femenia-Ribera, C.

    2017-06-01

    A rule-based topology software system providing a highly flexible and fast procedure to enforce integrity in spatial relationships among datasets is presented. This improved topology rule system is built over the spatial extension Jaspa. Both projects are open source, freely available software developed by the corresponding author of this paper. Currently, there is no spatial DBMS that implements a rule-based topology engine (considering that the topology rules are designed and performed in the spatial backend). If the topology rules are applied in the frontend (as in many GIS desktop programs), ArcGIS is the most advanced solution. The system presented in this paper has several major advantages over the ArcGIS approach: it can be extended with new topology rules, it has a much wider set of rules, and it can mix feature attributes with topology rules as filters. In addition, the topology rule system can work with various DBMSs, including PostgreSQL, H2 or Oracle, and the logic is performed in the spatial backend. The proposed topology system allows users to check the complex spatial relationships among features (from one or several spatial layers) that require some complex cartographic datasets, such as the data specifications proposed by INSPIRE in Europe and the Land Administration Domain Model (LADM) for Cadastral data.

  18. Universal rule for the symmetric division of plant cells

    PubMed Central

    Besson, Sébastien; Dumais, Jacques

    2011-01-01

    The division of eukaryotic cells involves the assembly of complex cytoskeletal structures to exert the forces required for chromosome segregation and cytokinesis. In plants, empirical evidence suggests that tensional forces within the cytoskeleton cause cells to divide along the plane that minimizes the surface area of the cell plate (Errera’s rule) while creating daughter cells of equal size. However, exceptions to Errera’s rule cast doubt on whether a broadly applicable rule can be formulated for plant cell division. Here, we show that the selection of the plane of division involves a competition between alternative configurations whose geometries represent local area minima. We find that the probability of observing a particular division configuration increases inversely with its relative area according to an exponential probability distribution known as the Gibbs measure. Moreover, a comparison across land plants and their most recent algal ancestors confirms that the probability distribution is widely conserved and independent of cell shape and size. Using a maximum entropy formulation, we show that this empirical division rule is predicted by the dynamics of the tense cytoskeletal elements that lead to the positioning of the preprophase band. Based on the fact that the division plane is selected from the sole interaction of the cytoskeleton with cell shape, we posit that the new rule represents the default mechanism for plant cell division when internal or external cues are absent. PMID:21383128

  19. A new hybrid case-based reasoning approach for medical diagnosis systems.

    PubMed

    Sharaf-El-Deen, Dina A; Moawad, Ibrahim F; Khalifa, M E

    2014-02-01

    Case-Based Reasoning (CBR) has been applied in many different medical applications. Due to the complexities and the diversities of this domain, most medical CBR systems become hybrid. Besides, the case adaptation process in CBR is often a challenging issue as it is traditionally carried out manually by domain experts. In this paper, a new hybrid case-based reasoning approach for medical diagnosis systems is proposed to improve the accuracy of the retrieval-only CBR systems. The approach integrates case-based reasoning and rule-based reasoning, and also applies the adaptation process automatically by exploiting adaptation rules. Both adaptation rules and reasoning rules are generated from the case-base. After solving a new case, the case-base is expanded, and both adaptation and reasoning rules are updated. To evaluate the proposed approach, a prototype was implemented and experimented to diagnose breast cancer and thyroid diseases. The final results show that the proposed approach increases the diagnosing accuracy of the retrieval-only CBR systems, and provides a reliable accuracy comparing to the current breast cancer and thyroid diagnosis systems.

  20. A self-learning rule base for command following in dynamical systems

    NASA Technical Reports Server (NTRS)

    Tsai, Wei K.; Lee, Hon-Mun; Parlos, Alexander

    1992-01-01

    In this paper, a self-learning Rule Base for command following in dynamical systems is presented. The learning is accomplished though reinforcement learning using an associative memory called SAM. The main advantage of SAM is that it is a function approximator with explicit storage of training samples. A learning algorithm patterned after the dynamic programming is proposed. Two artificially created, unstable dynamical systems are used for testing, and the Rule Base was used to generate a feedback control to improve the command following ability of the otherwise uncontrolled systems. The numerical results are very encouraging. The controlled systems exhibit a more stable behavior and a better capability to follow reference commands. The rules resulting from the reinforcement learning are explicitly stored and they can be modified or augmented by human experts. Due to overlapping storage scheme of SAM, the stored rules are similar to fuzzy rules.

  1. Empirical Analysis and Refinement of Expert System Knowledge Bases

    DTIC Science & Technology

    1988-08-31

    refinement. Both a simulated case generation program, and a random rule basher were developed to enhance rule refinement experimentation. *Substantial...the second fiscal year 88 objective was fully met. Rule Refinement System Simulated Rule Basher Case Generator Stored Cases Expert System Knowledge...generated until the rule is satisfied. Cases may be randomly generated for a given rule or hypothesis. Rule Basher Given that one has a correct

  2. Expert networks in CLIPS

    NASA Technical Reports Server (NTRS)

    Hruska, S. I.; Dalke, A.; Ferguson, J. J.; Lacher, R. C.

    1991-01-01

    Rule-based expert systems may be structurally and functionally mapped onto a special class of neural networks called expert networks. This mapping lends itself to adaptation of connectionist learning strategies for the expert networks. A parsing algorithm to translate C Language Integrated Production System (CLIPS) rules into a network of interconnected assertion and operation nodes has been developed. The translation of CLIPS rules to an expert network and back again is illustrated. Measures of uncertainty similar to those rules in MYCIN-like systems are introduced into the CLIPS system and techniques for combining and hiring nodes in the network based on rule-firing with these certainty factors in the expert system are presented. Several learning algorithms are under study which automate the process of attaching certainty factors to rules.

  3. Agent-based modelling in synthetic biology.

    PubMed

    Gorochowski, Thomas E

    2016-11-30

    Biological systems exhibit complex behaviours that emerge at many different levels of organization. These span the regulation of gene expression within single cells to the use of quorum sensing to co-ordinate the action of entire bacterial colonies. Synthetic biology aims to make the engineering of biology easier, offering an opportunity to control natural systems and develop new synthetic systems with useful prescribed behaviours. However, in many cases, it is not understood how individual cells should be programmed to ensure the emergence of a required collective behaviour. Agent-based modelling aims to tackle this problem, offering a framework in which to simulate such systems and explore cellular design rules. In this article, I review the use of agent-based models in synthetic biology, outline the available computational tools, and provide details on recently engineered biological systems that are amenable to this approach. I further highlight the challenges facing this methodology and some of the potential future directions. © 2016 The Author(s).

  4. Learning and disrupting invariance in visual recognition with a temporal association rule

    PubMed Central

    Isik, Leyla; Leibo, Joel Z.; Poggio, Tomaso

    2012-01-01

    Learning by temporal association rules such as Foldiak's trace rule is an attractive hypothesis that explains the development of invariance in visual recognition. Consistent with these rules, several recent experiments have shown that invariance can be broken at both the psychophysical and single cell levels. We show (1) that temporal association learning provides appropriate invariance in models of object recognition inspired by the visual cortex, (2) that we can replicate the “invariance disruption” experiments using these models with a temporal association learning rule to develop and maintain invariance, and (3) that despite dramatic single cell effects, a population of cells is very robust to these disruptions. We argue that these models account for the stability of perceptual invariance despite the underlying plasticity of the system, the variability of the visual world and expected noise in the biological mechanisms. PMID:22754523

  5. Research on complex 3D tree modeling based on L-system

    NASA Astrophysics Data System (ADS)

    Gang, Chen; Bin, Chen; Yuming, Liu; Hui, Li

    2018-03-01

    L-system as a fractal iterative system could simulate complex geometric patterns. Based on the field observation data of trees and knowledge of forestry experts, this paper extracted modeling constraint rules and obtained an L-system rules set. Using the self-developed L-system modeling software the L-system rule set was parsed to generate complex tree 3d models.The results showed that the geometrical modeling method based on l-system could be used to describe the morphological structure of complex trees and generate 3D tree models.

  6. RuleMonkey: software for stochastic simulation of rule-based models

    PubMed Central

    2010-01-01

    Background The system-level dynamics of many molecular interactions, particularly protein-protein interactions, can be conveniently represented using reaction rules, which can be specified using model-specification languages, such as the BioNetGen language (BNGL). A set of rules implicitly defines a (bio)chemical reaction network. The reaction network implied by a set of rules is often very large, and as a result, generation of the network implied by rules tends to be computationally expensive. Moreover, the cost of many commonly used methods for simulating network dynamics is a function of network size. Together these factors have limited application of the rule-based modeling approach. Recently, several methods for simulating rule-based models have been developed that avoid the expensive step of network generation. The cost of these "network-free" simulation methods is independent of the number of reactions implied by rules. Software implementing such methods is now needed for the simulation and analysis of rule-based models of biochemical systems. Results Here, we present a software tool called RuleMonkey, which implements a network-free method for simulation of rule-based models that is similar to Gillespie's method. The method is suitable for rule-based models that can be encoded in BNGL, including models with rules that have global application conditions, such as rules for intramolecular association reactions. In addition, the method is rejection free, unlike other network-free methods that introduce null events, i.e., steps in the simulation procedure that do not change the state of the reaction system being simulated. We verify that RuleMonkey produces correct simulation results, and we compare its performance against DYNSTOC, another BNGL-compliant tool for network-free simulation of rule-based models. We also compare RuleMonkey against problem-specific codes implementing network-free simulation methods. Conclusions RuleMonkey enables the simulation of rule-based models for which the underlying reaction networks are large. It is typically faster than DYNSTOC for benchmark problems that we have examined. RuleMonkey is freely available as a stand-alone application http://public.tgen.org/rulemonkey. It is also available as a simulation engine within GetBonNie, a web-based environment for building, analyzing and sharing rule-based models. PMID:20673321

  7. A Swarm Optimization approach for clinical knowledge mining.

    PubMed

    Christopher, J Jabez; Nehemiah, H Khanna; Kannan, A

    2015-10-01

    Rule-based classification is a typical data mining task that is being used in several medical diagnosis and decision support systems. The rules stored in the rule base have an impact on classification efficiency. Rule sets that are extracted with data mining tools and techniques are optimized using heuristic or meta-heuristic approaches in order to improve the quality of the rule base. In this work, a meta-heuristic approach called Wind-driven Swarm Optimization (WSO) is used. The uniqueness of this work lies in the biological inspiration that underlies the algorithm. WSO uses Jval, a new metric, to evaluate the efficiency of a rule-based classifier. Rules are extracted from decision trees. WSO is used to obtain different permutations and combinations of rules whereby the optimal ruleset that satisfies the requirement of the developer is used for predicting the test data. The performance of various extensions of decision trees, namely, RIPPER, PART, FURIA and Decision Tables are analyzed. The efficiency of WSO is also compared with the traditional Particle Swarm Optimization. Experiments were carried out with six benchmark medical datasets. The traditional C4.5 algorithm yields 62.89% accuracy with 43 rules for liver disorders dataset where as WSO yields 64.60% with 19 rules. For Heart disease dataset, C4.5 is 68.64% accurate with 98 rules where as WSO is 77.8% accurate with 34 rules. The normalized standard deviation for accuracy of PSO and WSO are 0.5921 and 0.5846 respectively. WSO provides accurate and concise rulesets. PSO yields results similar to that of WSO but the novelty of WSO lies in its biological motivation and it is customization for rule base optimization. The trade-off between the prediction accuracy and the size of the rule base is optimized during the design and development of rule-based clinical decision support system. The efficiency of a decision support system relies on the content of the rule base and classification accuracy. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.

  8. Robust and irreversible development in cell society as a general consequence of intra-inter dynamics

    NASA Astrophysics Data System (ADS)

    Kaneko, Kunihiko; Furusawa, Chikara

    2000-05-01

    A dynamical systems scenario for developmental cell biology is proposed, based on numerical studies of a system with interacting units with internal dynamics and reproduction. Diversification, formation of discrete and recursive types, and rules for differentiation are found as a natural consequence of such a system. “Stem cells” that either proliferate or differentiate to different types stochastically are found to appear when intra-cellular dynamics are chaotic. Robustness of the developmental process against microscopic and macroscopic perturbations is shown to be a natural consequence of such intra-inter dynamics, while irreversibility in developmental process is discussed in terms of the gain of stability, loss of diversity and chaotic instability.

  9. System diagnostic builder: a rule-generation tool for expert systems that do intelligent data evaluation

    NASA Astrophysics Data System (ADS)

    Nieten, Joseph L.; Burke, Roger

    1993-03-01

    The system diagnostic builder (SDB) is an automated knowledge acquisition tool using state- of-the-art artificial intelligence (AI) technologies. The SDB uses an inductive machine learning technique to generate rules from data sets that are classified by a subject matter expert (SME). Thus, data is captured from the subject system, classified by an expert, and used to drive the rule generation process. These rule-bases are used to represent the observable behavior of the subject system, and to represent knowledge about this system. The rule-bases can be used in any knowledge based system which monitors or controls a physical system or simulation. The SDB has demonstrated the utility of using inductive machine learning technology to generate reliable knowledge bases. In fact, we have discovered that the knowledge captured by the SDB can be used in any number of applications. For example, the knowledge bases captured from the SMS can be used as black box simulations by intelligent computer aided training devices. We can also use the SDB to construct knowledge bases for the process control industry, such as chemical production, or oil and gas production. These knowledge bases can be used in automated advisory systems to ensure safety, productivity, and consistency.

  10. Real time AI expert system for robotic applications

    NASA Technical Reports Server (NTRS)

    Follin, John F.

    1987-01-01

    A computer controlled multi-robot process cell to demonstrate advanced technologies for the demilitarization of obsolete chemical munitions was developed. The methods through which the vision system and other sensory inputs were used by the artificial intelligence to provide the information required to direct the robots to complete the desired task are discussed. The mechanisms that the expert system uses to solve problems (goals), the different rule data base, and the methods for adapting this control system to any device that can be controlled or programmed through a high level computer interface are discussed.

  11. Research of Litchi Diseases Diagnosis Expertsystem Based on Rbr and Cbr

    NASA Astrophysics Data System (ADS)

    Xu, Bing; Liu, Liqun

    To conquer the bottleneck problems existing in the traditional rule-based reasoning diseases diagnosis system, such as low reasoning efficiency and lack of flexibility, etc.. It researched the integrated case-based reasoning (CBR) and rule-based reasoning (RBR) technology, and put forward a litchi diseases diagnosis expert system (LDDES) with integrated reasoning method. The method use data mining and knowledge obtaining technology to establish knowledge base and case library. It adopt rules to instruct the retrieval and matching for CBR, and use association rule and decision trees algorithm to calculate case similarity.The experiment shows that the method can increase the system's flexibility and reasoning ability, and improve the accuracy of litchi diseases diagnosis.

  12. Comparative analysis of expert and machine-learning methods for classification of body cavity effusions in companion animals.

    PubMed

    Hotz, Christine S; Templeton, Steven J; Christopher, Mary M

    2005-03-01

    A rule-based expert system using CLIPS programming language was created to classify body cavity effusions as transudates, modified transudates, exudates, chylous, and hemorrhagic effusions. The diagnostic accuracy of the rule-based system was compared with that produced by 2 machine-learning methods: Rosetta, a rough sets algorithm and RIPPER, a rule-induction method. Results of 508 body cavity fluid analyses (canine, feline, equine) obtained from the University of California-Davis Veterinary Medical Teaching Hospital computerized patient database were used to test CLIPS and to test and train RIPPER and Rosetta. The CLIPS system, using 17 rules, achieved an accuracy of 93.5% compared with pathologist consensus diagnoses. Rosetta accurately classified 91% of effusions by using 5,479 rules. RIPPER achieved the greatest accuracy (95.5%) using only 10 rules. When the original rules of the CLIPS application were replaced with those of RIPPER, the accuracy rates were identical. These results suggest that both rule-based expert systems and machine-learning methods hold promise for the preliminary classification of body fluids in the clinical laboratory.

  13. Implementing a Commercial Rule Base as a Medication Order Safety Net

    PubMed Central

    Reichley, Richard M.; Seaton, Terry L.; Resetar, Ervina; Micek, Scott T.; Scott, Karen L.; Fraser, Victoria J.; Dunagan, W. Claiborne; Bailey, Thomas C.

    2005-01-01

    A commercial rule base (Cerner Multum) was used to identify medication orders exceeding recommended dosage limits at five hospitals within BJC HealthCare, an integrated health care system. During initial testing, clinical pharmacists determined that there was an excessive number of nuisance and clinically insignificant alerts, with an overall alert rate of 9.2%. A method for customizing the commercial rule base was implemented to increase rule specificity for problematic rules. The system was subsequently deployed at two facilities and achieved alert rates of less than 1%. Pharmacists screened these alerts and contacted ordering physicians in 21% of cases. Physicians made therapeutic changes in response to 38% of alerts presented to them. By applying simple techniques to customize rules, commercial rule bases can be used to rapidly deploy a safety net to screen drug orders for excessive dosages, while preserving the rule architecture for later implementations of more finely tuned clinical decision support. PMID:15802481

  14. Rule-based mechanisms of learning for intelligent adaptive flight control

    NASA Technical Reports Server (NTRS)

    Handelman, David A.; Stengel, Robert F.

    1990-01-01

    How certain aspects of human learning can be used to characterize learning in intelligent adaptive control systems is investigated. Reflexive and declarative memory and learning are described. It is shown that model-based systems-theoretic adaptive control methods exhibit attributes of reflexive learning, whereas the problem-solving capabilities of knowledge-based systems of artificial intelligence are naturally suited for implementing declarative learning. Issues related to learning in knowledge-based control systems are addressed, with particular attention given to rule-based systems. A mechanism for real-time rule-based knowledge acquisition is suggested, and utilization of this mechanism within the context of failure diagnosis for fault-tolerant flight control is demonstrated.

  15. Dopaminergic neurons write and update memories with cell-type-specific rules

    PubMed Central

    Aso, Yoshinori; Rubin, Gerald M

    2016-01-01

    Associative learning is thought to involve parallel and distributed mechanisms of memory formation and storage. In Drosophila, the mushroom body (MB) is the major site of associative odor memory formation. Previously we described the anatomy of the adult MB and defined 20 types of dopaminergic neurons (DANs) that each innervate distinct MB compartments (Aso et al., 2014a, 2014b). Here we compare the properties of memories formed by optogenetic activation of individual DAN cell types. We found extensive differences in training requirements for memory formation, decay dynamics, storage capacity and flexibility to learn new associations. Even a single DAN cell type can either write or reduce an aversive memory, or write an appetitive memory, depending on when it is activated relative to odor delivery. Our results show that different learning rules are executed in seemingly parallel memory systems, providing multiple distinct circuit-based strategies to predict future events from past experiences. DOI: http://dx.doi.org/10.7554/eLife.16135.001 PMID:27441388

  16. Automatic de-identification of French clinical records: comparison of rule-based and machine-learning approaches.

    PubMed

    Grouin, Cyril; Zweigenbaum, Pierre

    2013-01-01

    In this paper, we present a comparison of two approaches to automatically de-identify medical records written in French: a rule-based system and a machine-learning based system using a conditional random fields (CRF) formalism. Both systems have been designed to process nine identifiers in a corpus of medical records in cardiology. We performed two evaluations: first, on 62 documents in cardiology, and on 10 documents in foetopathology - produced by optical character recognition (OCR) - to evaluate the robustness of our systems. We achieved a 0.843 (rule-based) and 0.883 (machine-learning) exact match overall F-measure in cardiology. While the rule-based system allowed us to achieve good results on nominative (first and last names) and numerical data (dates, phone numbers, and zip codes), the machine-learning approach performed best on more complex categories (postal addresses, hospital names, medical devices, and towns). On the foetopathology corpus, although our systems have not been designed for this corpus and despite OCR character recognition errors, we obtained promising results: a 0.681 (rule-based) and 0.638 (machine-learning) exact-match overall F-measure. This demonstrates that existing tools can be applied to process new documents of lower quality.

  17. Using an improved association rules mining optimization algorithm in web-based mobile-learning system

    NASA Astrophysics Data System (ADS)

    Huang, Yin; Chen, Jianhua; Xiong, Shaojun

    2009-07-01

    Mobile-Learning (M-learning) makes many learners get the advantages of both traditional learning and E-learning. Currently, Web-based Mobile-Learning Systems have created many new ways and defined new relationships between educators and learners. Association rule mining is one of the most important fields in data mining and knowledge discovery in databases. Rules explosion is a serious problem which causes great concerns, as conventional mining algorithms often produce too many rules for decision makers to digest. Since Web-based Mobile-Learning System collects vast amounts of student profile data, data mining and knowledge discovery techniques can be applied to find interesting relationships between attributes of learners, assessments, the solution strategies adopted by learners and so on. Therefore ,this paper focus on a new data-mining algorithm, combined with the advantages of genetic algorithm and simulated annealing algorithm , called ARGSA(Association rules based on an improved Genetic Simulated Annealing Algorithm), to mine the association rules. This paper first takes advantage of the Parallel Genetic Algorithm and Simulated Algorithm designed specifically for discovering association rules. Moreover, the analysis and experiment are also made to show the proposed method is superior to the Apriori algorithm in this Mobile-Learning system.

  18. Integrating policy-based management and SLA performance monitoring

    NASA Astrophysics Data System (ADS)

    Liu, Tzong-Jye; Lin, Chin-Yi; Chang, Shu-Hsin; Yen, Meng-Tzu

    2001-10-01

    Policy-based management system provides the configuration capability for the system administrators to focus on the requirements of customers. The service level agreement performance monitoring mechanism helps system administrators to verify the correctness of policies. However, it is difficult for a device to process the policies directly because the policies are the management concept. This paper proposes a mechanism to decompose a policy into rules that can be efficiently processed by a device. Thus, the device may process the rule and collect the performance statistics information efficiently; and the policy-based management system may collect these performance statistics information and report the service-level agreement performance monitoring information to the system administrator. The proposed policy-based management system achieves both the policy configuration and service-level agreement performance monitoring requirements. A policy consists of a condition part and an action part. The condition part is a Boolean expression of a source host IP group, a destination host IP group, etc. The action part is the parameters of services. We say that an address group is compact if it only consists of a range of IP address that can be denoted by a pair of IP address and corresponding IP mask. If the condition part of a policy only consists of the compact address group, we say that the policy is a rule. Since a device can efficiently process a compact address and a system administrator prefers to define a range of IP address, the policy-based management system has to translate policy into rules and supplements the gaps between policy and rules. The proposed policy-based management system builds the relationships between VPN and policies, policy and rules. Since the system administrator wants to monitor the system performance information of VPNs and policies, the proposed policy-based management system downloads the relationships among VPNs, policies and rules to the SNMP agents. The SNMP agents build the management information base (MIB) of all VPNs, policies and rules according to the relationships obtained from the management server. Thus, the proposed policy-based management system may get all performance monitoring information of VPNs and policies from agents. The proposed policy-based manager achieves two goals: a) provide a management environment for the system administrator to configure their network only considering the policy requirement issues and b) let the device have only to process the packet and then collect the required performance information. These two things make the proposed management system satisfy both the user and device requirements.

  19. eFSM--a novel online neural-fuzzy semantic memory model.

    PubMed

    Tung, Whye Loon; Quek, Chai

    2010-01-01

    Fuzzy rule-based systems (FRBSs) have been successfully applied to many areas. However, traditional fuzzy systems are often manually crafted, and their rule bases that represent the acquired knowledge are static and cannot be trained to improve the modeling performance. This subsequently leads to intensive research on the autonomous construction and tuning of a fuzzy system directly from the observed training data to address the knowledge acquisition bottleneck, resulting in well-established hybrids such as neural-fuzzy systems (NFSs) and genetic fuzzy systems (GFSs). However, the complex and dynamic nature of real-world problems demands that fuzzy rule-based systems and models be able to adapt their parameters and ultimately evolve their rule bases to address the nonstationary (time-varying) characteristics of their operating environments. Recently, considerable research efforts have been directed to the study of evolving Tagaki-Sugeno (T-S)-type NFSs based on the concept of incremental learning. In contrast, there are very few incremental learning Mamdani-type NFSs reported in the literature. Hence, this paper presents the evolving neural-fuzzy semantic memory (eFSM) model, a neural-fuzzy Mamdani architecture with a data-driven progressively adaptive structure (i.e., rule base) based on incremental learning. Issues related to the incremental learning of the eFSM rule base are carefully investigated, and a novel parameter learning approach is proposed for the tuning of the fuzzy set parameters in eFSM. The proposed eFSM model elicits highly interpretable semantic knowledge in the form of Mamdani-type if-then fuzzy rules from low-level numeric training data. These Mamdani fuzzy rules define the computing structure of eFSM and are incrementally learned with the arrival of each training data sample. New rules are constructed from the emergence of novel training data and obsolete fuzzy rules that no longer describe the recently observed data trends are pruned. This enables eFSM to maintain a current and compact set of Mamdani-type if-then fuzzy rules that collectively generalizes and describes the salient associative mappings between the inputs and outputs of the underlying process being modeled. The learning and modeling performances of the proposed eFSM are evaluated using several benchmark applications and the results are encouraging.

  20. Organizational Knowledge Transfer Using Ontologies and a Rule-Based System

    NASA Astrophysics Data System (ADS)

    Okabe, Masao; Yoshioka, Akiko; Kobayashi, Keido; Yamaguchi, Takahira

    In recent automated and integrated manufacturing, so-called intelligence skill is becoming more and more important and its efficient transfer to next-generation engineers is one of the urgent issues. In this paper, we propose a new approach without costly OJT (on-the-job training), that is, combinational usage of a domain ontology, a rule ontology and a rule-based system. Intelligence skill can be decomposed into pieces of simple engineering rules. A rule ontology consists of these engineering rules as primitives and the semantic relations among them. A domain ontology consists of technical terms in the engineering rules and the semantic relations among them. A rule ontology helps novices get the total picture of the intelligence skill and a domain ontology helps them understand the exact meanings of the engineering rules. A rule-based system helps domain experts externalize their tacit intelligence skill to ontologies and also helps novices internalize them. As a case study, we applied our proposal to some actual job at a remote control and maintenance office of hydroelectric power stations in Tokyo Electric Power Co., Inc. We also did an evaluation experiment for this case study and the result supports our proposal.

  1. Computationally assisted screening and design of cell-interactive peptides by a cell-based assay using peptide arrays and a fuzzy neural network algorithm.

    PubMed

    Kaga, Chiaki; Okochi, Mina; Tomita, Yasuyuki; Kato, Ryuji; Honda, Hiroyuki

    2008-03-01

    We developed a method of effective peptide screening that combines experiments and computational analysis. The method is based on the concept that screening efficiency can be enhanced from even limited data by use of a model derived from computational analysis that serves as a guide to screening and combining the model with subsequent repeated experiments. Here we focus on cell-adhesion peptides as a model application of this peptide-screening strategy. Cell-adhesion peptides were screened by use of a cell-based assay of a peptide array. Starting with the screening data obtained from a limited, random 5-mer library (643 sequences), a rule regarding structural characteristics of cell-adhesion peptides was extracted by fuzzy neural network (FNN) analysis. According to this rule, peptides with unfavored residues in certain positions that led to inefficient binding were eliminated from the random sequences. In the restricted, second random library (273 sequences), the yield of cell-adhesion peptides having an adhesion rate more than 1.5-fold to that of the basal array support was significantly high (31%) compared with the unrestricted random library (20%). In the restricted third library (50 sequences), the yield of cell-adhesion peptides increased to 84%. We conclude that a repeated cycle of experiments screening limited numbers of peptides can be assisted by the rule-extracting feature of FNN.

  2. A multilayer perceptron solution to the match phase problem in rule-based artificial intelligence systems

    NASA Technical Reports Server (NTRS)

    Sartori, Michael A.; Passino, Kevin M.; Antsaklis, Panos J.

    1992-01-01

    In rule-based AI planning, expert, and learning systems, it is often the case that the left-hand-sides of the rules must be repeatedly compared to the contents of some 'working memory'. The traditional approach to solve such a 'match phase problem' for production systems is to use the Rete Match Algorithm. Here, a new technique using a multilayer perceptron, a particular artificial neural network model, is presented to solve the match phase problem for rule-based AI systems. A syntax for premise formulas (i.e., the left-hand-sides of the rules) is defined, and working memory is specified. From this, it is shown how to construct a multilayer perceptron that finds all of the rules which can be executed for the current situation in working memory. The complexity of the constructed multilayer perceptron is derived in terms of the maximum number of nodes and the required number of layers. A method for reducing the number of layers to at most three is also presented.

  3. ARROWSMITH-P: A prototype expert system for software engineering management

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Ramsey, Connie Loggia

    1985-01-01

    Although the field of software engineering is relatively new, it can benefit from the use of expert systems. Two prototype expert systems were developed to aid in software engineering management. Given the values for certain metrics, these systems will provide interpretations which explain any abnormal patterns of these values during the development of a software project. The two systems, which solve the same problem, were built using different methods, rule-based deduction and frame-based abduction. A comparison was done to see which method was better suited to the needs of this field. It was found that both systems performed moderately well, but the rule-based deduction system using simple rules provided more complete solutions than did the frame-based abduction system.

  4. Web-based Weather Expert System (WES) for Space Shuttle Launch

    NASA Technical Reports Server (NTRS)

    Bardina, Jorge E.; Rajkumar, T.

    2003-01-01

    The Web-based Weather Expert System (WES) is a critical module of the Virtual Test Bed development to support 'go/no go' decisions for Space Shuttle operations in the Intelligent Launch and Range Operations program of NASA. The weather rules characterize certain aspects of the environment related to the launching or landing site, the time of the day or night, the pad or runway conditions, the mission durations, the runway equipment and landing type. Expert system rules are derived from weather contingency rules, which were developed over years by NASA. Backward chaining, a goal-directed inference method is adopted, because a particular consequence or goal clause is evaluated first, and then chained backward through the rules. Once a rule is satisfied or true, then that particular rule is fired and the decision is expressed. The expert system is continuously verifying the rules against the past one-hour weather conditions and the decisions are made. The normal procedure of operations requires a formal pre-launch weather briefing held on Launch minus 1 day, which is a specific weather briefing for all areas of Space Shuttle launch operations. In this paper, the Web-based Weather Expert System of the Intelligent Launch and range Operations program is presented.

  5. An Expert-System Engine With Operative Probabilities

    NASA Technical Reports Server (NTRS)

    Orlando, N. E.; Palmer, M. T.; Wallace, R. S.

    1986-01-01

    Program enables proof-of-concepts tests of expert systems under development. AESOP is rule-based inference engine for expert system, which makes decisions about particular situation given user-supplied hypotheses, rules, and answers to questions drawn from rules. If knowledge base containing hypotheses and rules governing environment is available to AESOP, almost any situation within that environment resolved by answering questions asked by AESOP. Questions answered with YES, NO, MAYBE, DON'T KNOW, DON'T CARE, or with probability factor ranging from 0 to 10. AESOP written in Franz LISP for interactive execution.

  6. Four simple rules that are sufficient to generate the mammalian blastocyst

    PubMed Central

    Nissen, Silas Boye; Perera, Marta; Gonzalez, Javier Martin; Morgani, Sophie M.; Jensen, Mogens H.; Sneppen, Kim; Brickman, Joshua M.

    2017-01-01

    Early mammalian development is both highly regulative and self-organizing. It involves the interplay of cell position, predetermined gene regulatory networks, and environmental interactions to generate the physical arrangement of the blastocyst with precise timing. However, this process occurs in the absence of maternal information and in the presence of transcriptional stochasticity. How does the preimplantation embryo ensure robust, reproducible development in this context? It utilizes a versatile toolbox that includes complex intracellular networks coupled to cell—cell communication, segregation by differential adhesion, and apoptosis. Here, we ask whether a minimal set of developmental rules based on this toolbox is sufficient for successful blastocyst development, and to what extent these rules can explain mutant and experimental phenotypes. We implemented experimentally reported mechanisms for polarity, cell—cell signaling, adhesion, and apoptosis as a set of developmental rules in an agent-based in silico model of physically interacting cells. We find that this model quantitatively reproduces specific mutant phenotypes and provides an explanation for the emergence of heterogeneity without requiring any initial transcriptional variation. It also suggests that a fixed time point for the cells’ competence of fibroblast growth factor (FGF)/extracellular signal—regulated kinase (ERK) sets an embryonic clock that enables certain scaling phenomena, a concept that we evaluate quantitatively by manipulating embryos in vitro. Based on these observations, we conclude that the minimal set of rules enables the embryo to experiment with stochastic gene expression and could provide the robustness necessary for the evolutionary diversification of the preimplantation gene regulatory network. PMID:28700688

  7. Analysis of mesenchymal stem cell differentiation in vitro using classification association rule mining.

    PubMed

    Wang, Weiqi; Wang, Yanbo Justin; Bañares-Alcántara, René; Coenen, Frans; Cui, Zhanfeng

    2009-12-01

    In this paper, data mining is used to analyze the data on the differentiation of mammalian Mesenchymal Stem Cells (MSCs), aiming at discovering known and hidden rules governing MSC differentiation, following the establishment of a web-based public database containing experimental data on the MSC proliferation and differentiation. To this effect, a web-based public interactive database comprising the key parameters which influence the fate and destiny of mammalian MSCs has been constructed and analyzed using Classification Association Rule Mining (CARM) as a data-mining technique. The results show that the proposed approach is technically feasible and performs well with respect to the accuracy of (classification) prediction. Key rules mined from the constructed MSC database are consistent with experimental observations, indicating the validity of the method developed and the first step in the application of data mining to the study of MSCs.

  8. Simulation of operating rules and discretional decisions using a fuzzy rule-based system integrated into a water resources management model

    NASA Astrophysics Data System (ADS)

    Macian-Sorribes, Hector; Pulido-Velazquez, Manuel

    2013-04-01

    Water resources systems are operated, mostly, using a set of pre-defined rules not regarding, usually, to an optimal allocation in terms of water use or economic benefits, but to historical and institutional reasons. These operating policies are reproduced, commonly, as hedging rules, pack rules or zone-based operations, and simulation models can be used to test their performance under a wide range of hydrological and/or socio-economic hypothesis. Despite the high degree of acceptation and testing that these models have achieved, the actual operation of water resources systems hardly follows all the time the pre-defined rules with the consequent uncertainty on the system performance. Real-world reservoir operation is very complex, affected by input uncertainty (imprecision in forecast inflow, seepage and evaporation losses, etc.), filtered by the reservoir operator's experience and natural risk-aversion, while considering the different physical and legal/institutional constraints in order to meet the different demands and system requirements. The aim of this work is to expose a fuzzy logic approach to derive and assess the historical operation of a system. This framework uses a fuzzy rule-based system to reproduce pre-defined rules and also to match as close as possible the actual decisions made by managers. After built up, the fuzzy rule-based system can be integrated in a water resources management model, making possible to assess the system performance at the basin scale. The case study of the Mijares basin (eastern Spain) is used to illustrate the method. A reservoir operating curve regulates the two main reservoir releases (operated in a conjunctive way) with the purpose of guaranteeing a high realiability of supply to the traditional irrigation districts with higher priority (more senior demands that funded the reservoir construction). A fuzzy rule-based system has been created to reproduce the operating curve's performance, defining the system state (total water stored in the reservoirs) and the month of the year as inputs; and the demand deliveries as outputs. The developed simulation management model integrates the fuzzy-ruled system of the operation of the two main reservoirs of the basin with the corresponding mass balance equations, the physical or boundary conditions and the water allocation rules among the competing demands. Historical information on inflow time series is used as inputs to the model simulation, being trained and validated using historical information on reservoir storage level and flow in several streams of the Mijares river. This methodology provides a more flexible and close to real policies approach. The model is easy to develop and to understand due to its rule-based structure, which mimics the human way of thinking. This can improve cooperation and negotiation between managers, decision-makers and stakeholders. The approach can be also applied to analyze the historical operation of the reservoir (what we have called a reservoir operation "audit").

  9. Designing boosting ensemble of relational fuzzy systems.

    PubMed

    Scherer, Rafał

    2010-10-01

    A method frequently used in classification systems for improving classification accuracy is to combine outputs of several classifiers. Among various types of classifiers, fuzzy ones are tempting because of using intelligible fuzzy if-then rules. In the paper we build an AdaBoost ensemble of relational neuro-fuzzy classifiers. Relational fuzzy systems bond input and output fuzzy linguistic values by a binary relation; thus, fuzzy rules have additional, comparing to traditional fuzzy systems, weights - elements of a fuzzy relation matrix. Thanks to this the system is better adjustable to data during learning. In the paper an ensemble of relational fuzzy systems is proposed. The problem is that such an ensemble contains separate rule bases which cannot be directly merged. As systems are separate, we cannot treat fuzzy rules coming from different systems as rules from the same (single) system. In the paper, the problem is addressed by a novel design of fuzzy systems constituting the ensemble, resulting in normalization of individual rule bases during learning. The method described in the paper is tested on several known benchmarks and compared with other machine learning solutions from the literature.

  10. Analysis of aminoacids pattern in receptor tyrosine kinase using Boolean association rule.

    PubMed

    Kalita, Pranjal; Kumar, Brindha Senthil; Krishnaswamy, Soundararajan; Nachimuthu, Senthil Kumar

    2012-01-01

    Cancers are characterized by unrestricted cell division and independency of growth factor and other external signal responsiveness. Eukaryotic parental cells of tumors, on the other hand, constitute tissues and other higher structures like organs and systems and are capable of performing various functions in a highly co-ordinated fashion. Hence, cancer cells may be considered as entities capable of incessant growth and cell division but lacking any evolutionarily advanced intracellular or intercellular regulation. Since receptor tyrosine kinases are highly altered and exist in deregulated/constitutively active forms in cancer cells - achieved through various epigenetic mechanisms - we hypothesize the functional RTKs in cancer cells to resemble their counterparts in more primitive species. Analysis of RTK sequences of various species and of cancer is, therefore, expected to prove this hypothesis. Association rule in data mining can reveal the hidden biological information. This study utilizes the Boolean association rule to mine the occurrence pattern of glycine, arginine and alanine in receptor tyrosine kinases (RTKs) of invertebrates, vertebrates and cancer related vertebrate RTKs based on protein sequence informations. The results reveal that vertebrate cancer RTKs resembles prokaryotes and invertebrate RTKs showing an increasing trend of glycine, alanine and decreasing trend in arginine composition. The aminoacid compositions of vertebrates: invertebrates: prokaryotes: vertebrate cancer with respect to Glycine (>=6.1) were 42.86: 50.0: 85.71: 100%, Alanine (>=6.2) were 10.72: 66.67: 85.71: 100%, whereas Arginine (>=5.9) were 21.43: 16.67: 14.29: 0%, respectively. In conclusion, results from this study supports our hypothesis that cancer cells may resemble lower organisms since functionally cancer cells are unresponsive to external signals and various regulatory mechanisms typically found in higher eukaryotes are largely absent.

  11. System Diagnostic Builder - A rule generation tool for expert systems that do intelligent data evaluation. [applied to Shuttle Mission Simulator

    NASA Technical Reports Server (NTRS)

    Nieten, Joseph; Burke, Roger

    1993-01-01

    Consideration is given to the System Diagnostic Builder (SDB), an automated knowledge acquisition tool using state-of-the-art AI technologies. The SDB employs an inductive machine learning technique to generate rules from data sets that are classified by a subject matter expert. Thus, data are captured from the subject system, classified, and used to drive the rule generation process. These rule bases are used to represent the observable behavior of the subject system, and to represent knowledge about this system. The knowledge bases captured from the Shuttle Mission Simulator can be used as black box simulations by the Intelligent Computer Aided Training devices. The SDB can also be used to construct knowledge bases for the process control industry, such as chemical production or oil and gas production.

  12. Intrusion Detection Systems with Live Knowledge System

    DTIC Science & Technology

    2016-05-31

    Ripple -down Rule (RDR) to maintain the knowledge from human experts with knowledge base generated by the Induct RDR, which is a machine-learning based RDR...propose novel approach that uses Ripple -down Rule (RDR) to maintain the knowledge from human experts with knowledge base generated by the Induct RDR...detection model by applying Induct RDR approach. The proposed induct RDR ( Ripple Down Rules) approach allows to acquire the phishing detection

  13. Modeling formalisms in Systems Biology

    PubMed Central

    2011-01-01

    Systems Biology has taken advantage of computational tools and high-throughput experimental data to model several biological processes. These include signaling, gene regulatory, and metabolic networks. However, most of these models are specific to each kind of network. Their interconnection demands a whole-cell modeling framework for a complete understanding of cellular systems. We describe the features required by an integrated framework for modeling, analyzing and simulating biological processes, and review several modeling formalisms that have been used in Systems Biology including Boolean networks, Bayesian networks, Petri nets, process algebras, constraint-based models, differential equations, rule-based models, interacting state machines, cellular automata, and agent-based models. We compare the features provided by different formalisms, and discuss recent approaches in the integration of these formalisms, as well as possible directions for the future. PMID:22141422

  14. Automated visualization of rule-based models

    PubMed Central

    Tapia, Jose-Juan; Faeder, James R.

    2017-01-01

    Frameworks such as BioNetGen, Kappa and Simmune use “reaction rules” to specify biochemical interactions compactly, where each rule specifies a mechanism such as binding or phosphorylation and its structural requirements. Current rule-based models of signaling pathways have tens to hundreds of rules, and these numbers are expected to increase as more molecule types and pathways are added. Visual representations are critical for conveying rule-based models, but current approaches to show rules and interactions between rules scale poorly with model size. Also, inferring design motifs that emerge from biochemical interactions is an open problem, so current approaches to visualize model architecture rely on manual interpretation of the model. Here, we present three new visualization tools that constitute an automated visualization framework for rule-based models: (i) a compact rule visualization that efficiently displays each rule, (ii) the atom-rule graph that conveys regulatory interactions in the model as a bipartite network, and (iii) a tunable compression pipeline that incorporates expert knowledge and produces compact diagrams of model architecture when applied to the atom-rule graph. The compressed graphs convey network motifs and architectural features useful for understanding both small and large rule-based models, as we show by application to specific examples. Our tools also produce more readable diagrams than current approaches, as we show by comparing visualizations of 27 published models using standard graph metrics. We provide an implementation in the open source and freely available BioNetGen framework, but the underlying methods are general and can be applied to rule-based models from the Kappa and Simmune frameworks also. We expect that these tools will promote communication and analysis of rule-based models and their eventual integration into comprehensive whole-cell models. PMID:29131816

  15. Developmental engineering: a new paradigm for the design and manufacturing of cell-based products. Part II: from genes to networks: tissue engineering from the viewpoint of systems biology and network science.

    PubMed

    Lenas, Petros; Moos, Malcolm; Luyten, Frank P

    2009-12-01

    The field of tissue engineering is moving toward a new concept of "in vitro biomimetics of in vivo tissue development." In Part I of this series, we proposed a theoretical framework integrating the concepts of developmental biology with those of process design to provide the rules for the design of biomimetic processes. We named this methodology "developmental engineering" to emphasize that it is not the tissue but the process of in vitro tissue development that has to be engineered. To formulate the process design rules in a rigorous way that will allow a computational design, we should refer to mathematical methods to model the biological process taking place in vitro. Tissue functions cannot be attributed to individual molecules but rather to complex interactions between the numerous components of a cell and interactions between cells in a tissue that form a network. For tissue engineering to advance to the level of a technologically driven discipline amenable to well-established principles of process engineering, a scientifically rigorous formulation is needed of the general design rules so that the behavior of networks of genes, proteins, or cells that govern the unfolding of developmental processes could be related to the design parameters. Now that sufficient experimental data exist to construct plausible mathematical models of many biological control circuits, explicit hypotheses can be evaluated using computational approaches to facilitate process design. Recent progress in systems biology has shown that the empirical concepts of developmental biology that we used in Part I to extract the rules of biomimetic process design can be expressed in rigorous mathematical terms. This allows the accurate characterization of manufacturing processes in tissue engineering as well as the properties of the artificial tissues themselves. In addition, network science has recently shown that the behavior of biological networks strongly depends on their topology and has developed the necessary concepts and methods to describe it, allowing therefore a deeper understanding of the behavior of networks during biomimetic processes. These advances thus open the door to a transition for tissue engineering from a substantially empirical endeavor to a technology-based discipline comparable to other branches of engineering.

  16. Mammogram segmentation using maximal cell strength updation in cellular automata.

    PubMed

    Anitha, J; Peter, J Dinesh

    2015-08-01

    Breast cancer is the most frequently diagnosed type of cancer among women. Mammogram is one of the most effective tools for early detection of the breast cancer. Various computer-aided systems have been introduced to detect the breast cancer from mammogram images. In a computer-aided diagnosis system, detection and segmentation of breast masses from the background tissues is an important issue. In this paper, an automatic segmentation method is proposed to identify and segment the suspicious mass regions of mammogram using a modified transition rule named maximal cell strength updation in cellular automata (CA). In coarse-level segmentation, the proposed method performs an adaptive global thresholding based on the histogram peak analysis to obtain the rough region of interest. An automatic seed point selection is proposed using gray-level co-occurrence matrix-based sum average feature in the coarse segmented image. Finally, the method utilizes CA with the identified initial seed point and the modified transition rule to segment the mass region. The proposed approach is evaluated over the dataset of 70 mammograms with mass from mini-MIAS database. Experimental results show that the proposed approach yields promising results to segment the mass region in the mammograms with the sensitivity of 92.25% and accuracy of 93.48%.

  17. Probabilistic Cellular Automata

    PubMed Central

    Agapie, Alexandru; Giuclea, Marius

    2014-01-01

    Abstract Cellular automata are binary lattices used for modeling complex dynamical systems. The automaton evolves iteratively from one configuration to another, using some local transition rule based on the number of ones in the neighborhood of each cell. With respect to the number of cells allowed to change per iteration, we speak of either synchronous or asynchronous automata. If randomness is involved to some degree in the transition rule, we speak of probabilistic automata, otherwise they are called deterministic. With either type of cellular automaton we are dealing with, the main theoretical challenge stays the same: starting from an arbitrary initial configuration, predict (with highest accuracy) the end configuration. If the automaton is deterministic, the outcome simplifies to one of two configurations, all zeros or all ones. If the automaton is probabilistic, the whole process is modeled by a finite homogeneous Markov chain, and the outcome is the corresponding stationary distribution. Based on our previous results for the asynchronous case—connecting the probability of a configuration in the stationary distribution to its number of zero-one borders—the article offers both numerical and theoretical insight into the long-term behavior of synchronous cellular automata. PMID:24999557

  18. Probabilistic cellular automata.

    PubMed

    Agapie, Alexandru; Andreica, Anca; Giuclea, Marius

    2014-09-01

    Cellular automata are binary lattices used for modeling complex dynamical systems. The automaton evolves iteratively from one configuration to another, using some local transition rule based on the number of ones in the neighborhood of each cell. With respect to the number of cells allowed to change per iteration, we speak of either synchronous or asynchronous automata. If randomness is involved to some degree in the transition rule, we speak of probabilistic automata, otherwise they are called deterministic. With either type of cellular automaton we are dealing with, the main theoretical challenge stays the same: starting from an arbitrary initial configuration, predict (with highest accuracy) the end configuration. If the automaton is deterministic, the outcome simplifies to one of two configurations, all zeros or all ones. If the automaton is probabilistic, the whole process is modeled by a finite homogeneous Markov chain, and the outcome is the corresponding stationary distribution. Based on our previous results for the asynchronous case-connecting the probability of a configuration in the stationary distribution to its number of zero-one borders-the article offers both numerical and theoretical insight into the long-term behavior of synchronous cellular automata.

  19. Marginal Contribution-Based Distributed Subchannel Allocation in Small Cell Networks.

    PubMed

    Shah, Shashi; Kittipiyakul, Somsak; Lim, Yuto; Tan, Yasuo

    2018-05-10

    The paper presents a game theoretic solution for distributed subchannel allocation problem in small cell networks (SCNs) analyzed under the physical interference model. The objective is to find a distributed solution that maximizes the welfare of the SCNs, defined as the total system capacity. Although the problem can be addressed through best-response (BR) dynamics, the existence of a steady-state solution, i.e., a pure strategy Nash equilibrium (NE), cannot be guaranteed. Potential games (PGs) ensure convergence to a pure strategy NE when players rationally play according to some specified learning rules. However, such a performance guarantee comes at the expense of complete knowledge of the SCNs. To overcome such requirements, properties of PGs are exploited for scalable implementations, where we utilize the concept of marginal contribution (MC) as a tool to design learning rules of players’ utility and propose the marginal contribution-based best-response (MCBR) algorithm of low computational complexity for the distributed subchannel allocation problem. Finally, we validate and evaluate the proposed scheme through simulations for various performance metrics.

  20. Expert system shell to reason on large amounts of data

    NASA Technical Reports Server (NTRS)

    Giuffrida, Gionanni

    1994-01-01

    The current data base management systems (DBMS's) do not provide a sophisticated environment to develop rule based expert systems applications. Some of the new DBMS's come with some sort of rule mechanism; these are active and deductive database systems. However, both of these are not featured enough to support full implementation based on rules. On the other hand, current expert system shells do not provide any link with external databases. That is, all the data are kept in the system working memory. Such working memory is maintained in main memory. For some applications the reduced size of the available working memory could represent a constraint for the development. Typically these are applications which require reasoning on huge amounts of data. All these data do not fit into the computer main memory. Moreover, in some cases these data can be already available in some database systems and continuously updated while the expert system is running. This paper proposes an architecture which employs knowledge discovering techniques to reduce the amount of data to be stored in the main memory; in this architecture a standard DBMS is coupled with a rule-based language. The data are stored into the DBMS. An interface between the two systems is responsible for inducing knowledge from the set of relations. Such induced knowledge is then transferred to the rule-based language working memory.

  1. Automated rule-base creation via CLIPS-Induce

    NASA Technical Reports Server (NTRS)

    Murphy, Patrick M.

    1994-01-01

    Many CLIPS rule-bases contain one or more rule groups that perform classification. In this paper we describe CLIPS-Induce, an automated system for the creation of a CLIPS classification rule-base from a set of test cases. CLIPS-Induce consists of two components, a decision tree induction component and a CLIPS production extraction component. ID3, a popular decision tree induction algorithm, is used to induce a decision tree from the test cases. CLIPS production extraction is accomplished through a top-down traversal of the decision tree. Nodes of the tree are used to construct query rules, and branches of the tree are used to construct classification rules. The learned CLIPS productions may easily be incorporated into a large CLIPS system that perform tasks such as accessing a database or displaying information.

  2. An intelligent knowledge-based and customizable home care system framework with ubiquitous patient monitoring and alerting techniques.

    PubMed

    Chen, Yen-Lin; Chiang, Hsin-Han; Yu, Chao-Wei; Chiang, Chuan-Yen; Liu, Chuan-Ming; Wang, Jenq-Haur

    2012-01-01

    This study develops and integrates an efficient knowledge-based system and a component-based framework to design an intelligent and flexible home health care system. The proposed knowledge-based system integrates an efficient rule-based reasoning model and flexible knowledge rules for determining efficiently and rapidly the necessary physiological and medication treatment procedures based on software modules, video camera sensors, communication devices, and physiological sensor information. This knowledge-based system offers high flexibility for improving and extending the system further to meet the monitoring demands of new patient and caregiver health care by updating the knowledge rules in the inference mechanism. All of the proposed functional components in this study are reusable, configurable, and extensible for system developers. Based on the experimental results, the proposed intelligent homecare system demonstrates that it can accomplish the extensible, customizable, and configurable demands of the ubiquitous healthcare systems to meet the different demands of patients and caregivers under various rehabilitation and nursing conditions.

  3. An Intelligent Knowledge-Based and Customizable Home Care System Framework with Ubiquitous Patient Monitoring and Alerting Techniques

    PubMed Central

    Chen, Yen-Lin; Chiang, Hsin-Han; Yu, Chao-Wei; Chiang, Chuan-Yen; Liu, Chuan-Ming; Wang, Jenq-Haur

    2012-01-01

    This study develops and integrates an efficient knowledge-based system and a component-based framework to design an intelligent and flexible home health care system. The proposed knowledge-based system integrates an efficient rule-based reasoning model and flexible knowledge rules for determining efficiently and rapidly the necessary physiological and medication treatment procedures based on software modules, video camera sensors, communication devices, and physiological sensor information. This knowledge-based system offers high flexibility for improving and extending the system further to meet the monitoring demands of new patient and caregiver health care by updating the knowledge rules in the inference mechanism. All of the proposed functional components in this study are reusable, configurable, and extensible for system developers. Based on the experimental results, the proposed intelligent homecare system demonstrates that it can accomplish the extensible, customizable, and configurable demands of the ubiquitous healthcare systems to meet the different demands of patients and caregivers under various rehabilitation and nursing conditions. PMID:23112650

  4. Rule Based Expert System for Monitoring Real Time Drug Supply in Hospital Using Radio Frequency Identification Technology

    NASA Astrophysics Data System (ADS)

    Driandanu, Galih; Surarso, Bayu; Suryono

    2018-02-01

    A radio frequency identification (RFID) has obtained increasing attention with the emergence of various applications. This study aims to examine the implementation of rule based expert system supported by RFID technology into a monitoring information system of drug supply in a hospital. This research facilitates in monitoring the real time drug supply by using data sample from the hospital pharmacy. This system able to identify and count the number of drug and provide warning and report in real time. the conclusion is the rule based expert system and RFID technology can facilitate the performance in monitoring the drug supply quickly and precisely.

  5. Fuzzylot: a novel self-organising fuzzy-neural rule-based pilot system for automated vehicles.

    PubMed

    Pasquier, M; Quek, C; Toh, M

    2001-10-01

    This paper presents part of our research work concerned with the realisation of an Intelligent Vehicle and the technologies required for its routing, navigation, and control. An automated driver prototype has been developed using a self-organising fuzzy rule-based system (POPFNN-CRI(S)) to model and subsequently emulate human driving expertise. The ability of fuzzy logic to represent vague information using linguistic variables makes it a powerful tool to develop rule-based control systems when an exact working model is not available, as is the case of any vehicle-driving task. Designing a fuzzy system, however, is a complex endeavour, due to the need to define the variables and their associated fuzzy sets, and determine a suitable rule base. Many efforts have thus been devoted to automating this process, yielding the development of learning and optimisation techniques. One of them is the family of POP-FNNs, or Pseudo-Outer Product Fuzzy Neural Networks (TVR, AARS(S), AARS(NS), CRI, Yager). These generic self-organising neural networks developed at the Intelligent Systems Laboratory (ISL/NTU) are based on formal fuzzy mathematical theory and are able to objectively extract a fuzzy rule base from training data. In this application, a driving simulator has been developed, that integrates a detailed model of the car dynamics, complete with engine characteristics and environmental parameters, and an OpenGL-based 3D-simulation interface coupled with driving wheel and accelerator/ brake pedals. The simulator has been used on various road scenarios to record from a human pilot driving data consisting of steering and speed control actions associated to road features. Specifically, the POPFNN-CRI(S) system is used to cluster the data and extract a fuzzy rule base modelling the human driving behaviour. Finally, the effectiveness of the generated rule base has been validated using the simulator in autopilot mode.

  6. Evaluating the Effectiveness of Auditing Rules for Electronic Health Record Systems

    PubMed Central

    Hedda, Monica; Malin, Bradley A.; Yan, Chao; Fabbri, Daniel

    2017-01-01

    Healthcare organizations (HCOs) often deploy rule-based auditing systems to detect insider threats to sensitive patient health information in electronic health record (EHR) systems. These rule-based systems define behavior deemed to be high-risk a priori (e.g., family member, co-worker access). While such rules seem logical, there has been little scientific investigation into the effectiveness of these auditing rules in identifying inappropriate behavior. Thus, in this paper, we introduce an approach to evaluate the effectiveness of individual high-risk rules and rank them according to their potential risk. We investigate the rate of high-risk access patterns and minimum rate of high-risk accesses that can be explained with appropriate clinical reasons in a large EHR system. An analysis of 8M accesses from one-week of data shows that specific high-risk flags occur more frequently than theoretically expected and the rate at which accesses can be explained away with five simple reasons is 16 - 43%. PMID:29854153

  7. Evaluating the Effectiveness of Auditing Rules for Electronic Health Record Systems.

    PubMed

    Hedda, Monica; Malin, Bradley A; Yan, Chao; Fabbri, Daniel

    2017-01-01

    Healthcare organizations (HCOs) often deploy rule-based auditing systems to detect insider threats to sensitive patient health information in electronic health record (EHR) systems. These rule-based systems define behavior deemed to be high-risk a priori (e.g., family member, co-worker access). While such rules seem logical, there has been little scientific investigation into the effectiveness of these auditing rules in identifying inappropriate behavior. Thus, in this paper, we introduce an approach to evaluate the effectiveness of individual high-risk rules and rank them according to their potential risk. We investigate the rate of high-risk access patterns and minimum rate of high-risk accesses that can be explained with appropriate clinical reasons in a large EHR system. An analysis of 8M accesses from one-week of data shows that specific high-risk flags occur more frequently than theoretically expected and the rate at which accesses can be explained away with five simple reasons is 16 - 43%.

  8. Engineering Cell-Cell Signaling

    PubMed Central

    Milano, Daniel F.; Natividad, Robert J.; Asthagiri, Anand R.

    2014-01-01

    Juxtacrine cell-cell signaling mediated by the direct interaction of adjoining mammalian cells is arguably the mode of cell communication that is most recalcitrant to engineering. Overcoming this challenge is crucial for progress in biomedical applications, such as tissue engineering, regenerative medicine, immune system engineering and therapeutic design. Here, we describe the significant advances that have been made in developing synthetic platforms (materials and devices) and synthetic cells (cell surface engineering and synthetic gene circuits) to modulate juxtacrine cell-cell signaling. In addition, significant progress has been made in elucidating design rules and strategies to modulate juxtacrine signaling based on quantitative, engineering analysis of the mechanical and regulatory role of juxtacrine signals in the context of other cues and physical constraints in the microenvironment. These advances in engineering juxtacrine signaling lay a strong foundation for an integrative approach to utilizing synthetic cells, advanced ‘chassis’ and predictive modeling to engineer the form and function of living tissues. PMID:23856592

  9. RB-ARD: A proof of concept rule-based abort

    NASA Technical Reports Server (NTRS)

    Smith, Richard; Marinuzzi, John

    1987-01-01

    The Abort Region Determinator (ARD) is a console program in the space shuttle mission control center. During shuttle ascent, the Flight Dynamics Officer (FDO) uses the ARD to determine the possible abort modes and make abort calls for the crew. The goal of the Rule-based Abort region Determinator (RB/ARD) project was to test the concept of providing an onboard ARD for the shuttle or an automated ARD for the mission control center (MCC). A proof of concept rule-based system was developed on a LMI Lambda computer using PICON, a knowdedge-based system shell. Knowdedge derived from documented flight rules and ARD operation procedures was coded in PICON rules. These rules, in conjunction with modules of conventional code, enable the RB-ARD to carry out key parts of the ARD task. Current capabilities of the RB-ARD include: continuous updating of the available abort mode, recognition of a limited number of main engine faults and recommendation of safing actions. Safing actions recommended by the RB-ARD concern the Space Shuttle Main Engine (SSME) limit shutdown system and powerdown of the SSME Ac buses.

  10. A new vapor-liquid equilibrium apparatus for hydrogen fluoride containing systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jongcheon Lee; Hwayong Kim; Jong Sung Lim

    1996-12-31

    A new circulating type apparatus has been constructed to obtain reliable equilibrium PTxy data for hydrogen fluoride (HF) containing system. Equilibrium cell with Pyrex windows protected by Teflon PFA sheets to prevent the corrosion was used. Isothermal vapor-liquid equilibrium data for the 1,1-difluoroethane (HFC-152a) + HF system at 288.23 and 298.35 K were obtained, and compared with PTx measurement results. Experimental data were correlated using Lencka and Anderko equation of state for HF with the Wong-Sandler mixing rule as well as the van der Waals one fluid mixing rule. The Wong-Sandler mixing rule gives better results. 5 refs., 3 figs.

  11. Expert systems for diagnostic purposes, prospected applications to the radar field

    NASA Astrophysics Data System (ADS)

    Filippi, Riccardo

    Expert systems applied to fault diagnosis, particularly electrical circuit troubleshooting, are introduced. Diagnostic systems consisting of sequences of rules of the symptom-disease type (rule based system) and systems based upon a physical and functional description of the unit subjected to fault diagnosis are treated. Application of such systems to radar equipment troubleshooting, in particular to the transmitter, is discussed.

  12. Knowledge-based reasoning in the Paladin tactical decision generation system

    NASA Technical Reports Server (NTRS)

    Chappell, Alan R.

    1993-01-01

    A real-time tactical decision generation system for air combat engagements, Paladin, has been developed. A pilot's job in air combat includes tasks that are largely symbolic. These symbolic tasks are generally performed through the application of experience and training (i.e. knowledge) gathered over years of flying a fighter aircraft. Two such tasks, situation assessment and throttle control, are identified and broken out in Paladin to be handled by specialized knowledge based systems. Knowledge pertaining to these tasks is encoded into rule-bases to provide the foundation for decisions. Paladin uses a custom built inference engine and a partitioned rule-base structure to give these symbolic results in real-time. This paper provides an overview of knowledge-based reasoning systems as a subset of rule-based systems. The knowledge used by Paladin in generating results as well as the system design for real-time execution is discussed.

  13. On the fusion of tuning parameters of fuzzy rules and neural network

    NASA Astrophysics Data System (ADS)

    Mamuda, Mamman; Sathasivam, Saratha

    2017-08-01

    Learning fuzzy rule-based system with neural network can lead to a precise valuable empathy of several problems. Fuzzy logic offers a simple way to reach at a definite conclusion based upon its vague, ambiguous, imprecise, noisy or missing input information. Conventional learning algorithm for tuning parameters of fuzzy rules using training input-output data usually end in a weak firing state, this certainly powers the fuzzy rule and makes it insecure for a multiple-input fuzzy system. In this paper, we introduce a new learning algorithm for tuning the parameters of the fuzzy rules alongside with radial basis function neural network (RBFNN) in training input-output data based on the gradient descent method. By the new learning algorithm, the problem of weak firing using the conventional method was addressed. We illustrated the efficiency of our new learning algorithm by means of numerical examples. MATLAB R2014(a) software was used in simulating our result The result shows that the new learning method has the best advantage of training the fuzzy rules without tempering with the fuzzy rule table which allowed a membership function of the rule to be used more than one time in the fuzzy rule base.

  14. Complex-learning Induced Modifications in Synaptic Inhibition: Mechanisms and Functional Significance.

    PubMed

    Reuveni, Iris; Lin, Longnian; Barkai, Edi

    2018-06-15

    Following training in a difficult olfactory-discrimination (OD) task rats acquire the capability to perform the task easily, with little effort. This new acquired skill, of 'learning how to learn' is termed 'rule learning'. At the single-cell level, rule learning is manifested in long-term enhancement of intrinsic neuronal excitability of piriform cortex (PC) pyramidal neurons, and in excitatory synaptic connections between these neurons to maintain cortical stability, such long-lasting increase in excitability must be accompanied by paralleled increase in inhibitory processes that would prevent hyper-excitable activation. In this review we describe the cellular and molecular mechanisms underlying complex-learning-induced long-lasting modifications in GABA A -receptors and GABA B -receptor-mediated synaptic inhibition. Subsequently we discuss how such modifications support the induction and preservation of long-term memories in the in the mammalian brain. Based on experimental results, computational analysis and modeling, we propose that rule learning is maintained by doubling the strength of synaptic inputs, excitatory as well as inhibitory, in a sub-group of neurons. This enhanced synaptic transmission, which occurs in all (or almost all) synaptic inputs onto these neurons, activates specific stored memories. At the molecular level, such rule-learning-relevant synaptic strengthening is mediated by doubling the conductance of synaptic channels, but not their numbers. This post synaptic process is controlled by a whole-cell mechanism via particular second messenger systems. This whole-cell mechanism enables memory amplification when required and memory extinction when not relevant. Copyright © 2018 IBRO. Published by Elsevier Ltd. All rights reserved.

  15. Analysis of Rules for Islamic Inheritance Law in Indonesia Using Hybrid Rule Based Learning

    NASA Astrophysics Data System (ADS)

    Khosyi'ah, S.; Irfan, M.; Maylawati, D. S.; Mukhlas, O. S.

    2018-01-01

    Along with the development of human civilization in Indonesia, the changes and reform of Islamic inheritance law so as to conform to the conditions and culture cannot be denied. The distribution of inheritance in Indonesia can be done automatically by storing the rule of Islamic inheritance law in the expert system. In this study, we analyze the knowledge of experts in Islamic inheritance in Indonesia and represent it in the form of rules using rule-based Forward Chaining (FC) and Davis-Putman-Logemann-Loveland (DPLL) algorithms. By hybridizing FC and DPLL algorithms, the rules of Islamic inheritance law in Indonesia are clearly defined and measured. The rules were conceptually validated by some experts in Islamic laws and informatics. The results revealed that generally all rules were ready for use in an expert system.

  16. Autotransporter-based cell surface display in Gram-negative bacteria.

    PubMed

    Nicolay, Toon; Vanderleyden, Jos; Spaepen, Stijn

    2015-02-01

    Cell surface display of proteins can be used for several biotechnological applications such as the screening of protein libraries, whole cell biocatalysis and live vaccine development. Amongst all secretion systems and surface appendages of Gram-negative bacteria, the autotransporter secretion pathway holds great potential for surface display because of its modular structure and apparent simplicity. Autotransporters are polypeptides made up of an N-terminal signal peptide, a secreted or surface-displayed passenger domain and a membrane-anchored C-terminal translocation unit. Genetic replacement of the passenger domain allows for the surface display of heterologous passengers. An autotransporter-based surface expression module essentially consists of an application-dependent promoter system, a signal peptide, a passenger domain of interest and the autotransporter translocation unit. The passenger domain needs to be compatible with surface translocation although till now no general rules have been determined to test this compatibility. The autotransporter technology for surface display of heterologous passenger domains is critically discussed for various applications.

  17. Decision support system for triage management: A hybrid approach using rule-based reasoning and fuzzy logic.

    PubMed

    Dehghani Soufi, Mahsa; Samad-Soltani, Taha; Shams Vahdati, Samad; Rezaei-Hachesu, Peyman

    2018-06-01

    Fast and accurate patient triage for the response process is a critical first step in emergency situations. This process is often performed using a paper-based mode, which intensifies workload and difficulty, wastes time, and is at risk of human errors. This study aims to design and evaluate a decision support system (DSS) to determine the triage level. A combination of the Rule-Based Reasoning (RBR) and Fuzzy Logic Classifier (FLC) approaches were used to predict the triage level of patients according to the triage specialist's opinions and Emergency Severity Index (ESI) guidelines. RBR was applied for modeling the first to fourth decision points of the ESI algorithm. The data relating to vital signs were used as input variables and modeled using fuzzy logic. Narrative knowledge was converted to If-Then rules using XML. The extracted rules were then used to create the rule-based engine and predict the triage levels. Fourteen RBR and 27 fuzzy rules were extracted and used in the rule-based engine. The performance of the system was evaluated using three methods with real triage data. The accuracy of the clinical decision support systems (CDSSs; in the test data) was 99.44%. The evaluation of the error rate revealed that, when using the traditional method, 13.4% of the patients were miss-triaged, which is statically significant. The completeness of the documentation also improved from 76.72% to 98.5%. Designed system was effective in determining the triage level of patients and it proved helpful for nurses as they made decisions, generated nursing diagnoses based on triage guidelines. The hybrid approach can reduce triage misdiagnosis in a highly accurate manner and improve the triage outcomes. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. Evaluation of Potential Continuation Rules for Mepolizumab Treatment of Severe Eosinophilic Asthma.

    PubMed

    Gunsoy, Necdet B; Cockle, Sarah M; Yancey, Steven W; Keene, Oliver N; Bradford, Eric S; Albers, Frank C; Pavord, Ian D

    Mepolizumab significantly reduces exacerbations in patients with severe eosinophilic asthma. The early identification of patients likely to receive long-term benefit from treatment could ensure effective resource allocation. To assess potential continuation rules for mepolizumab in addition to initiation criteria defined as 2 or more exacerbations in the previous year and blood eosinophil counts of 150 cells/μL or more at initiation or 300 cells/μL or more in the previous year. This post hoc analysis included data from 2 randomized, double-blind, placebo-controlled studies (NCT01000506 and NCT01691521) of mepolizumab in patients with severe eosinophilic asthma (N = 1,192). Rules based on blood eosinophils, physician-rated response to treatment, FEV 1 , Asthma Control Questionnaire (ACQ-5) score, and exacerbation reduction were assessed at week 16. To assess these rules, 2 key metrics accounting for the effects observed in the placebo arm were developed. Patients not meeting continuation rules based on physician-rated response, FEV 1 , and the ACQ-5 score still derived long-term benefit from mepolizumab. Nearly all patients failing to reduce blood eosinophils had counts of 150 cells/μL or less at baseline. For exacerbations, assessment after 16 weeks was potentially premature for predicting future exacerbations. There was no evidence of a reliable physician-rated response, ACQ-5 score, or lung function-based continuation rule. The added value of changes in blood eosinophils at week 16 over baseline was marginal. Initiation criteria for mepolizumab treatment provide the best method for assessing patient benefit from mepolizumab treatment, and treatment continuation should be reviewed on the basis of a predefined reduction in long-term exacerbation frequency and/or oral corticosteroid dose. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  19. A rule-based expert system for chemical prioritization using effects-based chemical categories

    EPA Science Inventory

    A rule-based expert system (ES) was developed to predict chemical binding to the estrogen receptor (ER) patterned on the research approaches championed by Gilman Veith to whom this article and journal issue are dedicated. The ERES was built to be mechanistically-transparent and m...

  20. C-Language Integrated Production System, Version 5.1

    NASA Technical Reports Server (NTRS)

    Riley, Gary; Donnell, Brian; Ly, Huyen-Anh VU; Culbert, Chris; Savely, Robert T.; Mccoy, Daniel J.; Giarratano, Joseph

    1992-01-01

    CLIPS 5.1 provides cohesive software tool for handling wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming provides representation of knowledge by use of heuristics. Object-oriented programming enables modeling of complex systems as modular components. Procedural programming enables CLIPS to represent knowledge in ways similar to those allowed in such languages as C, Pascal, Ada, and LISP. Working with CLIPS 5.1, one can develop expert-system software by use of rule-based programming only, object-oriented programming only, procedural programming only, or combinations of the three.

  1. A rule-based smart automated fertilization and irrigation systems

    NASA Astrophysics Data System (ADS)

    Yousif, Musab El-Rashid; Ghafar, Khairuddin; Zahari, Rahimi; Lim, Tiong Hoo

    2018-04-01

    Smart automation in industries has become very important as it can improve the reliability and efficiency of the systems. The use of smart technologies in agriculture have increased over the year to ensure and control the production of crop and address food security. However, it is important to use proper irrigation systems avoid water wastage and overfeeding of the plant. In this paper, a Smart Rule-based Automated Fertilization and Irrigation System is proposed and evaluated. We propose a rule based decision making algorithm to monitor and control the food supply to the plant and the soil quality. A build-in alert system is also used to update the farmer using a text message. The system is developed and evaluated using a real hardware.

  2. TRICARE revision to CHAMPUS DRG-based payment system, pricing of hospital claims. Final rule.

    PubMed

    2014-05-21

    This Final rule changes TRICARE's current regulatory provision for inpatient hospital claims priced under the DRG-based payment system. Claims are currently priced by using the rates and weights that are in effect on a beneficiary's date of admission. This Final rule changes that provision to price such claims by using the rates and weights that are in effect on a beneficiary's date of discharge.

  3. Phenomenology-Based Inverse Scattering for Sensor Information Fusion

    DTIC Science & Technology

    2006-09-15

    abilities in the past. Rule -based systems and mathematics of logic implied significant similarities between the two: Thoughts, words, and phrases...all are logical statements. The situation has changed, in part due to the fact that logic- rule systems have not been sufficiently powerful to explain...references]. 3 Language mechanisms of our mind include abilities to acquire a large vocabulary, rules of grammar, and to use the finite set of

  4. Personalization of Rule-based Web Services.

    PubMed

    Choi, Okkyung; Han, Sang Yong

    2008-04-04

    Nowadays Web users have clearly expressed their wishes to receive personalized services directly. Personalization is the way to tailor services directly to the immediate requirements of the user. However, the current Web Services System does not provide any features supporting this such as consideration of personalization of services and intelligent matchmaking. In this research a flexible, personalized Rule-based Web Services System to address these problems and to enable efficient search, discovery and construction across general Web documents and Semantic Web documents in a Web Services System is proposed. This system utilizes matchmaking among service requesters', service providers' and users' preferences using a Rule-based Search Method, and subsequently ranks search results. A prototype of efficient Web Services search and construction for the suggested system is developed based on the current work.

  5. An analytical fuzzy-based approach to ?-gain optimal control of input-affine nonlinear systems using Newton-type algorithm

    NASA Astrophysics Data System (ADS)

    Milic, Vladimir; Kasac, Josip; Novakovic, Branko

    2015-10-01

    This paper is concerned with ?-gain optimisation of input-affine nonlinear systems controlled by analytic fuzzy logic system. Unlike the conventional fuzzy-based strategies, the non-conventional analytic fuzzy control method does not require an explicit fuzzy rule base. As the first contribution of this paper, we prove, by using the Stone-Weierstrass theorem, that the proposed fuzzy system without rule base is universal approximator. The second contribution of this paper is an algorithm for solving a finite-horizon minimax problem for ?-gain optimisation. The proposed algorithm consists of recursive chain rule for first- and second-order derivatives, Newton's method, multi-step Adams method and automatic differentiation. Finally, the results of this paper are evaluated on a second-order nonlinear system.

  6. Life insurance risk assessment using a fuzzy logic expert system

    NASA Technical Reports Server (NTRS)

    Carreno, Luis A.; Steel, Roy A.

    1992-01-01

    In this paper, we present a knowledge based system that combines fuzzy processing with rule-based processing to form an improved decision aid for evaluating risk for life insurance. This application illustrates the use of FuzzyCLIPS to build a knowledge based decision support system possessing fuzzy components to improve user interactions and KBS performance. The results employing FuzzyCLIPS are compared with the results obtained from the solution of the problem using traditional numerical equations. The design of the fuzzy solution consists of a CLIPS rule-based system for some factors combined with fuzzy logic rules for others. This paper describes the problem, proposes a solution, presents the results, and provides a sample output of the software product.

  7. Translating expert system rules into Ada code with validation and verification

    NASA Technical Reports Server (NTRS)

    Becker, Lee; Duckworth, R. James; Green, Peter; Michalson, Bill; Gosselin, Dave; Nainani, Krishan; Pease, Adam

    1991-01-01

    The purpose of this ongoing research and development program is to develop software tools which enable the rapid development, upgrading, and maintenance of embedded real-time artificial intelligence systems. The goals of this phase of the research were to investigate the feasibility of developing software tools which automatically translate expert system rules into Ada code and develop methods for performing validation and verification testing of the resultant expert system. A prototype system was demonstrated which automatically translated rules from an Air Force expert system was demonstrated which detected errors in the execution of the resultant system. The method and prototype tools for converting AI representations into Ada code by converting the rules into Ada code modules and then linking them with an Activation Framework based run-time environment to form an executable load module are discussed. This method is based upon the use of Evidence Flow Graphs which are a data flow representation for intelligent systems. The development of prototype test generation and evaluation software which was used to test the resultant code is discussed. This testing was performed automatically using Monte-Carlo techniques based upon a constraint based description of the required performance for the system.

  8. Debugging expert systems using a dynamically created hypertext network

    NASA Technical Reports Server (NTRS)

    Boyle, Craig D. B.; Schuette, John F.

    1991-01-01

    The labor intensive nature of expert system writing and debugging motivated this study. The hypothesis is that a hypertext based debugging tool is easier and faster than one traditional tool, the graphical execution trace. HESDE (Hypertext Expert System Debugging Environment) uses Hypertext nodes and links to represent the objects and their relationships created during the execution of a rule based expert system. HESDE operates transparently on top of the CLIPS (C Language Integrated Production System) rule based system environment and is used during the knowledge base debugging process. During the execution process HESDE builds an execution trace. Use of facts, rules, and their values are automatically stored in a Hypertext network for each execution cycle. After the execution process, the knowledge engineer may access the Hypertext network and browse the network created. The network may be viewed in terms of rules, facts, and values. An experiment was conducted to compare HESDE with a graphical debugging environment. Subjects were given representative tasks. For speed and accuracy, in eight of the eleven tasks given to subjects, HESDE was significantly better.

  9. Exact Hybrid Particle/Population Simulation of Rule-Based Models of Biochemical Systems

    PubMed Central

    Stover, Lori J.; Nair, Niketh S.; Faeder, James R.

    2014-01-01

    Detailed modeling and simulation of biochemical systems is complicated by the problem of combinatorial complexity, an explosion in the number of species and reactions due to myriad protein-protein interactions and post-translational modifications. Rule-based modeling overcomes this problem by representing molecules as structured objects and encoding their interactions as pattern-based rules. This greatly simplifies the process of model specification, avoiding the tedious and error prone task of manually enumerating all species and reactions that can potentially exist in a system. From a simulation perspective, rule-based models can be expanded algorithmically into fully-enumerated reaction networks and simulated using a variety of network-based simulation methods, such as ordinary differential equations or Gillespie's algorithm, provided that the network is not exceedingly large. Alternatively, rule-based models can be simulated directly using particle-based kinetic Monte Carlo methods. This “network-free” approach produces exact stochastic trajectories with a computational cost that is independent of network size. However, memory and run time costs increase with the number of particles, limiting the size of system that can be feasibly simulated. Here, we present a hybrid particle/population simulation method that combines the best attributes of both the network-based and network-free approaches. The method takes as input a rule-based model and a user-specified subset of species to treat as population variables rather than as particles. The model is then transformed by a process of “partial network expansion” into a dynamically equivalent form that can be simulated using a population-adapted network-free simulator. The transformation method has been implemented within the open-source rule-based modeling platform BioNetGen, and resulting hybrid models can be simulated using the particle-based simulator NFsim. Performance tests show that significant memory savings can be achieved using the new approach and a monetary cost analysis provides a practical measure of its utility. PMID:24699269

  10. Exact hybrid particle/population simulation of rule-based models of biochemical systems.

    PubMed

    Hogg, Justin S; Harris, Leonard A; Stover, Lori J; Nair, Niketh S; Faeder, James R

    2014-04-01

    Detailed modeling and simulation of biochemical systems is complicated by the problem of combinatorial complexity, an explosion in the number of species and reactions due to myriad protein-protein interactions and post-translational modifications. Rule-based modeling overcomes this problem by representing molecules as structured objects and encoding their interactions as pattern-based rules. This greatly simplifies the process of model specification, avoiding the tedious and error prone task of manually enumerating all species and reactions that can potentially exist in a system. From a simulation perspective, rule-based models can be expanded algorithmically into fully-enumerated reaction networks and simulated using a variety of network-based simulation methods, such as ordinary differential equations or Gillespie's algorithm, provided that the network is not exceedingly large. Alternatively, rule-based models can be simulated directly using particle-based kinetic Monte Carlo methods. This "network-free" approach produces exact stochastic trajectories with a computational cost that is independent of network size. However, memory and run time costs increase with the number of particles, limiting the size of system that can be feasibly simulated. Here, we present a hybrid particle/population simulation method that combines the best attributes of both the network-based and network-free approaches. The method takes as input a rule-based model and a user-specified subset of species to treat as population variables rather than as particles. The model is then transformed by a process of "partial network expansion" into a dynamically equivalent form that can be simulated using a population-adapted network-free simulator. The transformation method has been implemented within the open-source rule-based modeling platform BioNetGen, and resulting hybrid models can be simulated using the particle-based simulator NFsim. Performance tests show that significant memory savings can be achieved using the new approach and a monetary cost analysis provides a practical measure of its utility.

  11. A 3-D model of tumor progression based on complex automata driven by particle dynamics.

    PubMed

    Wcisło, Rafał; Dzwinel, Witold; Yuen, David A; Dudek, Arkadiusz Z

    2009-12-01

    The dynamics of a growing tumor involving mechanical remodeling of healthy tissue and vasculature is neglected in most of the existing tumor models. This is due to the lack of efficient computational framework allowing for simulation of mechanical interactions. Meanwhile, just these interactions trigger critical changes in tumor growth dynamics and are responsible for its volumetric and directional progression. We describe here a novel 3-D model of tumor growth, which combines particle dynamics with cellular automata concept. The particles represent both tissue cells and fragments of the vascular network. They interact with their closest neighbors via semi-harmonic central forces simulating mechanical resistance of the cell walls. The particle dynamics is governed by both the Newtonian laws of motion and the cellular automata rules. These rules can represent cell life-cycle and other biological interactions involving smaller spatio-temporal scales. We show that our complex automata, particle based model can reproduce realistic 3-D dynamics of the entire system consisting of the tumor, normal tissue cells, blood vessels and blood flow. It can explain phenomena such as the inward cell motion in avascular tumor, stabilization of tumor growth by the external pressure, tumor vascularization due to the process of angiogenesis, trapping of healthy cells by invading tumor, and influence of external (boundary) conditions on the direction of tumor progression. We conclude that the particle model can serve as a general framework for designing advanced multiscale models of tumor dynamics and it is very competitive to the modeling approaches presented before.

  12. Monitoring Agents for Assisting NASA Engineers with Shuttle Ground Processing

    NASA Technical Reports Server (NTRS)

    Semmel, Glenn S.; Davis, Steven R.; Leucht, Kurt W.; Rowe, Danil A.; Smith, Kevin E.; Boeloeni, Ladislau

    2005-01-01

    The Spaceport Processing Systems Branch at NASA Kennedy Space Center has designed, developed, and deployed a rule-based agent to monitor the Space Shuttle's ground processing telemetry stream. The NASA Engineering Shuttle Telemetry Agent increases situational awareness for system and hardware engineers during ground processing of the Shuttle's subsystems. The agent provides autonomous monitoring of the telemetry stream and automatically alerts system engineers when user defined conditions are satisfied. Efficiency and safety are improved through increased automation. Sandia National Labs' Java Expert System Shell is employed as the agent's rule engine. The shell's predicate logic lends itself well to capturing the heuristics and specifying the engineering rules within this domain. The declarative paradigm of the rule-based agent yields a highly modular and scalable design spanning multiple subsystems of the Shuttle. Several hundred monitoring rules have been written thus far with corresponding notifications sent to Shuttle engineers. This chapter discusses the rule-based telemetry agent used for Space Shuttle ground processing. We present the problem domain along with design and development considerations such as information modeling, knowledge capture, and the deployment of the product. We also present ongoing work with other condition monitoring agents.

  13. Proving Properties of Rule-Based Systems

    DTIC Science & Technology

    1990-12-01

    in these systems and enable us to use them with more confidence. Each system of rules is encoded as a set of axioms that define the system theory . The...operation of the rule language and information about the subject domain are also described in the system theory . Validation tasks, such as...the validity of the conjecture in the system theory , we have carried out the corresponding validation task. If the proof is restricted to be

  14. Algorithm Optimally Orders Forward-Chaining Inference Rules

    NASA Technical Reports Server (NTRS)

    James, Mark

    2008-01-01

    People typically develop knowledge bases in a somewhat ad hoc manner by incrementally adding rules with no specific organization. This often results in a very inefficient execution of those rules since they are so often order sensitive. This is relevant to tasks like Deep Space Network in that it allows the knowledge base to be incrementally developed and have it automatically ordered for efficiency. Although data flow analysis was first developed for use in compilers for producing optimal code sequences, its usefulness is now recognized in many software systems including knowledge-based systems. However, this approach for exhaustively computing data-flow information cannot directly be applied to inference systems because of the ubiquitous execution of the rules. An algorithm is presented that efficiently performs a complete producer/consumer analysis for each antecedent and consequence clause in a knowledge base to optimally order the rules to minimize inference cycles. An algorithm was developed that optimally orders a knowledge base composed of forwarding chaining inference rules such that independent inference cycle executions are minimized, thus, resulting in significantly faster execution. This algorithm was integrated into the JPL tool Spacecraft Health Inference Engine (SHINE) for verification and it resulted in a significant reduction in inference cycles for what was previously considered an ordered knowledge base. For a knowledge base that is completely unordered, then the improvement is much greater.

  15. CT Image Sequence Analysis for Object Recognition - A Rule-Based 3-D Computer Vision System

    Treesearch

    Dongping Zhu; Richard W. Conners; Daniel L. Schmoldt; Philip A. Araman

    1991-01-01

    Research is now underway to create a vision system for hardwood log inspection using a knowledge-based approach. In this paper, we present a rule-based, 3-D vision system for locating and identifying wood defects using topological, geometric, and statistical attributes. A number of different features can be derived from the 3-D input scenes. These features and evidence...

  16. A discrete cell model with adaptive signalling for aggregation of Dictyostelium discoideum.

    PubMed Central

    Dallon, J C; Othmer, H G

    1997-01-01

    Dictyostelium discoideum (Dd) is a widely studied model system from which fundamental insights into cell movement, chemotaxis, aggregation and pattern formation can be gained. In this system aggregation results from the chemotactic response by dispersed amoebae to a travelling wave of the chemoattractant cAMP. We have developed a model in which the cells are treated as discrete points in a continuum field of the chemoattractant, and transduction of the extracellular cAMP signal into the intracellular signal is based on the G protein model developed by Tang & Othmer. The model reproduces a number of experimental observations and gives further insight into the aggregation process. We investigate different rules for cell movement the factors that influence stream formation the effect on aggregation of noise in the choice of the direction of movement and when spiral waves of chemoattractant and cell density are likely to occur. Our results give new insight into the origin of spiral waves and suggest that streaming is due to a finite amplitude instability. PMID:9134569

  17. Rule Extraction Based on Extreme Learning Machine and an Improved Ant-Miner Algorithm for Transient Stability Assessment.

    PubMed

    Li, Yang; Li, Guoqing; Wang, Zhenhao

    2015-01-01

    In order to overcome the problems of poor understandability of the pattern recognition-based transient stability assessment (PRTSA) methods, a new rule extraction method based on extreme learning machine (ELM) and an improved Ant-miner (IAM) algorithm is presented in this paper. First, the basic principles of ELM and Ant-miner algorithm are respectively introduced. Then, based on the selected optimal feature subset, an example sample set is generated by the trained ELM-based PRTSA model. And finally, a set of classification rules are obtained by IAM algorithm to replace the original ELM network. The novelty of this proposal is that transient stability rules are extracted from an example sample set generated by the trained ELM-based transient stability assessment model by using IAM algorithm. The effectiveness of the proposed method is shown by the application results on the New England 39-bus power system and a practical power system--the southern power system of Hebei province.

  18. Rule Systems for Runtime Verification: A Short Tutorial

    NASA Astrophysics Data System (ADS)

    Barringer, Howard; Havelund, Klaus; Rydeheard, David; Groce, Alex

    In this tutorial, we introduce two rule-based systems for on and off-line trace analysis, RuleR and LogScope. RuleR is a conditional rule-based system, which has a simple and easily implemented algorithm for effective runtime verification, and into which one can compile a wide range of temporal logics and other specification formalisms used for runtime verification. Specifications can be parameterized with data, or even with specifications, allowing for temporal logic combinators to be defined. We outline a number of simple syntactic extensions of core RuleR that can lead to further conciseness of specification but still enabling easy and efficient implementation. RuleR is implemented in Java and we will demonstrate its ease of use in monitoring Java programs. LogScope is a derivation of RuleR adding a simple very user-friendly temporal logic. It was developed in Python, specifically for supporting testing of spacecraft flight software for NASA’s next 2011 Mars mission MSL (Mars Science Laboratory). The system has been applied by test engineers to analysis of log files generated by running the flight software. Detailed logging is already part of the system design approach, and hence there is no added instrumentation overhead caused by this approach. While post-mortem log analysis prevents the autonomous reaction to problems possible with traditional runtime verification, it provides a powerful tool for test automation. A new system is being developed that integrates features from both RuleR and LogScope.

  19. Conceptual model of knowledge base system

    NASA Astrophysics Data System (ADS)

    Naykhanova, L. V.; Naykhanova, I. V.

    2018-05-01

    In the article, the conceptual model of the knowledge based system by the type of the production system is provided. The production system is intended for automation of problems, which solution is rigidly conditioned by the legislation. A core component of the system is a knowledge base. The knowledge base consists of a facts set, a rules set, the cognitive map and ontology. The cognitive map is developed for implementation of a control strategy, ontology - the explanation mechanism. Knowledge representation about recognition of a situation in the form of rules allows describing knowledge of the pension legislation. This approach provides the flexibility, originality and scalability of the system. In the case of changing legislation, it is necessary to change the rules set. This means that the change of the legislation would not be a big problem. The main advantage of the system is that there is an opportunity to be adapted easily to changes of the legislation.

  20. ML-Space: Hybrid Spatial Gillespie and Particle Simulation of Multi-Level Rule-Based Models in Cell Biology.

    PubMed

    Bittig, Arne T; Uhrmacher, Adelinde M

    2017-01-01

    Spatio-temporal dynamics of cellular processes can be simulated at different levels of detail, from (deterministic) partial differential equations via the spatial Stochastic Simulation algorithm to tracking Brownian trajectories of individual particles. We present a spatial simulation approach for multi-level rule-based models, which includes dynamically hierarchically nested cellular compartments and entities. Our approach ML-Space combines discrete compartmental dynamics, stochastic spatial approaches in discrete space, and particles moving in continuous space. The rule-based specification language of ML-Space supports concise and compact descriptions of models and to adapt the spatial resolution of models easily.

  1. Grid occupancy estimation for environment perception based on belief functions and PCR6

    NASA Astrophysics Data System (ADS)

    Moras, Julien; Dezert, Jean; Pannetier, Benjamin

    2015-05-01

    In this contribution, we propose to improve the grid map occupancy estimation method developed so far based on belief function modeling and the classical Dempster's rule of combination. Grid map offers a useful representation of the perceived world for mobile robotics navigation. It will play a major role for the security (obstacle avoidance) of next generations of terrestrial vehicles, as well as for future autonomous navigation systems. In a grid map, the occupancy of each cell representing a small piece of the surrounding area of the robot must be estimated at first from sensors measurements (typically LIDAR, or camera), and then it must also be classified into different classes in order to get a complete and precise perception of the dynamic environment where the robot moves. So far, the estimation and the grid map updating have been done using fusion techniques based on the probabilistic framework, or on the classical belief function framework thanks to an inverse model of the sensors. Mainly because the latter offers an interesting management of uncertainties when the quality of available information is low, and when the sources of information appear as conflicting. To improve the performances of the grid map estimation, we propose in this paper to replace Dempster's rule of combination by the PCR6 rule (Proportional Conflict Redistribution rule #6) proposed in DSmT (Dezert-Smarandache) Theory. As an illustrating scenario, we consider a platform moving in dynamic area and we compare our new realistic simulation results (based on a LIDAR sensor) with those obtained by the probabilistic and the classical belief-based approaches.

  2. XBONE: a hybrid expert system for supporting diagnosis of bone diseases.

    PubMed

    Hatzilygeroudis, I; Vassilakos, P J; Tsakalidis, A

    1997-01-01

    In this paper, XBONE, a hybrid medical expert system that supports diagnosis of bone diseases is presented. Diagnosis is based on various patient data and is performed in two stages. In the early stage, diagnosis is based on demographic and clinical data of the patient, whereas in the late stage it is mainly based on nuclear medicine image data. Knowledge is represented via an integrated formalism that combines production rules and the Adaline artificial neural unit. Each condition of a rule is assigned a number, called its significance factor, representing its significance in drawing the conclusion of the rule. This results in better representation, reduction of the knowledge base size and gives the system learning capabilities.

  3. Agent-Based Modeling of Cancer Stem Cell Driven Solid Tumor Growth.

    PubMed

    Poleszczuk, Jan; Macklin, Paul; Enderling, Heiko

    2016-01-01

    Computational modeling of tumor growth has become an invaluable tool to simulate complex cell-cell interactions and emerging population-level dynamics. Agent-based models are commonly used to describe the behavior and interaction of individual cells in different environments. Behavioral rules can be informed and calibrated by in vitro assays, and emerging population-level dynamics may be validated with both in vitro and in vivo experiments. Here, we describe the design and implementation of a lattice-based agent-based model of cancer stem cell driven tumor growth.

  4. Reconstruction of DNA sequences using genetic algorithms and cellular automata: towards mutation prediction?

    PubMed

    Mizas, Ch; Sirakoulis, G Ch; Mardiris, V; Karafyllidis, I; Glykos, N; Sandaltzopoulos, R

    2008-04-01

    Change of DNA sequence that fuels evolution is, to a certain extent, a deterministic process because mutagenesis does not occur in an absolutely random manner. So far, it has not been possible to decipher the rules that govern DNA sequence evolution due to the extreme complexity of the entire process. In our attempt to approach this issue we focus solely on the mechanisms of mutagenesis and deliberately disregard the role of natural selection. Hence, in this analysis, evolution refers to the accumulation of genetic alterations that originate from mutations and are transmitted through generations without being subjected to natural selection. We have developed a software tool that allows modelling of a DNA sequence as a one-dimensional cellular automaton (CA) with four states per cell which correspond to the four DNA bases, i.e. A, C, T and G. The four states are represented by numbers of the quaternary number system. Moreover, we have developed genetic algorithms (GAs) in order to determine the rules of CA evolution that simulate the DNA evolution process. Linear evolution rules were considered and square matrices were used to represent them. If DNA sequences of different evolution steps are available, our approach allows the determination of the underlying evolution rule(s). Conversely, once the evolution rules are deciphered, our tool may reconstruct the DNA sequence in any previous evolution step for which the exact sequence information was unknown. The developed tool may be used to test various parameters that could influence evolution. We describe a paradigm relying on the assumption that mutagenesis is governed by a near-neighbour-dependent mechanism. Based on the satisfactory performance of our system in the deliberately simplified example, we propose that our approach could offer a starting point for future attempts to understand the mechanisms that govern evolution. The developed software is open-source and has a user-friendly graphical input interface.

  5. Modern architectures for intelligent systems: reusable ontologies and problem-solving methods.

    PubMed Central

    Musen, M. A.

    1998-01-01

    When interest in intelligent systems for clinical medicine soared in the 1970s, workers in medical informatics became particularly attracted to rule-based systems. Although many successful rule-based applications were constructed, development and maintenance of large rule bases remained quite problematic. In the 1980s, an entire industry dedicated to the marketing of tools for creating rule-based systems rose and fell, as workers in medical informatics began to appreciate deeply why knowledge acquisition and maintenance for such systems are difficult problems. During this time period, investigators began to explore alternative programming abstractions that could be used to develop intelligent systems. The notions of "generic tasks" and of reusable problem-solving methods became extremely influential. By the 1990s, academic centers were experimenting with architectures for intelligent systems based on two classes of reusable components: (1) domain-independent problem-solving methods-standard algorithms for automating stereotypical tasks--and (2) domain ontologies that captured the essential concepts (and relationships among those concepts) in particular application areas. This paper will highlight how intelligent systems for diverse tasks can be efficiently automated using these kinds of building blocks. The creation of domain ontologies and problem-solving methods is the fundamental end product of basic research in medical informatics. Consequently, these concepts need more attention by our scientific community. PMID:9929181

  6. Modern architectures for intelligent systems: reusable ontologies and problem-solving methods.

    PubMed

    Musen, M A

    1998-01-01

    When interest in intelligent systems for clinical medicine soared in the 1970s, workers in medical informatics became particularly attracted to rule-based systems. Although many successful rule-based applications were constructed, development and maintenance of large rule bases remained quite problematic. In the 1980s, an entire industry dedicated to the marketing of tools for creating rule-based systems rose and fell, as workers in medical informatics began to appreciate deeply why knowledge acquisition and maintenance for such systems are difficult problems. During this time period, investigators began to explore alternative programming abstractions that could be used to develop intelligent systems. The notions of "generic tasks" and of reusable problem-solving methods became extremely influential. By the 1990s, academic centers were experimenting with architectures for intelligent systems based on two classes of reusable components: (1) domain-independent problem-solving methods-standard algorithms for automating stereotypical tasks--and (2) domain ontologies that captured the essential concepts (and relationships among those concepts) in particular application areas. This paper will highlight how intelligent systems for diverse tasks can be efficiently automated using these kinds of building blocks. The creation of domain ontologies and problem-solving methods is the fundamental end product of basic research in medical informatics. Consequently, these concepts need more attention by our scientific community.

  7. Genetic reinforcement learning through symbiotic evolution for fuzzy controller design.

    PubMed

    Juang, C F; Lin, J Y; Lin, C T

    2000-01-01

    An efficient genetic reinforcement learning algorithm for designing fuzzy controllers is proposed in this paper. The genetic algorithm (GA) adopted in this paper is based upon symbiotic evolution which, when applied to fuzzy controller design, complements the local mapping property of a fuzzy rule. Using this Symbiotic-Evolution-based Fuzzy Controller (SEFC) design method, the number of control trials, as well as consumed CPU time, are considerably reduced when compared to traditional GA-based fuzzy controller design methods and other types of genetic reinforcement learning schemes. Moreover, unlike traditional fuzzy controllers, which partition the input space into a grid, SEFC partitions the input space in a flexible way, thus creating fewer fuzzy rules. In SEFC, different types of fuzzy rules whose consequent parts are singletons, fuzzy sets, or linear equations (TSK-type fuzzy rules) are allowed. Further, the free parameters (e.g., centers and widths of membership functions) and fuzzy rules are all tuned automatically. For the TSK-type fuzzy rule especially, which put the proposed learning algorithm in use, only the significant input variables are selected to participate in the consequent of a rule. The proposed SEFC design method has been applied to different simulated control problems, including the cart-pole balancing system, a magnetic levitation system, and a water bath temperature control system. The proposed SEFC has been verified to be efficient and superior from these control problems, and from comparisons with some traditional GA-based fuzzy systems.

  8. Local behavioral rules sustain the cell allocation pattern in the combs of honey bee colonies (Apis mellifera).

    PubMed

    Montovan, Kathryn J; Karst, Nathaniel; Jones, Laura E; Seeley, Thomas D

    2013-11-07

    In the beeswax combs of honey bees, the cells of brood, pollen, and honey have a consistent spatial pattern that is sustained throughout the life of a colony. This spatial pattern is believed to emerge from simple behavioral rules that specify how the queen moves, where foragers deposit honey/pollen and how honey/pollen is consumed from cells. Prior work has shown that a set of such rules can explain the formation of the allocation pattern starting from an empty comb. We show that these rules cannot maintain the pattern once the brood start to vacate their cells, and we propose new, biologically realistic rules that better sustain the observed allocation pattern. We analyze the three resulting models by performing hundreds of simulation runs over many gestational periods and a wide range of parameter values. We develop new metrics for pattern assessment and employ them in analyzing pattern retention over each simulation run. Applied to our simulation results, these metrics show alteration of an accepted model for honey/pollen consumption based on local information can stabilize the cell allocation pattern over time. We also show that adding global information, by biasing the queen's movements towards the center of the comb, expands the parameter regime over which pattern retention occurs. © 2013 Published by Elsevier Ltd. All rights reserved.

  9. 48 CFR 6103.306 - Decisions [Rule 306].

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Decisions [Rule 306]. 6103.306 Section 6103.306 Federal Acquisition Regulations System CIVILIAN BOARD OF CONTRACT APPEALS, GENERAL SERVICES ADMINISTRATION TRANSPORTATION RATE CASES 6103.306 Decisions [Rule 306]. The judge will issue a written decision based upon the record,...

  10. Rule-based support system for multiple UMLS semantic type assignments

    PubMed Central

    Geller, James; He, Zhe; Perl, Yehoshua; Morrey, C. Paul; Xu, Julia

    2012-01-01

    Background When new concepts are inserted into the UMLS, they are assigned one or several semantic types from the UMLS Semantic Network by the UMLS editors. However, not every combination of semantic types is permissible. It was observed that many concepts with rare combinations of semantic types have erroneous semantic type assignments or prohibited combinations of semantic types. The correction of such errors is resource-intensive. Objective We design a computational system to inform UMLS editors as to whether a specific combination of two, three, four, or five semantic types is permissible or prohibited or questionable. Methods We identify a set of inclusion and exclusion instructions in the UMLS Semantic Network documentation and derive corresponding rule-categories as well as rule-categories from the UMLS concept content. We then design an algorithm adviseEditor based on these rule-categories. The algorithm specifies rules for an editor how to proceed when considering a tuple (pair, triple, quadruple, quintuple) of semantic types to be assigned to a concept. Results Eight rule-categories were identified. A Web-based system was developed to implement the adviseEditor algorithm, which returns for an input combination of semantic types whether it is permitted, prohibited or (in a few cases) requires more research. The numbers of semantic type pairs assigned to each rule-category are reported. Interesting examples for each rule-category are illustrated. Cases of semantic type assignments that contradict rules are listed, including recently introduced ones. Conclusion The adviseEditor system implements explicit and implicit knowledge available in the UMLS in a system that informs UMLS editors about the permissibility of a desired combination of semantic types. Using adviseEditor might help accelerate the work of the UMLS editors and prevent erroneous semantic type assignments. PMID:23041716

  11. Criterion learning in rule-based categorization: Simulation of neural mechanism and new data

    PubMed Central

    Helie, Sebastien; Ell, Shawn W.; Filoteo, J. Vincent; Maddox, W. Todd

    2015-01-01

    In perceptual categorization, rule selection consists of selecting one or several stimulus-dimensions to be used to categorize the stimuli (e.g, categorize lines according to their length). Once a rule has been selected, criterion learning consists of defining how stimuli will be grouped using the selected dimension(s) (e.g., if the selected rule is line length, define ‘long’ and ‘short’). Very little is known about the neuroscience of criterion learning, and most existing computational models do not provide a biological mechanism for this process. In this article, we introduce a new model of rule learning called Heterosynaptic Inhibitory Criterion Learning (HICL). HICL includes a biologically-based explanation of criterion learning, and we use new category-learning data to test key aspects of the model. In HICL, rule selective cells in prefrontal cortex modulate stimulus-response associations using pre-synaptic inhibition. Criterion learning is implemented by a new type of heterosynaptic error-driven Hebbian learning at inhibitory synapses that uses feedback to drive cell activation above/below thresholds representing ionic gating mechanisms. The model is used to account for new human categorization data from two experiments showing that: (1) changing rule criterion on a given dimension is easier if irrelevant dimensions are also changing (Experiment 1), and (2) showing that changing the relevant rule dimension and learning a new criterion is more difficult, but also facilitated by a change in the irrelevant dimension (Experiment 2). We conclude with a discussion of some of HICL’s implications for future research on rule learning. PMID:25682349

  12. Criterion learning in rule-based categorization: simulation of neural mechanism and new data.

    PubMed

    Helie, Sebastien; Ell, Shawn W; Filoteo, J Vincent; Maddox, W Todd

    2015-04-01

    In perceptual categorization, rule selection consists of selecting one or several stimulus-dimensions to be used to categorize the stimuli (e.g., categorize lines according to their length). Once a rule has been selected, criterion learning consists of defining how stimuli will be grouped using the selected dimension(s) (e.g., if the selected rule is line length, define 'long' and 'short'). Very little is known about the neuroscience of criterion learning, and most existing computational models do not provide a biological mechanism for this process. In this article, we introduce a new model of rule learning called Heterosynaptic Inhibitory Criterion Learning (HICL). HICL includes a biologically-based explanation of criterion learning, and we use new category-learning data to test key aspects of the model. In HICL, rule selective cells in prefrontal cortex modulate stimulus-response associations using pre-synaptic inhibition. Criterion learning is implemented by a new type of heterosynaptic error-driven Hebbian learning at inhibitory synapses that uses feedback to drive cell activation above/below thresholds representing ionic gating mechanisms. The model is used to account for new human categorization data from two experiments showing that: (1) changing rule criterion on a given dimension is easier if irrelevant dimensions are also changing (Experiment 1), and (2) showing that changing the relevant rule dimension and learning a new criterion is more difficult, but also facilitated by a change in the irrelevant dimension (Experiment 2). We conclude with a discussion of some of HICL's implications for future research on rule learning. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pin, F.G.

    Sensor-based operation of autonomous robots in unstructured and/or outdoor environments has revealed to be an extremely challenging problem, mainly because of the difficulties encountered when attempting to represent the many uncertainties which are always present in the real world. These uncertainties are primarily due to sensor imprecisions and unpredictability of the environment, i.e., lack of full knowledge of the environment characteristics and dynamics. An approach. which we have named the {open_quotes}Fuzzy Behaviorist Approach{close_quotes} (FBA) is proposed in an attempt to remedy some of these difficulties. This approach is based on the representation of the system`s uncertainties using Fuzzy Set Theory-basedmore » approximations and on the representation of the reasoning and control schemes as sets of elemental behaviors. Using the FBA, a formalism for rule base development and an automated generator of fuzzy rules have been developed. This automated system can automatically construct the set of membership functions corresponding to fuzzy behaviors. Once these have been expressed in qualitative terms by the user. The system also checks for completeness of the rule base and for non-redundancy of the rules (which has traditionally been a major hurdle in rule base development). Two major conceptual features, the suppression and inhibition mechanisms which allow to express a dominance between behaviors are discussed in detail. Some experimental results obtained with the automated fuzzy, rule generator applied to the domain of sensor-based navigation in aprion unknown environments. using one of our autonomous test-bed robots as well as a real car in outdoor environments, are then reviewed and discussed to illustrate the feasibility of large-scale automatic fuzzy rule generation using the {open_quotes}Fuzzy Behaviorist{close_quotes} concepts.« less

  14. Modeling mechanical inhomogeneities in small populations of proliferating monolayers and spheroids.

    PubMed

    Lejeune, Emma; Linder, Christian

    2018-06-01

    Understanding the mechanical behavior of multicellular monolayers and spheroids is fundamental to tissue culture, organism development, and the early stages of tumor growth. Proliferating cells in monolayers and spheroids experience mechanical forces as they grow and divide and local inhomogeneities in the mechanical microenvironment can cause individual cells within the multicellular system to grow and divide at different rates. This differential growth, combined with cell division and reorganization, leads to residual stress. Multiple different modeling approaches have been taken to understand and predict the residual stresses that arise in growing multicellular systems, particularly tumor spheroids. Here, we show that by using a mechanically robust agent-based model constructed with the peridynamic framework, we gain a better understanding of residual stresses in multicellular systems as they grow from a single cell. In particular, we focus on small populations of cells (1-100 s) where population behavior is highly stochastic and prior investigation has been limited. We compare the average strain energy density of cells in monolayers and spheroids using different growth and division rules and find that, on average, cells in spheroids have a higher strain energy density than cells in monolayers. We also find that cells in the interior of a growing spheroid are, on average, in compression. Finally, we demonstrate the importance of accounting for stochastic fluctuations in the mechanical environment, particularly when the cellular response to mechanical cues is nonlinear. The results presented here serve as a starting point for both further investigation with agent-based models, and for the incorporation of major findings from agent-based models into continuum scale models when explicit representation of individual cells is not computationally feasible.

  15. Effective Design of Multifunctional Peptides by Combining Compatible Functions

    PubMed Central

    Diener, Christian; Garza Ramos Martínez, Georgina; Moreno Blas, Daniel; Castillo González, David A.; Corzo, Gerardo; Castro-Obregon, Susana; Del Rio, Gabriel

    2016-01-01

    Multifunctionality is a common trait of many natural proteins and peptides, yet the rules to generate such multifunctionality remain unclear. We propose that the rules defining some protein/peptide functions are compatible. To explore this hypothesis, we trained a computational method to predict cell-penetrating peptides at the sequence level and learned that antimicrobial peptides and DNA-binding proteins are compatible with the rules of our predictor. Based on this finding, we expected that designing peptides for CPP activity may render AMP and DNA-binding activities. To test this prediction, we designed peptides that embedded two independent functional domains (nuclear localization and yeast pheromone activity), linked by optimizing their composition to fit the rules characterizing cell-penetrating peptides. These peptides presented effective cell penetration, DNA-binding, pheromone and antimicrobial activities, thus confirming the effectiveness of our computational approach to design multifunctional peptides with potential therapeutic uses. Our computational implementation is available at http://bis.ifc.unam.mx/en/software/dcf. PMID:27096600

  16. A CLIPS-based expert system for the evaluation and selection of robots

    NASA Technical Reports Server (NTRS)

    Nour, Mohamed A.; Offodile, Felix O.; Madey, Gregory R.

    1994-01-01

    This paper describes the development of a prototype expert system for intelligent selection of robots for manufacturing operations. The paper first develops a comprehensive, three-stage process to model the robot selection problem. The decisions involved in this model easily lend themselves to an expert system application. A rule-based system, based on the selection model, is developed using the CLIPS expert system shell. Data about actual robots is used to test the performance of the prototype system. Further extensions to the rule-based system for data handling and interfacing capabilities are suggested.

  17. Common-Sense Rule Inference

    NASA Astrophysics Data System (ADS)

    Lombardi, Ilaria; Console, Luca

    In the paper we show how rule-based inference can be made more flexible by exploiting semantic information associated with the concepts involved in the rules. We introduce flexible forms of common sense reasoning in which whenever no rule applies to a given situation, the inference engine can fire rules that apply to more general or to similar situations. This can be obtained by defining new forms of match between rules and the facts in the working memory and new forms of conflict resolution. We claim that in this way we can overcome some of the brittleness problems that are common in rule-based systems.

  18. Rule-Based Event Processing and Reaction Rules

    NASA Astrophysics Data System (ADS)

    Paschke, Adrian; Kozlenkov, Alexander

    Reaction rules and event processing technologies play a key role in making business and IT / Internet infrastructures more agile and active. While event processing is concerned with detecting events from large event clouds or streams in almost real-time, reaction rules are concerned with the invocation of actions in response to events and actionable situations. They state the conditions under which actions must be taken. In the last decades various reaction rule and event processing approaches have been developed, which for the most part have been advanced separately. In this paper we survey reaction rule approaches and rule-based event processing systems and languages.

  19. SURE (Science User Resource Expert): A science planning and scheduling assistant for a resource based environment

    NASA Technical Reports Server (NTRS)

    Thalman, Nancy E.; Sparn, Thomas P.

    1990-01-01

    SURE (Science User Resource Expert) is one of three components that compose the SURPASS (Science User Resource Planning and Scheduling System). This system is a planning and scheduling tool which supports distributed planning and scheduling, based on resource allocation and optimization. Currently SURE is being used within the SURPASS by the UARS (Upper Atmospheric Research Satellite) SOLSTICE instrument to build a daily science plan and activity schedule and in a prototyping effort with NASA GSFC to demonstrate distributed planning and scheduling for the SOLSTICE II instrument on the EOS platform. For the SOLSTICE application the SURE utilizes a rule-based system. Development of a rule-based program using Ada CLIPS as opposed to using conventional programming, allows for capture of the science planning and scheduling heuristics in rules and provides flexibility in inserting or removing rules as the scientific objectives and mission constraints change. The SURE system's role as a component in the SURPASS, the purpose of the SURE planning and scheduling tool, the SURE knowledge base, and the software architecture of the SURE component are described.

  20. A diagnostic prototype of the potable water subsystem of the Space Station Freedom ECLSS

    NASA Technical Reports Server (NTRS)

    Lukefahr, Brenda D.; Rochowiak, Daniel M.; Benson, Brian L.; Rogers, John S.; Mckee, James W.

    1989-01-01

    In analyzing the baseline Environmental Control and Life Support System (ECLSS) command and control architecture, various processes are found which would be enhanced by the use of knowledge based system methods of implementation. The most suitable process for prototyping using rule based methods are documented, while domain knowledge resources and other practical considerations are examined. Requirements for a prototype rule based software system are documented. These requirements reflect Space Station Freedom ECLSS software and hardware development efforts, and knowledge based system requirements. A quick prototype knowledge based system environment is researched and developed.

  1. A Novel Range-Extended Strategy for Fuel Cell/Battery Electric Vehicles.

    PubMed

    Hwang, Jenn-Jiang; Hu, Jia-Sheng; Lin, Chih-Hong

    2015-01-01

    The range-extended electric vehicle is proposed to improve the range anxiety drivers have of electric vehicles. Conventionally, a gasoline/diesel generator increases the range of an electric vehicle. Due to the zero-CO2 emission stipulations, utilizing fuel cells as generators raises concerns in society. This paper presents a novel charging strategy for fuel cell/battery electric vehicles. In comparison to the conventional switch control, a fuzzy control approach is employed to enhance the battery's state of charge (SOC). This approach improves the quick loss problem of the system's SOC and thus can achieve an extended driving range. Smooth steering experience and range extension are the main indexes for development of fuzzy rules, which are mainly based on the energy management in the urban driving model. Evaluation of the entire control system is performed by simulation, which demonstrates its effectiveness and feasibility.

  2. A Novel Range-Extended Strategy for Fuel Cell/Battery Electric Vehicles

    PubMed Central

    Hwang, Jenn-Jiang; Lin, Chih-Hong

    2015-01-01

    The range-extended electric vehicle is proposed to improve the range anxiety drivers have of electric vehicles. Conventionally, a gasoline/diesel generator increases the range of an electric vehicle. Due to the zero-CO2 emission stipulations, utilizing fuel cells as generators raises concerns in society. This paper presents a novel charging strategy for fuel cell/battery electric vehicles. In comparison to the conventional switch control, a fuzzy control approach is employed to enhance the battery's state of charge (SOC). This approach improves the quick loss problem of the system's SOC and thus can achieve an extended driving range. Smooth steering experience and range extension are the main indexes for development of fuzzy rules, which are mainly based on the energy management in the urban driving model. Evaluation of the entire control system is performed by simulation, which demonstrates its effectiveness and feasibility. PMID:26236771

  3. Medicare Program: Hospital Outpatient Prospective Payment and Ambulatory Surgical Center Payment Systems and Quality Reporting Programs; Organ Procurement Organization Reporting and Communication; Transplant Outcome Measures and Documentation Requirements; Electronic Health Record (EHR) Incentive Programs; Payment to Nonexcepted Off-Campus Provider-Based Department of a Hospital; Hospital Value-Based Purchasing (VBP) Program; Establishment of Payment Rates Under the Medicare Physician Fee Schedule for Nonexcepted Items and Services Furnished by an Off-Campus Provider-Based Department of a Hospital. Final rule with comment period and interim final rule with comment period.

    PubMed

    2016-11-14

    This final rule with comment period revises the Medicare hospital outpatient prospective payment system (OPPS) and the Medicare ambulatory surgical center (ASC) payment system for CY 2017 to implement applicable statutory requirements and changes arising from our continuing experience with these systems. In this final rule with comment period, we describe the changes to the amounts and factors used to determine the payment rates for Medicare services paid under the OPPS and those paid under the ASC payment system. In addition, this final rule with comment period updates and refines the requirements for the Hospital Outpatient Quality Reporting (OQR) Program and the ASC Quality Reporting (ASCQR) Program. Further, in this final rule with comment period, we are making changes to tolerance thresholds for clinical outcomes for solid organ transplant programs; to Organ Procurement Organizations (OPOs) definitions, outcome measures, and organ transport documentation; and to the Medicare and Medicaid Electronic Health Record Incentive Programs. We also are removing the HCAHPS Pain Management dimension from the Hospital Value-Based Purchasing (VBP) Program. In addition, we are implementing section 603 of the Bipartisan Budget Act of 2015 relating to payment for certain items and services furnished by certain off-campus provider-based departments of a provider. In this document, we also are issuing an interim final rule with comment period to establish the Medicare Physician Fee Schedule payment rates for the nonexcepted items and services billed by a nonexcepted off-campus provider-based department of a hospital in accordance with the provisions of section 603.

  4. Assessment of geometry in 2D immune systems using high accuracy laser-based bioprinting techniques (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Lauzurica, Sara; Márquez, Andrés.; Molpeceres, Carlos; Notario, Laura; Gómez-Fontela, Miguel; Lauzurica, Pilar

    2017-02-01

    The immune system is a very complex system that comprises a network of genetic and signaling pathways subtending a network of interacting cells. The location of the cells in a network, along with the gene products they interact with, rules the behavior of the immune system. Therefore, there is a great interest in understanding properly the role of a cell in such networks to increase our knowledge of the immune system response. In order to acquire a better understanding of these processes, cell printing with high spatial resolution emerges as one of the promising approaches to organize cells in two and three-dimensional patterns to enable the study the geometry influence in these interactions. In particular, laser assisted bio-printing techniques using sub-nanosecond laser sources have better characteristics for application in this field, mainly due to its higher spatial resolution, cell viability percentage and process automation. This work presents laser assisted bio-printing of antigen-presenting cells (APCs) in two-dimensional geometries, placing cellular components on a matrix previously generated on demand, permitting to test the molecular interactions between APCs and lymphocytes; as well as the generation of two-dimensional structures designed ad hoc in order to study the mechanisms of mobilization of immune system cells. The use of laser assisted bio-printing, along with APCs and lymphocytes emulate the structure of different niches of the immune system so that we can analyse functional requirement of these interaction.

  5. Novel flowcytometry-based approach of malignant cell detection in body fluids using an automated hematology analyzer

    PubMed Central

    Tabe, Yoko; Takemura, Hiroyuki; Kimura, Konobu; Takahashi, Toshihiro; Yang, Haeun; Tsuchiya, Koji; Konishi, Aya; Uchihashi, Kinya; Horii, Takashi; Ohsaka, Akimichi

    2018-01-01

    Morphological microscopic examinations of nucleated cells in body fluid (BF) samples are performed to screen malignancy. However, the morphological differentiation is time-consuming and labor-intensive. This study aimed to develop a new flowcytometry-based gating analysis mode “XN-BF gating algorithm” to detect malignant cells using an automated hematology analyzer, Sysmex XN-1000. XN-BF mode was equipped with WDF white blood cell (WBC) differential channel. We added two algorithms to the WDF channel: Rule 1 detects larger and clumped cell signals compared to the leukocytes, targeting the clustered malignant cells; Rule 2 detects middle sized mononuclear cells containing less granules than neutrophils with similar fluorescence signal to monocytes, targeting hematological malignant cells and solid tumor cells. BF samples that meet, at least, one rule were detected as malignant. To evaluate this novel gating algorithm, 92 various BF samples were collected. Manual microscopic differentiation with the May-Grunwald Giemsa stain and WBC count with hemocytometer were also performed. The performance of these three methods were evaluated by comparing with the cytological diagnosis. The XN-BF gating algorithm achieved sensitivity of 63.0% and specificity of 87.8% with 68.0% for positive predictive value and 85.1% for negative predictive value in detecting malignant-cell positive samples. Manual microscopic WBC differentiation and WBC count demonstrated 70.4% and 66.7% of sensitivities, and 96.9% and 92.3% of specificities, respectively. The XN-BF gating algorithm can be a feasible tool in hematology laboratories for prompt screening of malignant cells in various BF samples. PMID:29425230

  6. Novel flowcytometry-based approach of malignant cell detection in body fluids using an automated hematology analyzer.

    PubMed

    Ai, Tomohiko; Tabe, Yoko; Takemura, Hiroyuki; Kimura, Konobu; Takahashi, Toshihiro; Yang, Haeun; Tsuchiya, Koji; Konishi, Aya; Uchihashi, Kinya; Horii, Takashi; Ohsaka, Akimichi

    2018-01-01

    Morphological microscopic examinations of nucleated cells in body fluid (BF) samples are performed to screen malignancy. However, the morphological differentiation is time-consuming and labor-intensive. This study aimed to develop a new flowcytometry-based gating analysis mode "XN-BF gating algorithm" to detect malignant cells using an automated hematology analyzer, Sysmex XN-1000. XN-BF mode was equipped with WDF white blood cell (WBC) differential channel. We added two algorithms to the WDF channel: Rule 1 detects larger and clumped cell signals compared to the leukocytes, targeting the clustered malignant cells; Rule 2 detects middle sized mononuclear cells containing less granules than neutrophils with similar fluorescence signal to monocytes, targeting hematological malignant cells and solid tumor cells. BF samples that meet, at least, one rule were detected as malignant. To evaluate this novel gating algorithm, 92 various BF samples were collected. Manual microscopic differentiation with the May-Grunwald Giemsa stain and WBC count with hemocytometer were also performed. The performance of these three methods were evaluated by comparing with the cytological diagnosis. The XN-BF gating algorithm achieved sensitivity of 63.0% and specificity of 87.8% with 68.0% for positive predictive value and 85.1% for negative predictive value in detecting malignant-cell positive samples. Manual microscopic WBC differentiation and WBC count demonstrated 70.4% and 66.7% of sensitivities, and 96.9% and 92.3% of specificities, respectively. The XN-BF gating algorithm can be a feasible tool in hematology laboratories for prompt screening of malignant cells in various BF samples.

  7. An architecture for rule based system explanation

    NASA Technical Reports Server (NTRS)

    Fennel, T. R.; Johannes, James D.

    1990-01-01

    A system architecture is presented which incorporate both graphics and text into explanations provided by rule based expert systems. This architecture facilitates explanation of the knowledge base content, the control strategies employed by the system, and the conclusions made by the system. The suggested approach combines hypermedia and inference engine capabilities. Advantages include: closer integration of user interface, explanation system, and knowledge base; the ability to embed links to deeper knowledge underlying the compiled knowledge used in the knowledge base; and allowing for more direct control of explanation depth and duration by the user. User models are suggested to control the type, amount, and order of information presented.

  8. Knowledge base rule partitioning design for CLIPS

    NASA Technical Reports Server (NTRS)

    Mainardi, Joseph D.; Szatkowski, G. P.

    1990-01-01

    This describes a knowledge base (KB) partitioning approach to solve the problem of real-time performance using the CLIPS AI shell when containing large numbers of rules and facts. This work is funded under the joint USAF/NASA Advanced Launch System (ALS) Program as applied research in expert systems to perform vehicle checkout for real-time controller and diagnostic monitoring tasks. The Expert System advanced development project (ADP-2302) main objective is to provide robust systems responding to new data frames of 0.1 to 1.0 second intervals. The intelligent system control must be performed within the specified real-time window, in order to meet the demands of the given application. Partitioning the KB reduces the complexity of the inferencing Rete net at any given time. This reduced complexity improves performance but without undo impacts during load and unload cycles. The second objective is to produce highly reliable intelligent systems. This requires simple and automated approaches to the KB verification & validation task. Partitioning the KB reduces rule interaction complexity overall. Reduced interaction simplifies the V&V testing necessary by focusing attention only on individual areas of interest. Many systems require a robustness that involves a large number of rules, most of which are mutually exclusive under different phases or conditions. The ideal solution is to control the knowledge base by loading rules that directly apply for that condition, while stripping out all rules and facts that are not used during that cycle. The practical approach is to cluster rules and facts into associated 'blocks'. A simple approach has been designed to control the addition and deletion of 'blocks' of rules and facts, while allowing real-time operations to run freely. Timing tests for real-time performance for specific machines under R/T operating systems have not been completed but are planned as part of the analysis process to validate the design.

  9. Program for Experimentation With Expert Systems

    NASA Technical Reports Server (NTRS)

    Engle, S. W.

    1986-01-01

    CERBERUS is forward-chaining, knowledge-based system program useful for experimentation with expert systems. Inference-engine mechanism performs deductions according to user-supplied rule set. Information stored in intermediate area, and user interrogated only when no applicable data found in storage. Each assertion posed by CERBERUS answered with certainty ranging from 0 to 100 percent. Rule processor stops investigating applicable rules when goal reaches certainty of 95 percent or higher. Capable of operating for wide variety of domains. Sample rule files included for animal identification, pixel classification in image processing, and rudimentary car repair for novice mechanic. User supplies set of end goals or actions. System complexity decided by user's rule file. CERBERUS written in FORTRAN 77.

  10. Detection of pseudosinusoidal epileptic seizure segments in the neonatal EEG by cascading a rule-based algorithm with a neural network.

    PubMed

    Karayiannis, Nicolaos B; Mukherjee, Amit; Glover, John R; Ktonas, Periklis Y; Frost, James D; Hrachovy, Richard A; Mizrahi, Eli M

    2006-04-01

    This paper presents an approach to detect epileptic seizure segments in the neonatal electroencephalogram (EEG) by characterizing the spectral features of the EEG waveform using a rule-based algorithm cascaded with a neural network. A rule-based algorithm screens out short segments of pseudosinusoidal EEG patterns as epileptic based on features in the power spectrum. The output of the rule-based algorithm is used to train and compare the performance of conventional feedforward neural networks and quantum neural networks. The results indicate that the trained neural networks, cascaded with the rule-based algorithm, improved the performance of the rule-based algorithm acting by itself. The evaluation of the proposed cascaded scheme for the detection of pseudosinusoidal seizure segments reveals its potential as a building block of the automated seizure detection system under development.

  11. Annotation of rule-based models with formal semantics to enable creation, analysis, reuse and visualization.

    PubMed

    Misirli, Goksel; Cavaliere, Matteo; Waites, William; Pocock, Matthew; Madsen, Curtis; Gilfellon, Owen; Honorato-Zimmer, Ricardo; Zuliani, Paolo; Danos, Vincent; Wipat, Anil

    2016-03-15

    Biological systems are complex and challenging to model and therefore model reuse is highly desirable. To promote model reuse, models should include both information about the specifics of simulations and the underlying biology in the form of metadata. The availability of computationally tractable metadata is especially important for the effective automated interpretation and processing of models. Metadata are typically represented as machine-readable annotations which enhance programmatic access to information about models. Rule-based languages have emerged as a modelling framework to represent the complexity of biological systems. Annotation approaches have been widely used for reaction-based formalisms such as SBML. However, rule-based languages still lack a rich annotation framework to add semantic information, such as machine-readable descriptions, to the components of a model. We present an annotation framework and guidelines for annotating rule-based models, encoded in the commonly used Kappa and BioNetGen languages. We adapt widely adopted annotation approaches to rule-based models. We initially propose a syntax to store machine-readable annotations and describe a mapping between rule-based modelling entities, such as agents and rules, and their annotations. We then describe an ontology to both annotate these models and capture the information contained therein, and demonstrate annotating these models using examples. Finally, we present a proof of concept tool for extracting annotations from a model that can be queried and analyzed in a uniform way. The uniform representation of the annotations can be used to facilitate the creation, analysis, reuse and visualization of rule-based models. Although examples are given, using specific implementations the proposed techniques can be applied to rule-based models in general. The annotation ontology for rule-based models can be found at http://purl.org/rbm/rbmo The krdf tool and associated executable examples are available at http://purl.org/rbm/rbmo/krdf anil.wipat@newcastle.ac.uk or vdanos@inf.ed.ac.uk. © The Author 2015. Published by Oxford University Press.

  12. One Giant Leap for Categorizers: One Small Step for Categorization Theory

    PubMed Central

    Smith, J. David; Ell, Shawn W.

    2015-01-01

    We explore humans’ rule-based category learning using analytic approaches that highlight their psychological transitions during learning. These approaches confirm that humans show qualitatively sudden psychological transitions during rule learning. These transitions contribute to the theoretical literature contrasting single vs. multiple category-learning systems, because they seem to reveal a distinctive learning process of explicit rule discovery. A complete psychology of categorization must describe this learning process, too. Yet extensive formal-modeling analyses confirm that a wide range of current (gradient-descent) models cannot reproduce these transitions, including influential rule-based models (e.g., COVIS) and exemplar models (e.g., ALCOVE). It is an important theoretical conclusion that existing models cannot explain humans’ rule-based category learning. The problem these models have is the incremental algorithm by which learning is simulated. Humans descend no gradient in rule-based tasks. Very different formal-modeling systems will be required to explain humans’ psychology in these tasks. An important next step will be to build a new generation of models that can do so. PMID:26332587

  13. Real-time Geographic Information System (GIS) for Monitoring the Area of Potential Water Level Using Rule Based System

    NASA Astrophysics Data System (ADS)

    Anugrah, Wirdah; Suryono; Suseno, Jatmiko Endro

    2018-02-01

    Management of water resources based on Geographic Information System can provide substantial benefits to water availability settings. Monitoring the potential water level is needed in the development sector, agriculture, energy and others. In this research is developed water resource information system using real-time Geographic Information System concept for monitoring the potential water level of web based area by applying rule based system method. GIS consists of hardware, software, and database. Based on the web-based GIS architecture, this study uses a set of computer that are connected to the network, run on the Apache web server and PHP programming language using MySQL database. The Ultrasound Wireless Sensor System is used as a water level data input. It also includes time and geographic location information. This GIS maps the five sensor locations. GIS is processed through a rule based system to determine the level of potential water level of the area. Water level monitoring information result can be displayed on thematic maps by overlaying more than one layer, and also generating information in the form of tables from the database, as well as graphs are based on the timing of events and the water level values.

  14. Timescale analysis of rule-based biochemical reaction networks

    PubMed Central

    Klinke, David J.; Finley, Stacey D.

    2012-01-01

    The flow of information within a cell is governed by a series of protein-protein interactions that can be described as a reaction network. Mathematical models of biochemical reaction networks can be constructed by repetitively applying specific rules that define how reactants interact and what new species are formed upon reaction. To aid in understanding the underlying biochemistry, timescale analysis is one method developed to prune the size of the reaction network. In this work, we extend the methods associated with timescale analysis to reaction rules instead of the species contained within the network. To illustrate this approach, we applied timescale analysis to a simple receptor-ligand binding model and a rule-based model of Interleukin-12 (IL-12) signaling in näive CD4+ T cells. The IL-12 signaling pathway includes multiple protein-protein interactions that collectively transmit information; however, the level of mechanistic detail sufficient to capture the observed dynamics has not been justified based upon the available data. The analysis correctly predicted that reactions associated with JAK2 and TYK2 binding to their corresponding receptor exist at a pseudo-equilibrium. In contrast, reactions associated with ligand binding and receptor turnover regulate cellular response to IL-12. An empirical Bayesian approach was used to estimate the uncertainty in the timescales. This approach complements existing rank- and flux-based methods that can be used to interrogate complex reaction networks. Ultimately, timescale analysis of rule-based models is a computational tool that can be used to reveal the biochemical steps that regulate signaling dynamics. PMID:21954150

  15. Automatic Learning of Fine Operating Rules for Online Power System Security Control.

    PubMed

    Sun, Hongbin; Zhao, Feng; Wang, Hao; Wang, Kang; Jiang, Weiyong; Guo, Qinglai; Zhang, Boming; Wehenkel, Louis

    2016-08-01

    Fine operating rules for security control and an automatic system for their online discovery were developed to adapt to the development of smart grids. The automatic system uses the real-time system state to determine critical flowgates, and then a continuation power flow-based security analysis is used to compute the initial transfer capability of critical flowgates. Next, the system applies the Monte Carlo simulations to expected short-term operating condition changes, feature selection, and a linear least squares fitting of the fine operating rules. The proposed system was validated both on an academic test system and on a provincial power system in China. The results indicated that the derived rules provide accuracy and good interpretability and are suitable for real-time power system security control. The use of high-performance computing systems enables these fine operating rules to be refreshed online every 15 min.

  16. An Interval Type-2 Neural Fuzzy System for Online System Identification and Feature Elimination.

    PubMed

    Lin, Chin-Teng; Pal, Nikhil R; Wu, Shang-Lin; Liu, Yu-Ting; Lin, Yang-Yin

    2015-07-01

    We propose an integrated mechanism for discarding derogatory features and extraction of fuzzy rules based on an interval type-2 neural fuzzy system (NFS)-in fact, it is a more general scheme that can discard bad features, irrelevant antecedent clauses, and even irrelevant rules. High-dimensional input variable and a large number of rules not only enhance the computational complexity of NFSs but also reduce their interpretability. Therefore, a mechanism for simultaneous extraction of fuzzy rules and reducing the impact of (or eliminating) the inferior features is necessary. The proposed approach, namely an interval type-2 Neural Fuzzy System for online System Identification and Feature Elimination (IT2NFS-SIFE), uses type-2 fuzzy sets to model uncertainties associated with information and data in designing the knowledge base. The consequent part of the IT2NFS-SIFE is of Takagi-Sugeno-Kang type with interval weights. The IT2NFS-SIFE possesses a self-evolving property that can automatically generate fuzzy rules. The poor features can be discarded through the concept of a membership modulator. The antecedent and modulator weights are learned using a gradient descent algorithm. The consequent part weights are tuned via the rule-ordered Kalman filter algorithm to enhance learning effectiveness. Simulation results show that IT2NFS-SIFE not only simplifies the system architecture by eliminating derogatory/irrelevant antecedent clauses, rules, and features but also maintains excellent performance.

  17. Object-oriented fault tree models applied to system diagnosis

    NASA Technical Reports Server (NTRS)

    Iverson, David L.; Patterson-Hine, F. A.

    1990-01-01

    When a diagnosis system is used in a dynamic environment, such as the distributed computer system planned for use on Space Station Freedom, it must execute quickly and its knowledge base must be easily updated. Representing system knowledge as object-oriented augmented fault trees provides both features. The diagnosis system described here is based on the failure cause identification process of the diagnostic system described by Narayanan and Viswanadham. Their system has been enhanced in this implementation by replacing the knowledge base of if-then rules with an object-oriented fault tree representation. This allows the system to perform its task much faster and facilitates dynamic updating of the knowledge base in a changing diagnosis environment. Accessing the information contained in the objects is more efficient than performing a lookup operation on an indexed rule base. Additionally, the object-oriented fault trees can be easily updated to represent current system status. This paper describes the fault tree representation, the diagnosis algorithm extensions, and an example application of this system. Comparisons are made between the object-oriented fault tree knowledge structure solution and one implementation of a rule-based solution. Plans for future work on this system are also discussed.

  18. Extending rule-based methods to model molecular geometry and 3D model resolution.

    PubMed

    Hoard, Brittany; Jacobson, Bruna; Manavi, Kasra; Tapia, Lydia

    2016-08-01

    Computational modeling is an important tool for the study of complex biochemical processes associated with cell signaling networks. However, it is challenging to simulate processes that involve hundreds of large molecules due to the high computational cost of such simulations. Rule-based modeling is a method that can be used to simulate these processes with reasonably low computational cost, but traditional rule-based modeling approaches do not include details of molecular geometry. The incorporation of geometry into biochemical models can more accurately capture details of these processes, and may lead to insights into how geometry affects the products that form. Furthermore, geometric rule-based modeling can be used to complement other computational methods that explicitly represent molecular geometry in order to quantify binding site accessibility and steric effects. We propose a novel implementation of rule-based modeling that encodes details of molecular geometry into the rules and binding rates. We demonstrate how rules are constructed according to the molecular curvature. We then perform a study of antigen-antibody aggregation using our proposed method. We simulate the binding of antibody complexes to binding regions of the shrimp allergen Pen a 1 using a previously developed 3D rigid-body Monte Carlo simulation, and we analyze the aggregate sizes. Then, using our novel approach, we optimize a rule-based model according to the geometry of the Pen a 1 molecule and the data from the Monte Carlo simulation. We use the distances between the binding regions of Pen a 1 to optimize the rules and binding rates. We perform this procedure for multiple conformations of Pen a 1 and analyze the impact of conformation and resolution on the optimal rule-based model. We find that the optimized rule-based models provide information about the average steric hindrance between binding regions and the probability that antibodies will bind to these regions. These optimized models quantify the variation in aggregate size that results from differences in molecular geometry and from model resolution.

  19. Automatic information extraction from unstructured mammography reports using distributed semantics.

    PubMed

    Gupta, Anupama; Banerjee, Imon; Rubin, Daniel L

    2018-02-01

    To date, the methods developed for automated extraction of information from radiology reports are mainly rule-based or dictionary-based, and, therefore, require substantial manual effort to build these systems. Recent efforts to develop automated systems for entity detection have been undertaken, but little work has been done to automatically extract relations and their associated named entities in narrative radiology reports that have comparable accuracy to rule-based methods. Our goal is to extract relations in a unsupervised way from radiology reports without specifying prior domain knowledge. We propose a hybrid approach for information extraction that combines dependency-based parse tree with distributed semantics for generating structured information frames about particular findings/abnormalities from the free-text mammography reports. The proposed IE system obtains a F 1 -score of 0.94 in terms of completeness of the content in the information frames, which outperforms a state-of-the-art rule-based system in this domain by a significant margin. The proposed system can be leveraged in a variety of applications, such as decision support and information retrieval, and may also easily scale to other radiology domains, since there is no need to tune the system with hand-crafted information extraction rules. Copyright © 2018 Elsevier Inc. All rights reserved.

  20. Rule-based modeling and simulations of the inner kinetochore structure.

    PubMed

    Tschernyschkow, Sergej; Herda, Sabine; Gruenert, Gerd; Döring, Volker; Görlich, Dennis; Hofmeister, Antje; Hoischen, Christian; Dittrich, Peter; Diekmann, Stephan; Ibrahim, Bashar

    2013-09-01

    Combinatorial complexity is a central problem when modeling biochemical reaction networks, since the association of a few components can give rise to a large variation of protein complexes. Available classical modeling approaches are often insufficient for the analysis of very large and complex networks in detail. Recently, we developed a new rule-based modeling approach that facilitates the analysis of spatial and combinatorially complex problems. Here, we explore for the first time how this approach can be applied to a specific biological system, the human kinetochore, which is a multi-protein complex involving over 100 proteins. Applying our freely available SRSim software to a large data set on kinetochore proteins in human cells, we construct a spatial rule-based simulation model of the human inner kinetochore. The model generates an estimation of the probability distribution of the inner kinetochore 3D architecture and we show how to analyze this distribution using information theory. In our model, the formation of a bridge between CenpA and an H3 containing nucleosome only occurs efficiently for higher protein concentration realized during S-phase but may be not in G1. Above a certain nucleosome distance the protein bridge barely formed pointing towards the importance of chromatin structure for kinetochore complex formation. We define a metric for the distance between structures that allow us to identify structural clusters. Using this modeling technique, we explore different hypothetical chromatin layouts. Applying a rule-based network analysis to the spatial kinetochore complex geometry allowed us to integrate experimental data on kinetochore proteins, suggesting a 3D model of the human inner kinetochore architecture that is governed by a combinatorial algebraic reaction network. This reaction network can serve as bridge between multiple scales of modeling. Our approach can be applied to other systems beyond kinetochores. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. Neural activity in superior parietal cortex during rule-based visual-motor transformations.

    PubMed

    Hawkins, Kara M; Sayegh, Patricia; Yan, Xiaogang; Crawford, J Douglas; Sergio, Lauren E

    2013-03-01

    Cognition allows for the use of different rule-based sensorimotor strategies, but the neural underpinnings of such strategies are poorly understood. The purpose of this study was to compare neural activity in the superior parietal lobule during a standard (direct interaction) reaching task, with two nonstandard (gaze and reach spatially incongruent) reaching tasks requiring the integration of rule-based information. Specifically, these nonstandard tasks involved dissociating the planes of reach and vision or rotating visual feedback by 180°. Single unit activity, gaze, and reach trajectories were recorded from two female Macaca mulattas. In all three conditions, we observed a temporal discharge pattern at the population level reflecting early reach planning and on-line reach monitoring. In the plane-dissociated task, we found a significant overall attenuation in the discharge rate of cells from deep recording sites, relative to standard reaching. We also found that cells modulated by reach direction tended to be significantly tuned either during the standard or the plane-dissociated task but rarely during both. In the standard versus feedback reversal comparison, we observed some cells that shifted their preferred direction by 180° between conditions, reflecting maintenance of directional tuning with respect to the reach goal. Our findings suggest that the superior parietal lobule plays an important role in processing information about the nonstandard nature of a task, which, through reciprocal connections with precentral motor areas, contributes to the accurate transformation of incongruent sensory inputs into an appropriate motor output. Such processing is crucial for the integration of rule-based information into a motor act.

  2. Modeling reliability measurement of interface on information system: Towards the forensic of rules

    NASA Astrophysics Data System (ADS)

    Nasution, M. K. M.; Sitompul, Darwin; Harahap, Marwan

    2018-02-01

    Today almost all machines depend on the software. As a software and hardware system depends also on the rules that are the procedures for its use. If the procedure or program can be reliably characterized by involving the concept of graph, logic, and probability, then regulatory strength can also be measured accordingly. Therefore, this paper initiates an enumeration model to measure the reliability of interfaces based on the case of information systems supported by the rules of use by the relevant agencies. An enumeration model is obtained based on software reliability calculation.

  3. Interacting damage models mapped onto ising and percolation models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toussaint, Renaud; Pride, Steven R.

    The authors introduce a class of damage models on regular lattices with isotropic interactions between the broken cells of the lattice. Quasistatic fiber bundles are an example. The interactions are assumed to be weak, in the sense that the stress perturbation from a broken cell is much smaller than the mean stress in the system. The system starts intact with a surface-energy threshold required to break any cell sampled from an uncorrelated quenched-disorder distribution. The evolution of this heterogeneous system is ruled by Griffith's principle which states that a cell breaks when the release in potential (elastic) energy in themore » system exceeds the surface-energy barrier necessary to break the cell. By direct integration over all possible realizations of the quenched disorder, they obtain the probability distribution of each damage configuration at any level of the imposed external deformation. They demonstrate an isomorphism between the distributions so obtained and standard generalized Ising models, in which the coupling constants and effective temperature in the Ising model are functions of the nature of the quenched-disorder distribution and the extent of accumulated damage. In particular, they show that damage models with global load sharing are isomorphic to standard percolation theory, that damage models with local load sharing rule are isomorphic to the standard ising model, and draw consequences thereof for the universality class and behavior of the autocorrelation length of the breakdown transitions corresponding to these models. they also treat damage models having more general power-law interactions, and classify the breakdown process as a function of the power-law interaction exponent. Last, they also show that the probability distribution over configurations is a maximum of Shannon's entropy under some specific constraints related to the energetic balance of the fracture process, which firmly relates this type of quenched-disorder based damage model to standard statistical mechanics.« less

  4. Object-Driven and Temporal Action Rules Mining

    ERIC Educational Resources Information Center

    Hajja, Ayman

    2013-01-01

    In this thesis, I present my complete research work in the field of action rules, more precisely object-driven and temporal action rules. The drive behind the introduction of object-driven and temporally based action rules is to bring forth an adapted approach to extract action rules from a subclass of systems that have a specific nature, in which…

  5. Discovering Fine-grained Sentiment in Suicide Notes

    PubMed Central

    Wang, Wenbo; Chen, Lu; Tan, Ming; Wang, Shaojun; Sheth, Amit P.

    2012-01-01

    This paper presents our solution for the i2b2 sentiment classification challenge. Our hybrid system consists of machine learning and rule-based classifiers. For the machine learning classifier, we investigate a variety of lexical, syntactic and knowledge-based features, and show how much these features contribute to the performance of the classifier through experiments. For the rule-based classifier, we propose an algorithm to automatically extract effective syntactic and lexical patterns from training examples. The experimental results show that the rule-based classifier outperforms the baseline machine learning classifier using unigram features. By combining the machine learning classifier and the rule-based classifier, the hybrid system gains a better trade-off between precision and recall, and yields the highest micro-averaged F-measure (0.5038), which is better than the mean (0.4875) and median (0.5027) micro-average F-measures among all participating teams. PMID:22879770

  6. Current Features of Secondary (Acquired) Types of Immune Deficiency.

    PubMed

    Kovalchuk, Leonid V.; Pinegin, Boris V.

    1999-12-01

    Secondary (acquired) types of immune deficiencies (SID) take a leading place in practice of modern clinical immunology. The causes for SID development are extremely variable. Special attention is concerned with accumulating facts about target action of microorganisms, and first of all viruses, on certain processes in immune system. Damageable action of HIV-1 on cell elements expressing CD4 molecules is known in most precise manner. It is noteworthy that the search of real molecular defects, induced by microorganisms in immune system is required. It is not to be ruled out that the increased level of apoptosis of immune system cells is one of the causes of SID. The basis of it is disbalance between positive and negative activation processes of immunocompetent cells. Multiple factors may serve as apoptogens, including drugs (glucocorticoids etc.), xenobiotics, physical factors (radiation) and many others. In practice of clinical laboratories a certain spectrum of immunological investigations is recommended that allows to diagnose the degree of immunopathology. At present, in clinical practice these methods are focused around flow cytometry (immunophenotyping), immunodiffusion and immunoenzyme tests (determination of immunoglobulins, cytokines, other soluble components of immune system), tests of estimation of immunocompetent cell activation, proliferation and differentiation. As a prospective, some methods, based on identification of molecular defects in cells and soluble factors of immune system, may be taken into consideration.

  7. Tissue interactions with nonionizing electromagnetic fields. Final report, March 1979-February 1986

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adey, W.R.; Bawin, S.M.; Byus, C.V.

    1986-08-01

    This report provides an overview of this research program focused on basic research in nervous system responses to electric fields at 60 Hz. The emphasis in this project was to determine the fundamental mechanisms underlying some phenomena of electric field interactions in neural systems. The five studies of the initial program were tests of behavioral responses in the rat based upon the hypothesis that electric field detection might follow psychophysical rules known from prior research with light, sound and other stimuli; tests of electrophysiological responses to ''normal'' forms of stimulation in rat brain tissue exposed in vitro to electric fields,more » based on the hypothesis that the excitability of brain tissue might be affected by fields in the extracellular environment; tests of electrophysiological responses of spontaneously active pacemaker neurons of the Aplysia abdominal ganglion, based on the hypothesis that electric field interactions at the cell membrane might affect the balance among the several membrane-related processes that govern pacemaker activity; studies of mechanisms of low frequency electromagnetic field interactions with bone cells in the context of field therapy of ununited fractures; and manipulation of cell surface receptor proteins in studies of their mobility during EM field exposure.« less

  8. Distributional benefit analysis of a national air quality rule.

    PubMed

    Post, Ellen S; Belova, Anna; Huang, Jin

    2011-06-01

    Under Executive Order 12898, the U.S. Environmental Protection Agency (EPA) must perform environmental justice (EJ) reviews of its rules and regulations. EJ analyses address the hypothesis that environmental disamenities are experienced disproportionately by poor and/or minority subgroups. Such analyses typically use communities as the unit of analysis. While community-based approaches make sense when considering where polluting sources locate, they are less appropriate for national air quality rules affecting many sources and pollutants that can travel thousands of miles. We compare exposures and health risks of EJ-identified individuals rather than communities to analyze EPA's Heavy Duty Diesel (HDD) rule as an example national air quality rule. Air pollutant exposures are estimated within grid cells by air quality models; all individuals in the same grid cell are assigned the same exposure. Using an inequality index, we find that inequality within racial/ethnic subgroups far outweighs inequality between them. We find, moreover, that the HDD rule leaves between-subgroup inequality essentially unchanged. Changes in health risks depend also on subgroups' baseline incidence rates, which differ across subgroups. Thus, health risk reductions may not follow the same pattern as reductions in exposure. These results are likely representative of other national air quality rules as well.

  9. 77 FR 55371 - System Safety Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-09-07

    ...-based rule and FRA seeks comments on all aspects of the proposed rule. An SSP would be implemented by a... SSP would be the risk-based hazard management program and risk-based hazard analysis. A properly implemented risk-based hazard management program and risk-based hazard analysis would identify the hazards and...

  10. Rule-based navigation control design for autonomous flight

    NASA Astrophysics Data System (ADS)

    Contreras, Hugo; Bassi, Danilo

    2008-04-01

    This article depicts a navigation control system design that is based on a set of rules in order to follow a desired trajectory. The full control of the aircraft considered here comprises: a low level stability control loop, based on classic PID controller and the higher level navigation whose main job is to exercise lateral control (course) and altitude control, trying to follow a desired trajectory. The rules and PID gains were adjusted systematically according to the result of flight simulation. In spite of its simplicity, the rule-based navigation control proved to be robust, even with big perturbation, like crossing winds.

  11. Incremental Learning of Context Free Grammars by Parsing-Based Rule Generation and Rule Set Search

    NASA Astrophysics Data System (ADS)

    Nakamura, Katsuhiko; Hoshina, Akemi

    This paper discusses recent improvements and extensions in Synapse system for inductive inference of context free grammars (CFGs) from sample strings. Synapse uses incremental learning, rule generation based on bottom-up parsing, and the search for rule sets. The form of production rules in the previous system is extended from Revised Chomsky Normal Form A→βγ to Extended Chomsky Normal Form, which also includes A→B, where each of β and γ is either a terminal or nonterminal symbol. From the result of bottom-up parsing, a rule generation mechanism synthesizes minimum production rules required for parsing positive samples. Instead of inductive CYK algorithm in the previous version of Synapse, the improved version uses a novel rule generation method, called ``bridging,'' which bridges the lacked part of the derivation tree for the positive string. The improved version also employs a novel search strategy, called serial search in addition to minimum rule set search. The synthesis of grammars by the serial search is faster than the minimum set search in most cases. On the other hand, the size of the generated CFGs is generally larger than that by the minimum set search, and the system can find no appropriate grammar for some CFL by the serial search. The paper shows experimental results of incremental learning of several fundamental CFGs and compares the methods of rule generation and search strategies.

  12. Discontinuous categories affect information-integration but not rule-based category learning.

    PubMed

    Maddox, W Todd; Filoteo, J Vincent; Lauritzen, J Scott; Connally, Emily; Hejl, Kelli D

    2005-07-01

    Three experiments were conducted that provide a direct examination of within-category discontinuity manipulations on the implicit, procedural-based learning and the explicit, hypothesis-testing systems proposed in F. G. Ashby, L. A. Alfonso-Reese, A. U. Turken, and E. M. Waldron's (1998) competition between verbal and implicit systems model. Discontinuous categories adversely affected information-integration but not rule-based category learning. Increasing the magnitude of the discontinuity did not lead to a significant decline in performance. The distance to the bound provides a reasonable description of the generalization profile associated with the hypothesis-testing system, whereas the distance to the bound plus the distance to the trained response region provides a reasonable description of the generalization profile associated with the procedural-based learning system. These results suggest that within-category discontinuity differentially impacts information-integration but not rule-based category learning and provides information regarding the detailed processing characteristics of each category learning system. ((c) 2005 APA, all rights reserved).

  13. Automated process control for plasma etching

    NASA Astrophysics Data System (ADS)

    McGeown, Margaret; Arshak, Khalil I.; Murphy, Eamonn

    1992-06-01

    This paper discusses the development and implementation of a rule-based system which assists in providing automated process control for plasma etching. The heart of the system is to establish a correspondence between a particular data pattern -- sensor or data signals -- and one or more modes of failure, i.e., a data-driven monitoring approach. The objective of this rule based system, PLETCHSY, is to create a program combining statistical process control (SPC) and fault diagnosis to help control a manufacturing process which varies over time. This can be achieved by building a process control system (PCS) with the following characteristics. A facility to monitor the performance of the process by obtaining and analyzing the data relating to the appropriate process variables. Process sensor/status signals are input into an SPC module. If trends are present, the SPC module outputs the last seven control points, a pattern which is represented by either regression or scoring. The pattern is passed to the rule-based module. When the rule-based system recognizes a pattern, it starts the diagnostic process using the pattern. If the process is considered to be going out of control, advice is provided about actions which should be taken to bring the process back into control.

  14. A Bankruptcy Problem Approach to Load-shedding in Multiagent-based Microgrid Operation

    PubMed Central

    Kim, Hak-Man; Kinoshita, Tetsuo; Lim, Yujin; Kim, Tai-Hoon

    2010-01-01

    A microgrid is composed of distributed power generation systems (DGs), distributed energy storage devices (DSs), and loads. To maintain a specific frequency in the islanded mode as an important requirement, the control of DGs’ output and charge action of DSs are used in supply surplus conditions and load-shedding and discharge action of DSs are used in supply shortage conditions. Recently, multiagent systems for autonomous microgrid operation have been studied. Especially, load-shedding, which is intentional reduction of electricity use, is a critical problem in islanded microgrid operation based on the multiagent system. Therefore, effective schemes for load-shedding are required. Meanwhile, the bankruptcy problem deals with dividing short resources among multiple agents. In order to solve the bankruptcy problem, division rules, such as the constrained equal awards rule (CEA), the constrained equal losses rule (CEL), and the random arrival rule (RA), have been used. In this paper, we approach load-shedding as a bankruptcy problem. We compare load-shedding results by above-mentioned rules in islanded microgrid operation based on wireless sensor network (WSN) as the communication link for an agent’s interactions. PMID:22163386

  15. A bankruptcy problem approach to load-shedding in multiagent-based microgrid operation.

    PubMed

    Kim, Hak-Man; Kinoshita, Tetsuo; Lim, Yujin; Kim, Tai-Hoon

    2010-01-01

    A microgrid is composed of distributed power generation systems (DGs), distributed energy storage devices (DSs), and loads. To maintain a specific frequency in the islanded mode as an important requirement, the control of DGs' output and charge action of DSs are used in supply surplus conditions and load-shedding and discharge action of DSs are used in supply shortage conditions. Recently, multiagent systems for autonomous microgrid operation have been studied. Especially, load-shedding, which is intentional reduction of electricity use, is a critical problem in islanded microgrid operation based on the multiagent system. Therefore, effective schemes for load-shedding are required. Meanwhile, the bankruptcy problem deals with dividing short resources among multiple agents. In order to solve the bankruptcy problem, division rules, such as the constrained equal awards rule (CEA), the constrained equal losses rule (CEL), and the random arrival rule (RA), have been used. In this paper, we approach load-shedding as a bankruptcy problem. We compare load-shedding results by above-mentioned rules in islanded microgrid operation based on wireless sensor network (WSN) as the communication link for an agent's interactions.

  16. On implementing clinical decision support: achieving scalability and maintainability by combining business rules and ontologies.

    PubMed

    Kashyap, Vipul; Morales, Alfredo; Hongsermeier, Tonya

    2006-01-01

    We present an approach and architecture for implementing scalable and maintainable clinical decision support at the Partners HealthCare System. The architecture integrates a business rules engine that executes declarative if-then rules stored in a rule-base referencing objects and methods in a business object model. The rules engine executes object methods by invoking services implemented on the clinical data repository. Specialized inferences that support classification of data and instances into classes are identified and an approach to implement these inferences using an OWL based ontology engine is presented. Alternative representations of these specialized inferences as if-then rules or OWL axioms are explored and their impact on the scalability and maintenance of the system is presented. Architectural alternatives for integration of clinical decision support functionality with the invoking application and the underlying clinical data repository; and their associated trade-offs are discussed and presented.

  17. The glia of the adult Drosophila nervous system

    PubMed Central

    Kremer, Malte C.; Jung, Christophe; Batelli, Sara; Rubin, Gerald M.

    2017-01-01

    Glia play crucial roles in the development and homeostasis of the nervous system. While the GLIA in the Drosophila embryo have been well characterized, their study in the adult nervous system has been limited. Here, we present a detailed description of the glia in the adult nervous system, based on the analysis of some 500 glial drivers we identified within a collection of synthetic GAL4 lines. We find that glia make up ∼10% of the cells in the nervous system and envelop all compartments of neurons (soma, dendrites, axons) as well as the nervous system as a whole. Our morphological analysis suggests a set of simple rules governing the morphogenesis of glia and their interactions with other cells. All glial subtypes minimize contact with their glial neighbors but maximize their contact with neurons and adapt their macromorphology and micromorphology to the neuronal entities they envelop. Finally, glial cells show no obvious spatial organization or registration with neuronal entities. Our detailed description of all glial subtypes and their regional specializations, together with the powerful genetic toolkit we provide, will facilitate the functional analysis of glia in the mature nervous system. GLIA 2017 GLIA 2017;65:606–638 PMID:28133822

  18. The arginylation branch of the N-end rule pathway positively regulates cellular autophagic flux and clearance of proteotoxic proteins

    PubMed Central

    Jiang, Yanxialei; Lee, Jeeyoung; Lee, Jung Hoon; Lee, Joon Won; Kim, Ji Hyeon; Choi, Won Hoon; Yoo, Young Dong; Cha-Molstad, Hyunjoo; Kim, Bo Yeon; Kwon, Yong Tae; Noh, Sue Ah; Kim, Kwang Pyo; Lee, Min Jae

    2016-01-01

    ABSTRACT The N-terminal amino acid of a protein is an essential determinant of ubiquitination and subsequent proteasomal degradation in the N-end rule pathway. Using para-chloroamphetamine (PCA), a specific inhibitor of the arginylation branch of the pathway (Arg/N-end rule pathway), we identified that blocking the Arg/N-end rule pathway significantly impaired the fusion of autophagosomes with lysosomes. Under ER stress, ATE1-encoded Arg-tRNA-protein transferases carry out the N-terminal arginylation of the ER heat shock protein HSPA5 that initially targets cargo proteins, along with SQSTM1, to the autophagosome. At the late stage of autophagy, however, proteasomal degradation of arginylated HSPA5 might function as a critical checkpoint for the proper progression of autophagic flux in the cells. Consistently, the inhibition of the Arg/N-end rule pathway with PCA significantly elevated levels of MAPT and huntingtin aggregates, accompanied by increased numbers of LC3 and SQSTM1 puncta. Cells treated with the Arg/N-end rule inhibitor became more sensitized to proteotoxic stress-induced cytotoxicity. SILAC-based quantitative proteomics also revealed that PCA significantly alters various biological pathways, including cellular responses to stress, nutrient, and DNA damage, which are also closely involved in modulation of autophagic responses. Thus, our results indicate that the Arg/N-end rule pathway may function to actively protect cells from detrimental effects of cellular stresses, including proteotoxic protein accumulation, by positively regulating autophagic flux. PMID:27560450

  19. 77 FR 12055 - Comment Sought on Petition for Declaratory Ruling Interpreting the Definition of “Commercial...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-02-28

    ... Networks of California, Inc.'s Distributed Antenna Systems and Other ``Small-Cell'' Solutions AGENCY.... Petitioner states that it provides telecommunications service via Distributed Antenna Systems (DAS) and other...

  20. Expert system for web based collaborative CAE

    NASA Astrophysics Data System (ADS)

    Hou, Liang; Lin, Zusheng

    2006-11-01

    An expert system for web based collaborative CAE was developed based on knowledge engineering, relational database and commercial FEA (Finite element analysis) software. The architecture of the system was illustrated. In this system, the experts' experiences, theories and typical examples and other related knowledge, which will be used in the stage of pre-process in FEA, were categorized into analysis process and object knowledge. Then, the integrated knowledge model based on object-oriented method and rule based method was described. The integrated reasoning process based on CBR (case based reasoning) and rule based reasoning was presented. Finally, the analysis process of this expert system in web based CAE application was illustrated, and an analysis example of a machine tool's column was illustrated to prove the validity of the system.

  1. Development of an expert system for analysis of Shuttle atmospheric revitalization and pressure control subsystem anomalies

    NASA Technical Reports Server (NTRS)

    Lafuse, Sharon A.

    1991-01-01

    The paper describes the Shuttle Leak Management Expert System (SLMES), a preprototype expert system developed to enable the ECLSS subsystem manager to analyze subsystem anomalies and to formulate flight procedures based on flight data. The SLMES combines the rule-based expert system technology with the traditional FORTRAN-based software into an integrated system. SLMES analyzes the data using rules, and, when it detects a problem that requires simulation, it sets up the input for the FORTRAN-based simulation program ARPCS2AT2, which predicts the cabin total pressure and composition as a function of time. The program simulates the pressure control system, the crew oxygen masks, the airlock repress/depress valves, and the leakage. When the simulation has completed, other SLMES rules are triggered to examine the results of simulation contrary to flight data and to suggest methods for correcting the problem. Results are then presented in form of graphs and tables.

  2. Medicare Program; Prospective Payment System and Consolidated Billing for Skilled Nursing Facilities for FY 2017, SNF Value-Based Purchasing Program, SNF Quality Reporting Program, and SNF Payment Models Research. Final rule.

    PubMed

    2016-08-05

    This final rule updates the payment rates used under the prospective payment system (PPS) for skilled nursing facilities (SNFs) for fiscal year (FY) 2017. In addition, it specifies a potentially preventable readmission measure for the Skilled Nursing Facility Value-Based Purchasing Program (SNF VBP), and implements requirements for that program, including performance standards, a scoring methodology, and a review and correction process for performance information to be made public, aimed at implementing value-based purchasing for SNFs. Additionally, this final rule includes additional polices and measures in the Skilled Nursing Facility Quality Reporting Program (SNF QRP). This final rule also responds to comments on the SNF Payment Models Research (PMR) project.

  3. Model-based Systems Engineering: Creation and Implementation of Model Validation Rules for MOS 2.0

    NASA Technical Reports Server (NTRS)

    Schmidt, Conrad K.

    2013-01-01

    Model-based Systems Engineering (MBSE) is an emerging modeling application that is used to enhance the system development process. MBSE allows for the centralization of project and system information that would otherwise be stored in extraneous locations, yielding better communication, expedited document generation and increased knowledge capture. Based on MBSE concepts and the employment of the Systems Modeling Language (SysML), extremely large and complex systems can be modeled from conceptual design through all system lifecycles. The Operations Revitalization Initiative (OpsRev) seeks to leverage MBSE to modernize the aging Advanced Multi-Mission Operations Systems (AMMOS) into the Mission Operations System 2.0 (MOS 2.0). The MOS 2.0 will be delivered in a series of conceptual and design models and documents built using the modeling tool MagicDraw. To ensure model completeness and cohesiveness, it is imperative that the MOS 2.0 models adhere to the specifications, patterns and profiles of the Mission Service Architecture Framework, thus leading to the use of validation rules. This paper outlines the process by which validation rules are identified, designed, implemented and tested. Ultimately, these rules provide the ability to maintain model correctness and synchronization in a simple, quick and effective manner, thus allowing the continuation of project and system progress.

  4. Automatic rule generation for high-level vision

    NASA Technical Reports Server (NTRS)

    Rhee, Frank Chung-Hoon; Krishnapuram, Raghu

    1992-01-01

    Many high-level vision systems use rule-based approaches to solving problems such as autonomous navigation and image understanding. The rules are usually elaborated by experts. However, this procedure may be rather tedious. In this paper, we propose a method to generate such rules automatically from training data. The proposed method is also capable of filtering out irrelevant features and criteria from the rules.

  5. PAM: Particle automata model in simulation of Fusarium graminearum pathogen expansion.

    PubMed

    Wcisło, Rafał; Miller, S Shea; Dzwinel, Witold

    2016-01-21

    The multi-scale nature and inherent complexity of biological systems are a great challenge for computer modeling and classical modeling paradigms. We present a novel particle automata modeling metaphor in the context of developing a 3D model of Fusarium graminearum infection in wheat. The system consisting of the host plant and Fusarium pathogen cells can be represented by an ensemble of discrete particles defined by a set of attributes. The cells-particles can interact with each other mimicking mechanical resistance of the cell walls and cell coalescence. The particles can move, while some of their attributes can be changed according to prescribed rules. The rules can represent cellular scales of a complex system, while the integrated particle automata model (PAM) simulates its overall multi-scale behavior. We show that due to the ability of mimicking mechanical interactions of Fusarium tip cells with the host tissue, the model is able to simulate realistic penetration properties of the colonization process reproducing both vertical and lateral Fusarium invasion scenarios. The comparison of simulation results with micrographs from laboratory experiments shows encouraging qualitative agreement between the two. Copyright © 2015 Elsevier Ltd. All rights reserved.

  6. Peptide array-based interaction assay of solid-bound peptides and anchorage-dependant cells and its effectiveness in cell-adhesive peptide design.

    PubMed

    Kato, Ryuji; Kaga, Chiaki; Kunimatsu, Mitoshi; Kobayashi, Takeshi; Honda, Hiroyuki

    2006-06-01

    Peptide array, the designable peptide library covalently synthesized on cellulose support, was applied to assay peptide-cell interaction, between solid-bound peptides and anchorage-dependant cells, to study objective peptide design. As a model case, cell-adhesive peptides that could enhance cell growth as tissue engineering scaffold material, was studied. On the peptide array, the relative cell-adhesion ratio of NIH/3T3 cells was 2.5-fold higher on the RGDS (Arg-Gly-Asp-Ser) peptide spot as compared to the spot with no peptide, thus indicating integrin-mediated peptide-cell interaction. Such strong cell adhesion mediated by the RGDS peptide was easily disrupted by single residue substitution on the peptide array, thus indicating that the sequence recognition accuracy of cells was strictly conserved in our optimized scheme. The observed cellular morphological extension with active actin stress-fiber on the RGD motif-containing peptide supported our strategy that peptide array-based interaction assay of solid-bound peptide and anchorage-dependant cells (PIASPAC) could provide quantitative data on biological peptide-cell interaction. The analysis of 180 peptides obtained from fibronectin type III domain (no. 1447-1629) yielded 18 novel cell-adhesive peptides without the RGD motif. Taken together with the novel candidates, representative rules of ineffective amino acid usage were obtained from non-effective candidate sequences for the effective designing of cell-adhesive peptides. On comparing the amino acid usage of the top 20 and last 20 peptides from the 180 peptides, the following four brief design rules were indicated: (i) Arg or Lys of positively charged amino acids (except His) could enhance cell adhesion, (ii) small hydrophilic amino acids are favored in cell-adhesion peptides, (iii) negatively charged amino acids and small amino acids (except Gly) could reduce cell adhesion, and (iv) Cys and Met could be excluded from the sequence combination since they have less influence on the peptide design. Such rules that are indicative of the nature of the functional peptide sequence can be obtained only by the mass comparison analysis of PIASPAC using peptide array. By following such indicative rules, numerous amino acid combinations can be effectively screened for further examination of novel peptide design.

  7. Evolving fuzzy rules in a learning classifier system

    NASA Technical Reports Server (NTRS)

    Valenzuela-Rendon, Manuel

    1993-01-01

    The fuzzy classifier system (FCS) combines the ideas of fuzzy logic controllers (FLC's) and learning classifier systems (LCS's). It brings together the expressive powers of fuzzy logic as it has been applied in fuzzy controllers to express relations between continuous variables, and the ability of LCS's to evolve co-adapted sets of rules. The goal of the FCS is to develop a rule-based system capable of learning in a reinforcement regime, and that can potentially be used for process control.

  8. Parallel inferencing method and apparatus for rule-based expert systems

    NASA Technical Reports Server (NTRS)

    Schwuttke, Ursula M. (Inventor); Moldovan, Dan (Inventor); Kuo, Steve (Inventor)

    1993-01-01

    The invention analyzes areas of conditions with an expert knowledge base of rules using plural separate nodes which fire respective rules of said knowledge base, each of said rules upon being fired altering certain of said conditions predicated upon the existence of other said conditions. The invention operates by constructing a P representation of all pairs of said rules which are input dependent or output dependent; constructing a C representation of all pairs of said rules which are communication dependent or input dependent; determining which of the rules are ready to fire by matching the predicate conditions of each rule with the conditions of said set; enabling said node means to simultaneously fire those of the rules ready to fire which are defined by said P representation as being free of input and output dependencies; and communicating from each node enabled by said enabling step the alteration of conditions by the corresponding rule to other nodes whose rules are defined by said C matrix means as being input or communication dependent upon the rule of said enabled node.

  9. An XML-Based Manipulation and Query Language for Rule-Based Information

    NASA Astrophysics Data System (ADS)

    Mansour, Essam; Höpfner, Hagen

    Rules are utilized to assist in the monitoring process that is required in activities, such as disease management and customer relationship management. These rules are specified according to the application best practices. Most of research efforts emphasize on the specification and execution of these rules. Few research efforts focus on managing these rules as one object that has a management life-cycle. This paper presents our manipulation and query language that is developed to facilitate the maintenance of this object during its life-cycle and to query the information contained in this object. This language is based on an XML-based model. Furthermore, we evaluate the model and language using a prototype system applied to a clinical case study.

  10. Influence of dispatching rules on average production lead time for multi-stage production systems.

    PubMed

    Hübl, Alexander; Jodlbauer, Herbert; Altendorfer, Klaus

    2013-08-01

    In this paper the influence of different dispatching rules on the average production lead time is investigated. Two theorems based on covariance between processing time and production lead time are formulated and proved theoretically. Theorem 1 links the average production lead time to the "processing time weighted production lead time" for the multi-stage production systems analytically. The influence of different dispatching rules on average lead time, which is well known from simulation and empirical studies, can be proved theoretically in Theorem 2 for a single stage production system. A simulation study is conducted to gain more insight into the influence of dispatching rules on average production lead time in a multi-stage production system. We find that the "processing time weighted average production lead time" for a multi-stage production system is not invariant of the applied dispatching rule and can be used as a dispatching rule independent indicator for single-stage production systems.

  11. An expert system design to diagnose cancer by using a new method reduced rule base.

    PubMed

    Başçiftçi, Fatih; Avuçlu, Emre

    2018-04-01

    A Medical Expert System (MES) was developed which uses Reduced Rule Base to diagnose cancer risk according to the symptoms in an individual. A total of 13 symptoms were used. With the new MES, the reduced rules are controlled instead of all possibilities (2 13 = 8192 different possibilities occur). By controlling reduced rules, results are found more quickly. The method of two-level simplification of Boolean functions was used to obtain Reduced Rule Base. Thanks to the developed application with the number of dynamic inputs and outputs on different platforms, anyone can easily test their own cancer easily. More accurate results were obtained considering all the possibilities related to cancer. Thirteen different risk factors were determined to determine the type of cancer. The truth table produced in our study has 13 inputs and 4 outputs. The Boolean Function Minimization method is used to obtain less situations by simplifying logical functions. Diagnosis of cancer quickly thanks to control of the simplified 4 output functions. Diagnosis made with the 4 output values obtained using Reduced Rule Base was found to be quicker than diagnosis made by screening all 2 13 = 8192 possibilities. With the improved MES, more probabilities were added to the process and more accurate diagnostic results were obtained. As a result of the simplification process in breast and renal cancer diagnosis 100% diagnosis speed gain, in cervical cancer and lung cancer diagnosis rate gain of 99% was obtained. With Boolean function minimization, less number of rules is evaluated instead of evaluating a large number of rules. Reducing the number of rules allows the designed system to work more efficiently and to save time, and facilitates to transfer the rules to the designed Expert systems. Interfaces were developed in different software platforms to enable users to test the accuracy of the application. Any one is able to diagnose the cancer itself using determinative risk factors. Thereby likely to beat the cancer with early diagnosis. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. Integrating the ECG power-line interference removal methods with rule-based system.

    PubMed

    Kumaravel, N; Senthil, A; Sridhar, K S; Nithiyanandam, N

    1995-01-01

    The power-line frequency interference in electrocardiographic signals is eliminated to enhance the signal characteristics for diagnosis. The power-line frequency normally varies +/- 1.5 Hz from its standard value of 50 Hz. In the present work, the performances of the linear FIR filter, Wave digital filter (WDF) and adaptive filter for the power-line frequency variations from 48.5 to 51.5 Hz in steps of 0.5 Hz are studied. The advantage of the LMS adaptive filter in the removal of power-line frequency interference even if the frequency of interference varies by +/- 1.5 Hz from its normal value of 50 Hz over other fixed frequency filters is very well justified. A novel method of integrating rule-based system approach with linear FIR filter and also with Wave digital filter are proposed. The performances of Rule-based FIR filter and Rule-based Wave digital filter are compared with the LMS adaptive filter.

  13. Gene regulatory network identification from the yeast cell cycle based on a neuro-fuzzy system.

    PubMed

    Wang, B H; Lim, J W; Lim, J S

    2016-08-30

    Many studies exist for reconstructing gene regulatory networks (GRNs). In this paper, we propose a method based on an advanced neuro-fuzzy system, for gene regulatory network reconstruction from microarray time-series data. This approach uses a neural network with a weighted fuzzy function to model the relationships between genes. Fuzzy rules, which determine the regulators of genes, are very simplified through this method. Additionally, a regulator selection procedure is proposed, which extracts the exact dynamic relationship between genes, using the information obtained from the weighted fuzzy function. Time-series related features are extracted from the original data to employ the characteristics of temporal data that are useful for accurate GRN reconstruction. The microarray dataset of the yeast cell cycle was used for our study. We measured the mean squared prediction error for the efficiency of the proposed approach and evaluated the accuracy in terms of precision, sensitivity, and F-score. The proposed method outperformed the other existing approaches.

  14. Structure-activity relationships and prediction of the phototoxicity and phototoxic potential of new drugs.

    PubMed

    Barratt, Martin D

    2004-11-01

    Relationships between the structure and properties of chemicals can be programmed into knowledge-based systems such as DEREK for Windows (DEREK is an acronym for "Deductive Estimation of Risk from Existing Knowledge"). The DEREK for Windows computer system contains a subset of over 60 rules describing chemical substructures (toxophores) responsible for skin sensitisation. As part of the European Phototox Project, the rule base was supplemented by a number of rules for the prospective identification of photoallergens, either by extension of the scope of existing rules or by the generation of new rules where a sound mechanistic rationale for the biological activity could be established. The scope of the rules for photoallergenicity was then further refined by assessment against a list of chemicals identified as photosensitisers by the Centro de Farmacovigilancia de la Comunidad Valenciana, Valencia, Spain. This paper contains an analysis of the mechanistic bases of activity for eight important groups of photoallergens and phototoxins, together with rules for the prospective identification of the photobiological activity of new or untested chemicals belonging to those classes. The mechanism of action of one additional chemical, nitrofurantoin, is well established; however, it was deemed inappropriate to write a rule on the basis of a single chemical structure.

  15. A new intuitionistic fuzzy rule-based decision-making system for an operating system process scheduler.

    PubMed

    Butt, Muhammad Arif; Akram, Muhammad

    2016-01-01

    We present a new intuitionistic fuzzy rule-based decision-making system based on intuitionistic fuzzy sets for a process scheduler of a batch operating system. Our proposed intuitionistic fuzzy scheduling algorithm, inputs the nice value and burst time of all available processes in the ready queue, intuitionistically fuzzify the input values, triggers appropriate rules of our intuitionistic fuzzy inference engine and finally calculates the dynamic priority (dp) of all the processes in the ready queue. Once the dp of every process is calculated the ready queue is sorted in decreasing order of dp of every process. The process with maximum dp value is sent to the central processing unit for execution. Finally, we show complete working of our algorithm on two different data sets and give comparisons with some standard non-preemptive process schedulers.

  16. Enhancing apoptosis in TRAIL-resistant cancer cells using fundamental response rules

    PubMed Central

    Piras, Vincent; Hayashi, Kentaro; Tomita, Masaru; Selvarajoo, Kumar

    2011-01-01

    The tumor necrosis factor related apoptosis-inducing ligand (TRAIL) induces apoptosis in malignant cells, while leaving other cells mostly unharmed. However, several carcinomas remain resistant to TRAIL. To investigate the resistance mechanisms in TRAIL-stimulated human fibrosarcoma (HT1080) cells, we developed a computational model to analyze the temporal activation profiles of cell survival (IκB, JNK, p38) and apoptotic (caspase-8 and -3) molecules in wildtype and several (FADD, RIP1, TRAF2 and caspase-8) knock-down conditions. Based on perturbation-response approach utilizing the law of information (signaling flux) conservation, we derived response rules for population-level average cell response. From this approach, i) a FADD-independent pathway to activate p38 and JNK, ii) a crosstalk between RIP1 and p38, and iii) a crosstalk between p62 and JNK are predicted. Notably, subsequent simulations suggest that targeting a novel molecule at p62/sequestosome-1 junction will optimize apoptosis through signaling flux redistribution. This study offers a valuable prospective to sensitive TRAIL-based therapy. PMID:22355661

  17. A novel prosodic-information synthesizer based on recurrent fuzzy neural network for the Chinese TTS system.

    PubMed

    Lin, Chin-Teng; Wu, Rui-Cheng; Chang, Jyh-Yeong; Liang, Sheng-Fu

    2004-02-01

    In this paper, a new technique for the Chinese text-to-speech (TTS) system is proposed. Our major effort focuses on the prosodic information generation. New methodologies for constructing fuzzy rules in a prosodic model simulating human's pronouncing rules are developed. The proposed Recurrent Fuzzy Neural Network (RFNN) is a multilayer recurrent neural network (RNN) which integrates a Self-cOnstructing Neural Fuzzy Inference Network (SONFIN) into a recurrent connectionist structure. The RFNN can be functionally divided into two parts. The first part adopts the SONFIN as a prosodic model to explore the relationship between high-level linguistic features and prosodic information based on fuzzy inference rules. As compared to conventional neural networks, the SONFIN can always construct itself with an economic network size in high learning speed. The second part employs a five-layer network to generate all prosodic parameters by directly using the prosodic fuzzy rules inferred from the first part as well as other important features of syllables. The TTS system combined with the proposed method can behave not only sandhi rules but also the other prosodic phenomena existing in the traditional TTS systems. Moreover, the proposed scheme can even find out some new rules about prosodic phrase structure. The performance of the proposed RFNN-based prosodic model is verified by imbedding it into a Chinese TTS system with a Chinese monosyllable database based on the time-domain pitch synchronous overlap add (TD-PSOLA) method. Our experimental results show that the proposed RFNN can generate proper prosodic parameters including pitch means, pitch shapes, maximum energy levels, syllable duration, and pause duration. Some synthetic sounds are online available for demonstration.

  18. Solutions to time variant problems of real-time expert systems

    NASA Technical Reports Server (NTRS)

    Yeh, Show-Way; Wu, Chuan-Lin; Hung, Chaw-Kwei

    1988-01-01

    Real-time expert systems for monitoring and control are driven by input data which changes with time. One of the subtle problems of this field is the propagation of time variant problems from rule to rule. This propagation problem is even complicated under a multiprogramming environment where the expert system may issue test commands to the system to get data and to access time consuming devices to retrieve data for concurrent reasoning. Two approaches are used to handle the flood of input data. Snapshots can be taken to freeze the system from time to time. The expert system treats the system as a stationary one and traces changes by comparing consecutive snapshots. In the other approach, when an input is available, the rules associated with it are evaluated. For both approaches, if the premise condition of a fired rule is changed to being false, the downstream rules should be deactivated. If the status change is due to disappearance of a transient problem, actions taken by the fired downstream rules which are no longer true may need to be undone. If a downstream rule is being evaluated, it should not be fired. Three mechanisms for solving this problem are discussed: tracing, backward checking, and censor setting. In the forward tracing mechanism, when the premise conditions of a fired rule become false, the premise conditions of downstream rules which have been fired or are being evaluated due to the firing of that rule are reevaluated. A tree with its root at the rule being deactivated is traversed. In the backward checking mechanism, when a rule is being fired, the expert system checks back on the premise conditions of the upstream rules that result in evaluation of the rule to see whether it should be fired. The root of the tree being traversed is the rule being fired. In the censor setting mechanism, when a rule is to be evaluated, a censor is constructed based on the premise conditions of the upstream rules and the censor is evaluated just before the rule is fired. Unlike the backward checking mechanism, this one does not search the upstream rules. This paper explores the details of implementation of the three mechanisms.

  19. A Logical Framework for Service Migration Based Survivability

    DTIC Science & Technology

    2016-06-24

    platforms; Service Migration Strategy Fuzzy Inference System Knowledge Base Fuzzy rules representing domain expert knowledge about implications of...service migration strategy. Our approach uses expert knowledge as linguistic reasoning rules and takes service programs damage assessment, service...programs complexity, and available network capability as input. The fuzzy inference system includes four components as shown in Figure 5: (1) a knowledge

  20. Construction of a Clinical Decision Support System for Undergoing Surgery Based on Domain Ontology and Rules Reasoning

    PubMed Central

    Bau, Cho-Tsan; Huang, Chung-Yi

    2014-01-01

    Abstract Objective: To construct a clinical decision support system (CDSS) for undergoing surgery based on domain ontology and rules reasoning in the setting of hospitalized diabetic patients. Materials and Methods: The ontology was created with a modified ontology development method, including specification and conceptualization, formalization, implementation, and evaluation and maintenance. The Protégé–Web Ontology Language editor was used to implement the ontology. Embedded clinical knowledge was elicited to complement the domain ontology with formal concept analysis. The decision rules were translated into JENA format, which JENA can use to infer recommendations based on patient clinical situations. Results: The ontology includes 31 classes and 13 properties, plus 38 JENA rules that were built to generate recommendations. The evaluation studies confirmed the correctness of the ontology, acceptance of recommendations, satisfaction with the system, and usefulness of the ontology for glycemic management of diabetic patients undergoing surgery, especially for domain experts. Conclusions: The contribution of this research is to set up an evidence-based hybrid ontology and an evaluation method for CDSS. The system can help clinicians to achieve inpatient glycemic control in diabetic patients undergoing surgery while avoiding hypoglycemia. PMID:24730353

  1. Construction of a clinical decision support system for undergoing surgery based on domain ontology and rules reasoning.

    PubMed

    Bau, Cho-Tsan; Chen, Rung-Ching; Huang, Chung-Yi

    2014-05-01

    To construct a clinical decision support system (CDSS) for undergoing surgery based on domain ontology and rules reasoning in the setting of hospitalized diabetic patients. The ontology was created with a modified ontology development method, including specification and conceptualization, formalization, implementation, and evaluation and maintenance. The Protégé-Web Ontology Language editor was used to implement the ontology. Embedded clinical knowledge was elicited to complement the domain ontology with formal concept analysis. The decision rules were translated into JENA format, which JENA can use to infer recommendations based on patient clinical situations. The ontology includes 31 classes and 13 properties, plus 38 JENA rules that were built to generate recommendations. The evaluation studies confirmed the correctness of the ontology, acceptance of recommendations, satisfaction with the system, and usefulness of the ontology for glycemic management of diabetic patients undergoing surgery, especially for domain experts. The contribution of this research is to set up an evidence-based hybrid ontology and an evaluation method for CDSS. The system can help clinicians to achieve inpatient glycemic control in diabetic patients undergoing surgery while avoiding hypoglycemia.

  2. Use of an expert system data analysis manager for space shuttle main engine test evaluation

    NASA Technical Reports Server (NTRS)

    Abernethy, Ken

    1988-01-01

    The ability to articulate, collect, and automate the application of the expertise needed for the analysis of space shuttle main engine (SSME) test data would be of great benefit to NASA liquid rocket engine experts. This paper describes a project whose goal is to build a rule-based expert system which incorporates such expertise. Experiential expertise, collected directly from the experts currently involved in SSME data analysis, is used to build a rule base to identify engine anomalies similar to those analyzed previously. Additionally, an alternate method of expertise capture is being explored. This method would generate rules inductively based on calculations made using a theoretical model of the SSME's operation. The latter rules would be capable of diagnosing anomalies which may not have appeared before, but whose effects can be predicted by the theoretical model.

  3. Differential impact of relevant and irrelevant dimension primes on rule-based and information-integration category learning.

    PubMed

    Grimm, Lisa R; Maddox, W Todd

    2013-11-01

    Research has identified multiple category-learning systems with each being "tuned" for learning categories with different task demands and each governed by different neurobiological systems. Rule-based (RB) classification involves testing verbalizable rules for category membership while information-integration (II) classification requires the implicit learning of stimulus-response mappings. In the first study to directly test rule priming with RB and II category learning, we investigated the influence of the availability of information presented at the beginning of the task. Participants viewed lines that varied in length, orientation, and position on the screen, and were primed to focus on stimulus dimensions that were relevant or irrelevant to the correct classification rule. In Experiment 1, we used an RB category structure, and in Experiment 2, we used an II category structure. Accuracy and model-based analyses suggested that a focus on relevant dimensions improves RB task performance later in learning while a focus on an irrelevant dimension improves II task performance early in learning. © 2013.

  4. CLIPS: A tool for the development and delivery of expert systems

    NASA Technical Reports Server (NTRS)

    Riley, Gary

    1991-01-01

    The C Language Integrated Production System (CLIPS) is a forward chaining rule-based language developed by the Software Technology Branch at the Johnson Space Center. CLIPS provides a complete environment for the construction of rule-based expert systems. CLIPS was designed specifically to provide high probability, low cost, and easy integration with external systems. Other key features of CLIPS include a powerful rule syntax, an interactive development environment, high performance, extensibility, a verification/validation tool, extensive documentation, and source code availability. The current release of CLIPS, version 4.3, is being used by over 2,500 users throughout the public and private community including: all NASA sites and branches of the military, numerous Federal bureaus, government contractors, 140 universities, and many companies.

  5. CATS - A process-based model for turbulent turbidite systems at the reservoir scale

    NASA Astrophysics Data System (ADS)

    Teles, Vanessa; Chauveau, Benoît; Joseph, Philippe; Weill, Pierre; Maktouf, Fakher

    2016-09-01

    The Cellular Automata for Turbidite systems (CATS) model is intended to simulate the fine architecture and facies distribution of turbidite reservoirs with a multi-event and process-based approach. The main processes of low-density turbulent turbidity flow are modeled: downslope sediment-laden flow, entrainment of ambient water, erosion and deposition of several distinct lithologies. This numerical model, derived from (Salles, 2006; Salles et al., 2007), proposes a new approach based on the Rouse concentration profile to consider the flow capacity to carry the sediment load in suspension. In CATS, the flow distribution on a given topography is modeled with local rules between neighboring cells (cellular automata) based on potential and kinetic energy balance and diffusion concepts. Input parameters are the initial flow parameters and a 3D topography at depositional time. An overview of CATS capabilities in different contexts is presented and discussed.

  6. Intelligent control for modeling of real-time reservoir operation, part II: artificial neural network with operating rule curves

    NASA Astrophysics Data System (ADS)

    Chang, Ya-Ting; Chang, Li-Chiu; Chang, Fi-John

    2005-04-01

    To bridge the gap between academic research and actual operation, we propose an intelligent control system for reservoir operation. The methodology includes two major processes, the knowledge acquired and implemented, and the inference system. In this study, a genetic algorithm (GA) and a fuzzy rule base (FRB) are used to extract knowledge based on the historical inflow data with a design objective function and on the operating rule curves respectively. The adaptive network-based fuzzy inference system (ANFIS) is then used to implement the knowledge, to create the fuzzy inference system, and then to estimate the optimal reservoir operation. To investigate its applicability and practicability, the Shihmen reservoir, Taiwan, is used as a case study. For the purpose of comparison, a simulation of the currently used M-5 operating rule curve is also performed. The results demonstrate that (1) the GA is an efficient way to search the optimal input-output patterns, (2) the FRB can extract the knowledge from the operating rule curves, and (3) the ANFIS models built on different types of knowledge can produce much better performance than the traditional M-5 curves in real-time reservoir operation. Moreover, we show that the model can be more intelligent for reservoir operation if more information (or knowledge) is involved.

  7. The Role of Age and Executive Function in Auditory Category Learning

    PubMed Central

    Reetzke, Rachel; Maddox, W. Todd; Chandrasekaran, Bharath

    2015-01-01

    Auditory categorization is a natural and adaptive process that allows for the organization of high-dimensional, continuous acoustic information into discrete representations. Studies in the visual domain have identified a rule-based learning system that learns and reasons via a hypothesis-testing process that requires working memory and executive attention. The rule-based learning system in vision shows a protracted development, reflecting the influence of maturing prefrontal function on visual categorization. The aim of the current study is two-fold: (a) to examine the developmental trajectory of rule-based auditory category learning from childhood through adolescence, into early adulthood; and (b) to examine the extent to which individual differences in rule-based category learning relate to individual differences in executive function. Sixty participants with normal hearing, 20 children (age range, 7–12), 21 adolescents (age range, 13–19), and 19 young adults (age range, 20–23), learned to categorize novel dynamic ripple sounds using trial-by-trial feedback. The spectrotemporally modulated ripple sounds are considered the auditory equivalent of the well-studied Gabor patches in the visual domain. Results revealed that auditory categorization accuracy improved with age, with young adults outperforming children and adolescents. Computational modeling analyses indicated that the use of the task-optimal strategy (i.e. a conjunctive rule-based learning strategy) improved with age. Notably, individual differences in executive flexibility significantly predicted auditory category learning success. The current findings demonstrate a protracted development of rule-based auditory categorization. The results further suggest that executive flexibility coupled with perceptual processes play important roles in successful rule-based auditory category learning. PMID:26491987

  8. 78 FR 55339 - Regulatory Capital Rules: Regulatory Capital, Implementation of Basel III, Capital Adequacy...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-10

    ...The Federal Deposit Insurance Corporation (FDIC) is adopting an interim final rule that revises its risk-based and leverage capital requirements for FDIC-supervised institutions. This interim final rule is substantially identical to a joint final rule issued by the Office of the Comptroller of the Currency (OCC) and the Board of Governors of the Federal Reserve System (Federal Reserve) (together, with the FDIC, the agencies). The interim final rule consolidates three separate notices of proposed rulemaking that the agencies jointly published in the Federal Register on August 30, 2012, with selected changes. The interim final rule implements a revised definition of regulatory capital, a new common equity tier 1 minimum capital requirement, a higher minimum tier 1 capital requirement, and, for FDIC-supervised institutions subject to the advanced approaches risk-based capital rules, a supplementary leverage ratio that incorporates a broader set of exposures in the denominator. The interim final rule incorporates these new requirements into the FDIC's prompt corrective action (PCA) framework. In addition, the interim final rule establishes limits on FDIC-supervised institutions' capital distributions and certain discretionary bonus payments if the FDIC-supervised institution does not hold a specified amount of common equity tier 1 capital in addition to the amount necessary to meet its minimum risk-based capital requirements. The interim final rule amends the methodologies for determining risk-weighted assets for all FDIC-supervised institutions. The interim final rule also adopts changes to the FDIC's regulatory capital requirements that meet the requirements of section 171 and section 939A of the Dodd-Frank Wall Street Reform and Consumer Protection Act. The interim final rule also codifies the FDIC's regulatory capital rules, which have previously resided in various appendices to their respective regulations, into a harmonized integrated regulatory framework. In addition, the FDIC is amending the market risk capital rule (market risk rule) to apply to state savings associations. The FDIC is issuing these revisions to its capital regulations as an interim final rule. The FDIC invites comments on the interaction of this rule with other proposed leverage ratio requirements applicable to large, systemically important banking organizations. This interim final rule otherwise contains regulatory text that is identical to the common rule text adopted as a final rule by the Federal Reserve and the OCC. This interim final rule enables the FDIC to proceed on a unified, expedited basis with the other federal banking agencies pending consideration of other issues. Specifically, the FDIC intends to evaluate this interim final rule in the context of the proposed well- capitalized and buffer levels of the supplementary leverage ratio applicable to large, systemically important banking organizations, as described in a separate Notice of Proposed Rulemaking (NPR) published in the Federal Register August 20, 2013. The FDIC is seeking commenters' views on the interaction of this interim final rule with the proposed rule regarding the supplementary leverage ratio for large, systemically important banking organizations.

  9. The Epistemology of a Rule-Based Expert System: A Framework for Explanation.

    DTIC Science & Technology

    1982-01-01

    Hypothesis e.coli cryptococcus "concluded by" 3 Rule Rule543 Rule535 predicates" 4 Hypothesis meningitis bacterial steroids a3coholic "more general" 5...the hypothesis "e.coll Is causing meningitis" before " cryptococcus is causing meningitis" Is strategic. And recalling an earlier example

  10. SCADA-based Operator Support System for Power Plant Equipment Fault Forecasting

    NASA Astrophysics Data System (ADS)

    Mayadevi, N.; Ushakumari, S. S.; Vinodchandra, S. S.

    2014-12-01

    Power plant equipment must be monitored closely to prevent failures from disrupting plant availability. Online monitoring technology integrated with hybrid forecasting techniques can be used to prevent plant equipment faults. A self learning rule-based expert system is proposed in this paper for fault forecasting in power plants controlled by supervisory control and data acquisition (SCADA) system. Self-learning utilizes associative data mining algorithms on the SCADA history database to form new rules that can dynamically update the knowledge base of the rule-based expert system. In this study, a number of popular associative learning algorithms are considered for rule formation. Data mining results show that the Tertius algorithm is best suited for developing a learning engine for power plants. For real-time monitoring of the plant condition, graphical models are constructed by K-means clustering. To build a time-series forecasting model, a multi layer preceptron (MLP) is used. Once created, the models are updated in the model library to provide an adaptive environment for the proposed system. Graphical user interface (GUI) illustrates the variation of all sensor values affecting a particular alarm/fault, as well as the step-by-step procedure for avoiding critical situations and consequent plant shutdown. The forecasting performance is evaluated by computing the mean absolute error and root mean square error of the predictions.

  11. Self-organized, near-critical behavior during aggregation in Dictyostelium discoideum

    NASA Astrophysics Data System (ADS)

    de Palo, Giovanna; Yi, Darvin; Gregor, Thomas; Endres, Robert

    During starvation, the social amoeba Dictyostelium discoideum aggregates artfully via pattern formation into a multicellular slug and finally spores. The aggregation process is mediated by the secretion and sensing of cyclic adenosine monophosphate, leading to the synchronized movement of cells. The whole process is a remarkable example of collective behavior, spontaneously emerging from single-cell chemotaxis. Despite this phenomenon being broadly studied, a precise characterization of the transition from single cells to multicellularity has been elusive. Here, using fluorescence imaging data of thousands of cells, we investigate the role of cell shape in aggregation, demonstrating remarkable transitions in cell behavior. To better understand their functional role, we analyze cell-cell correlations and provide evidence for self-organization at the onset of aggregation (as opposed to leader cells), with features of criticality in this finite system. To capture the mechanism of self-organization, we extend a detailed single-cell model of D.discoideum chemotaxis by adding cell-cell communication. We then use these results to extract a minimal set of rules leading to aggregation in the population model. If universal, similar rules may explain other types of collective cell behavior.

  12. Systematic methods for knowledge acquisition and expert system development

    NASA Technical Reports Server (NTRS)

    Belkin, Brenda L.; Stengel, Robert F.

    1991-01-01

    Nine cooperating rule-based systems, collectively called AUTOCREW which were designed to automate functions and decisions associated with a combat aircraft's subsystems, are discussed. The organization of tasks within each system is described; performance metrics were developed to evaluate the workload of each rule base and to assess the cooperation between the rule bases. Simulation and comparative workload results for two mission scenarios are given. The scenarios are inbound surface-to-air-missile attack on the aircraft and pilot incapacitation. The methodology used to develop the AUTOCREW knowledge bases is summarized. Issues involved in designing the navigation sensor selection expert in AUTOCREW's NAVIGATOR knowledge base are discussed in detail. The performance of seven navigation systems aiding a medium-accuracy INS was investigated using Kalman filter covariance analyses. A navigation sensor management (NSM) expert system was formulated from covariance simulation data using the analysis of variance (ANOVA) method and the ID3 algorithm. ANOVA results show that statistically different position accuracies are obtained when different navaids are used, the number of navaids aiding the INS is varied, the aircraft's trajectory is varied, and the performance history is varied. The ID3 algorithm determines the NSM expert's classification rules in the form of decision trees. The performance of these decision trees was assessed on two arbitrary trajectories, and the results demonstrate that the NSM expert adapts to new situations and provides reasonable estimates of the expected hybrid performance.

  13. Cellular automaton simulation examining progenitor hierarchy structure effects on mammary ductal carcinoma in situ.

    PubMed

    Bankhead, Armand; Magnuson, Nancy S; Heckendorn, Robert B

    2007-06-07

    A computer simulation is used to model ductal carcinoma in situ, a form of non-invasive breast cancer. The simulation uses known histological morphology, cell types, and stochastic cell proliferation to evolve tumorous growth within a duct. The ductal simulation is based on a hybrid cellular automaton design using genetic rules to determine each cell's behavior. The genetic rules are a mutable abstraction that demonstrate genetic heterogeneity in a population. Our goal was to examine the role (if any) that recently discovered mammary stem cell hierarchies play in genetic heterogeneity, DCIS initiation and aggressiveness. Results show that simpler progenitor hierarchies result in greater genetic heterogeneity and evolve DCIS significantly faster. However, the more complex progenitor hierarchy structure was able to sustain the rapid reproduction of a cancer cell population for longer periods of time.

  14. A Hybrid Genetic Programming Algorithm for Automated Design of Dispatching Rules.

    PubMed

    Nguyen, Su; Mei, Yi; Xue, Bing; Zhang, Mengjie

    2018-06-04

    Designing effective dispatching rules for production systems is a difficult and timeconsuming task if it is done manually. In the last decade, the growth of computing power, advanced machine learning, and optimisation techniques has made the automated design of dispatching rules possible and automatically discovered rules are competitive or outperform existing rules developed by researchers. Genetic programming is one of the most popular approaches to discovering dispatching rules in the literature, especially for complex production systems. However, the large heuristic search space may restrict genetic programming from finding near optimal dispatching rules. This paper develops a new hybrid genetic programming algorithm for dynamic job shop scheduling based on a new representation, a new local search heuristic, and efficient fitness evaluators. Experiments show that the new method is effective regarding the quality of evolved rules. Moreover, evolved rules are also significantly smaller and contain more relevant attributes.

  15. Simulation Of Combat With An Expert System

    NASA Technical Reports Server (NTRS)

    Provenzano, J. P.

    1989-01-01

    Proposed expert system predicts outcomes of combat situations. Called "COBRA", combat outcome based on rules for attrition, system selects rules for mathematical modeling of losses and discrete events in combat according to previous experiences. Used with another software module known as the "Game". Game/COBRA software system, consisting of Game and COBRA modules, provides for both quantitative aspects and qualitative aspects in simulations of battles. COBRA intended for simulation of large-scale military exercises, concepts embodied in it have much broader applicability. In industrial research, knowledge-based system enables qualitative as well as quantitative simulations.

  16. Construction of living cellular automata using the Physarum plasmodium

    NASA Astrophysics Data System (ADS)

    Shirakawa, Tomohiro; Sato, Hiroshi; Ishiguro, Shinji

    2015-04-01

    The plasmodium of Physarum polycephalum is a unicellular and multinuclear giant amoeba that has an amorphous cell body. To clearly observe how the plasmodium makes decisions in its motile and exploratory behaviours, we developed a new experimental system to pseudo-discretize the motility of the organism. In our experimental space that has agar surfaces arranged in a two-dimensional lattice, the continuous and omnidirectional movement of the plasmodium was limited to the stepwise one, and the direction of the locomotion was also limited to four neighbours. In such an experimental system, a cellular automata-like system was constructed using the living cell. We further analysed the exploratory behaviours of the plasmodium by duplicating the experimental results in the simulation models of cellular automata. As a result, it was revealed that the behaviours of the plasmodium are not reproduced by only local state transition rules; and for the reproduction, a kind of historical rule setting is needed.

  17. Evidence flow graph methods for validation and verification of expert systems

    NASA Technical Reports Server (NTRS)

    Becker, Lee A.; Green, Peter G.; Bhatnagar, Jayant

    1989-01-01

    The results of an investigation into the use of evidence flow graph techniques for performing validation and verification of expert systems are given. A translator to convert horn-clause rule bases into evidence flow graphs, a simulation program, and methods of analysis were developed. These tools were then applied to a simple rule base which contained errors. It was found that the method was capable of identifying a variety of problems, for example that the order of presentation of input data or small changes in critical parameters could affect the output from a set of rules.

  18. Adaptive WTA with an analog VLSI neuromorphic learning chip.

    PubMed

    Häfliger, Philipp

    2007-03-01

    In this paper, we demonstrate how a particular spike-based learning rule (where exact temporal relations between input and output spikes of a spiking model neuron determine the changes of the synaptic weights) can be tuned to express rate-based classical Hebbian learning behavior (where the average input and output spike rates are sufficient to describe the synaptic changes). This shift in behavior is controlled by the input statistic and by a single time constant. The learning rule has been implemented in a neuromorphic very large scale integration (VLSI) chip as part of a neurally inspired spike signal image processing system. The latter is the result of the European Union research project Convolution AER Vision Architecture for Real-Time (CAVIAR). Since it is implemented as a spike-based learning rule (which is most convenient in the overall spike-based system), even if it is tuned to show rate behavior, no explicit long-term average signals are computed on the chip. We show the rule's rate-based Hebbian learning ability in a classification task in both simulation and chip experiment, first with artificial stimuli and then with sensor input from the CAVIAR system.

  19. Combination Rules for Morse-Based van der Waals Force Fields.

    PubMed

    Yang, Li; Sun, Lei; Deng, Wei-Qiao

    2018-02-15

    In traditional force fields (FFs), van der Waals interactions have been usually described by the Lennard-Jones potentials. Conventional combination rules for the parameters of van der Waals (VDW) cross-termed interactions were developed for the Lennard-Jones based FFs. Here, we report that the Morse potentials were a better function to describe VDW interactions calculated by highly precise quantum mechanics methods. A new set of combination rules was developed for Morse-based FFs, in which VDW interactions were described by Morse potentials. The new set of combination rules has been verified by comparing the second virial coefficients of 11 noble gas mixtures. For all of the mixed binaries considered in this work, the combination rules work very well and are superior to all three other existing sets of combination rules reported in the literature. We further used the Morse-based FF by using the combination rules to simulate the adsorption isotherms of CH 4 at 298 K in four covalent-organic frameworks (COFs). The overall agreement is great, which supports the further applications of this new set of combination rules in more realistic simulation systems.

  20. Rules for biocatalyst and reaction engineering to implement effective, NAD(P)H-dependent, whole cell bioreductions

    PubMed Central

    Kratzer, Regina; Woodley, John M.; Nidetzky, Bernd

    2016-01-01

    Access to chiral alcohols of high optical purity is today frequently provided by the enzymatic reduction of precursor ketones. However, bioreductions are complicated by the need for reducing equivalents in the form of NAD(P)H. The high price and molecular weight of NAD(P)H necessitate in situ recycling of catalytic quantities, which is mostly accomplished by enzymatic oxidation of a cheap co-substrate. The coupled oxidoreduction can be either performed by free enzymes in solution or by whole cells. Reductase selection, the decision between cell-free and whole cell reduction system, coenzyme recycling mode and reaction conditions represent design options that strongly affect bioreduction efficiency. In this paper, each option was critically scrutinized and decision rules formulated based on well-described literature examples. The development chain was visualized as a decision-tree that can be used to identify the most promising route towards the production of a specific chiral alcohol. General methods, applications and bottlenecks in the set-up are presented and key experiments required to “test” for decision-making attributes are defined. The reduction of o-chloroacetophenone to (S)-1-(2-chlorophenyl)ethanol was used as one example to demonstrate all the development steps. Detailed analysis of reported large scale bioreductions identified product isolation as a major bottleneck in process design. PMID:26343336

  1. Representation and the Rules of the Game: An Electoral Simulation

    ERIC Educational Resources Information Center

    Hoffman, Donna R.

    2009-01-01

    It is often a difficult proposition for introductory American government students to comprehend different electoral systems and how the rules of the game affect the representation that results. I have developed a simulation in which different proportional-based electoral systems are compared with a single-member plurality electoral system. In…

  2. Heat exchanger expert system logic

    NASA Technical Reports Server (NTRS)

    Cormier, R.

    1988-01-01

    The reduction is described of the operation and fault diagnostics of a Deep Space Network heat exchanger to a rule base by the application of propositional calculus to a set of logic statements. The value of this approach lies in the ease of converting the logic and subsequently implementing it on a computer as an expert system. The rule base was written in Process Intelligent Control software.

  3. Applications of Machine Learning and Rule Induction,

    DTIC Science & Technology

    1995-02-15

    An important area of application for machine learning is in automating the acquisition of knowledge bases required for expert systems. In this paper...we review the major paradigms for machine learning , including neural networks, instance-based methods, genetic learning, rule induction, and analytic

  4. Robust Strategy for Rocket Engine Health Monitoring

    NASA Technical Reports Server (NTRS)

    Santi, L. Michael

    2001-01-01

    Monitoring the health of rocket engine systems is essentially a two-phase process. The acquisition phase involves sensing physical conditions at selected locations, converting physical inputs to electrical signals, conditioning the signals as appropriate to establish scale or filter interference, and recording results in a form that is easy to interpret. The inference phase involves analysis of results from the acquisition phase, comparison of analysis results to established health measures, and assessment of health indications. A variety of analytical tools may be employed in the inference phase of health monitoring. These tools can be separated into three broad categories: statistical, rule based, and model based. Statistical methods can provide excellent comparative measures of engine operating health. They require well-characterized data from an ensemble of "typical" engines, or "golden" data from a specific test assumed to define the operating norm in order to establish reliable comparative measures. Statistical methods are generally suitable for real-time health monitoring because they do not deal with the physical complexities of engine operation. The utility of statistical methods in rocket engine health monitoring is hindered by practical limits on the quantity and quality of available data. This is due to the difficulty and high cost of data acquisition, the limited number of available test engines, and the problem of simulating flight conditions in ground test facilities. In addition, statistical methods incur a penalty for disregarding flow complexity and are therefore limited in their ability to define performance shift causality. Rule based methods infer the health state of the engine system based on comparison of individual measurements or combinations of measurements with defined health norms or rules. This does not mean that rule based methods are necessarily simple. Although binary yes-no health assessment can sometimes be established by relatively simple rules, the causality assignment needed for refined health monitoring often requires an exceptionally complex rule base involving complicated logical maps. Structuring the rule system to be clear and unambiguous can be difficult, and the expert input required to maintain a large logic network and associated rule base can be prohibitive.

  5. Checking Flight Rules with TraceContract: Application of a Scala DSL for Trace Analysis

    NASA Technical Reports Server (NTRS)

    Barringer, Howard; Havelund, Klaus; Morris, Robert A.

    2011-01-01

    Typically during the design and development of a NASA space mission, rules and constraints are identified to help reduce reasons for failure during operations. These flight rules are usually captured in a set of indexed tables, containing rule descriptions, rationales for the rules, and other information. Flight rules can be part of manual operations procedures carried out by humans. However, they can also be automated, and either implemented as on-board monitors, or as ground based monitors that are part of a ground data system. In the case of automated flight rules, one considerable expense to be addressed for any mission is the extensive process by which system engineers express flight rules in prose, software developers translate these requirements into code, and then both experts verify that the resulting application is correct. This paper explores the potential benefits of using an internal Scala DSL for general trace analysis, named TRACECONTRACT, to write executable specifications of flight rules. TRACECONTRACT can generally be applied to analysis of for example log files or for monitoring executing systems online.

  6. A robotically constructed production and supply base on Phobos

    NASA Astrophysics Data System (ADS)

    1989-05-01

    PHOBIA Corporation is involved with the design of a man-tenable robotically constructed, bootstrap base on Mars' moon, Phobos. This base will be a pit-stop for future manned missions to Mars and beyond and will be a control facility during the robotic construction of a Martian base. An introduction is given to the concepts and the ground rules followed during the design process. Details of a base design and its location are given along with information about some of the subsystems. Since a major purpose of the base is to supply fuel to spacecraft so they can limit their fuel mass, mining and production systems are discussed. Surface support activities such as docks, anchors, and surface transportation systems are detailed. Several power supplies for the base are investigated and include fuel cells and a nuclear reactor. Tasks for the robots are defined along with descriptions of the robots capable of completing the tasks. Finally, failure modes for the entire PHOBIA Corporation design are presented along with an effects analysis and preventative recommendations.

  7. A robotically constructed production and supply base on Phobos

    NASA Technical Reports Server (NTRS)

    1989-01-01

    PHOBIA Corporation is involved with the design of a man-tenable robotically constructed, bootstrap base on Mars' moon, Phobos. This base will be a pit-stop for future manned missions to Mars and beyond and will be a control facility during the robotic construction of a Martian base. An introduction is given to the concepts and the ground rules followed during the design process. Details of a base design and its location are given along with information about some of the subsystems. Since a major purpose of the base is to supply fuel to spacecraft so they can limit their fuel mass, mining and production systems are discussed. Surface support activities such as docks, anchors, and surface transportation systems are detailed. Several power supplies for the base are investigated and include fuel cells and a nuclear reactor. Tasks for the robots are defined along with descriptions of the robots capable of completing the tasks. Finally, failure modes for the entire PHOBIA Corporation design are presented along with an effects analysis and preventative recommendations.

  8. Medicare and Medicaid programs; fire safety requirements for certain health care facilities; amendment. Final rule.

    PubMed

    2006-09-22

    This final rule adopts the substance of the April 15, 2004 tentative interim amendment (TIA) 00-1 (101), Alcohol Based Hand Rub Solutions, an amendment to the 2000 edition of the Life Safety Code, published by the National Fire Protection Association (NFPA). This amendment allows certain health care facilities to place alcohol-based hand rub dispensers in egress corridors under specified conditions. This final rule also requires that nursing facilities at least install battery-operated single station smoke alarms in resident rooms and common areas if they are not fully sprinklered or they do not have system-based smoke detectors in those areas. Finally, this final rule confirms as final the provisions of the March 25, 2005 interim final rule with changes and responds to public comments on that rule.

  9. Symbolic rule-based classification of lung cancer stages from free-text pathology reports.

    PubMed

    Nguyen, Anthony N; Lawley, Michael J; Hansen, David P; Bowman, Rayleen V; Clarke, Belinda E; Duhig, Edwina E; Colquist, Shoni

    2010-01-01

    To classify automatically lung tumor-node-metastases (TNM) cancer stages from free-text pathology reports using symbolic rule-based classification. By exploiting report substructure and the symbolic manipulation of systematized nomenclature of medicine-clinical terms (SNOMED CT) concepts in reports, statements in free text can be evaluated for relevance against factors relating to the staging guidelines. Post-coordinated SNOMED CT expressions based on templates were defined and populated by concepts in reports, and tested for subsumption by staging factors. The subsumption results were used to build logic according to the staging guidelines to calculate the TNM stage. The accuracy measure and confusion matrices were used to evaluate the TNM stages classified by the symbolic rule-based system. The system was evaluated against a database of multidisciplinary team staging decisions and a machine learning-based text classification system using support vector machines. Overall accuracy on a corpus of pathology reports for 718 lung cancer patients against a database of pathological TNM staging decisions were 72%, 78%, and 94% for T, N, and M staging, respectively. The system's performance was also comparable to support vector machine classification approaches. A system to classify lung TNM stages from free-text pathology reports was developed, and it was verified that the symbolic rule-based approach using SNOMED CT can be used for the extraction of key lung cancer characteristics from free-text reports. Future work will investigate the applicability of using the proposed methodology for extracting other cancer characteristics and types.

  10. On the effects of adaptive reservoir operating rules in hydrological physically-based models

    NASA Astrophysics Data System (ADS)

    Giudici, Federico; Anghileri, Daniela; Castelletti, Andrea; Burlando, Paolo

    2017-04-01

    Recent years have seen a significant increase of the human influence on the natural systems both at the global and local scale. Accurately modeling the human component and its interaction with the natural environment is key to characterize the real system dynamics and anticipate future potential changes to the hydrological regimes. Modern distributed, physically-based hydrological models are able to describe hydrological processes with high level of detail and high spatiotemporal resolution. Yet, they lack in sophistication for the behavior component and human decisions are usually described by very simplistic rules, which might underperform in reproducing the catchment dynamics. In the case of water reservoir operators, these simplistic rules usually consist of target-level rule curves, which represent the average historical level trajectory. Whilst these rules can reasonably reproduce the average seasonal water volume shifts due to the reservoirs' operation, they cannot properly represent peculiar conditions, which influence the actual reservoirs' operation, e.g., variations in energy price or water demand, dry or wet meteorological conditions. Moreover, target-level rule curves are not suitable to explore the water system response to climate and socio economic changing contexts, because they assume a business-as-usual operation. In this work, we quantitatively assess how the inclusion of adaptive reservoirs' operating rules into physically-based hydrological models contribute to the proper representation of the hydrological regime at the catchment scale. In particular, we contrast target-level rule curves and detailed optimization-based behavioral models. We, first, perform the comparison on past observational records, showing that target-level rule curves underperform in representing the hydrological regime over multiple time scales (e.g., weekly, seasonal, inter-annual). Then, we compare how future hydrological changes are affected by the two modeling approaches by considering different future scenarios comprising climate change projections of precipitation and temperature and projections of electricity prices. We perform this comparative assessment on the real-world water system of Lake Como catchment in the Italian Alps, which is characterized by the massive presence of artificial hydropower reservoirs heavily altering the natural hydrological regime. The results show how different behavioral model approaches affect the system representation in terms of hydropower performance, reservoirs dynamics and hydrological regime under different future scenarios.

  11. Benchmarking CRISPR on-target sgRNA design.

    PubMed

    Yan, Jifang; Chuai, Guohui; Zhou, Chi; Zhu, Chenyu; Yang, Jing; Zhang, Chao; Gu, Feng; Xu, Han; Wei, Jia; Liu, Qi

    2017-02-15

    CRISPR (Clustered Regularly Interspaced Short Palindromic Repeats)-based gene editing has been widely implemented in various cell types and organisms. A major challenge in the effective application of the CRISPR system is the need to design highly efficient single-guide RNA (sgRNA) with minimal off-target cleavage. Several tools are available for sgRNA design, while limited tools were compared. In our opinion, benchmarking the performance of the available tools and indicating their applicable scenarios are important issues. Moreover, whether the reported sgRNA design rules are reproducible across different sgRNA libraries, cell types and organisms remains unclear. In our study, a systematic and unbiased benchmark of the sgRNA predicting efficacy was performed on nine representative on-target design tools, based on six benchmark data sets covering five different cell types. The benchmark study presented here provides novel quantitative insights into the available CRISPR tools. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  12. Integration of object-oriented knowledge representation with the CLIPS rule based system

    NASA Technical Reports Server (NTRS)

    Logie, David S.; Kamil, Hasan

    1990-01-01

    The paper describes a portion of the work aimed at developing an integrated, knowledge based environment for the development of engineering-oriented applications. An Object Representation Language (ORL) was implemented in C++ which is used to build and modify an object-oriented knowledge base. The ORL was designed in such a way so as to be easily integrated with other representation schemes that could effectively reason with the object base. Specifically, the integration of the ORL with the rule based system C Language Production Systems (CLIPS), developed at the NASA Johnson Space Center, will be discussed. The object-oriented knowledge representation provides a natural means of representing problem data as a collection of related objects. Objects are comprised of descriptive properties and interrelationships. The object-oriented model promotes efficient handling of the problem data by allowing knowledge to be encapsulated in objects. Data is inherited through an object network via the relationship links. Together, the two schemes complement each other in that the object-oriented approach efficiently handles problem data while the rule based knowledge is used to simulate the reasoning process. Alone, the object based knowledge is little more than an object-oriented data storage scheme; however, the CLIPS inference engine adds the mechanism to directly and automatically reason with that knowledge. In this hybrid scheme, the expert system dynamically queries for data and can modify the object base with complete access to all the functionality of the ORL from rules.

  13. Knowledge-Based Motion Control of AN Intelligent Mobile Autonomous System

    NASA Astrophysics Data System (ADS)

    Isik, Can

    An Intelligent Mobile Autonomous System (IMAS), which is equipped with vision and low level sensors to cope with unknown obstacles, is modeled as a hierarchy of path planning and motion control. This dissertation concentrates on the lower level of this hierarchy (Pilot) with a knowledge-based controller. The basis of a theory of knowledge-based controllers is established, using the example of the Pilot level motion control of IMAS. In this context, the knowledge-based controller with a linguistic world concept is shown to be adequate for the minimum time control of an autonomous mobile robot motion. The Pilot level motion control of IMAS is approached in the framework of production systems. The three major components of the knowledge-based control that are included here are the hierarchies of the database, the rule base and the rule evaluator. The database, which is the representation of the state of the world, is organized as a semantic network, using a concept of minimal admissible vocabulary. The hierarchy of rule base is derived from the analytical formulation of minimum-time control of IMAS motion. The procedure introduced for rule derivation, which is called analytical model verbalization, utilizes the concept of causalities to describe the system behavior. A realistic analytical system model is developed and the minimum-time motion control in an obstacle strewn environment is decomposed to a hierarchy of motion planning and control. The conditions for the validity of the hierarchical problem decomposition are established, and the consistency of operation is maintained by detecting the long term conflicting decisions of the levels of the hierarchy. The imprecision in the world description is modeled using the theory of fuzzy sets. The method developed for the choice of the rule that prescribes the minimum-time motion control among the redundant set of applicable rules is explained and the usage of fuzzy set operators is justified. Also included in the dissertation are the description of the computer simulation of Pilot within the hierarchy of IMAS control and the simulated experiments that demonstrate the theoretical work.

  14. CLIPS: A tool for corn disease diagnostic system and an aid to neural network for automated knowledge acquisition

    NASA Technical Reports Server (NTRS)

    Wu, Cathy; Taylor, Pam; Whitson, George; Smith, Cathy

    1990-01-01

    This paper describes the building of a corn disease diagnostic expert system using CLIPS, and the development of a neural expert system using the fact representation method of CLIPS for automated knowledge acquisition. The CLIPS corn expert system diagnoses 21 diseases from 52 symptoms and signs with certainty factors. CLIPS has several unique features. It allows the facts in rules to be broken down to object-attribute-value (OAV) triples, allows rule-grouping, and fires rules based on pattern-matching. These features combined with the chained inference engine result to a natural user query system and speedy execution. In order to develop a method for automated knowledge acquisition, an Artificial Neural Expert System (ANES) is developed by a direct mapping from the CLIPS system. The ANES corn expert system uses the same OAV triples in the CLIPS system for its facts. The LHS and RHS facts of the CLIPS rules are mapped into the input and output layers of the ANES, respectively; and the inference engine of the rules is imbedded in the hidden layer. The fact representation by OAC triples gives a natural grouping of the rules. These features allow the ANES system to automate rule-generation, and make it efficient to execute and easy to expand for a large and complex domain.

  15. Implementation of a spike-based perceptron learning rule using TiO2-x memristors.

    PubMed

    Mostafa, Hesham; Khiat, Ali; Serb, Alexander; Mayr, Christian G; Indiveri, Giacomo; Prodromakis, Themis

    2015-01-01

    Synaptic plasticity plays a crucial role in allowing neural networks to learn and adapt to various input environments. Neuromorphic systems need to implement plastic synapses to obtain basic "cognitive" capabilities such as learning. One promising and scalable approach for implementing neuromorphic synapses is to use nano-scale memristors as synaptic elements. In this paper we propose a hybrid CMOS-memristor system comprising CMOS neurons interconnected through TiO2-x memristors, and spike-based learning circuits that modulate the conductance of the memristive synapse elements according to a spike-based Perceptron plasticity rule. We highlight a number of advantages for using this spike-based plasticity rule as compared to other forms of spike timing dependent plasticity (STDP) rules. We provide experimental proof-of-concept results with two silicon neurons connected through a memristive synapse that show how the CMOS plasticity circuits can induce stable changes in memristor conductances, giving rise to increased synaptic strength after a potentiation episode and to decreased strength after a depression episode.

  16. Simple methods of exploiting the underlying structure of rule-based systems

    NASA Technical Reports Server (NTRS)

    Hendler, James

    1986-01-01

    Much recent work in the field of expert systems research has aimed at exploiting the underlying structures of the rule base for reasons of analysis. Such techniques as Petri-nets and GAGs have been proposed as representational structures that will allow complete analysis. Much has been made of proving isomorphisms between the rule bases and the mechanisms, and in examining the theoretical power of this analysis. In this paper we describe some early work in a new system which has much simpler (and thus, one hopes, more easily achieved) aims and less formality. The technique being examined is a very simple one: OPS5 programs are analyzed in a purely syntactic way and a FSA description is generated. In this paper we describe the technique and some user interface tools which exploit this structure.

  17. A hybrid intelligence approach to artifact recognition in digital publishing

    NASA Astrophysics Data System (ADS)

    Vega-Riveros, J. Fernando; Santos Villalobos, Hector J.

    2006-02-01

    The system presented integrates rule-based and case-based reasoning for artifact recognition in Digital Publishing. In Variable Data Printing (VDP) human proofing could result prohibitive since a job could contain millions of different instances that may contain two types of artifacts: 1) evident defects, like a text overflow or overlapping 2) style-dependent artifacts, subtle defects that show as inconsistencies with regard to the original job design. We designed a Knowledge-Based Artifact Recognition tool for document segmentation, layout understanding, artifact detection, and document design quality assessment. Document evaluation is constrained by reference to one instance of the VDP job proofed by a human expert against the remaining instances. Fundamental rules of document design are used in the rule-based component for document segmentation and layout understanding. Ambiguities in the design principles not covered by the rule-based system are analyzed by case-based reasoning, using the Nearest Neighbor Algorithm, where features from previous jobs are used to detect artifacts and inconsistencies within the document layout. We used a subset of XSL-FO and assembled a set of 44 document samples. The system detected all the job layout changes, while obtaining an overall average accuracy of 84.56%, with the highest accuracy of 92.82%, for overlapping and the lowest, 66.7%, for the lack-of-white-space.

  18. Breaking the rules? X-ray examination of hematopoietic stem cell grafts at international airports.

    PubMed

    Petzer, Andreas L; Speth, Hans-Georg; Hoflehner, Elisabeth; Clausen, Johannes; Nachbaur, David; Gastl, Günther; Gunsilius, Eberhard

    2002-06-15

    Hematopoietic stem cell grafts from unrelated donors are commonly transported by aircraft. They must not be subjected to x-rays during security checks, which may cause inconvenient discussions between the courier and the airport security staff. We exposed hematopoietic stem cells from mobilized peripheral blood to a widely used x-ray hand-luggage control system. Cell viability as well as growth in vitro of mature progenitor cells (colony-forming cells), primitive progenitor cells (long-term culture-initiating cells), and lymphocytes were not altered even after 10 passages through the hand-luggage control system. Thus, repeated exposure to the low radiation dose of hand-luggage control systems (1.5 +/- 0.6 microSv per exposure) seems to be harmless for hematopoietic stem cells, which should simplify the international transport of stem cell grafts.

  19. Equations for Scoring Rules When Data Are Missing

    NASA Technical Reports Server (NTRS)

    James, Mark

    2006-01-01

    A document presents equations for scoring rules in a diagnostic and/or prognostic artificial-intelligence software system of the rule-based inference-engine type. The equations define a set of metrics that characterize the evaluation of a rule when data required for the antecedence clause(s) of the rule are missing. The metrics include a primary measure denoted the rule completeness metric (RCM) plus a number of subsidiary measures that contribute to the RCM. The RCM is derived from an analysis of a rule with respect to its truth and a measure of the completeness of its input data. The derivation is such that the truth value of an antecedent is independent of the measure of its completeness. The RCM can be used to compare the degree of completeness of two or more rules with respect to a given set of data. Hence, the RCM can be used as a guide to choosing among rules during the rule-selection phase of operation of the artificial-intelligence system..

  20. Retinal Connectomics: Towards Complete, Accurate Networks

    PubMed Central

    Marc, Robert E.; Jones, Bryan W.; Watt, Carl B.; Anderson, James R.; Sigulinsky, Crystal; Lauritzen, Scott

    2013-01-01

    Connectomics is a strategy for mapping complex neural networks based on high-speed automated electron optical imaging, computational assembly of neural data volumes, web-based navigational tools to explore 1012–1015 byte (terabyte to petabyte) image volumes, and annotation and markup tools to convert images into rich networks with cellular metadata. These collections of network data and associated metadata, analyzed using tools from graph theory and classification theory, can be merged with classical systems theory, giving a more completely parameterized view of how biologic information processing systems are implemented in retina and brain. Networks have two separable features: topology and connection attributes. The first findings from connectomics strongly validate the idea that the topologies complete retinal networks are far more complex than the simple schematics that emerged from classical anatomy. In particular, connectomics has permitted an aggressive refactoring of the retinal inner plexiform layer, demonstrating that network function cannot be simply inferred from stratification; exposing the complex geometric rules for inserting different cells into a shared network; revealing unexpected bidirectional signaling pathways between mammalian rod and cone systems; documenting selective feedforward systems, novel candidate signaling architectures, new coupling motifs, and the highly complex architecture of the mammalian AII amacrine cell. This is but the beginning, as the underlying principles of connectomics are readily transferrable to non-neural cell complexes and provide new contexts for assessing intercellular communication. PMID:24016532

  1. A rule-based system for real-time analysis of control systems

    NASA Astrophysics Data System (ADS)

    Larson, Richard R.; Millard, D. Edward

    1992-10-01

    An approach to automate the real-time analysis of flight critical health monitoring and system status is being developed and evaluated at the NASA Dryden Flight Research Facility. A software package was developed in-house and installed as part of the extended aircraft interrogation and display system. This design features a knowledge-base structure in the form of rules to formulate interpretation and decision logic of real-time data. This technique has been applied for ground verification and validation testing and flight testing monitoring where quick, real-time, safety-of-flight decisions can be very critical. In many cases post processing and manual analysis of flight system data are not required. The processing is described of real-time data for analysis along with the output format which features a message stack display. The development, construction, and testing of the rule-driven knowledge base, along with an application using the X-31A flight test program, are presented.

  2. Model-Based Anomaly Detection for a Transparent Optical Transmission System

    NASA Astrophysics Data System (ADS)

    Bengtsson, Thomas; Salamon, Todd; Ho, Tin Kam; White, Christopher A.

    In this chapter, we present an approach for anomaly detection at the physical layer of networks where detailed knowledge about the devices and their operations is available. The approach combines physics-based process models with observational data models to characterize the uncertainties and derive the alarm decision rules. We formulate and apply three different methods based on this approach for a well-defined problem in optical network monitoring that features many typical challenges for this methodology. Specifically, we address the problem of monitoring optically transparent transmission systems that use dynamically controlled Raman amplification systems. We use models of amplifier physics together with statistical estimation to derive alarm decision rules and use these rules to automatically discriminate between measurement errors, anomalous losses, and pump failures. Our approach has led to an efficient tool for systematically detecting anomalies in the system behavior of a deployed network, where pro-active measures to address such anomalies are key to preventing unnecessary disturbances to the system's continuous operation.

  3. A rule-based system for real-time analysis of control systems

    NASA Technical Reports Server (NTRS)

    Larson, Richard R.; Millard, D. Edward

    1992-01-01

    An approach to automate the real-time analysis of flight critical health monitoring and system status is being developed and evaluated at the NASA Dryden Flight Research Facility. A software package was developed in-house and installed as part of the extended aircraft interrogation and display system. This design features a knowledge-base structure in the form of rules to formulate interpretation and decision logic of real-time data. This technique has been applied for ground verification and validation testing and flight testing monitoring where quick, real-time, safety-of-flight decisions can be very critical. In many cases post processing and manual analysis of flight system data are not required. The processing is described of real-time data for analysis along with the output format which features a message stack display. The development, construction, and testing of the rule-driven knowledge base, along with an application using the X-31A flight test program, are presented.

  4. META II Complex Systems Design and Analysis (CODA)

    DTIC Science & Technology

    2011-08-01

    37  3.8.7  Variables, Parameters and Constraints ............................................................. 37  3.8.8  Objective...18  Figure 7: Inputs, States, Outputs and Parameters of System Requirements Specifications ......... 19...Design Rule Based on Device Parameter ....................................................... 57  Figure 35: AEE Device Design Rules (excerpt

  5. Search for Minimal and Semi-Minimal Rule Sets in Incremental Learning of Context-Free and Definite Clause Grammars

    NASA Astrophysics Data System (ADS)

    Imada, Keita; Nakamura, Katsuhiko

    This paper describes recent improvements to Synapse system for incremental learning of general context-free grammars (CFGs) and definite clause grammars (DCGs) from positive and negative sample strings. An important feature of our approach is incremental learning, which is realized by a rule generation mechanism called “bridging” based on bottom-up parsing for positive samples and the search for rule sets. The sizes of rule sets and the computation time depend on the search strategies. In addition to the global search for synthesizing minimal rule sets and serial search, another method for synthesizing semi-optimum rule sets, we incorporate beam search to the system for synthesizing semi-minimal rule sets. The paper shows several experimental results on learning CFGs and DCGs, and we analyze the sizes of rule sets and the computation time.

  6. Handedness in shearing auxetics creates rigid and compliant structures

    NASA Astrophysics Data System (ADS)

    Lipton, Jeffrey Ian; MacCurdy, Robert; Manchester, Zachary; Chin, Lillian; Cellucci, Daniel; Rus, Daniela

    2018-05-01

    In nature, repeated base units produce handed structures that selectively bond to make rigid or compliant materials. Auxetic tilings are scale-independent frameworks made from repeated unit cells that expand under tension. We discovered how to produce handedness in auxetic unit cells that shear as they expand by changing the symmetries and alignments of auxetic tilings. Using the symmetry and alignment rules that we developed, we made handed shearing auxetics that tile planes, cylinders, and spheres. By compositing the handed shearing auxetics in a manner inspired by keratin and collagen, we produce both compliant structures that expand while twisting and deployable structures that can rigidly lock. This work opens up new possibilities in designing chemical frameworks, medical devices like stents, robotic systems, and deployable engineering structures.

  7. A clinical decision support system for diagnosis of Allergic Rhinitis based on intradermal skin tests.

    PubMed

    Jabez Christopher, J; Khanna Nehemiah, H; Kannan, A

    2015-10-01

    Allergic Rhinitis is a universal common disease, especially in populated cities and urban areas. Diagnosis and treatment of Allergic Rhinitis will improve the quality of life of allergic patients. Though skin tests remain the gold standard test for diagnosis of allergic disorders, clinical experts are required for accurate interpretation of test outcomes. This work presents a clinical decision support system (CDSS) to assist junior clinicians in the diagnosis of Allergic Rhinitis. Intradermal Skin tests were performed on patients who had plausible allergic symptoms. Based on patient׳s history, 40 clinically relevant allergens were tested. 872 patients who had allergic symptoms were considered for this study. The rule based classification approach and the clinical test results were used to develop and validate the CDSS. Clinical relevance of the CDSS was compared with the Score for Allergic Rhinitis (SFAR). Tests were conducted for junior clinicians to assess their diagnostic capability in the absence of an expert. The class based Association rule generation approach provides a concise set of rules that is further validated by clinical experts. The interpretations of the experts are considered as the gold standard. The CDSS diagnoses the presence or absence of rhinitis with an accuracy of 88.31%. The allergy specialist and the junior clinicians prefer the rule based approach for its comprehendible knowledge model. The Clinical Decision Support Systems with rule based classification approach assists junior doctors and clinicians in the diagnosis of Allergic Rhinitis to make reliable decisions based on the reports of intradermal skin tests. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. A three solar cell system based on a self-supporting, transparent AlGaAs top solar cell

    NASA Technical Reports Server (NTRS)

    Negley, Gerald H.; Rhoads, Sandra L.; Terranova, Nancy E.; Mcneely, James B.; Barnett, Allen M.

    1989-01-01

    Development of a three solar cell stack can lead to practical efficiencies greater than 30 percent (1x,AM0). A theoretical efficiency limitation of 43.7 percent at AM0 and one sun is predicted by this model. Including expected losses, a practical system efficiency of 36.8 percent is anticipated. These calculations are based on a 1.93eV/1.43eV/0.89eV energy band gap combination. AlGaAs/GaAs/GaInAsP materials can be used with a six-terminal wiring configuration. The key issues for multijunction solar cells are the top and middle solar cell performance and the sub-bandgap transparency. AstroPower has developed a technique to fabricate AlGaAs solar cells on rugged, self-supporting, transparent AlGaAs substrates. Top solar cell efficiencies greater than 11 percent AM0 have been achieved. State-of-the-art GaAs or InP devices will be used for the middle solar cell. GaInAsP will be used to fabricate the bottom solar cell. This material is lattice-matched to InP and offers a wide range of bandgaps for optimization of the three solar cell stack. Liquid phase epitaxy is being used to grow the quaternary material. Initial solar cells have shown open-circuit voltages of 462 mV for a bandgap of 0.92eV. Design rules for the multijunction three solar cell stack are discussed. The progress in the development of the self-supporting AlGaAs top solar cell and the GaInAsP bottom solar cell is presented.

  9. Style-independent document labeling: design and performance evaluation

    NASA Astrophysics Data System (ADS)

    Mao, Song; Kim, Jong Woo; Thoma, George R.

    2003-12-01

    The Medical Article Records System or MARS has been developed at the U.S. National Library of Medicine (NLM) for automated data entry of bibliographical information from medical journals into MEDLINE, the premier bibliographic citation database at NLM. Currently, a rule-based algorithm (called ZoneCzar) is used for labeling important bibliographical fields (title, author, affiliation, and abstract) on medical journal article page images. While rules have been created for medical journals with regular layout types, new rules have to be manually created for any input journals with arbitrary or new layout types. Therefore, it is of interest to label any journal articles independent of their layout styles. In this paper, we first describe a system (called ZoneMatch) for automated generation of crucial geometric and non-geometric features of important bibliographical fields based on string-matching and clustering techniques. The rule based algorithm is then modified to use these features to perform style-independent labeling. We then describe a performance evaluation method for quantitatively evaluating our algorithm and characterizing its error distributions. Experimental results show that the labeling performance of the rule-based algorithm is significantly improved when the generated features are used.

  10. A simple and accurate rule-based modeling framework for simulation of autocrine/paracrine stimulation of glioblastoma cell motility and proliferation by L1CAM in 2-D culture.

    PubMed

    Caccavale, Justin; Fiumara, David; Stapf, Michael; Sweitzer, Liedeke; Anderson, Hannah J; Gorky, Jonathan; Dhurjati, Prasad; Galileo, Deni S

    2017-12-11

    Glioblastoma multiforme (GBM) is a devastating brain cancer for which there is no known cure. Its malignancy is due to rapid cell division along with high motility and invasiveness of cells into the brain tissue. Simple 2-dimensional laboratory assays (e.g., a scratch assay) commonly are used to measure the effects of various experimental perturbations, such as treatment with chemical inhibitors. Several mathematical models have been developed to aid the understanding of the motile behavior and proliferation of GBM cells. However, many are mathematically complicated, look at multiple interdependent phenomena, and/or use modeling software not freely available to the research community. These attributes make the adoption of models and simulations of even simple 2-dimensional cell behavior an uncommon practice by cancer cell biologists. Herein, we developed an accurate, yet simple, rule-based modeling framework to describe the in vitro behavior of GBM cells that are stimulated by the L1CAM protein using freely available NetLogo software. In our model L1CAM is released by cells to act through two cell surface receptors and a point of signaling convergence to increase cell motility and proliferation. A simple graphical interface is provided so that changes can be made easily to several parameters controlling cell behavior, and behavior of the cells is viewed both pictorially and with dedicated graphs. We fully describe the hierarchical rule-based modeling framework, show simulation results under several settings, describe the accuracy compared to experimental data, and discuss the potential usefulness for predicting future experimental outcomes and for use as a teaching tool for cell biology students. It is concluded that this simple modeling framework and its simulations accurately reflect much of the GBM cell motility behavior observed experimentally in vitro in the laboratory. Our framework can be modified easily to suit the needs of investigators interested in other similar intrinsic or extrinsic stimuli that influence cancer or other cell behavior. This modeling framework of a commonly used experimental motility assay (scratch assay) should be useful to both researchers of cell motility and students in a cell biology teaching laboratory.

  11. The load shedding advisor: An example of a crisis-response expert system

    NASA Technical Reports Server (NTRS)

    Bollinger, Terry B.; Lightner, Eric; Laverty, John; Ambrose, Edward

    1987-01-01

    A Prolog-based prototype expert system is described that was implemented by the Network Operations Branch of the NASA Goddard Space Flight Center. The purpose of the prototype was to test whether a small, inexpensive computer system could be used to host a load shedding advisor, a system which would monitor major physical environment parameters in a computer facility, then recommend appropriate operator reponses whenever a serious condition was detected. The resulting prototype performed significantly to efficiency gains achieved by replacing a purely rule-based design methodology with a hybrid approach that combined procedural, entity-relationship, and rule-based methods.

  12. Rules and Self-Organizing Properties of Post-embryonic Plant Organ Cell Division Patterns.

    PubMed

    von Wangenheim, Daniel; Fangerau, Jens; Schmitz, Alexander; Smith, Richard S; Leitte, Heike; Stelzer, Ernst H K; Maizel, Alexis

    2016-02-22

    Plants form new organs with patterned tissue organization throughout their lifespan. It is unknown whether this robust post-embryonic organ formation results from stereotypic dynamic processes, in which the arrangement of cells follows rigid rules. Here, we combine modeling with empirical observations of whole-organ development to identify the principles governing lateral root formation in Arabidopsis. Lateral roots derive from a small pool of founder cells in which some take a dominant role as seen by lineage tracing. The first division of the founders is asymmetric, tightly regulated, and determines the formation of a layered structure. Whereas the pattern of subsequent cell divisions is not stereotypic between different samples, it is characterized by a regular switch in division plane orientation. This switch is also necessary for the appearance of patterned layers as a result of the apical growth of the primordium. Our data suggest that lateral root morphogenesis is based on a limited set of rules. They determine cell growth and division orientation. The organ-level coupling of the cell behavior ensures the emergence of the lateral root's characteristic features. We propose that self-organizing, non-deterministic modes of development account for the robustness of plant organ morphogenesis. Copyright © 2016 Elsevier Ltd. All rights reserved.

  13. An on-line expert system for diagnosing environmentally induced spacecraft anomalies using CLIPS

    NASA Technical Reports Server (NTRS)

    Lauriente, Michael; Rolincik, Mark; Koons, Harry C; Gorney, David

    1993-01-01

    A new rule-based, expert system for diagnosing spacecraft anomalies is under development. The knowledge base consists of over two-hundred rules and provide links to historical and environmental databases. Environmental causes considered are bulk charging, single event upsets (SEU), surface charging, and total radiation dose. The system's driver translates forward chaining rules into a backward chaining sequence, prompting the user for information pertinent to the causes considered. The use of heuristics frees the user from searching through large amounts of irrelevant information (varying degrees of confidence in an answer) or 'unknown' to any question. The expert system not only provides scientists with needed risk analysis and confidence estimates not available in standard numerical models or databases, but it is also an effective learning tool. In addition, the architecture of the expert system allows easy additions to the knowledge base and the database. For example, new frames concerning orbital debris and ionospheric scintillation are being considered. The system currently runs on a MicroVAX and uses the C Language Integrated Production System (CLIPS).

  14. A fuzzy hill-climbing algorithm for the development of a compact associative classifier

    NASA Astrophysics Data System (ADS)

    Mitra, Soumyaroop; Lam, Sarah S.

    2012-02-01

    Classification, a data mining technique, has widespread applications including medical diagnosis, targeted marketing, and others. Knowledge discovery from databases in the form of association rules is one of the important data mining tasks. An integrated approach, classification based on association rules, has drawn the attention of the data mining community over the last decade. While attention has been mainly focused on increasing classifier accuracies, not much efforts have been devoted towards building interpretable and less complex models. This paper discusses the development of a compact associative classification model using a hill-climbing approach and fuzzy sets. The proposed methodology builds the rule-base by selecting rules which contribute towards increasing training accuracy, thus balancing classification accuracy with the number of classification association rules. The results indicated that the proposed associative classification model can achieve competitive accuracies on benchmark datasets with continuous attributes and lend better interpretability, when compared with other rule-based systems.

  15. Prediction of Land use changes using CA in GIS Environment

    NASA Astrophysics Data System (ADS)

    Kiavarz Moghaddam, H.; Samadzadegan, F.

    2009-04-01

    Urban growth is a typical self-organized system that results from the interaction between three defined systems; developed urban system, natural non-urban system and planned urban system. Urban growth simulation for an artificial city is carried out first. It evaluates a number of urban sprawl parameters including the size and shape of neighborhood besides testing different types of constraints on urban growth simulation. The results indicate that circular-type neighborhood shows smoother but faster urban growth as compared to nine-cell Moore neighborhood. Cellular Automata is proved to be very efficient in simulating the urban growth simulation over time. The strength of this technology comes from the ability of urban modeler to implement the growth simulation model, evaluating the results and presenting the output simulation results in visual interpretable environment. Artificial city simulation model provides an excellent environment to test a number of simulation parameters such as neighborhood influence on growth results and constraints role in driving the urban growth .Also, CA rules definition is critical stage in simulating the urban growth pattern in a close manner to reality. CA urban growth simulation and prediction of Tehran over the last four decades succeeds to simulate specified tested growth years at a high accuracy level. Some real data layer have been used in the CA simulation training phase such as 1995 while others used for testing the prediction results such as 2002. Tuning the CA growth rules is important through comparing the simulated images with the real data to obtain feedback. An important notice is that CA rules need also to be modified over time to adapt to the urban growth pattern. The evaluation method used on region basis has its advantage in covering the spatial distribution component of the urban growth process. Next step includes running the developed CA simulation over classified raster data for three years in a developed ArcGIS extention. A set of crisp rules are defined and calibrated based on real urban growth pattern. Uncertainty analysis is performed to evaluate the accuracy of the simulated results as compared to the historical real data. Evaluation shows promising results represented by the high average accuracies achieved. The average accuracy for the predicted growth images 1964 and 2002 is over 80 %. Modifying CA growth rules over time to match the growth pattern changes is important to obtain accurate simulation. This modification is based on the urban growth relationship for Tehran over time as can be seen in the historical raster data. The feedback obtained from comparing the simulated and real data is crucial in identifying the optimal set of CA rules for reliable simulation and calibrating growth steps.

  16. Learning invariance from natural images inspired by observations in the primary visual cortex.

    PubMed

    Teichmann, Michael; Wiltschut, Jan; Hamker, Fred

    2012-05-01

    The human visual system has the remarkable ability to largely recognize objects invariant of their position, rotation, and scale. A good interpretation of neurobiological findings involves a computational model that simulates signal processing of the visual cortex. In part, this is likely achieved step by step from early to late areas of visual perception. While several algorithms have been proposed for learning feature detectors, only few studies at hand cover the issue of biologically plausible learning of such invariance. In this study, a set of Hebbian learning rules based on calcium dynamics and homeostatic regulations of single neurons is proposed. Their performance is verified within a simple model of the primary visual cortex to learn so-called complex cells, based on a sequence of static images. As a result, the learned complex-cell responses are largely invariant to phase and position.

  17. Evidence flow graph methods for validation and verification of expert systems

    NASA Technical Reports Server (NTRS)

    Becker, Lee A.; Green, Peter G.; Bhatnagar, Jayant

    1988-01-01

    This final report describes the results of an investigation into the use of evidence flow graph techniques for performing validation and verification of expert systems. This was approached by developing a translator to convert horn-clause rule bases into evidence flow graphs, a simulation program, and methods of analysis. These tools were then applied to a simple rule base which contained errors. It was found that the method was capable of identifying a variety of problems, for example that the order of presentation of input data or small changes in critical parameters could effect the output from a set of rules.

  18. ASICs Approach for the Implementation of a Symmetric Triangular Fuzzy Coprocessor and Its Application to Adaptive Filtering

    NASA Technical Reports Server (NTRS)

    Starks, Scott; Abdel-Hafeez, Saleh; Usevitch, Bryan

    1997-01-01

    This paper discusses the implementation of a fuzzy logic system using an ASICs design approach. The approach is based upon combining the inherent advantages of symmetric triangular membership functions and fuzzy singleton sets to obtain a novel structure for fuzzy logic system application development. The resulting structure utilizes a fuzzy static RAM to store the rule-base and the end-points of the triangular membership functions. This provides advantages over other approaches in which all sampled values of membership functions for all universes must be stored. The fuzzy coprocessor structure implements the fuzzification and defuzzification processes through a two-stage parallel pipeline architecture which is capable of executing complex fuzzy computations in less than 0.55us with an accuracy of more than 95%, thus making it suitable for a wide range of applications. Using the approach presented in this paper, a fuzzy logic rule-base can be directly downloaded via a host processor to an onchip rule-base memory with a size of 64 words. The fuzzy coprocessor's design supports up to 49 rules for seven fuzzy membership functions associated with each of the chip's two input variables. This feature allows designers to create fuzzy logic systems without the need for additional on-board memory. Finally, the paper reports on simulation studies that were conducted for several adaptive filter applications using the least mean squared adaptive algorithm for adjusting the knowledge rule-base.

  19. Three CLIPS-based expert systems for solving engineering problems

    NASA Technical Reports Server (NTRS)

    Parkinson, W. J.; Luger, G. F.; Bretz, R. E.

    1990-01-01

    We have written three expert systems, using the CLIPS PC-based expert system shell. These three expert systems are rule based and are relatively small, with the largest containing slightly less than 200 rules. The first expert system is an expert assistant that was written to help users of the ASPEN computer code choose the proper thermodynamic package to use with their particular vapor-liquid equilibrium problem. The second expert system was designed to help petroleum engineers choose the proper enhanced oil recovery method to be used with a given reservoir. The effectiveness of each technique is highly dependent upon the reservoir conditions. The third expert system is a combination consultant and control system. This system was designed specifically for silicon carbide whisker growth. Silicon carbide whiskers are an extremely strong product used to make ceramic and metal composites. The manufacture of whiskers is a very complicated process. which to date. has defied a good mathematical model. The process was run by experts who had gained their expertise by trial and error. A system of rules was devised by these experts both for procedure setup and for the process control. In this paper we discuss the three problem areas of the design, development and evaluation of the CLIPS-based programs.

  20. On Inference Rules of Logic-Based Information Retrieval Systems.

    ERIC Educational Resources Information Center

    Chen, Patrick Shicheng

    1994-01-01

    Discussion of relevance and the needs of the users in information retrieval focuses on a deductive object-oriented approach and suggests eight inference rules for the deduction. Highlights include characteristics of a deductive object-oriented system, database and data modeling language, implementation, and user interface. (Contains 24…

  1. Knowledge-based approach to video content classification

    NASA Astrophysics Data System (ADS)

    Chen, Yu; Wong, Edward K.

    2001-01-01

    A framework for video content classification using a knowledge-based approach is herein proposed. This approach is motivated by the fact that videos are rich in semantic contents, which can best be interpreted and analyzed by human experts. We demonstrate the concept by implementing a prototype video classification system using the rule-based programming language CLIPS 6.05. Knowledge for video classification is encoded as a set of rules in the rule base. The left-hand-sides of rules contain high level and low level features, while the right-hand-sides of rules contain intermediate results or conclusions. Our current implementation includes features computed from motion, color, and text extracted from video frames. Our current rule set allows us to classify input video into one of five classes: news, weather, reporting, commercial, basketball and football. We use MYCIN's inexact reasoning method for combining evidences, and to handle the uncertainties in the features and in the classification results. We obtained good results in a preliminary experiment, and it demonstrated the validity of the proposed approach.

  2. Knowledge-based approach to video content classification

    NASA Astrophysics Data System (ADS)

    Chen, Yu; Wong, Edward K.

    2000-12-01

    A framework for video content classification using a knowledge-based approach is herein proposed. This approach is motivated by the fact that videos are rich in semantic contents, which can best be interpreted and analyzed by human experts. We demonstrate the concept by implementing a prototype video classification system using the rule-based programming language CLIPS 6.05. Knowledge for video classification is encoded as a set of rules in the rule base. The left-hand-sides of rules contain high level and low level features, while the right-hand-sides of rules contain intermediate results or conclusions. Our current implementation includes features computed from motion, color, and text extracted from video frames. Our current rule set allows us to classify input video into one of five classes: news, weather, reporting, commercial, basketball and football. We use MYCIN's inexact reasoning method for combining evidences, and to handle the uncertainties in the features and in the classification results. We obtained good results in a preliminary experiment, and it demonstrated the validity of the proposed approach.

  3. 76 FR 18589 - Self-Regulatory Organizations; The NASDAQ Stock Market LLC; Notice of Filing of Proposed Rule...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-04

    ...) of that Rule. This tie-breaker resolves price disputes based on minimizing order imbalances. In other... Opening Cross, the system will choose that price which minimizes the order imbalance remaining if the... opening cross. NASDAQ initially adopted the imbalance-based tie-breaker based upon its successful use in...

  4. Integration of a knowledge-based system and a clinical documentation system via a data dictionary.

    PubMed

    Eich, H P; Ohmann, C; Keim, E; Lang, K

    1997-01-01

    This paper describes the design and realisation of a knowledge-based system and a clinical documentation system linked via a data dictionary. The software was developed as a shell with object oriented methods and C++ for IBM-compatible PC's and WINDOWS 3.1/95. The data dictionary covers terminology and document objects with relations to external classifications. It controls the terminology in the documentation program with form-based entry of clinical documents and in the knowledge-based system with scores and rules. The software was applied to the clinical field of acute abdominal pain by implementing a data dictionary with 580 terminology objects, 501 document objects, and 2136 links; a documentation module with 8 clinical documents and a knowledge-based system with 10 scores and 7 sets of rules.

  5. Migration of cells in a social context.

    PubMed

    Vedel, Søren; Tay, Savaş; Johnston, Darius M; Bruus, Henrik; Quake, Stephen R

    2013-01-02

    In multicellular organisms and complex ecosystems, cells migrate in a social context. Whereas this is essential for the basic processes of life, the influence of neighboring cells on the individual remains poorly understood. Previous work on isolated cells has observed a stereotypical migratory behavior characterized by short-time directional persistence with long-time random movement. We discovered a much richer dynamic in the social context, with significant variations in directionality, displacement, and speed, which are all modulated by local cell density. We developed a mathematical model based on the experimentally identified "cellular traffic rules" and basic physics that revealed that these emergent behaviors are caused by the interplay of single-cell properties and intercellular interactions, the latter being dominated by a pseudopod formation bias mediated by secreted chemicals and pseudopod collapse following collisions. The model demonstrates how aspects of complex biology can be explained by simple rules of physics and constitutes a rapid test bed for future studies of collective migration of individual cells.

  6. Damage detection based on acceleration data using artificial immune system

    NASA Astrophysics Data System (ADS)

    Chartier, Sandra; Mita, Akira

    2009-03-01

    Nowadays, Structural Health Monitoring (SHM) is essential in order to prevent damages occurrence in civil structures. This is a particularly important issue as the number of aged structures is increasing. Damage detection algorithms are often based on changes in the modal properties like natural frequencies, modal shapes and modal damping. In this paper, damage detection is completed by using Artificial Immune System (AIS) theory directly on acceleration data. Inspired from the biological immune system, AIS is composed of several models like negative selection which has a great potential for this study. The negative selection process relies on the fact that T-cells, after their maturation, are sensitive to non self cells and can not detect self cells. Acceleration data were provided by using the numerical model of a 3-story frame structure. Damages were introduced, at particular times, by reduction of story's stiffness. Based on these acceleration data, undamaged data (equivalent to self data) and damaged data (equivalent to non self data) can be obtained and represented in the Hamming shape-space with a binary representation. From the undamaged encoded data, detectors (equivalent to T-cells) are derived and are able to detect damaged encoded data really efficiently by using the rcontiguous bits matching rule. Indeed, more than 95% of detection can be reached when efficient combinations of parameters are used. According to the number of detected data, the localization of damages can even be determined by using the differences between story's relative accelerations. Thus, the difference which presents the highest detection rate, generally up to 89%, is directly linked to the location of damage.

  7. Engineering monitoring expert system's developer

    NASA Technical Reports Server (NTRS)

    Lo, Ching F.

    1991-01-01

    This research project is designed to apply artificial intelligence technology including expert systems, dynamic interface of neural networks, and hypertext to construct an expert system developer. The developer environment is specifically suited to building expert systems which monitor the performance of ground support equipment for propulsion systems and testing facilities. The expert system developer, through the use of a graphics interface and a rule network, will be transparent to the user during rule constructing and data scanning of the knowledge base. The project will result in a software system that allows its user to build specific monitoring type expert systems which monitor various equipments used for propulsion systems or ground testing facilities and accrues system performance information in a dynamic knowledge base.

  8. Random Evolution of Idiotypic Networks: Dynamics and Architecture

    NASA Astrophysics Data System (ADS)

    Brede, Markus; Behn, Ulrich

    The paper deals with modelling a subsystem of the immune system, the so-called idiotypic network (INW). INWs, conceived by N.K. Jerne in 1974, are functional networks of interacting antibodies and B cells. In principle, Jernes' framework provides solutions to many issues in immunology, such as immunological memory, mechanisms for antigen recognition and self/non-self discrimination. Explaining the interconnection between the elementary components, local dynamics, network formation and architecture, and possible modes of global system function appears to be an ideal playground of statistical mechanics. We present a simple cellular automaton model, based on a graph representation of the system. From a simplified description of idiotypic interactions, rules for the random evolution of networks of occupied and empty sites on these graphs are derived. In certain biologically relevant parameter ranges the resultant dynamics leads to stationary states. A stationary state is found to correspond to a specific pattern of network organization. It turns out that even these very simple rules give rise to a multitude of different kinds of patterns. We characterize these networks by classifying `static' and `dynamic' network-patterns. A type of `dynamic' network is found to display many features of real INWs.

  9. Cell Libraries

    NASA Technical Reports Server (NTRS)

    1994-01-01

    A NASA contract led to the development of faster and more energy efficient semiconductor materials for digital integrated circuits. Gallium arsenide (GaAs) conducts electrons 4-6 times faster than silicon and uses less power at frequencies above 100-150 megahertz. However, the material is expensive, brittle, fragile and has lacked computer automated engineering tools to solve this problem. Systems & Processes Engineering Corporation (SPEC) developed a series of GaAs cell libraries for cell layout, design rule checking, logic synthesis, placement and routing, simulation and chip assembly. The system is marketed by Compare Design Automation.

  10. Retrosynthetic Reaction Prediction Using Neural Sequence-to-Sequence Models

    PubMed Central

    2017-01-01

    We describe a fully data driven model that learns to perform a retrosynthetic reaction prediction task, which is treated as a sequence-to-sequence mapping problem. The end-to-end trained model has an encoder–decoder architecture that consists of two recurrent neural networks, which has previously shown great success in solving other sequence-to-sequence prediction tasks such as machine translation. The model is trained on 50,000 experimental reaction examples from the United States patent literature, which span 10 broad reaction types that are commonly used by medicinal chemists. We find that our model performs comparably with a rule-based expert system baseline model, and also overcomes certain limitations associated with rule-based expert systems and with any machine learning approach that contains a rule-based expert system component. Our model provides an important first step toward solving the challenging problem of computational retrosynthetic analysis. PMID:29104927

  11. Rule Based System for Medicine Inventory Control Using Radio Frequency Identification (RFID)

    NASA Astrophysics Data System (ADS)

    Nugraha, Joanna Ardhyanti Mita; Suryono; Suseno, dan Jatmiko Endro

    2018-02-01

    Rule based system is very efficient to ensure stock of drug to remain available by utilizing Radio Frequency Identification (RFID) as input means automatically. This method can ensure the stock of drugs to remain available by analyzing the needs of drug users. The research data was the amount of drug usage in hospital for 1 year. The data was processed by using ABC classification to determine the drug with fast, medium and slow movement. In each classification result, rule based algorithm was given for determination of safety stock and Reorder Point (ROP). This research yielded safety stock and ROP values that vary depending on the class of each drug. Validation is done by comparing the calculation of safety stock and reorder point both manually and by system, then, it was found that the mean deviation value at safety stock was 0,03 and and ROP was 0,08.

  12. Impact of Operating Rules on Planning Capacity Expansion of Urban Water Supply Systems

    NASA Astrophysics Data System (ADS)

    de Neufville, R.; Galelli, S.; Tian, X.

    2017-12-01

    This study addresses the impact of operating rules on capacity planning of urban water supply systems. The continuous growth of metropolitan areas represents a major challenge for water utilities, which often rely on industrial water supply (e.g., desalination, reclaimed water) to complement natural resources (e.g., reservoirs). These additional sources increase the reliability of supply, equipping operators with additional means to hedge against droughts. How do their rules for using industrial water supply impact the performance of water supply system? How might it affect long-term plans for capacity expansion? Possibly significantly, as demonstrated by the analysis of the operations and planning of a water supply system inspired by Singapore. Our analysis explores the system dynamics under multiple inflow and management scenarios to understand the extent to which alternative operating rules for the use of industrial water supply affect system performance. Results first show that these operating rules can have significant impact on the variability in system performance (e.g., reliability, energy use) comparable to that of hydro-climatological conditions. Further analyses of several capacity expansion exercises—based on our original hydrological and management scenarios—show that operating rules significantly affect the timing and magnitude of critical decisions, such as the construction of new desalination plants. These results have two implications: Capacity expansion analysis should consider the effect of a priori uncertainty about operating rules; and operators should consider how their flexibility in operating rules can affect their perceived need for capacity.

  13. A personalized health-monitoring system for elderly by combining rules and case-based reasoning.

    PubMed

    Ahmed, Mobyen Uddin

    2015-01-01

    Health-monitoring system for elderly in home environment is a promising solution to provide efficient medical services that increasingly interest by the researchers within this area. It is often more challenging when the system is self-served and functioning as personalized provision. This paper proposed a personalized self-served health-monitoring system for elderly in home environment by combining general rules with a case-based reasoning approach. Here, the system generates feedback, recommendation and alarm in a personalized manner based on elderly's medical information and health parameters such as blood pressure, blood glucose, weight, activity, pulse, etc. A set of general rules has used to classify individual health parameters. The case-based reasoning approach is used to combine all different health parameters, which generates an overall classification of health condition. According to the evaluation result considering 323 cases and k=2 i.e., top 2 most similar retrieved cases, the sensitivity, specificity and overall accuracy are achieved as 90%, 97% and 96% respectively. The preliminary result of the system is acceptable since the feedback; recommendation and alarm messages are personalized and differ from the general messages. Thus, this approach could be possibly adapted for other situations in personalized elderly monitoring.

  14. The study on dynamic cadastral coding rules based on kinship relationship

    NASA Astrophysics Data System (ADS)

    Xu, Huan; Liu, Nan; Liu, Renyi; Lu, Jingfeng

    2007-06-01

    Cadastral coding rules are an important supplement to the existing national and local standard specifications for building cadastral database. After analyzing the course of cadastral change, especially the parcel change with the method of object-oriented analysis, a set of dynamic cadastral coding rules based on kinship relationship corresponding to the cadastral change is put forward and a coding format composed of street code, block code, father parcel code, child parcel code and grandchild parcel code is worked out within the county administrative area. The coding rule has been applied to the development of an urban cadastral information system called "ReGIS", which is not only able to figure out the cadastral code automatically according to both the type of parcel change and the coding rules, but also capable of checking out whether the code is spatiotemporally unique before the parcel is stored in the database. The system has been used in several cities of Zhejiang Province and got a favorable response. This verifies the feasibility and effectiveness of the coding rules to some extent.

  15. Correcting groove error in gratings ruled on a 500-mm ruling engine using interferometric control.

    PubMed

    Mi, Xiaotao; Yu, Haili; Yu, Hongzhu; Zhang, Shanwen; Li, Xiaotian; Yao, Xuefeng; Qi, Xiangdong; Bayinhedhig; Wan, Qiuhua

    2017-07-20

    Groove error is one of the most important factors affecting grating quality and spectral performance. To reduce groove error, we propose a new ruling-tool carriage system based on aerostatic guideways. We design a new blank carriage system with double piezoelectric actuators. We also propose a completely closed-loop servo-control system with a new optical measurement system that can control the position of the diamond relative to the blank. To evaluate our proposed methods, we produced several gratings, including an echelle grating with 79  grooves/mm, a grating with 768  grooves/mm, and a high-density grating with 6000  grooves/mm. The results show that our methods effectively reduce groove error in ruled gratings.

  16. Optimal pattern distributions in Rete-based production systems

    NASA Technical Reports Server (NTRS)

    Scott, Stephen L.

    1994-01-01

    Since its introduction into the AI community in the early 1980's, the Rete algorithm has been widely used. This algorithm has formed the basis for many AI tools, including NASA's CLIPS. One drawback of Rete-based implementation, however, is that the network structures used internally by the Rete algorithm make it sensitive to the arrangement of individual patterns within rules. Thus while rules may be more or less arbitrarily placed within source files, the distribution of individual patterns within these rules can significantly affect the overall system performance. Some heuristics have been proposed to optimize pattern placement, however, these suggestions can be conflicting. This paper describes a systematic effort to measure the effect of pattern distribution on production system performance. An overview of the Rete algorithm is presented to provide context. A description of the methods used to explore the pattern ordering problem area are presented, using internal production system metrics such as the number of partial matches, and coarse-grained operating system data such as memory usage and time. The results of this study should be of interest to those developing and optimizing software for Rete-based production systems.

  17. Classifying the Indication for Colonoscopy Procedures: A Comparison of NLP Approaches in a Diverse National Healthcare System.

    PubMed

    Patterson, Olga V; Forbush, Tyler B; Saini, Sameer D; Moser, Stephanie E; DuVall, Scott L

    2015-01-01

    In order to measure the level of utilization of colonoscopy procedures, identifying the primary indication for the procedure is required. Colonoscopies may be utilized not only for screening, but also for diagnostic or therapeutic purposes. To determine whether a colonoscopy was performed for screening, we created a natural language processing system to identify colonoscopy reports in the electronic medical record system and extract indications for the procedure. A rule-based model and three machine-learning models were created using 2,000 manually annotated clinical notes of patients cared for in the Department of Veterans Affairs. Performance of the models was measured and compared. Analysis of the models on a test set of 1,000 documents indicates that the rule-based system performance stays fairly constant as evaluated on training and testing sets. However, the machine learning model without feature selection showed significant decrease in performance. Therefore, rule-based classification system appears to be more robust than a machine-learning system in cases when no feature selection is performed.

  18. Comparison of Natural Language Processing Rules-based and Machine-learning Systems to Identify Lumbar Spine Imaging Findings Related to Low Back Pain.

    PubMed

    Tan, W Katherine; Hassanpour, Saeed; Heagerty, Patrick J; Rundell, Sean D; Suri, Pradeep; Huhdanpaa, Hannu T; James, Kathryn; Carrell, David S; Langlotz, Curtis P; Organ, Nancy L; Meier, Eric N; Sherman, Karen J; Kallmes, David F; Luetmer, Patrick H; Griffith, Brent; Nerenz, David R; Jarvik, Jeffrey G

    2018-03-28

    To evaluate a natural language processing (NLP) system built with open-source tools for identification of lumbar spine imaging findings related to low back pain on magnetic resonance and x-ray radiology reports from four health systems. We used a limited data set (de-identified except for dates) sampled from lumbar spine imaging reports of a prospectively assembled cohort of adults. From N = 178,333 reports, we randomly selected N = 871 to form a reference-standard dataset, consisting of N = 413 x-ray reports and N = 458 MR reports. Using standardized criteria, four spine experts annotated the presence of 26 findings, where 71 reports were annotated by all four experts and 800 were each annotated by two experts. We calculated inter-rater agreement and finding prevalence from annotated data. We randomly split the annotated data into development (80%) and testing (20%) sets. We developed an NLP system from both rule-based and machine-learned models. We validated the system using accuracy metrics such as sensitivity, specificity, and area under the receiver operating characteristic curve (AUC). The multirater annotated dataset achieved inter-rater agreement of Cohen's kappa > 0.60 (substantial agreement) for 25 of 26 findings, with finding prevalence ranging from 3% to 89%. In the testing sample, rule-based and machine-learned predictions both had comparable average specificity (0.97 and 0.95, respectively). The machine-learned approach had a higher average sensitivity (0.94, compared to 0.83 for rules-based), and a higher overall AUC (0.98, compared to 0.90 for rules-based). Our NLP system performed well in identifying the 26 lumbar spine findings, as benchmarked by reference-standard annotation by medical experts. Machine-learned models provided substantial gains in model sensitivity with slight loss of specificity, and overall higher AUC. Copyright © 2018 The Association of University Radiologists. All rights reserved.

  19. CNN-based ranking for biomedical entity normalization.

    PubMed

    Li, Haodi; Chen, Qingcai; Tang, Buzhou; Wang, Xiaolong; Xu, Hua; Wang, Baohua; Huang, Dong

    2017-10-03

    Most state-of-the-art biomedical entity normalization systems, such as rule-based systems, merely rely on morphological information of entity mentions, but rarely consider their semantic information. In this paper, we introduce a novel convolutional neural network (CNN) architecture that regards biomedical entity normalization as a ranking problem and benefits from semantic information of biomedical entities. The CNN-based ranking method first generates candidates using handcrafted rules, and then ranks the candidates according to their semantic information modeled by CNN as well as their morphological information. Experiments on two benchmark datasets for biomedical entity normalization show that our proposed CNN-based ranking method outperforms traditional rule-based method with state-of-the-art performance. We propose a CNN architecture that regards biomedical entity normalization as a ranking problem. Comparison results show that semantic information is beneficial to biomedical entity normalization and can be well combined with morphological information in our CNN architecture for further improvement.

  20. Building a common pipeline for rule-based document classification.

    PubMed

    Patterson, Olga V; Ginter, Thomas; DuVall, Scott L

    2013-01-01

    Instance-based classification of clinical text is a widely used natural language processing task employed as a step for patient classification, document retrieval, or information extraction. Rule-based approaches rely on concept identification and context analysis in order to determine the appropriate class. We propose a five-step process that enables even small research teams to develop simple but powerful rule-based NLP systems by taking advantage of a common UIMA AS based pipeline for classification. Our proposed methodology coupled with the general-purpose solution provides researchers with access to the data locked in clinical text in cases of limited human resources and compact timelines.

  1. How to translate therapeutic recommendations in clinical practice guidelines into rules for critiquing physician prescriptions? Methods and application to five guidelines

    PubMed Central

    2010-01-01

    Background Clinical practice guidelines give recommendations about what to do in various medical situations, including therapeutical recommendations for drug prescription. An effective way to computerize these recommendations is to design critiquing decision support systems, i.e. systems that criticize the physician's prescription when it does not conform to the guidelines. These systems are commonly based on a list of "if conditions then criticism" rules. However, writing these rules from the guidelines is not a trivial task. The objective of this article is to propose methods that (1) simplify the implementation of guidelines' therapeutical recommendations in critiquing systems by automatically translating structured therapeutical recommendations into a list of "if conditions then criticize" rules, and (2) can generate an appropriate textual label to explain to the physician why his/her prescription is not recommended. Methods We worked on the therapeutic recommendations in five clinical practice guidelines concerning chronic diseases related to the management of cardiovascular risk. We evaluated the system using a test base of more than 2000 cases. Results Algorithms for automatically translating therapeutical recommendations into "if conditions then criticize" rules are presented. Eight generic recommendations are also proposed; they are guideline-independent, and can be used as default behaviour for handling various situations that are usually implicit in the guidelines, such as decreasing the dose of a poorly tolerated drug. Finally, we provide models and methods for generating a human-readable textual critique. The system was successfully evaluated on the test base. Conclusion We show that it is possible to criticize physicians' prescriptions starting from a structured clinical guideline, and to provide clear explanations. We are now planning a randomized clinical trial to evaluate the impact of the system on practices. PMID:20509903

  2. Granular Flow Graph, Adaptive Rule Generation and Tracking.

    PubMed

    Pal, Sankar Kumar; Chakraborty, Debarati Bhunia

    2017-12-01

    A new method of adaptive rule generation in granular computing framework is described based on rough rule base and granular flow graph, and applied for video tracking. In the process, several new concepts and operations are introduced, and methodologies formulated with superior performance. The flow graph enables in defining an intelligent technique for rule base adaptation where its characteristics in mapping the relevance of attributes and rules in decision-making system are exploited. Two new features, namely, expected flow graph and mutual dependency between flow graphs are defined to make the flow graph applicable in the tasks of both training and validation. All these techniques are performed in neighborhood granular level. A way of forming spatio-temporal 3-D granules of arbitrary shape and size is introduced. The rough flow graph-based adaptive granular rule-based system, thus produced for unsupervised video tracking, is capable of handling the uncertainties and incompleteness in frames, able to overcome the incompleteness in information that arises without initial manual interactions and in providing superior performance and gaining in computation time. The cases of partial overlapping and detecting the unpredictable changes are handled efficiently. It is shown that the neighborhood granulation provides a balanced tradeoff between speed and accuracy as compared to pixel level computation. The quantitative indices used for evaluating the performance of tracking do not require any information on ground truth as in the other methods. Superiority of the algorithm to nonadaptive and other recent ones is demonstrated extensively.

  3. Derivation and Validation of a Biomarker-Based Clinical Algorithm to Rule Out Sepsis From Noninfectious Systemic Inflammatory Response Syndrome at Emergency Department Admission: A Multicenter Prospective Study.

    PubMed

    Mearelli, Filippo; Fiotti, Nicola; Giansante, Carlo; Casarsa, Chiara; Orso, Daniele; De Helmersen, Marco; Altamura, Nicola; Ruscio, Maurizio; Castello, Luigi Mario; Colonetti, Efrem; Marino, Rossella; Barbati, Giulia; Bregnocchi, Andrea; Ronco, Claudio; Lupia, Enrico; Montrucchio, Giuseppe; Muiesan, Maria Lorenza; Di Somma, Salvatore; Avanzi, Gian Carlo; Biolo, Gianni

    2018-05-07

    To derive and validate a predictive algorithm integrating a nomogram-based prediction of the pretest probability of infection with a panel of serum biomarkers, which could robustly differentiate sepsis/septic shock from noninfectious systemic inflammatory response syndrome. Multicenter prospective study. At emergency department admission in five University hospitals. Nine-hundred forty-seven adults in inception cohort and 185 adults in validation cohort. None. A nomogram, including age, Sequential Organ Failure Assessment score, recent antimicrobial therapy, hyperthermia, leukocytosis, and high C-reactive protein values, was built in order to take data from 716 infected patients and 120 patients with noninfectious systemic inflammatory response syndrome to predict pretest probability of infection. Then, the best combination of procalcitonin, soluble phospholypase A2 group IIA, presepsin, soluble interleukin-2 receptor α, and soluble triggering receptor expressed on myeloid cell-1 was applied in order to categorize patients as "likely" or "unlikely" to be infected. The predictive algorithm required only procalcitonin backed up with soluble phospholypase A2 group IIA determined in 29% of the patients to rule out sepsis/septic shock with a negative predictive value of 93%. In a validation cohort of 158 patients, predictive algorithm reached 100% of negative predictive value requiring biomarker measurements in 18% of the population. We have developed and validated a high-performing, reproducible, and parsimonious algorithm to assist emergency department physicians in distinguishing sepsis/septic shock from noninfectious systemic inflammatory response syndrome.

  4. Performance of Case-Based Reasoning Retrieval Using Classification Based on Associations versus Jcolibri and FreeCBR: A Further Validation Study

    NASA Astrophysics Data System (ADS)

    Aljuboori, Ahmed S.; Coenen, Frans; Nsaif, Mohammed; Parsons, David J.

    2018-05-01

    Case-Based Reasoning (CBR) plays a major role in expert system research. However, a critical problem can be met when a CBR system retrieves incorrect cases. Class Association Rules (CARs) have been utilized to offer a potential solution in a previous work. The aim of this paper was to perform further validation of Case-Based Reasoning using a Classification based on Association Rules (CBRAR) to enhance the performance of Similarity Based Retrieval (SBR). The CBRAR strategy uses a classed frequent pattern tree algorithm (FP-CAR) in order to disambiguate wrongly retrieved cases in CBR. The research reported in this paper makes contributions to both fields of CBR and Association Rules Mining (ARM) in that full target cases can be extracted from the FP-CAR algorithm without invoking P-trees and union operations. The dataset used in this paper provided more efficient results when the SBR retrieves unrelated answers. The accuracy of the proposed CBRAR system outperforms the results obtained by existing CBR tools such as Jcolibri and FreeCBR.

  5. Systematic methods for knowledge acquisition and expert system development

    NASA Technical Reports Server (NTRS)

    Belkin, Brenda L.; Stengel, Robert F.

    1991-01-01

    Nine cooperating rule-based systems, collectively called AUTOCREW, were designed to automate functions and decisions associated with a combat aircraft's subsystem. The organization of tasks within each system is described; performance metrics were developed to evaluate the workload of each rule base, and to assess the cooperation between the rule-bases. Each AUTOCREW subsystem is composed of several expert systems that perform specific tasks. AUTOCREW's NAVIGATOR was analyzed in detail to understand the difficulties involved in designing the system and to identify tools and methodologies that ease development. The NAVIGATOR determines optimal navigation strategies from a set of available sensors. A Navigation Sensor Management (NSM) expert system was systematically designed from Kalman filter covariance data; four ground-based, a satellite-based, and two on-board INS-aiding sensors were modeled and simulated to aid an INS. The NSM Expert was developed using the Analysis of Variance (ANOVA) and the ID3 algorithm. Navigation strategy selection is based on an RSS position error decision metric, which is computed from the covariance data. Results show that the NSM Expert predicts position error correctly between 45 and 100 percent of the time for a specified navaid configuration and aircraft trajectory. The NSM Expert adapts to new situations, and provides reasonable estimates of hybrid performance. The systematic nature of the ANOVA/ID3 method makes it broadly applicable to expert system design when experimental or simulation data is available.

  6. A Methodology for Multiple Rule System Integration and Resolution Within a Singular Knowledge Base

    NASA Technical Reports Server (NTRS)

    Kautzmann, Frank N., III

    1988-01-01

    Expert Systems which support knowledge representation by qualitative modeling techniques experience problems, when called upon to support integrated views embodying description and explanation, especially when other factors such as multiple causality, competing rule model resolution, and multiple uses of knowledge representation are included. A series of prototypes are being developed to demonstrate the feasibility of automating the process of systems engineering, design and configuration, and diagnosis and fault management. A study involves not only a generic knowledge representation; it must also support multiple views at varying levels of description and interaction between physical elements, systems, and subsystems. Moreover, it will involve models of description and explanation for each level. This multiple model feature requires the development of control methods between rule systems and heuristics on a meta-level for each expert system involved in an integrated and larger class of expert system. The broadest possible category of interacting expert systems is described along with a general methodology for the knowledge representation and control of mutually exclusive rule systems.

  7. 75 FR 76393 - Notice of Request for a New Information Collection (Public Health Information System)

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-08

    ... intention to request a new information collection concerning its Web-based Public Health Information System... Building, Washington, DC 20250; (202) 720-0345. SUPPLEMENTARY INFORMATION: Title: Public Health Information... documents other than rules #0;or proposed rules that are applicable to the public. Notices of hearings #0...

  8. The Virtual Factory Teaching System (VFTS): Project Review and Results.

    ERIC Educational Resources Information Center

    Kazlauskas, E. J.; Boyd, E. F., III; Dessouky, M. M.

    This paper presents a review of the Virtual Factory Teaching (VFTS) project, a Web-based, multimedia collaborative learning network. The system allows students, working alone or in teams, to build factories, forecast demand for products, plan production, establish release rules for new work into the factory, and set scheduling rules for…

  9. Genetic learning in rule-based and neural systems

    NASA Technical Reports Server (NTRS)

    Smith, Robert E.

    1993-01-01

    The design of neural networks and fuzzy systems can involve complex, nonlinear, and ill-conditioned optimization problems. Often, traditional optimization schemes are inadequate or inapplicable for such tasks. Genetic Algorithms (GA's) are a class of optimization procedures whose mechanics are based on those of natural genetics. Mathematical arguments show how GAs bring substantial computational leverage to search problems, without requiring the mathematical characteristics often necessary for traditional optimization schemes (e.g., modality, continuity, availability of derivative information, etc.). GA's have proven effective in a variety of search tasks that arise in neural networks and fuzzy systems. This presentation begins by introducing the mechanism and theoretical underpinnings of GA's. GA's are then related to a class of rule-based machine learning systems called learning classifier systems (LCS's). An LCS implements a low-level production-system that uses a GA as its primary rule discovery mechanism. This presentation illustrates how, despite its rule-based framework, an LCS can be thought of as a competitive neural network. Neural network simulator code for an LCS is presented. In this context, the GA is doing more than optimizing and objective function. It is searching for an ecology of hidden nodes with limited connectivity. The GA attempts to evolve this ecology such that effective neural network performance results. The GA is particularly well adapted to this task, given its naturally-inspired basis. The LCS/neural network analogy extends itself to other, more traditional neural networks. Conclusions to the presentation discuss the implications of using GA's in ecological search problems that arise in neural and fuzzy systems.

  10. Derivation of optimal joint operating rules for multi-purpose multi-reservoir water-supply system

    NASA Astrophysics Data System (ADS)

    Tan, Qiao-feng; Wang, Xu; Wang, Hao; Wang, Chao; Lei, Xiao-hui; Xiong, Yi-song; Zhang, Wei

    2017-08-01

    The derivation of joint operating policy is a challenging task for a multi-purpose multi-reservoir system. This study proposed an aggregation-decomposition model to guide the joint operation of multi-purpose multi-reservoir system, including: (1) an aggregated model based on the improved hedging rule to ensure the long-term water-supply operating benefit; (2) a decomposed model to allocate the limited release to individual reservoirs for the purpose of maximizing the total profit of the facing period; and (3) a double-layer simulation-based optimization model to obtain the optimal time-varying hedging rules using the non-dominated sorting genetic algorithm II, whose objectives were to minimize maximum water deficit and maximize water supply reliability. The water-supply system of Li River in Guangxi Province, China, was selected for the case study. The results show that the operating policy proposed in this study is better than conventional operating rules and aggregated standard operating policy for both water supply and hydropower generation due to the use of hedging mechanism and effective coordination among multiple objectives.

  11. From data mining rules to medical logical modules and medical advices.

    PubMed

    Gomoi, Valentin; Vida, Mihaela; Robu, Raul; Stoicu-Tivadar, Vasile; Bernad, Elena; Lupşe, Oana

    2013-01-01

    Using data mining in collaboration with Clinical Decision Support Systems adds new knowledge as support for medical diagnosis. The current work presents a tool which translates data mining rules supporting generation of medical advices to Arden Syntax formalism. The developed system was tested with data related to 2326 births that took place in 2010 at the Bega Obstetrics - Gynaecology Hospital, Timişoara. Based on processing these data, 14 medical rules regarding the Apgar score were generated and then translated in Arden Syntax language.

  12. Integration of perception and reasoning in fast neural modules

    NASA Technical Reports Server (NTRS)

    Fritz, David G.

    1989-01-01

    Artificial neural systems promise to integrate symbolic and sub-symbolic processing to achieve real time control of physical systems. Two potential alternatives exist. In one, neural nets can be used to front-end expert systems. The expert systems, in turn, are developed with varying degrees of parallelism, including their implementation in neural nets. In the other, rule-based reasoning and sensor data can be integrated within a single hybrid neural system. The hybrid system reacts as a unit to provide decisions (problem solutions) based on the simultaneous evaluation of data and rules. Discussed here is a model hybrid system based on the fuzzy cognitive map (FCM). The operation of the model is illustrated with the control of a hypothetical satellite that intelligently alters its attitude in space in response to an intersecting micrometeorite shower.

  13. Mechanical Forces Program the Orientation of Cell Division during Airway Tube Morphogenesis.

    PubMed

    Tang, Zan; Hu, Yucheng; Wang, Zheng; Jiang, Kewu; Zhan, Cheng; Marshall, Wallace F; Tang, Nan

    2018-02-05

    Oriented cell division plays a key role in controlling organogenesis. The mechanisms for regulating division orientation at the whole-organ level are only starting to become understood. By combining 3D time-lapse imaging, mouse genetics, and mathematical modeling, we find that global orientation of cell division is the result of a combination of two types of spindles with distinct spindle dynamic behaviors in the developing airway epithelium. Fixed spindles follow the classic long-axis rule and establish their division orientation before metaphase. In contrast, rotating spindles do not strictly follow the long-axis rule and determine their division orientation during metaphase. By using both a cell-based mechanical model and stretching-lung-explant experiments, we showed that mechanical force can function as a regulatory signal in maintaining the stable ratio between fixed spindles and rotating spindles. Our findings demonstrate that mechanical forces, cell geometry, and oriented cell division function together in a highly coordinated manner to ensure normal airway tube morphogenesis. Copyright © 2017 Elsevier Inc. All rights reserved.

  14. Reversible elementary cellular automaton with rule number 150 and periodic boundary conditions over 𝔽p

    NASA Astrophysics Data System (ADS)

    Martín Del Rey, A.; Rodríguez Sánchez, G.

    2015-03-01

    The study of the reversibility of elementary cellular automata with rule number 150 over the finite state set 𝔽p and endowed with periodic boundary conditions is done. The dynamic of such discrete dynamical systems is characterized by means of characteristic circulant matrices, and their analysis allows us to state that the reversibility depends on the number of cells of the cellular space and to explicitly compute the corresponding inverse cellular automata.

  15. A proposed computer diagnostic system for malignant melanoma (CDSMM).

    PubMed

    Shao, S; Grams, R R

    1994-04-01

    This paper describes a computer diagnostic system for malignant melanoma. The diagnostic system is a rule base system based on image analyses and works under the PC windows environment. It consists of seven modules: I/O module, Patient/Clinic database, image processing module, classification module, rule base module and system control module. In the system, the image analyses are automatically carried out, and database management is efficient and fast. Both final clinic results and immediate results from various modules such as measured features, feature pictures and history records of the disease lesion can be presented on screen or printed out from each corresponding module or from the I/O module. The system can also work as a doctor's office-based tool to aid dermatologists with details not perceivable by the human eye. Since the system operates on a general purpose PC, it can be made portable if the I/O module is disconnected.

  16. Improved Personalized Recommendation Based on Causal Association Rule and Collaborative Filtering

    ERIC Educational Resources Information Center

    Lei, Wu; Qing, Fang; Zhou, Jin

    2016-01-01

    There are usually limited user evaluation of resources on a recommender system, which caused an extremely sparse user rating matrix, and this greatly reduce the accuracy of personalized recommendation, especially for new users or new items. This paper presents a recommendation method based on rating prediction using causal association rules.…

  17. 76 FR 10628 - Self-Regulatory Organizations; the Depository Trust Company; Notice of Filing and Immediate...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-25

    ... Change Regarding Providing Participants With a New Optional Settlement Web Interface February 22, 2011... Rule Change The proposed rule change will establish a new browser-based interface, the ``Settlement Web... Browser System (``PBS'').\\4\\ Based on request from its Participants, DTC has created a more user-friendly...

  18. Combining human and machine intelligence to derive agents' behavioral rules for groundwater irrigation

    NASA Astrophysics Data System (ADS)

    Hu, Yao; Quinn, Christopher J.; Cai, Ximing; Garfinkle, Noah W.

    2017-11-01

    For agent-based modeling, the major challenges in deriving agents' behavioral rules arise from agents' bounded rationality and data scarcity. This study proposes a "gray box" approach to address the challenge by incorporating expert domain knowledge (i.e., human intelligence) with machine learning techniques (i.e., machine intelligence). Specifically, we propose using directed information graph (DIG), boosted regression trees (BRT), and domain knowledge to infer causal factors and identify behavioral rules from data. A case study is conducted to investigate farmers' pumping behavior in the Midwest, U.S.A. Results show that four factors identified by the DIG algorithm- corn price, underlying groundwater level, monthly mean temperature and precipitation- have main causal influences on agents' decisions on monthly groundwater irrigation depth. The agent-based model is then developed based on the behavioral rules represented by three DIGs and modeled by BRTs, and coupled with a physically-based groundwater model to investigate the impacts of agents' pumping behavior on the underlying groundwater system in the context of coupled human and environmental systems.

  19. 77 FR 38729 - Alternate Tonnage Threshold for Oil Spill Response Vessels

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-29

    ...The Coast Guard is establishing an alternate size threshold based on the measurement system established under the International Convention on Tonnage Measurement of Ships, 1969, for oil spill response vessels, which are properly certificated under 46 CFR chapter I, subchapter L. The present size threshold of 500 gross register tons is based on the U.S. regulatory measurement system. This final rule provides an alternative for owners and operators of offshore supply vessels that may result in an increase in oil spill response capacity and capability. This final rule adopts, without change, the interim rule amending 46 CFR part 126 published in the Federal Register on Monday, December 12, 2011.

  20. Accelerating population balance-Monte Carlo simulation for coagulation dynamics from the Markov jump model, stochastic algorithm and GPU parallel computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xu, Zuwei; Zhao, Haibo, E-mail: klinsmannzhb@163.com; Zheng, Chuguang

    2015-01-15

    This paper proposes a comprehensive framework for accelerating population balance-Monte Carlo (PBMC) simulation of particle coagulation dynamics. By combining Markov jump model, weighted majorant kernel and GPU (graphics processing unit) parallel computing, a significant gain in computational efficiency is achieved. The Markov jump model constructs a coagulation-rule matrix of differentially-weighted simulation particles, so as to capture the time evolution of particle size distribution with low statistical noise over the full size range and as far as possible to reduce the number of time loopings. Here three coagulation rules are highlighted and it is found that constructing appropriate coagulation rule providesmore » a route to attain the compromise between accuracy and cost of PBMC methods. Further, in order to avoid double looping over all simulation particles when considering the two-particle events (typically, particle coagulation), the weighted majorant kernel is introduced to estimate the maximum coagulation rates being used for acceptance–rejection processes by single-looping over all particles, and meanwhile the mean time-step of coagulation event is estimated by summing the coagulation kernels of rejected and accepted particle pairs. The computational load of these fast differentially-weighted PBMC simulations (based on the Markov jump model) is reduced greatly to be proportional to the number of simulation particles in a zero-dimensional system (single cell). Finally, for a spatially inhomogeneous multi-dimensional (multi-cell) simulation, the proposed fast PBMC is performed in each cell, and multiple cells are parallel processed by multi-cores on a GPU that can implement the massively threaded data-parallel tasks to obtain remarkable speedup ratio (comparing with CPU computation, the speedup ratio of GPU parallel computing is as high as 200 in a case of 100 cells with 10 000 simulation particles per cell). These accelerating approaches of PBMC are demonstrated in a physically realistic Brownian coagulation case. The computational accuracy is validated with benchmark solution of discrete-sectional method. The simulation results show that the comprehensive approach can attain very favorable improvement in cost without sacrificing computational accuracy.« less

  1. Application of a rule-based knowledge system using CLIPS for the taxonomy of selected Opuntia species

    NASA Technical Reports Server (NTRS)

    Heymans, Bart C.; Onema, Joel P.; Kuti, Joseph O.

    1991-01-01

    A rule based knowledge system was developed in CLIPS (C Language Integrated Production System) for identifying Opuntia species in the family Cactaceae, which contains approx. 1500 different species. This botanist expert tool system is capable of identifying selected Opuntia plants from the family level down to the species level when given some basic characteristics of the plants. Many plants are becoming of increasing importance because of their nutrition and human health potential, especially in the treatment of diabetes mellitus. The expert tool system described can be extremely useful in an unequivocal identification of many useful Opuntia species.

  2. Alternative Theoretical Bases for the Study of Human Communication: The Rules Perspective.

    ERIC Educational Resources Information Center

    Cushman, Donald P.

    Three potentially useful perspectives for the scientific development of human communication theory are the law model, the systems approach, and the rules paradigm. It is the purpose of this paper to indicate the utility of the rules perspective. For the purposes of this analysis, human communication is viewed as the successful transfer of symbolic…

  3. Fusion of classifiers for REIS-based detection of suspicious breast lesions

    NASA Astrophysics Data System (ADS)

    Lederman, Dror; Wang, Xingwei; Zheng, Bin; Sumkin, Jules H.; Tublin, Mitchell; Gur, David

    2011-03-01

    After developing a multi-probe resonance-frequency electrical impedance spectroscopy (REIS) system aimed at detecting women with breast abnormalities that may indicate a developing breast cancer, we have been conducting a prospective clinical study to explore the feasibility of applying this REIS system to classify younger women (< 50 years old) into two groups of "higher-than-average risk" and "average risk" of having or developing breast cancer. The system comprises one central probe placed in contact with the nipple, and six additional probes uniformly distributed along an outside circle to be placed in contact with six points on the outer breast skin surface. In this preliminary study, we selected an initial set of 174 examinations on participants that have completed REIS examinations and have clinical status verification. Among these, 66 examinations were recommended for biopsy due to findings of a highly suspicious breast lesion ("positives"), and 108 were determined as negative during imaging based procedures ("negatives"). A set of REIS-based features, extracted using a mirror-matched approach, was computed and fed into five machine learning classifiers. A genetic algorithm was used to select an optimal subset of features for each of the five classifiers. Three fusion rules, namely sum rule, weighted sum rule and weighted median rule, were used to combine the results of the classifiers. Performance evaluation was performed using a leave-one-case-out cross-validation method. The results indicated that REIS may provide a new technology to identify younger women with higher than average risk of having or developing breast cancer. Furthermore, it was shown that fusion rule, such as a weighted median fusion rule and a weighted sum fusion rule may improve performance as compared with the highest performing single classifier.

  4. C-Language Integrated Production System, Version 6.0

    NASA Technical Reports Server (NTRS)

    Riley, Gary; Donnell, Brian; Ly, Huyen-Anh Bebe; Ortiz, Chris

    1995-01-01

    C Language Integrated Production System (CLIPS) computer programs are specifically intended to model human expertise or other knowledge. CLIPS is designed to enable research on, and development and delivery of, artificial intelligence on conventional computers. CLIPS 6.0 provides cohesive software tool for handling wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming: representation of knowledge as heuristics - essentially, rules of thumb that specify set of actions performed in given situation. Object-oriented programming: modeling of complex systems comprised of modular components easily reused to model other systems or create new components. Procedural-programming: representation of knowledge in ways similar to those of such languages as C, Pascal, Ada, and LISP. Version of CLIPS 6.0 for IBM PC-compatible computers requires DOS v3.3 or later and/or Windows 3.1 or later.

  5. Multiple systems of category learning.

    PubMed

    Smith, Edward E; Grossman, Murray

    2008-01-01

    We review neuropsychological and neuroimaging evidence for the existence of three qualitatively different categorization systems. These categorization systems are themselves based on three distinct memory systems: working memory (WM), explicit long-term memory (explicit LTM), and implicit long-term memory (implicit LTM). We first contrast categorization based on WM with that based on explicit LTM, where the former typically involves applying rules to a test item and the latter involves determining the similarity between stored exemplars or prototypes and a test item. Neuroimaging studies show differences between brain activity in normal participants as a function of whether they are instructed to categorize novel test items by rule or by similarity to known category members. Rule instructions typically lead to more activation in frontal or parietal areas, associated with WM and selective attention, whereas similarity instructions may activate parietal areas associated with the integration of perceptual features. Studies with neurological patients in the same paradigms provide converging evidence, e.g., patients with Alzheimer's disease, who have damage in prefrontal regions, are more impaired with rule than similarity instructions. Our second contrast is between categorization based on explicit LTM with that based on implicit LTM. Neuropsychological studies with patients with medial-temporal lobe damage show that patients are impaired on tasks requiring explicit LTM, but perform relatively normally on an implicit categorization task. Neuroimaging studies provide converging evidence: whereas explicit categorization is mediated by activation in numerous frontal and parietal areas, implicit categorization is mediated by a deactivation in posterior cortex.

  6. A center for commercial development of space: Real-time satellite mapping. Remote sensing-based agricultural information expert system

    NASA Technical Reports Server (NTRS)

    Hadipriono, Fabian C.; Diaz, Carlos F.; Merritt, Earl S.

    1989-01-01

    The research project results in a powerful yet user friendly CROPCAST expert system for use by a client to determine the crop yield production of a certain crop field. The study is based on the facts that heuristic assessment and decision making in agriculture are significant and dominate much of agribusiness. Transfer of the expert knowledge concerning remote sensing based crop yield production into a specific expert system is the key program in this study. A knowledge base consisting of a root frame, CROP-YIELD-FORECAST, and four subframes, namely, SATELLITE, PLANT-PHYSIOLOGY, GROUND, and MODEL were developed to accommodate the production rules obtained from the domain expert. The expert system shell Personal Consultant Plus version 4.0. was used for this purpose. An external geographic program was integrated to the system. This project is the first part of a completely built expert system. The study reveals that much effort was given to the development of the rules. Such effort is inevitable if workable, efficient, and accurate rules are desired. Furthermore, abundant help statements and graphics were included. Internal and external display routines add to the visual capability of the system. The work results in a useful tool for the client for making decisions on crop yield production.

  7. State Identification of Hoisting Motors Based on Association Rules for Quayside Container Crane

    NASA Astrophysics Data System (ADS)

    Li, Q. Z.; Gang, T.; Pan, H. Y.; Xiong, H.

    2017-07-01

    Quay container crane hoisting motor is a complex system, and the characteristics of long-term evolution and change of running status of there is a rule, and use it. Through association rules analysis, this paper introduced the similarity in association rules, and quay container crane hoisting motor status identification. Finally validated by an example, some rules change amplitude is small, regular monitoring, not easy to find, but it is precisely because of these small changes led to mechanical failure. Therefore, using the association rules change in monitoring the motor status has the very strong practical significance.

  8. An Intelligent computer-aided tutoring system for diagnosing anomalies of spacecraft in operation

    NASA Technical Reports Server (NTRS)

    Rolincik, Mark; Lauriente, Michael; Koons, Harry C.; Gorney, David

    1993-01-01

    A new rule-based, expert system for diagnosing spacecraft anomalies is under development. The knowledge base consists of over two-hundred (200) rules and provides links to historical and environmental databases. Environmental causes considered are bulk charging, single event upsets (SEU), surface charging, and total radiation dose. The system's driver translates forward chaining rules into a backward chaining sequence, prompting the user for information pertinent to the causes considered. When the user selects the novice mode, the system automatically gives detailed explanations and descriptions of terms and reasoning as the session progresses, in a sense teaching the user. As such it is an effective tutoring tool. The use of heuristics frees the user from searching through large amounts of irrelevant information and allows the user to input partial information (varying degrees of confidence in an answer) or 'unknown' to any question. The system is available on-line and uses C Language Integrated Production System (CLIPS), an expert shell developed by the NASA Johnson Space Center AI Laboratory in Houston.

  9. CMedTEX: A Rule-based Temporal Expression Extraction and Normalization System for Chinese Clinical Notes.

    PubMed

    Liu, Zengjian; Tang, Buzhou; Wang, Xiaolong; Chen, Qingcai; Li, Haodi; Bu, Junzhao; Jiang, Jingzhi; Deng, Qiwen; Zhu, Suisong

    2016-01-01

    Time is an important aspect of information and is very useful for information utilization. The goal of this study was to analyze the challenges of temporal expression (TE) extraction and normalization in Chinese clinical notes by assessing the performance of a rule-based system developed by us on a manually annotated corpus (including 1,778 clinical notes of 281 hospitalized patients). In order to develop system conveniently, we divided TEs into three categories: direct, indirect and uncertain TEs, and designed different rules for each category of them. Evaluation on the independent test set shows that our system achieves an F-score of93.40% on TE extraction, and an accuracy of 92.58% on TE normalization under "exact-match" criterion. Compared with HeidelTime for Chinese newswire text, our system is much better, indicating that it is necessary to develop a specific TE extraction and normalization system for Chinese clinical notes because of domain difference.

  10. Molnets: An Artificial Chemistry Based on Neural Networks

    NASA Technical Reports Server (NTRS)

    Colombano, Silvano; Luk, Johnny; Segovia-Juarez, Jose L.; Lohn, Jason; Clancy, Daniel (Technical Monitor)

    2002-01-01

    The fundamental problem in the evolution of matter is to understand how structure-function relationships are formed and increase in complexity from the molecular level all the way to a genetic system. We have created a system where structure-function relationships arise naturally and without the need of ad hoc function assignments to given structures. The idea was inspired by neural networks, where the structure of the net embodies specific computational properties. In this system networks interact with other networks to create connections between the inputs of one net and the outputs of another. The newly created net then recomputes its own synaptic weights, based on anti-hebbian rules. As a result some connections may be cut, and multiple nets can emerge as products of a 'reaction'. The idea is to study emergent reaction behaviors, based on simple rules that constitute a pseudophysics of the system. These simple rules are parameterized to produce behaviors that emulate chemical reactions. We find that these simple rules show a gradual increase in the size and complexity of molecules. We have been building a virtual artificial chemistry laboratory for discovering interesting reactions and for testing further ideas on the evolution of primitive molecules. Some of these ideas include the potential effect of membranes and selective diffusion according to molecular size.

  11. Design a Fuzzy Rule-based Expert System to Aid Earlier Diagnosis of Gastric Cancer.

    PubMed

    Safdari, Reza; Arpanahi, Hadi Kazemi; Langarizadeh, Mostafa; Ghazisaiedi, Marjan; Dargahi, Hossein; Zendehdel, Kazem

    2018-01-01

    Screening and health check-up programs are most important sanitary priorities, that should be undertaken to control dangerous diseases such as gastric cancer that affected by different factors. More than 50% of gastric cancer diagnoses are made during the advanced stage. Currently, there is no systematic approach for early diagnosis of gastric cancer. to develop a fuzzy expert system that can identify gastric cancer risk levels in individuals. This system was implemented in MATLAB software, Mamdani inference technique applied to simulate reasoning of experts in the field, a total of 67 fuzzy rules extracted as a rule-base based on medical expert's opinion. 50 case scenarios were used to evaluate the system, the information of case reports is given to the system to find risk level of each case report then obtained results were compared with expert's diagnosis. Results revealed that sensitivity was 92.1% and the specificity was 83.1%. The results show that is possible to develop a system that can identify High risk individuals for gastric cancer. The system can lead to earlier diagnosis, this may facilitate early treatment and reduce gastric cancer mortality rate.

  12. Agent-based re-engineering of ErbB signaling: a modeling pipeline for integrative systems biology.

    PubMed

    Das, Arya A; Ajayakumar Darsana, T; Jacob, Elizabeth

    2017-03-01

    Experiments in systems biology are generally supported by a computational model which quantitatively estimates the parameters of the system by finding the best fit to the experiment. Mathematical models have proved to be successful in reverse engineering the system. The data generated is interpreted to understand the dynamics of the underlying phenomena. The question we have sought to answer is that - is it possible to use an agent-based approach to re-engineer a biological process, making use of the available knowledge from experimental and modelling efforts? Can the bottom-up approach benefit from the top-down exercise so as to create an integrated modelling formalism for systems biology? We propose a modelling pipeline that learns from the data given by reverse engineering, and uses it for re-engineering the system, to carry out in-silico experiments. A mathematical model that quantitatively predicts co-expression of EGFR-HER2 receptors in activation and trafficking has been taken for this study. The pipeline architecture takes cues from the population model that gives the rates of biochemical reactions, to formulate knowledge-based rules for the particle model. Agent-based simulations using these rules, support the existing facts on EGFR-HER2 dynamics. We conclude that, re-engineering models, built using the results of reverse engineering, opens up the possibility of harnessing the power pack of data which now lies scattered in literature. Virtual experiments could then become more realistic when empowered with the findings of empirical cell biology and modelling studies. Implemented on the Agent Modelling Framework developed in-house. C ++ code templates available in Supplementary material . liz.csir@gmail.com. Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  13. Automating the design of scientific computing software

    NASA Technical Reports Server (NTRS)

    Kant, Elaine

    1992-01-01

    SINAPSE is a domain-specific software design system that generates code from specifications of equations and algorithm methods. This paper describes the system's design techniques (planning in a space of knowledge-based refinement and optimization rules), user interaction style (user has option to control decision making), and representation of knowledge (rules and objects). It also summarizes how the system knowledge has evolved over time and suggests some issues in building software design systems to facilitate reuse.

  14. A novel on-line spatial-temporal k-anonymity method for location privacy protection from sequence rules-based inference attacks.

    PubMed

    Zhang, Haitao; Wu, Chenxue; Chen, Zewei; Liu, Zhao; Zhu, Yunhong

    2017-01-01

    Analyzing large-scale spatial-temporal k-anonymity datasets recorded in location-based service (LBS) application servers can benefit some LBS applications. However, such analyses can allow adversaries to make inference attacks that cannot be handled by spatial-temporal k-anonymity methods or other methods for protecting sensitive knowledge. In response to this challenge, first we defined a destination location prediction attack model based on privacy-sensitive sequence rules mined from large scale anonymity datasets. Then we proposed a novel on-line spatial-temporal k-anonymity method that can resist such inference attacks. Our anti-attack technique generates new anonymity datasets with awareness of privacy-sensitive sequence rules. The new datasets extend the original sequence database of anonymity datasets to hide the privacy-sensitive rules progressively. The process includes two phases: off-line analysis and on-line application. In the off-line phase, sequence rules are mined from an original sequence database of anonymity datasets, and privacy-sensitive sequence rules are developed by correlating privacy-sensitive spatial regions with spatial grid cells among the sequence rules. In the on-line phase, new anonymity datasets are generated upon LBS requests by adopting specific generalization and avoidance principles to hide the privacy-sensitive sequence rules progressively from the extended sequence anonymity datasets database. We conducted extensive experiments to test the performance of the proposed method, and to explore the influence of the parameter K value. The results demonstrated that our proposed approach is faster and more effective for hiding privacy-sensitive sequence rules in terms of hiding sensitive rules ratios to eliminate inference attacks. Our method also had fewer side effects in terms of generating new sensitive rules ratios than the traditional spatial-temporal k-anonymity method, and had basically the same side effects in terms of non-sensitive rules variation ratios with the traditional spatial-temporal k-anonymity method. Furthermore, we also found the performance variation tendency from the parameter K value, which can help achieve the goal of hiding the maximum number of original sensitive rules while generating a minimum of new sensitive rules and affecting a minimum number of non-sensitive rules.

  15. A novel on-line spatial-temporal k-anonymity method for location privacy protection from sequence rules-based inference attacks

    PubMed Central

    Wu, Chenxue; Liu, Zhao; Zhu, Yunhong

    2017-01-01

    Analyzing large-scale spatial-temporal k-anonymity datasets recorded in location-based service (LBS) application servers can benefit some LBS applications. However, such analyses can allow adversaries to make inference attacks that cannot be handled by spatial-temporal k-anonymity methods or other methods for protecting sensitive knowledge. In response to this challenge, first we defined a destination location prediction attack model based on privacy-sensitive sequence rules mined from large scale anonymity datasets. Then we proposed a novel on-line spatial-temporal k-anonymity method that can resist such inference attacks. Our anti-attack technique generates new anonymity datasets with awareness of privacy-sensitive sequence rules. The new datasets extend the original sequence database of anonymity datasets to hide the privacy-sensitive rules progressively. The process includes two phases: off-line analysis and on-line application. In the off-line phase, sequence rules are mined from an original sequence database of anonymity datasets, and privacy-sensitive sequence rules are developed by correlating privacy-sensitive spatial regions with spatial grid cells among the sequence rules. In the on-line phase, new anonymity datasets are generated upon LBS requests by adopting specific generalization and avoidance principles to hide the privacy-sensitive sequence rules progressively from the extended sequence anonymity datasets database. We conducted extensive experiments to test the performance of the proposed method, and to explore the influence of the parameter K value. The results demonstrated that our proposed approach is faster and more effective for hiding privacy-sensitive sequence rules in terms of hiding sensitive rules ratios to eliminate inference attacks. Our method also had fewer side effects in terms of generating new sensitive rules ratios than the traditional spatial-temporal k-anonymity method, and had basically the same side effects in terms of non-sensitive rules variation ratios with the traditional spatial-temporal k-anonymity method. Furthermore, we also found the performance variation tendency from the parameter K value, which can help achieve the goal of hiding the maximum number of original sensitive rules while generating a minimum of new sensitive rules and affecting a minimum number of non-sensitive rules. PMID:28767687

  16. Modeling for (physical) biologists: an introduction to the rule-based approach

    PubMed Central

    Chylek, Lily A; Harris, Leonard A; Faeder, James R; Hlavacek, William S

    2015-01-01

    Models that capture the chemical kinetics of cellular regulatory networks can be specified in terms of rules for biomolecular interactions. A rule defines a generalized reaction, meaning a reaction that permits multiple reactants, each capable of participating in a characteristic transformation and each possessing certain, specified properties, which may be local, such as the state of a particular site or domain of a protein. In other words, a rule defines a transformation and the properties that reactants must possess to participate in the transformation. A rule also provides a rate law. A rule-based approach to modeling enables consideration of mechanistic details at the level of functional sites of biomolecules and provides a facile and visual means for constructing computational models, which can be analyzed to study how system-level behaviors emerge from component interactions. PMID:26178138

  17. A sensitive HIV-1 envelope induced fusion assay identifies fusion enhancement of thrombin

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheng, De-Chun; Zhong, Guo-Cai; Su, Ju-Xiang

    2010-01-22

    To evaluate the interaction between HIV-1 envelope glycoprotein (Env) and target cell receptors, various cell-cell-fusion assays have been developed. In the present study, we established a novel fusion system. In this system, the expression of the sensitive reporter gene, firefly luciferase (FL) gene, in the target cells was used to evaluate cell fusion event. Simultaneously, constitutively expressed Renilla luciferase (RL) gene was used to monitor effector cell number and viability. FL gave a wider dynamic range than other known reporters and the introduction of RL made the assay accurate and reproducible. This system is especially beneficial for investigation of potentialmore » entry-influencing agents, for its power of ruling out the false inhibition or enhancement caused by the artificial cell-number variation. As a case study, we applied this fusion system to observe the effect of a serine protease, thrombin, on HIV Env-mediated cell-cell fusion and have found the fusion enhancement activity of thrombin over two R5-tropic HIV strains.« less

  18. Reliability and performance evaluation of systems containing embedded rule-based expert systems

    NASA Technical Reports Server (NTRS)

    Beaton, Robert M.; Adams, Milton B.; Harrison, James V. A.

    1989-01-01

    A method for evaluating the reliability of real-time systems containing embedded rule-based expert systems is proposed and investigated. It is a three stage technique that addresses the impact of knowledge-base uncertainties on the performance of expert systems. In the first stage, a Markov reliability model of the system is developed which identifies the key performance parameters of the expert system. In the second stage, the evaluation method is used to determine the values of the expert system's key performance parameters. The performance parameters can be evaluated directly by using a probabilistic model of uncertainties in the knowledge-base or by using sensitivity analyses. In the third and final state, the performance parameters of the expert system are combined with performance parameters for other system components and subsystems to evaluate the reliability and performance of the complete system. The evaluation method is demonstrated in the context of a simple expert system used to supervise the performances of an FDI algorithm associated with an aircraft longitudinal flight-control system.

  19. Automated diagnosis of coronary artery disease based on data mining and fuzzy modeling.

    PubMed

    Tsipouras, Markos G; Exarchos, Themis P; Fotiadis, Dimitrios I; Kotsia, Anna P; Vakalis, Konstantinos V; Naka, Katerina K; Michalis, Lampros K

    2008-07-01

    A fuzzy rule-based decision support system (DSS) is presented for the diagnosis of coronary artery disease (CAD). The system is automatically generated from an initial annotated dataset, using a four stage methodology: 1) induction of a decision tree from the data; 2) extraction of a set of rules from the decision tree, in disjunctive normal form and formulation of a crisp model; 3) transformation of the crisp set of rules into a fuzzy model; and 4) optimization of the parameters of the fuzzy model. The dataset used for the DSS generation and evaluation consists of 199 subjects, each one characterized by 19 features, including demographic and history data, as well as laboratory examinations. Tenfold cross validation is employed, and the average sensitivity and specificity obtained is 62% and 54%, respectively, using the set of rules extracted from the decision tree (first and second stages), while the average sensitivity and specificity increase to 80% and 65%, respectively, when the fuzzification and optimization stages are used. The system offers several advantages since it is automatically generated, it provides CAD diagnosis based on easily and noninvasively acquired features, and is able to provide interpretation for the decisions made.

  20. Coreference analysis in clinical notes: a multi-pass sieve with alternate anaphora resolution modules.

    PubMed

    Jonnalagadda, Siddhartha Reddy; Li, Dingcheng; Sohn, Sunghwan; Wu, Stephen Tze-Inn; Wagholikar, Kavishwar; Torii, Manabu; Liu, Hongfang

    2012-01-01

    This paper describes the coreference resolution system submitted by Mayo Clinic for the 2011 i2b2/VA/Cincinnati shared task Track 1C. The goal of the task was to construct a system that links the markables corresponding to the same entity. The task organizers provided progress notes and discharge summaries that were annotated with the markables of treatment, problem, test, person, and pronoun. We used a multi-pass sieve algorithm that applies deterministic rules in the order of preciseness and simultaneously gathers information about the entities in the documents. Our system, MedCoref, also uses a state-of-the-art machine learning framework as an alternative to the final, rule-based pronoun resolution sieve. The best system that uses a multi-pass sieve has an overall score of 0.836 (average of B(3), MUC, Blanc, and CEAF F score) for the training set and 0.843 for the test set. A supervised machine learning system that typically uses a single function to find coreferents cannot accommodate irregularities encountered in data especially given the insufficient number of examples. On the other hand, a completely deterministic system could lead to a decrease in recall (sensitivity) when the rules are not exhaustive. The sieve-based framework allows one to combine reliable machine learning components with rules designed by experts. Using relatively simple rules, part-of-speech information, and semantic type properties, an effective coreference resolution system could be designed. The source code of the system described is available at https://sourceforge.net/projects/ohnlp/files/MedCoref.

  1. Medicare and Medicaid Programs; CY 2016 Home Health Prospective Payment System Rate Update; Home Health Value-Based Purchasing Model; and Home Health Quality Reporting Requirements. Final rule.

    PubMed

    2015-11-05

    This final rule will update Home Health Prospective Payment System (HH PPS) rates, including the national, standardized 60-day episode payment rates, the national per-visit rates, and the non-routine medical supply (NRS) conversion factor under the Medicare prospective payment system for home health agencies (HHAs), effective for episodes ending on or after January 1, 2016. As required by the Affordable Care Act, this rule implements the 3rd year of the 4-year phase-in of the rebasing adjustments to the HH PPS payment rates. This rule updates the HH PPS case-mix weights using the most current, complete data available at the time of rulemaking and provides a clarification regarding the use of the "initial encounter'' seventh character applicable to certain ICD-10-CM code categories. This final rule will also finalize reductions to the national, standardized 60-day episode payment rate in CY 2016, CY 2017, and CY 2018 of 0.97 percent in each year to account for estimated case-mix growth unrelated to increases in patient acuity (nominal case-mix growth) between CY 2012 and CY 2014. In addition, this rule implements a HH value-based purchasing (HHVBP) model, beginning January 1, 2016, in which all Medicare-certified HHAs in selected states will be required to participate. Finally, this rule finalizes minor changes to the home health quality reporting program and minor technical regulations text changes.

  2. Optimal operating rules definition in complex water resource systems combining fuzzy logic, expert criteria and stochastic programming

    NASA Astrophysics Data System (ADS)

    Macian-Sorribes, Hector; Pulido-Velazquez, Manuel

    2016-04-01

    This contribution presents a methodology for defining optimal seasonal operating rules in multireservoir systems coupling expert criteria and stochastic optimization. Both sources of information are combined using fuzzy logic. The structure of the operating rules is defined based on expert criteria, via a joint expert-technician framework consisting in a series of meetings, workshops and surveys carried out between reservoir managers and modelers. As a result, the decision-making process used by managers can be assessed and expressed using fuzzy logic: fuzzy rule-based systems are employed to represent the operating rules and fuzzy regression procedures are used for forecasting future inflows. Once done that, a stochastic optimization algorithm can be used to define optimal decisions and transform them into fuzzy rules. Finally, the optimal fuzzy rules and the inflow prediction scheme are combined into a Decision Support System for making seasonal forecasts and simulate the effect of different alternatives in response to the initial system state and the foreseen inflows. The approach presented has been applied to the Jucar River Basin (Spain). Reservoir managers explained how the system is operated, taking into account the reservoirs' states at the beginning of the irrigation season and the inflows previewed during that season. According to the information given by them, the Jucar River Basin operating policies were expressed via two fuzzy rule-based (FRB) systems that estimate the amount of water to be allocated to the users and how the reservoir storages should be balanced to guarantee those deliveries. A stochastic optimization model using Stochastic Dual Dynamic Programming (SDDP) was developed to define optimal decisions, which are transformed into optimal operating rules embedding them into the two FRBs previously created. As a benchmark, historical records are used to develop alternative operating rules. A fuzzy linear regression procedure was employed to foresee future inflows depending on present and past hydrological and meteorological variables actually used by the reservoir managers to define likely inflow scenarios. A Decision Support System (DSS) was created coupling the FRB systems and the inflow prediction scheme in order to give the user a set of possible optimal releases in response to the reservoir states at the beginning of the irrigation season and the fuzzy inflow projections made using hydrological and meteorological information. The results show that the optimal DSS created using the FRB operating policies are able to increase the amount of water allocated to the users in 20 to 50 Mm3 per irrigation season with respect to the current policies. Consequently, the mechanism used to define optimal operating rules and transform them into a DSS is able to increase the water deliveries in the Jucar River Basin, combining expert criteria and optimization algorithms in an efficient way. This study has been partially supported by the IMPADAPT project (CGL2013-48424-C2-1-R) with Spanish MINECO (Ministerio de Economía y Competitividad) and FEDER funds. It also has received funding from the European Union's Horizon 2020 research and innovation programme under the IMPREX project (grant agreement no: 641.811).

  3. Autonomous Flight Safety System

    NASA Technical Reports Server (NTRS)

    Simpson, James

    2010-01-01

    The Autonomous Flight Safety System (AFSS) is an independent self-contained subsystem mounted onboard a launch vehicle. AFSS has been developed by and is owned by the US Government. Autonomously makes flight termination/destruct decisions using configurable software-based rules implemented on redundant flight processors using data from redundant GPS/IMU navigation sensors. AFSS implements rules determined by the appropriate Range Safety officials.

  4. Forward-Chaining Versus A Graph Approach As The Inference Engine In Expert Systems

    NASA Astrophysics Data System (ADS)

    Neapolitan, Richard E.

    1986-03-01

    Rule-based expert systems are those in which a certain number of IF-THEN rules are assumed to be true. Based on the verity of some assertions, the rules deduce as many new conclusions as possible. A standard technique used to make these deductions is forward-chaining. In forward-chaining, the program or 'inference engine' cycles through the rules. At each rule, the premises for the rule are checked against the current true assertions. If all the premises are found, the conclusion is added to the list of true assertions. At that point it is necessary to start over at the first rule, since the new conclusion may be a premise in a rule already checked. Therefore, each time a new conclusion is deduced it is necessary to start the rule checking procedure over. This process continues until no new conclusions are added and the end of the list of rules is reached. The above process, although quite costly in terms of CPU cycles due to the necessity of repeatedly starting the process over, is necessary if the rules contain 'pattern variables'. An example of such a rule is, 'IF X IS A BACTERIA, THEN X CAN BE TREATED WITH ANTIBIOTICS'. Since the rule can lead to conclusions for many values of X, it is necessary to check each premise in the rule against every true assertion producing an association list to be used in the checking of the next premise. However, if the rule does not contain variable data, as is the case in many current expert systems, then a rule can lead to only one conclusion. In this case, the rules can be stored in a graph, and the true assertions in an assertion list. The assertion list is traversed only once; at each assertion a premise is triggered in all the rules which have that assertion as a premise. When all premises for a rule trigger, the rule's conclusion is added to the END of the list of assertions. It must be added at the end so that it will eventually be used to make further deductions. In the current paper, the two methods are described in detail, the relative advantages of each is discussed, and a benchmark comparing the CPU cycles consumed by each is included. It is also shown that, in the case of reasoning under uncertainty, it is possible to properly combine the certainties derived from rules arguing for the same conclusion when the graph approach is used.

  5. Medicare and Medicaid programs: hospital outpatient prospective payment; ambulatory surgical center payment; hospital value-based purchasing program; physician self-referral; and patient notification requirements in provider agreements. Final rule with comment period.

    PubMed

    2011-11-30

    This final rule with comment period revises the Medicare hospital outpatient prospective payment system (OPPS) for CY 2012 to implement applicable statutory requirements and changes arising from our continuing experience with this system. In this final rule with comment period, we describe the changes to the amounts and factors used to determine the payment rates for Medicare hospital outpatient services paid under the OPPS. In addition, this final rule with comment period updates the revised Medicare ambulatory surgical center (ASC) payment system to implement applicable statutory requirements and changes arising from our continuing experience with this system. In this final rule with comment period, we set forth the relative payment weights and payment amounts for services furnished in ASCs, specific HCPCS codes to which these changes apply, and other ratesetting information for the CY 2012 ASC payment system. We are revising the requirements for the Hospital Outpatient Quality Reporting (OQR) Program, adding new requirements for ASC Quality Reporting System, and making additional changes to provisions of the Hospital Inpatient Value-Based Purchasing (VBP) Program. We also are allowing eligible hospitals and CAHs participating in the Medicare Electronic Health Record (EHR) Incentive Program to meet the clinical quality measure reporting requirement of the EHR Incentive Program for payment year 2012 by participating in the 2012 Medicare EHR Incentive Program Electronic Reporting Pilot. Finally, we are making changes to the rules governing the whole hospital and rural provider exceptions to the physician self-referral prohibition for expansion of facility capacity and changes to provider agreement regulations on patient notification requirements.

  6. Dynamic association rules for gene expression data analysis.

    PubMed

    Chen, Shu-Chuan; Tsai, Tsung-Hsien; Chung, Cheng-Han; Li, Wen-Hsiung

    2015-10-14

    The purpose of gene expression analysis is to look for the association between regulation of gene expression levels and phenotypic variations. This association based on gene expression profile has been used to determine whether the induction/repression of genes correspond to phenotypic variations including cell regulations, clinical diagnoses and drug development. Statistical analyses on microarray data have been developed to resolve gene selection issue. However, these methods do not inform us of causality between genes and phenotypes. In this paper, we propose the dynamic association rule algorithm (DAR algorithm) which helps ones to efficiently select a subset of significant genes for subsequent analysis. The DAR algorithm is based on association rules from market basket analysis in marketing. We first propose a statistical way, based on constructing a one-sided confidence interval and hypothesis testing, to determine if an association rule is meaningful. Based on the proposed statistical method, we then developed the DAR algorithm for gene expression data analysis. The method was applied to analyze four microarray datasets and one Next Generation Sequencing (NGS) dataset: the Mice Apo A1 dataset, the whole genome expression dataset of mouse embryonic stem cells, expression profiling of the bone marrow of Leukemia patients, Microarray Quality Control (MAQC) data set and the RNA-seq dataset of a mouse genomic imprinting study. A comparison of the proposed method with the t-test on the expression profiling of the bone marrow of Leukemia patients was conducted. We developed a statistical way, based on the concept of confidence interval, to determine the minimum support and minimum confidence for mining association relationships among items. With the minimum support and minimum confidence, one can find significant rules in one single step. The DAR algorithm was then developed for gene expression data analysis. Four gene expression datasets showed that the proposed DAR algorithm not only was able to identify a set of differentially expressed genes that largely agreed with that of other methods, but also provided an efficient and accurate way to find influential genes of a disease. In the paper, the well-established association rule mining technique from marketing has been successfully modified to determine the minimum support and minimum confidence based on the concept of confidence interval and hypothesis testing. It can be applied to gene expression data to mine significant association rules between gene regulation and phenotype. The proposed DAR algorithm provides an efficient way to find influential genes that underlie the phenotypic variance.

  7. The role of the interaction network in the emergence of diversity of behavior

    PubMed Central

    Tabacof, Pedro; Von Zuben, Fernando J.

    2017-01-01

    How can systems in which individuals’ inner workings are very similar to each other, as neural networks or ant colonies, produce so many qualitatively different behaviors, giving rise to roles and specialization? In this work, we bring new perspectives to this question by focusing on the underlying network that defines how individuals in these systems interact. We applied a genetic algorithm to optimize rules and connections of cellular automata in order to solve the density classification task, a classical problem used to study emergent behaviors in decentralized computational systems. The networks used were all generated by the introduction of shortcuts in an originally regular topology, following the small-world model. Even though all cells follow the exact same rules, we observed the existence of different classes of cells’ behaviors in the best cellular automata found—most cells were responsible for memory and others for integration of information. Through the analysis of structural measures and patterns of connections (motifs) in successful cellular automata, we observed that the distribution of shortcuts between distant regions and the speed in which a cell can gather information from different parts of the system seem to be the main factors for the specialization we observed, demonstrating how heterogeneity in a network can create heterogeneity of behavior. PMID:28234962

  8. Merging a Terrain-Based Parameter and Snow Particle Counter Data for the Assessment of Snow Redistribution in the Col du Lac Blanc Area

    NASA Astrophysics Data System (ADS)

    Schön, Peter; Prokop, Alexander; Naaim-Bouvet, Florence; Vionnet, Vincent; Guyomarc'h, Gilbert; Heiser, Micha; Nishimura, Kouichi

    2015-04-01

    Wind and the associated snow drift are dominating factors determining the snow distribution and accumulation in alpine areas, resulting in a high spatial variability of snow depth that is difficult to evaluate and quantify. The terrain-based parameter Sx characterizes the degree of shelter or exposure of a grid point provided by the upwind terrain, without the computational complexity of numerical wind field models. The parameter has shown to qualitatively predict snow redistribution with good reproduction of spatial patterns. It does not, however, provide a quantitative estimate of changes in snow depths. The objective of our research was to introduce a new parameter to quantify changes in snow depths in our research area, the Col du Lac Blanc in the French Alps. The area is at an elevation of 2700 m and particularly suited for our study due to its consistently bi-modal wind directions. Our work focused on two pronounced, approximately 10 m high terrain breaks, and we worked with 1 m resolution digital snow surface models (DSM). The DSM and measured changes in snow depths were obtained with high-accuracy terrestrial laser scan (TLS) measurements. First we calculated the terrain-based parameter Sx on a digital snow surface model and correlated Sx with measured changes in snow-depths (Δ SH). Results showed that Δ SH can be approximated by Δ SHestimated = α * Sx, where α is a newly introduced parameter. The parameter α has shown to be linked to the amount of snow deposited influenced by blowing snow flux. At the Col du Lac Blanc test side, blowing snow flux is recorded with snow particle counters (SPC). Snow flux is the number of drifting snow particles per time and area. Hence, the SPC provide data about the duration and intensity of drifting snow events, two important factors not accounted for by the terrain parameter Sx. We analyse how the SPC snow flux data can be used to estimate the magnitude of the new variable parameter α . To simulate the development of the snow surface in dependency of Sx, SPC flux and time, we apply a simple cellular automata system. The system consists of raster cells that develop through discrete time steps according to a set of rules. The rules are based on the states of neighboring cells. Our model assumes snow transport in dependency of Sx gradients between neighboring cells. The cells evolve based on difference quotients between neighbouring cells. Our analyses and results are steps towards using the terrain-based parameter Sx, coupled with SPC data, to quantitatively estimate changes in snow depths, using high raster resolutions of 1 m.

  9. Unipolar distributions of junctional Myosin II identify cell stripe boundaries that drive cell intercalation throughout Drosophila axis extension

    PubMed Central

    Tetley, Robert J; Blanchard, Guy B; Fletcher, Alexander G; Adams, Richard J; Sanson, Bénédicte

    2016-01-01

    Convergence and extension movements elongate tissues during development. Drosophila germ-band extension (GBE) is one example, which requires active cell rearrangements driven by Myosin II planar polarisation. Here, we develop novel computational methods to analyse the spatiotemporal dynamics of Myosin II during GBE, at the scale of the tissue. We show that initial Myosin II bipolar cell polarization gives way to unipolar enrichment at parasegmental boundaries and two further boundaries within each parasegment, concomitant with a doubling of cell number as the tissue elongates. These boundaries are the primary sites of cell intercalation, behaving as mechanical barriers and providing a mechanism for how cells remain ordered during GBE. Enrichment at parasegment boundaries during GBE is independent of Wingless signaling, suggesting pair-rule gene control. Our results are consistent with recent work showing that a combinatorial code of Toll-like receptors downstream of pair-rule genes contributes to Myosin II polarization via local cell-cell interactions. We propose an updated cell-cell interaction model for Myosin II polarization that we tested in a vertex-based simulation. DOI: http://dx.doi.org/10.7554/eLife.12094.001 PMID:27183005

  10. Evaluation of a rule base for decision making in general practice.

    PubMed Central

    Essex, B; Healy, M

    1994-01-01

    BACKGROUND. Decision making in general practice relies heavily on judgmental expertise. It should be possible to codify this expertise into rules and principles. AIM. A study was undertaken to evaluate the effectiveness, of rules from a rule base designed to improve students' and trainees' management decisions relating to patients seen in general practice. METHOD. The rule base was developed after studying decisions about and management of thousands of patients seen in one general practice over an eight year period. Vignettes were presented to 93 fourth year medical students and 179 general practitioner trainees. They recorded their perception and management of each case before and after being presented with a selection of relevant rules. Participants also commented on their level of agreement with each of the rules provided with the vignettes. A panel of five independent assessors then rated as good, acceptable or poor, the participants' perception and management of each case before and after seeing the rules. RESULTS. Exposure to a few selected rules of thumb improved the problem perception and management decisions of both undergraduates and trainees. The degree of improvement was not related to previous experience or to the stated level of agreement with the proposed rules. The assessors identified difficulties students and trainees experienced in changing their perceptions and management decisions when the rules suggested options they had not considered. CONCLUSION. The rules developed to improve decision making skills in general practice are effective when used with vignettes. The next phase is to transform the rule base into an expert system to train students and doctors to acquire decision making skills. It could also be used to provide decision support when confronted with difficult management decisions in general practice. PMID:8204334

  11. 78 FR 18252 - Prevailing Rate Systems; North American Industry Classification System Based Federal Wage System...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-26

    ...-AM78 Prevailing Rate Systems; North American Industry Classification System Based Federal Wage System... 2007 North American Industry Classification System (NAICS) codes currently used in Federal Wage System... (OPM) issued a final rule (73 FR 45853) to update the 2002 North American Industry Classification...

  12. An Intelligent Decision Support System for Workforce Forecast

    DTIC Science & Technology

    2011-01-01

    ARIMA ) model to forecast the demand for construction skills in Hong Kong. This model was based...Decision Trees ARIMA Rule Based Forecasting Segmentation Forecasting Regression Analysis Simulation Modeling Input-Output Models LP and NLP Markovian...data • When results are needed as a set of easily interpretable rules 4.1.4 ARIMA Auto-regressive, integrated, moving-average ( ARIMA ) models

  13. Knowledge acquisition for case-based reasoning systems

    NASA Technical Reports Server (NTRS)

    Riesbeck, Christopher K.

    1988-01-01

    Case-based reasoning (CBR) is a simple idea: solve new problems by adapting old solutions to similar problems. The CBR approach offers several potential advantages over rule-based reasoning: rules are not combined blindly in a search for solutions, solutions can be explained in terms of concrete examples, and performance can improve automatically as new problems are solved and added to the case library. Moving CBR for the university research environment to the real world requires smooth interfaces for getting knowledge from experts. Described are the basic elements of an interface for acquiring three basic bodies of knowledge that any case-based reasoner requires: the case library of problems and their solutions, the analysis rules that flesh out input problem specifications so that relevant cases can be retrieved, and the adaptation rules that adjust old solutions to fit new problems.

  14. Knowledge From Pictures (KFP)

    NASA Technical Reports Server (NTRS)

    Truszkowski, Walt; Paterra, Frank; Bailin, Sidney

    1993-01-01

    The old maxim goes: 'A picture is worth a thousand words'. The objective of the research reported in this paper is to demonstrate this idea as it relates to the knowledge acquisition process and the automated development of an expert system's rule base. A prototype tool, the Knowledge From Pictures (KFP) tool, has been developed which configures an expert system's rule base by an automated analysis of and reasoning about a 'picture', i.e., a graphical representation of some target system to be supported by the diagnostic capabilities of the expert system under development. This rule base, when refined, could then be used by the expert system for target system monitoring and fault analysis in an operational setting. Most people, when faced with the problem of understanding the behavior of a complicated system, resort to the use of some picture or graphical representation of the system as an aid in thinking about it. This depiction provides a means of helping the individual to visualize the bahavior and dynamics of the system under study. An analysis of the picture augmented with the individual's background information, allows the problem solver to codify knowledge about the system. This knowledge can, in turn, be used to develop computer programs to automatically monitor the system's performance. The approach taken is this research was to mimic this knowledge acquisition paradigm. A prototype tool was developed which provides the user: (1) a mechanism for graphically representing sample system-configurations appropriate for the domain, and (2) a linguistic device for annotating the graphical representation with the behaviors and mutual influences of the components depicted in the graphic. The KFP tool, reasoning from the graphical depiction along with user-supplied annotations of component behaviors and inter-component influences, generates a rule base that could be used in automating the fault detection, isolation, and repair of the system.

  15. Putting theory to the test: which regulatory mechanisms can drive realistic growth of a root?

    PubMed

    De Vos, Dirk; Vissenberg, Kris; Broeckhove, Jan; Beemster, Gerrit T S

    2014-10-01

    In recent years there has been a strong development of computational approaches to mechanistically understand organ growth regulation in plants. In this study, simulation methods were used to explore which regulatory mechanisms can lead to realistic output at the cell and whole organ scale and which other possibilities must be discarded as they result in cellular patterns and kinematic characteristics that are not consistent with experimental observations for the Arabidopsis thaliana primary root. To aid in this analysis, a 'Uniform Longitudinal Strain Rule' (ULSR) was formulated as a necessary condition for stable, unidirectional, symplastic growth. Our simulations indicate that symplastic structures are robust to differences in longitudinal strain rates along the growth axis only if these differences are small and short-lived. Whereas simple cell-autonomous regulatory rules based on counters and timers can produce stable growth, it was found that steady developmental zones and smooth transitions in cell lengths are not feasible. By introducing spatial cues into growth regulation, those inadequacies could be avoided and experimental data could be faithfully reproduced. Nevertheless, a root growth model based on previous polar auxin-transport mechanisms violates the proposed ULSR due to the presence of lateral gradients. Models with layer-specific regulation or layer-driven growth offer potential solutions. Alternatively, a model representing the known cross-talk between auxin, as the cell proliferation promoting factor, and cytokinin, as the cell differentiation promoting factor, predicts the effect of hormone-perturbations on meristem size. By down-regulating PIN-mediated transport through the transcription factor SHY2, cytokinin effectively flattens the lateral auxin gradient, at the basal boundary of the division zone, (thereby imposing the ULSR) to signal the exit of proliferation and start of elongation. This model exploration underlines the value of generating virtual root growth kinematics to dissect and understand the mechanisms controlling this biological system.

  16. Integrated modelling of stormwater treatment systems uptake.

    PubMed

    Castonguay, A C; Iftekhar, M S; Urich, C; Bach, P M; Deletic, A

    2018-05-24

    Nature-based solutions provide a variety of benefits in growing cities, ranging from stormwater treatment to amenity provision such as aesthetics. However, the decision-making process involved in the installation of such green infrastructure is not straightforward, as much uncertainty around the location, size, costs and benefits impedes systematic decision-making. We developed a model to simulate decision rules used by local municipalities to install nature-based stormwater treatment systems, namely constructed wetlands, ponds/basins and raingardens. The model was used to test twenty-four scenarios of policy-making, by combining four asset selection, two location selection and three budget constraint decision rules. Based on the case study of a local municipality in Metropolitan Melbourne, Australia, the modelled uptake of stormwater treatment systems was compared with attributes of real-world systems for the simulation period. Results show that the actual budgeted funding is not reliable to predict systems' uptake and that policy-makers are more likely to plan expenditures based on installation costs. The model was able to replicate the cumulative treatment capacity and the location of systems. As such, it offers a novel approach to investigate the impact of using different decision rules to provide environmental services considering biophysical and economic factors. Copyright © 2018 Elsevier Ltd. All rights reserved.

  17. Evaluating Machine Learning Classifiers for Hybrid Network Intrusion Detection Systems

    DTIC Science & Technology

    2015-03-26

    7 VRT Vulnerability Research Team...and the Talos (formerly the Vulnerability Research Team ( VRT )) [7] 7 ruleset libraries are the two leading rulesets in use. Both libraries offer paid...rule sets to load for the signature-based IDS. Snort is selected as the IDS engine using the “ VRT and ET No/GPL” rule set. The total rule count in the

  18. st-Alphabets: On the Feasibility in the Explicit Use of Extended Relational Alphabets in Classifier Systems

    NASA Astrophysics Data System (ADS)

    Toledo-Suárez, Carlos D.

    It is proposed a way of increasing the cardinality of an alphabet used to write rules in a learning classifier system that extends the idea of relational schemata. Theoretical justifications regarding the possible reduction in the amount of rules for the solution of problems such extended alphabets (st-alphabets) imply are shown. It is shown that when expressed as bipolar neural networks, the matching process of rules over st-alphabets strongly resembles a gene expression mechanism applied to a system over {0,1,#}. In spite of the apparent drawbacks the explicit use of such relational alphabets would imply, their successful implementation in an information gain based classifier system (IGCS) is presented.

  19. A Decision Making Methodology in Support of the Business Rules Lifecycle

    NASA Technical Reports Server (NTRS)

    Wild, Christopher; Rosca, Daniela

    1998-01-01

    The business rules that underlie an enterprise emerge as a new category of system requirements that represent decisions about how to run the business, and which are characterized by their business-orientation and their propensity for change. In this report, we introduce a decision making methodology which addresses several aspects of the business rules lifecycle: acquisition, deployment and evolution. We describe a meta-model for representing business rules in terms of an enterprise model, and also a decision support submodel for reasoning about and deriving the rules. The possibility for lifecycle automated assistance is demonstrated in terms of the automatic extraction of business rules from the decision structure. A system based on the metamodel has been implemented, including the extraction algorithm. This is the final report for Daniela Rosca's PhD fellowship. It describes the work we have done over the past year, current research and the list of publications associated with her thesis topic.

  20. Multi-criteria objective based climate change impact assessment for multi-purpose multi-reservoir systems

    NASA Astrophysics Data System (ADS)

    Müller, Ruben; Schütze, Niels

    2014-05-01

    Water resources systems with reservoirs are expected to be sensitive to climate change. Assessment studies that analyze the impact of climate change on the performance of reservoirs can be divided in two groups: (1) Studies that simulate the operation under projected inflows with the current set of operational rules. Due to non adapted operational rules the future performance of these reservoirs can be underestimated and the impact overestimated. (2) Studies that optimize the operational rules for best adaption of the system to the projected conditions before the assessment of the impact. The latter allows for estimating more realistically future performance and adaption strategies based on new operation rules are available if required. Multi-purpose reservoirs serve various, often conflicting functions. If all functions cannot be served simultaneously at a maximum level, an effective compromise between multiple objectives of the reservoir operation has to be provided. Yet under climate change the historically preferenced compromise may no longer be the most suitable compromise in the future. Therefore a multi-objective based climate change impact assessment approach for multi-purpose multi-reservoir systems is proposed in the study. Projected inflows are provided in a first step using a physically based rainfall-runoff model. In a second step, a time series model is applied to generate long-term inflow time series. Finally, the long-term inflow series are used as driving variables for a simulation-based multi-objective optimization of the reservoir system in order to derive optimal operation rules. As a result, the adapted Pareto-optimal set of diverse best compromise solutions can be presented to the decision maker in order to assist him in assessing climate change adaption measures with respect to the future performance of the multi-purpose reservoir system. The approach is tested on a multi-purpose multi-reservoir system in a mountainous catchment in Germany. A climate change assessment is performed for climate change scenarios based on the SRES emission scenarios A1B, B1 and A2 for a set of statistically downscaled meteorological data. The future performance of the multi-purpose multi-reservoir system is quantified and possible intensifications of trade-offs between management goals or reservoir utilizations are shown.

  1. Kinetics of cell division in epidermal maintenance

    NASA Astrophysics Data System (ADS)

    Klein, Allon M.; Doupé, David P.; Jones, Phillip H.; Simons, Benjamin D.

    2007-08-01

    The rules governing cell division and differentiation are central to understanding the mechanisms of development, aging, and cancer. By utilizing inducible genetic labeling, recent studies have shown that the clonal population in transgenic mouse epidermis can be tracked in vivo. Drawing on these results, we explain how clonal fate data may be used to infer the rules of cell division and differentiation underlying the maintenance of adult murine tail-skin. We show that the rates of cell division and differentiation may be evaluated by considering the long-time and short-time clone fate data, and that the data is consistent with cells dividing independently rather than synchronously. Motivated by these findings, we consider a mechanism for cancer onset based closely on the model for normal adult skin. By analyzing the expected changes to clonal fate in cancer emerging from a simple two-stage mutation, we propose that clonal fate data may provide a novel method for studying the earliest stages of the disease.

  2. The Evolvement of Automobile Steering System Based on TRIZ

    NASA Astrophysics Data System (ADS)

    Zhao, Xinjun; Zhang, Shuang

    Products and techniques pass through a process of birth, growth, maturity, death and quit the stage like biological evolution process. The developments of products and techniques conform to some evolvement rules. If people know and hold these rules, they can design new kind of products and forecast the develop trends of the products. Thereby, enterprises can grasp the future technique directions of products, and make product and technique innovation. Below, based on TRIZ theory, the mechanism evolvement, the function evolvement and the appearance evolvement of automobile steering system had been analyzed and put forward some new ideas about future automobile steering system.

  3. An operation support expert system based on on-line dynamics simulation and fuzzy reasoning for startup schedule optimization in fossil power plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matsumoto, H.; Eki, Y.; Kaji, A.

    1993-12-01

    An expert system which can support operators of fossil power plants in creating the optimum startup schedule and executing it accurately is described. The optimum turbine speed-up and load-up pattern is obtained through an iterative manner which is based on fuzzy resonating using quantitative calculations as plant dynamics models and qualitative knowledge as schedule optimization rules with fuzziness. The rules represent relationships between stress margins and modification rates of the schedule parameters. Simulations analysis proves that the system provides quick and accurate plant startups.

  4. Medicare and Medicaid programs; fire safety requirements for certain health care facilities; amendment. Interim final rule with comment period.

    PubMed

    2005-03-25

    This interim final rule with comment period adopts the substance of the April 15, 2004 temporary interim amendment (TIA) 00-1 (101), Alcohol Based Hand Rub Solutions, an amendment to the 2000 edition of the Life Safety Code, published by the National Fire Protection Association (NFPA). This amendment will allow certain health care facilities to place alcohol-based hand rub dispensers in egress corridors under specified conditions. This interim final rule with comment period also requires that nursing facilities install smoke detectors in resident rooms and public areas if they do not have a sprinkler system installed throughout the facility or a hard-wired smoke detection system in those areas.

  5. Constructing Compact Takagi-Sugeno Rule Systems: Identification of Complex Interactions in Epidemiological Data

    PubMed Central

    Zhou, Shang-Ming; Lyons, Ronan A.; Brophy, Sinead; Gravenor, Mike B.

    2012-01-01

    The Takagi-Sugeno (TS) fuzzy rule system is a widely used data mining technique, and is of particular use in the identification of non-linear interactions between variables. However the number of rules increases dramatically when applied to high dimensional data sets (the curse of dimensionality). Few robust methods are available to identify important rules while removing redundant ones, and this results in limited applicability in fields such as epidemiology or bioinformatics where the interaction of many variables must be considered. Here, we develop a new parsimonious TS rule system. We propose three statistics: R, L, and ω-values, to rank the importance of each TS rule, and a forward selection procedure to construct a final model. We use our method to predict how key components of childhood deprivation combine to influence educational achievement outcome. We show that a parsimonious TS model can be constructed, based on a small subset of rules, that provides an accurate description of the relationship between deprivation indices and educational outcomes. The selected rules shed light on the synergistic relationships between the variables, and reveal that the effect of targeting specific domains of deprivation is crucially dependent on the state of the other domains. Policy decisions need to incorporate these interactions, and deprivation indices should not be considered in isolation. The TS rule system provides a basis for such decision making, and has wide applicability for the identification of non-linear interactions in complex biomedical data. PMID:23272108

  6. Constructing compact Takagi-Sugeno rule systems: identification of complex interactions in epidemiological data.

    PubMed

    Zhou, Shang-Ming; Lyons, Ronan A; Brophy, Sinead; Gravenor, Mike B

    2012-01-01

    The Takagi-Sugeno (TS) fuzzy rule system is a widely used data mining technique, and is of particular use in the identification of non-linear interactions between variables. However the number of rules increases dramatically when applied to high dimensional data sets (the curse of dimensionality). Few robust methods are available to identify important rules while removing redundant ones, and this results in limited applicability in fields such as epidemiology or bioinformatics where the interaction of many variables must be considered. Here, we develop a new parsimonious TS rule system. We propose three statistics: R, L, and ω-values, to rank the importance of each TS rule, and a forward selection procedure to construct a final model. We use our method to predict how key components of childhood deprivation combine to influence educational achievement outcome. We show that a parsimonious TS model can be constructed, based on a small subset of rules, that provides an accurate description of the relationship between deprivation indices and educational outcomes. The selected rules shed light on the synergistic relationships between the variables, and reveal that the effect of targeting specific domains of deprivation is crucially dependent on the state of the other domains. Policy decisions need to incorporate these interactions, and deprivation indices should not be considered in isolation. The TS rule system provides a basis for such decision making, and has wide applicability for the identification of non-linear interactions in complex biomedical data.

  7. An Ada inference engine for expert systems

    NASA Technical Reports Server (NTRS)

    Lavallee, David B.

    1986-01-01

    The purpose is to investigate the feasibility of using Ada for rule-based expert systems with real-time performance requirements. This includes exploring the Ada features which give improved performance to expert systems as well as optimizing the tradeoffs or workarounds that the use of Ada may require. A prototype inference engine was built using Ada, and rule firing rates in excess of 500 per second were demonstrated on a single MC68000 processor. The knowledge base uses a directed acyclic graph to represent production lines. The graph allows the use of AND, OR, and NOT logical operators. The inference engine uses a combination of both forward and backward chaining in order to reach goals as quickly as possible. Future efforts will include additional investigation of multiprocessing to improve performance and creating a user interface allowing rule input in an Ada-like syntax. Investigation of multitasking and alternate knowledge base representations will help to analyze some of the performance issues as they relate to larger problems.

  8. TOXPERT: An Expert System for Risk Assessment

    PubMed Central

    Soto, R. J.; Osimitz, T. G.; Oleson, A.

    1988-01-01

    TOXPERT is an artificial intelligence based system used to model product safety, toxicology (TOX) and regulatory (REG) decision processes. An expert system shell uses backward chaining rule control to link “marketing approval” goals to the type of product, REG agency, exposure conditions and TOX. Marketing risks are primarily a function of the TOX, hazards and exposure potential. The method employed differentiates between REG requirements in goal seeking control for various types of products. This is accomplished by controlling rule execution by defining frames for each REG agency. In addition, TOXPERT produces classifications of TOX ratings and suggested product labeling. This production rule system uses principles of TOX, REGs, corporate guidelines and internal “rules of thumb.” TOXPERT acts as an advisor for this narrow domain. Advantages are that it can make routine decisions freeing professional's time for more complex problem solving, provide backup and training.

  9. COMPUTERIZED RISK AND BIOACCUMULATION SYSTEM (VERSION 1.0)

    EPA Science Inventory

    CRABS is a combination of a rule-based expert system and more traditional procedural programming techniques. ule-based expert systems attempt to emulate the decision making process of human experts within a clearly defined subject area. xpert systems consist of an "inference engi...

  10. Adaptive Critic-based Neurofuzzy Controller for the Steam Generator Water Level

    NASA Astrophysics Data System (ADS)

    Fakhrazari, Amin; Boroushaki, Mehrdad

    2008-06-01

    In this paper, an adaptive critic-based neurofuzzy controller is presented for water level regulation of nuclear steam generators. The problem has been of great concern for many years as the steam generator is a highly nonlinear system showing inverse response dynamics especially at low operating power levels. Fuzzy critic-based learning is a reinforcement learning method based on dynamic programming. The only information available for the critic agent is the system feedback which is interpreted as the last action the controller has performed in the previous state. The signal produced by the critic agent is used alongside the backpropagation of error algorithm to tune online conclusion parts of the fuzzy inference rules. The critic agent here has a proportional-derivative structure and the fuzzy rule base has nine rules. The proposed controller shows satisfactory transient responses, disturbance rejection and robustness to model uncertainty. Its simple design procedure and structure, nominates it as one of the suitable controller designs for the steam generator water level control in nuclear power plant industry.

  11. Generalizing Gillespie’s Direct Method to Enable Network-Free Simulations

    DOE PAGES

    Suderman, Ryan T.; Mitra, Eshan David; Lin, Yen Ting; ...

    2018-03-28

    Gillespie’s direct method for stochastic simulation of chemical kinetics is a staple of computational systems biology research. However, the algorithm requires explicit enumeration of all reactions and all chemical species that may arise in the system. In many cases, this is not feasible due to the combinatorial explosion of reactions and species in biological networks. Rule-based modeling frameworks provide a way to exactly represent networks containing such combinatorial complexity, and generalizations of Gillespie’s direct method have been developed as simulation engines for rule-based modeling languages. Here, we provide both a high-level description of the algorithms underlying the simulation engines, termedmore » network-free simulation algorithms, and how they have been applied in systems biology research. We also define a generic rule-based modeling framework and describe a number of technical details required for adapting Gillespie’s direct method for network-free simulation. Lastly, we briefly discuss potential avenues for advancing network-free simulation and the role they continue to play in modeling dynamical systems in biology.« less

  12. Generalizing Gillespie’s Direct Method to Enable Network-Free Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Suderman, Ryan T.; Mitra, Eshan David; Lin, Yen Ting

    Gillespie’s direct method for stochastic simulation of chemical kinetics is a staple of computational systems biology research. However, the algorithm requires explicit enumeration of all reactions and all chemical species that may arise in the system. In many cases, this is not feasible due to the combinatorial explosion of reactions and species in biological networks. Rule-based modeling frameworks provide a way to exactly represent networks containing such combinatorial complexity, and generalizations of Gillespie’s direct method have been developed as simulation engines for rule-based modeling languages. Here, we provide both a high-level description of the algorithms underlying the simulation engines, termedmore » network-free simulation algorithms, and how they have been applied in systems biology research. We also define a generic rule-based modeling framework and describe a number of technical details required for adapting Gillespie’s direct method for network-free simulation. Lastly, we briefly discuss potential avenues for advancing network-free simulation and the role they continue to play in modeling dynamical systems in biology.« less

  13. SAMS--a systems architecture for developing intelligent health information systems.

    PubMed

    Yılmaz, Özgün; Erdur, Rıza Cenk; Türksever, Mustafa

    2013-12-01

    In this paper, SAMS, a novel health information system architecture for developing intelligent health information systems is proposed and also some strategies for developing such systems are discussed. The systems fulfilling this architecture will be able to store electronic health records of the patients using OWL ontologies, share patient records among different hospitals and provide physicians expertise to assist them in making decisions. The system is intelligent because it is rule-based, makes use of rule-based reasoning and has the ability to learn and evolve itself. The learning capability is provided by extracting rules from previously given decisions by the physicians and then adding the extracted rules to the system. The proposed system is novel and original in all of these aspects. As a case study, a system is implemented conforming to SAMS architecture for use by dentists in the dental domain. The use of the developed system is described with a scenario. For evaluation, the developed dental information system will be used and tried by a group of dentists. The development of this system proves the applicability of SAMS architecture. By getting decision support from a system derived from this architecture, the cognitive gap between experienced and inexperienced physicians can be compensated. Thus, patient satisfaction can be achieved, inexperienced physicians are supported in decision making and the personnel can improve their knowledge. A physician can diagnose a case, which he/she has never diagnosed before, using this system. With the help of this system, it will be possible to store general domain knowledge in this system and the personnel's need to medical guideline documents will be reduced.

  14. A mobile asset sharing policy for hospitals with real time locating systems.

    PubMed

    Demircan-Yıldız, Ece Arzu; Fescioglu-Unver, Nilgun

    2016-01-01

    Each year, hospitals lose a considerable amount of time and money due to misplaced mobile assets. In addition the assets which remain in departments that frequently use them depreciate early, while other assets of the same type in different departments are rarely used. A real time locating system can prevent these losses when used with appropriate asset sharing policies. This research quantifies the amount of time a medium size hospital saves by using real time locating system and proposes an asset selection rule to eliminate the asset usage imbalance problem. The asset selection rule proposed is based on multi objective optimization techniques. The effectiveness of this rule on asset to patient time and asset utilization rate variance performance measures were tested using discrete event simulation method. Results show that the proposed asset selection rule improved the usage balance significantly. Sensitivity analysis showed that the proposed rule is robust to changes in demand rates and user preferences. Real time locating systems enable saving considerable amount of time in hospitals, and they can still be improved by integrating decision support mechanisms. Combining tracking technology and asset selection rules helps improve healthcare services.

  15. An HL7-CDA wrapper for facilitating semantic interoperability to rule-based Clinical Decision Support Systems.

    PubMed

    Sáez, Carlos; Bresó, Adrián; Vicente, Javier; Robles, Montserrat; García-Gómez, Juan Miguel

    2013-03-01

    The success of Clinical Decision Support Systems (CDSS) greatly depends on its capability of being integrated in Health Information Systems (HIS). Several proposals have been published up to date to permit CDSS gathering patient data from HIS. Some base the CDSS data input on the HL7 reference model, however, they are tailored to specific CDSS or clinical guidelines technologies, or do not focus on standardizing the CDSS resultant knowledge. We propose a solution for facilitating semantic interoperability to rule-based CDSS focusing on standardized input and output documents conforming an HL7-CDA wrapper. We define the HL7-CDA restrictions in a HL7-CDA implementation guide. Patient data and rule inference results are mapped respectively to and from the CDSS by means of a binding method based on an XML binding file. As an independent clinical document, the results of a CDSS can present clinical and legal validity. The proposed solution is being applied in a CDSS for providing patient-specific recommendations for the care management of outpatients with diabetes mellitus. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  16. EXADS - EXPERT SYSTEM FOR AUTOMATED DESIGN SYNTHESIS

    NASA Technical Reports Server (NTRS)

    Rogers, J. L.

    1994-01-01

    The expert system called EXADS was developed to aid users of the Automated Design Synthesis (ADS) general purpose optimization program. Because of the general purpose nature of ADS, it is difficult for a nonexpert to select the best choice of strategy, optimizer, and one-dimensional search options from the one hundred or so combinations that are available. EXADS aids engineers in determining the best combination based on their knowledge of the problem and the expert knowledge previously stored by experts who developed ADS. EXADS is a customized application of the AESOP artificial intelligence program (the general version of AESOP is available separately from COSMIC. The ADS program is also available from COSMIC.) The expert system consists of two main components. The knowledge base contains about 200 rules and is divided into three categories: constrained, unconstrained, and constrained treated as unconstrained. The EXADS inference engine is rule-based and makes decisions about a particular situation using hypotheses (potential solutions), rules, and answers to questions drawn from the rule base. EXADS is backward-chaining, that is, it works from hypothesis to facts. The rule base was compiled from sources such as literature searches, ADS documentation, and engineer surveys. EXADS will accept answers such as yes, no, maybe, likely, and don't know, or a certainty factor ranging from 0 to 10. When any hypothesis reaches a confidence level of 90% or more, it is deemed as the best choice and displayed to the user. If no hypothesis is confirmed, the user can examine explanations of why the hypotheses failed to reach the 90% level. The IBM PC version of EXADS is written in IQ-LISP for execution under DOS 2.0 or higher with a central memory requirement of approximately 512K of 8 bit bytes. This program was developed in 1986.

  17. Quality Leadership and Quality Control

    PubMed Central

    Badrick, Tony

    2003-01-01

    Different quality control rules detect different analytical errors with varying levels of efficiency depending on the type of error present, its prevalence and the number of observations. The efficiency of a rule can be gauged by inspection of a power function graph. Control rules are only part of a process and not an end in itself; just as important are the trouble-shooting systems employed when a failure occurs. 'Average of patient normals' may develop as a usual adjunct to conventional quality control serum based programmes. Acceptable error can be based on various criteria; biological variation is probably the most sensible. Once determined, acceptable error can be used as limits in quality control rule systems. A key aspect of an organisation is leadership, which links the various components of the quality system. Leadership is difficult to characterise but its key aspects include trust, setting an example, developing staff and critically setting the vision for the organisation. Organisations also have internal characteristics such as the degree of formalisation, centralisation, and complexity. Medical organisations can have internal tensions because of the dichotomy between the bureaucratic and the shadow medical structures. PMID:18568046

  18. Using Social Media Data to Identify Potential Candidates for Drug Repurposing: A Feasibility Study.

    PubMed

    Rastegar-Mojarad, Majid; Liu, Hongfang; Nambisan, Priya

    2016-06-16

    Drug repurposing (defined as discovering new indications for existing drugs) could play a significant role in drug development, especially considering the declining success rates of developing novel drugs. Typically, new indications for existing medications are identified by accident. However, new technologies and a large number of available resources enable the development of systematic approaches to identify and validate drug-repurposing candidates. Patients today report their experiences with medications on social media and reveal side effects as well as beneficial effects of those medications. Our aim was to assess the feasibility of using patient reviews from social media to identify potential candidates for drug repurposing. We retrieved patient reviews of 180 medications from an online forum, WebMD. Using dictionary-based and machine learning approaches, we identified disease names in the reviews. Several publicly available resources were used to exclude comments containing known indications and adverse drug effects. After manually reviewing some of the remaining comments, we implemented a rule-based system to identify beneficial effects. The dictionary-based system and machine learning system identified 2178 and 6171 disease names respectively in 64,616 patient comments. We provided a list of 10 common patterns that patients used to report any beneficial effects or uses of medication. After manually reviewing the comments tagged by our rule-based system, we identified five potential drug repurposing candidates. To our knowledge, this is the first study to consider using social media data to identify drug-repurposing candidates. We found that even a rule-based system, with a limited number of rules, could identify beneficial effect mentions in patient comments. Our preliminary study shows that social media has the potential to be used in drug repurposing.

  19. Systems, methods and apparatus for verification of knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Rash, James L. (Inventor); Gracinin, Denis (Inventor); Erickson, John D. (Inventor); Rouff, Christopher A. (Inventor); Hinchey, Michael G. (Inventor)

    2010-01-01

    Systems, methods and apparatus are provided through which in some embodiments, domain knowledge is translated into a knowledge-based system. In some embodiments, a formal specification is derived from rules of a knowledge-based system, the formal specification is analyzed, and flaws in the formal specification are used to identify and correct errors in the domain knowledge, from which a knowledge-based system is translated.

  20. Research of Uncertainty Reasoning in Pineapple Disease Identification System

    NASA Astrophysics Data System (ADS)

    Liu, Liqun; Fan, Haifeng

    In order to deal with the uncertainty of evidences mostly existing in pineapple disease identification system, a reasoning model based on evidence credibility factor was established. The uncertainty reasoning method is discussed,including: uncertain representation of knowledge, uncertain representation of rules, uncertain representation of multi-evidences and update of reasoning rules. The reasoning can fully reflect the uncertainty in disease identification and reduce the influence of subjective factors on the accuracy of the system.

  1. Merit-Based Incentive Payment System: Meaningful Changes in the Final Rule Brings Cautious Optimism.

    PubMed

    Manchikanti, Laxmaiah; Helm Ii, Standiford; Calodney, Aaron K; Hirsch, Joshua A

    2017-01-01

    The Medicare Access and CHIP Reauthorization Act of 2015 (MACRA) eliminated the flawed Sustainable Growth Rate (SGR) act formula - a longstanding crucial issue of concern for health care providers and Medicare beneficiaries. MACRA also included a quality improvement program entitled, "The Merit-Based Incentive Payment System, or MIPS." The proposed rule of MIPS sought to streamline existing federal quality efforts and therefore linked 4 distinct programs into one. Three existing programs, meaningful use (MU), Physician Quality Reporting System (PQRS), value-based payment (VBP) system were merged with the addition of Clinical Improvement Activity category. The proposed rule also changed the name of MU to Advancing Care Information, or ACI. ACI contributes to 25% of composite score of the four programs, PQRS contributes 50% of the composite score, while VBP system, which deals with resource use or cost, contributes to 10% of the composite score. The newest category, Improvement Activities or IA, contributes 15% to the composite score. The proposed rule also created what it called a design incentive that drives movement to delivery system reform principles with the inclusion of Advanced Alternative Payment Models (APMs).Following the release of the proposed rule, the medical community, as well as Congress, provided substantial input to Centers for Medicare and Medicaid Services (CMS),expressing their concern. American Society of Interventional Pain Physicians (ASIPP) focused on 3 important aspects: delay the implementation, provide a 3-month performance period, and provide ability to submit meaningful quality measures in a timely and economic manner. The final rule accepted many of the comments from various organizations, including several of those specifically emphasized by ASIPP, with acceptance of 3-month reporting period, as well as the ability to submit non-MIPS measures to improve real quality and make the system meaningful. CMS also provided a mechanism for physicians to avoid penalties for non-reporting with reporting of just a single patient. In summary, CMS has provided substantial flexibility with mechanisms to avoid penalties, reporting for 90 continuous days, increasing the low volume threshold, changing the reporting burden and data thresholds and, finally, coordination between performance categories. The final rule has made MIPS more meaningful with bonuses for exceptional performance, the ability to report for 90 days, and to report on 50% of the patients in 2017 and 60% of the patients in 2018. The final rule also reduced the quality measures to 6, including only one outcome or high priority measure with elimination of cross cutting measure requirement. In addition, the final rule reduced the burden of ACI, improved the coordination of performance, reduced improvement activities burden from 60 points to 40 points, and finally improved coordination between performance categories. Multiple concerns remain regarding the reduction in scoring for quality improvement in future years, increase in proportion of MIPS scoring for resource use utilizing flawed, claims based methodology and the continuation of the disproportionate importance of ACI, an expensive program that can be onerous for providers which in many ways has not lived up to its promise. Key words: Medicare Access and CHIP Reauthorization Act of 2015, merit-based incentive payment system, quality performance measures, resource use, improvement activities, advancing care information performance category.

  2. An Autonomous Flight Safety System

    DTIC Science & Technology

    2008-11-01

    are taken. AFSS can take vehicle navigation data from redundant onboard sensors and make flight termination decisions using software-based rules...implemented on redundant flight processors. By basing these decisions on actual Instantaneous Impact Predictions and by providing for an arbitrary...number of mission rules, it is the contention of the AFSS development team that the decision making process used by Missile Flight Control Officers

  3. Spatial Rule-Based Modeling: A Method and Its Application to the Human Mitotic Kinetochore

    PubMed Central

    Ibrahim, Bashar; Henze, Richard; Gruenert, Gerd; Egbert, Matthew; Huwald, Jan; Dittrich, Peter

    2013-01-01

    A common problem in the analysis of biological systems is the combinatorial explosion that emerges from the complexity of multi-protein assemblies. Conventional formalisms, like differential equations, Boolean networks and Bayesian networks, are unsuitable for dealing with the combinatorial explosion, because they are designed for a restricted state space with fixed dimensionality. To overcome this problem, the rule-based modeling language, BioNetGen, and the spatial extension, SRSim, have been developed. Here, we describe how to apply rule-based modeling to integrate experimental data from different sources into a single spatial simulation model and how to analyze the output of that model. The starting point for this approach can be a combination of molecular interaction data, reaction network data, proximities, binding and diffusion kinetics and molecular geometries at different levels of detail. We describe the technique and then use it to construct a model of the human mitotic inner and outer kinetochore, including the spindle assembly checkpoint signaling pathway. This allows us to demonstrate the utility of the procedure, show how a novel perspective for understanding such complex systems becomes accessible and elaborate on challenges that arise in the formulation, simulation and analysis of spatial rule-based models. PMID:24709796

  4. Statistical Properties of Cell Topology and Geometry in a Tissue-Growth Model

    NASA Astrophysics Data System (ADS)

    Sahlin, Patrik; Hamant, Olivier; Jönsson, Henrik

    Statistical properties of cell topologies in two-dimensional tissues have recently been suggested to be a consequence of cell divisions. Different rules for the positioning of new walls in plants have been proposed, where e.g. Errara’s rule state that new walls are added with the shortest possible path dividing the mother cell’s volume into two equal parts. Here, we show that for an isotropically growing tissue Errara’s rule results in the correct distributions of number of cell neighbors as well as cellular geometries, in contrast to a random division rule. Further we show that wall mechanics constrain the isotropic growth such that the resulting cell shape distributions more closely agree with experimental data extracted from the shoot apex of Arabidopsis thaliana.

  5. Hierarchy of conversational rule violations involving utterance-based augmentative and alternative communication systems.

    PubMed

    Hoag, Linda A; Bedrosian, Jan L; McCoy, Kathleen F; Johnson, Dallas E

    2008-01-01

    This study examined the effects of using messages with conversational rule violations on attitudes toward people who used utterance-based augmentative and alternative communication (AAC) systems in transactional interactions. Specifically, the ratings were compared across messages with relevance, informativeness, and brevity violations, when latency remained constant (i.e., short). The 96 participating sales clerks viewed scripted, videotaped bookstore conversations and completed an attitude questionnaire. Results indicated that the prestored message with repeated words/phrases was rated the highest, followed by the message with excessive information; next was the message with inadequate information, followed by the message with partly relevant information. The findings may be useful to those using utterance-based systems when making message choices during interactions with service providers. Technological implications point to the development of schema/script-based systems and intelligent editing.

  6. Intelligent fault management for the Space Station active thermal control system

    NASA Technical Reports Server (NTRS)

    Hill, Tim; Faltisco, Robert M.

    1992-01-01

    The Thermal Advanced Automation Project (TAAP) approach and architecture is described for automating the Space Station Freedom (SSF) Active Thermal Control System (ATCS). The baseline functionally and advanced automation techniques for Fault Detection, Isolation, and Recovery (FDIR) will be compared and contrasted. Advanced automation techniques such as rule-based systems and model-based reasoning should be utilized to efficiently control, monitor, and diagnose this extremely complex physical system. TAAP is developing advanced FDIR software for use on the SSF thermal control system. The goal of TAAP is to join Knowledge-Based System (KBS) technology, using a combination of rules and model-based reasoning, with conventional monitoring and control software in order to maximize autonomy of the ATCS. TAAP's predecessor was NASA's Thermal Expert System (TEXSYS) project which was the first large real-time expert system to use both extensive rules and model-based reasoning to control and perform FDIR on a large, complex physical system. TEXSYS showed that a method is needed for safely and inexpensively testing all possible faults of the ATCS, particularly those potentially damaging to the hardware, in order to develop a fully capable FDIR system. TAAP therefore includes the development of a high-fidelity simulation of the thermal control system. The simulation provides realistic, dynamic ATCS behavior and fault insertion capability for software testing without hardware related risks or expense. In addition, thermal engineers will gain greater confidence in the KBS FDIR software than was possible prior to this kind of simulation testing. The TAAP KBS will initially be a ground-based extension of the baseline ATCS monitoring and control software and could be migrated on-board as additional computation resources are made available.

  7. Coreference analysis in clinical notes: a multi-pass sieve with alternate anaphora resolution modules

    PubMed Central

    Li, Dingcheng; Sohn, Sunghwan; Wu, Stephen Tze-Inn; Wagholikar, Kavishwar; Torii, Manabu; Liu, Hongfang

    2012-01-01

    Objective This paper describes the coreference resolution system submitted by Mayo Clinic for the 2011 i2b2/VA/Cincinnati shared task Track 1C. The goal of the task was to construct a system that links the markables corresponding to the same entity. Materials and methods The task organizers provided progress notes and discharge summaries that were annotated with the markables of treatment, problem, test, person, and pronoun. We used a multi-pass sieve algorithm that applies deterministic rules in the order of preciseness and simultaneously gathers information about the entities in the documents. Our system, MedCoref, also uses a state-of-the-art machine learning framework as an alternative to the final, rule-based pronoun resolution sieve. Results The best system that uses a multi-pass sieve has an overall score of 0.836 (average of B3, MUC, Blanc, and CEAF F score) for the training set and 0.843 for the test set. Discussion A supervised machine learning system that typically uses a single function to find coreferents cannot accommodate irregularities encountered in data especially given the insufficient number of examples. On the other hand, a completely deterministic system could lead to a decrease in recall (sensitivity) when the rules are not exhaustive. The sieve-based framework allows one to combine reliable machine learning components with rules designed by experts. Conclusion Using relatively simple rules, part-of-speech information, and semantic type properties, an effective coreference resolution system could be designed. The source code of the system described is available at https://sourceforge.net/projects/ohnlp/files/MedCoref. PMID:22707745

  8. iRODS: A Distributed Data Management Cyberinfrastructure for Observatories

    NASA Astrophysics Data System (ADS)

    Rajasekar, A.; Moore, R.; Vernon, F.

    2007-12-01

    Large-scale and long-term preservation of both observational and synthesized data requires a system that virtualizes data management concepts. A methodology is needed that can work across long distances in space (distribution) and long-periods in time (preservation). The system needs to manage data stored on multiple types of storage systems including new systems that become available in the future. This concept is called infrastructure independence, and is typically implemented through virtualization mechanisms. Data grids are built upon concepts of data and trust virtualization. These concepts enable the management of collections of data that are distributed across multiple institutions, stored on multiple types of storage systems, and accessed by multiple types of clients. Data virtualization ensures that the name spaces used to identify files, users, and storage systems are persistent, even when files are migrated onto future technology. This is required to preserve authenticity, the link between the record and descriptive and provenance metadata. Trust virtualization ensures that access controls remain invariant as files are moved within the data grid. This is required to track the chain of custody of records over time. The Storage Resource Broker (http://www.sdsc.edu/srb) is one such data grid used in a wide variety of applications in earth and space sciences such as ROADNet (roadnet.ucsd.edu), SEEK (seek.ecoinformatics.org), GEON (www.geongrid.org) and NOAO (www.noao.edu). Recent extensions to data grids provide one more level of virtualization - policy or management virtualization. Management virtualization ensures that execution of management policies can be automated, and that rules can be created that verify assertions about the shared collections of data. When dealing with distributed large-scale data over long periods of time, the policies used to manage the data and provide assurances about the authenticity of the data become paramount. The integrated Rule-Oriented Data System (iRODS) (http://irods.sdsc.edu) provides the mechanisms needed to describe not only management policies, but also to track how the policies are applied and their execution results. The iRODS data grid maps management policies to rules that control the execution of the remote micro-services. As an example, a rule can be created that automatically creates a replica whenever a file is added to a specific collection, or extracts its metadata automatically and registers it in a searchable catalog. For the replication operation, the persistent state information consists of the replica location, the creation date, the owner, the replica size, etc. The mechanism used by iRODS for providing policy virtualization is based on well-defined functions, called micro-services, which are chained into alternative workflows using rules. A rule engine, based on the event-condition-action paradigm executes the rule-based workflows after an event. Rules can be deferred to a pre-determined time or executed on a periodic basis. As the data management policies evolve, the iRODS system can implement new rules, new micro-services, and new state information (metadata content) needed to manage the new policies. Each sub- collection can be managed using a different set of policies. The discussion of the concepts in rule-based policy virtualization and its application to long-term and large-scale data management for observatories such as ORION and NEON will be the basis of the paper.

  9. Health-Mining: a Disease Management Support Service based on Data Mining and Rule Extraction.

    PubMed

    Bei, Andrea; De Luca, Stefano; Ruscitti, Giancarlo; Salamon, Diego

    2005-01-01

    The disease management is the collection of the processes aimed to control the health care and improving the quality at same time reducing the overall cost of the procedures. Our system, Health-Mining, is a Decision Support System with the objective of controlling the adequacy of hospitalization and therapies, determining the effective use of standard guidelines and eventually identifying better ones emerged from the medical practice (Evidence Based Medicine). In realizing the system, we have the aim of creation of a path to admissions- appropriateness criteria construction, valid at an international level. A main goal of the project is rule extraction and the identification of the rules adequate in term of efficacy, quality and cost reduction, especially in the view of fast changing technologies and medicines. We tested Health-Mining in a real test case for an Italian Region, Regione Veneto, on the installation of pacemaker and ICD.

  10. General Rule of Negative Effective Ueff System & Materials Design of High-Tc Superconductors by ab initio Calculations

    NASA Astrophysics Data System (ADS)

    Katayama-Yoshida, Hiroshi; Nakanishi, Akitaka; Uede, Hiroki; Takawashi, Yuki; Fukushima, Tetsuya; Sato, Kazunori

    2014-03-01

    Based upon ab initio electronic structure calculation, I will discuss the general rule of negative effective U system by (1) exchange-correlation-induced negative effective U caused by the stability of the exchange-correlation energy in Hund's rule with high-spin ground states of d5 configuration, and (2) charge-excitation-induced negative effective U caused by the stability of chemical bond in the closed-shell of s2, p6, and d10 configurations. I will show the calculated results of negative effective U systems such as hole-doped CuAlO2 and CuFeS2. Based on the total energy calculations of antiferromagnetic and ferromagnetic states, I will discuss the magnetic phase diagram and superconductivity upon hole doping. I also discuss the computational materials design method of high-Tc superconductors by ab initio calculation to go beyond LDA and multi-scale simulations.

  11. Learning and tuning fuzzy logic controllers through reinforcements.

    PubMed

    Berenji, H R; Khedkar, P

    1992-01-01

    A method for learning and tuning a fuzzy logic controller based on reinforcements from a dynamic system is presented. It is shown that: the generalized approximate-reasoning-based intelligent control (GARIC) architecture learns and tunes a fuzzy logic controller even when only weak reinforcement, such as a binary failure signal, is available; introduces a new conjunction operator in computing the rule strengths of fuzzy control rules; introduces a new localized mean of maximum (LMOM) method in combining the conclusions of several firing control rules; and learns to produce real-valued control actions. Learning is achieved by integrating fuzzy inference into a feedforward network, which can then adaptively improve performance by using gradient descent methods. The GARIC architecture is applied to a cart-pole balancing system and demonstrates significant improvements in terms of the speed of learning and robustness to changes in the dynamic system's parameters over previous schemes for cart-pole balancing.

  12. Syntactic sequencing in Hebbian cell assemblies.

    PubMed

    Wennekers, Thomas; Palm, Günther

    2009-12-01

    Hebbian cell assemblies provide a theoretical framework for the modeling of cognitive processes that grounds them in the underlying physiological neural circuits. Recently we have presented an extension of cell assemblies by operational components which allows to model aspects of language, rules, and complex behaviour. In the present work we study the generation of syntactic sequences using operational cell assemblies timed by unspecific trigger signals. Syntactic patterns are implemented in terms of hetero-associative transition graphs in attractor networks which cause a directed flow of activity through the neural state space. We provide regimes for parameters that enable an unspecific excitatory control signal to switch reliably between attractors in accordance with the implemented syntactic rules. If several target attractors are possible in a given state, noise in the system in conjunction with a winner-takes-all mechanism can randomly choose a target. Disambiguation can also be guided by context signals or specific additional external signals. Given a permanently elevated level of external excitation the model can enter an autonomous mode, where it generates temporal grammatical patterns continuously.

  13. Expert systems for automated maintenance of a Mars oxygen production system

    NASA Astrophysics Data System (ADS)

    Huang, Jen-Kuang; Ho, Ming-Tsang; Ash, Robert L.

    1992-08-01

    Application of expert system concepts to a breadboard Mars oxygen processor unit have been studied and tested. The research was directed toward developing the methodology required to enable autonomous operation and control of these simple chemical processors at Mars. Failure detection and isolation was the key area of concern, and schemes using forward chaining, backward chaining, knowledge-based expert systems, and rule-based expert systems were examined. Tests and simulations were conducted that investigated self-health checkout, emergency shutdown, and fault detection, in addition to normal control activities. A dynamic system model was developed using the Bond-Graph technique. The dynamic model agreed well with tests involving sudden reductions in throughput. However, nonlinear effects were observed during tests that incorporated step function increases in flow variables. Computer simulations and experiments have demonstrated the feasibility of expert systems utilizing rule-based diagnosis and decision-making algorithms.

  14. A living mesoscopic cellular automaton made of skin scales.

    PubMed

    Manukyan, Liana; Montandon, Sophie A; Fofonjka, Anamarija; Smirnov, Stanislav; Milinkovitch, Michel C

    2017-04-12

    In vertebrates, skin colour patterns emerge from nonlinear dynamical microscopic systems of cell interactions. Here we show that in ocellated lizards a quasi-hexagonal lattice of skin scales, rather than individual chromatophore cells, establishes a green and black labyrinthine pattern of skin colour. We analysed time series of lizard scale colour dynamics over four years of their development and demonstrate that this pattern is produced by a cellular automaton (a grid of elements whose states are iterated according to a set of rules based on the states of neighbouring elements) that dynamically computes the colour states of individual mesoscopic skin scales to produce the corresponding macroscopic colour pattern. Using numerical simulations and mathematical derivation, we identify how a discrete von Neumann cellular automaton emerges from a continuous Turing reaction-diffusion system. Skin thickness variation generated by three-dimensional morphogenesis of skin scales causes the underlying reaction-diffusion dynamics to separate into microscopic and mesoscopic spatial scales, the latter generating a cellular automaton. Our study indicates that cellular automata are not merely abstract computational systems, but can directly correspond to processes generated by biological evolution.

  15. A living mesoscopic cellular automaton made of skin scales

    NASA Astrophysics Data System (ADS)

    Manukyan, Liana; Montandon, Sophie A.; Fofonjka, Anamarija; Smirnov, Stanislav; Milinkovitch, Michel C.

    2017-04-01

    In vertebrates, skin colour patterns emerge from nonlinear dynamical microscopic systems of cell interactions. Here we show that in ocellated lizards a quasi-hexagonal lattice of skin scales, rather than individual chromatophore cells, establishes a green and black labyrinthine pattern of skin colour. We analysed time series of lizard scale colour dynamics over four years of their development and demonstrate that this pattern is produced by a cellular automaton (a grid of elements whose states are iterated according to a set of rules based on the states of neighbouring elements) that dynamically computes the colour states of individual mesoscopic skin scales to produce the corresponding macroscopic colour pattern. Using numerical simulations and mathematical derivation, we identify how a discrete von Neumann cellular automaton emerges from a continuous Turing reaction-diffusion system. Skin thickness variation generated by three-dimensional morphogenesis of skin scales causes the underlying reaction-diffusion dynamics to separate into microscopic and mesoscopic spatial scales, the latter generating a cellular automaton. Our study indicates that cellular automata are not merely abstract computational systems, but can directly correspond to processes generated by biological evolution.

  16. Overcoming rule-based rigidity and connectionist limitations through massively-parallel case-based reasoning

    NASA Technical Reports Server (NTRS)

    Barnden, John; Srinivas, Kankanahalli

    1990-01-01

    Symbol manipulation as used in traditional Artificial Intelligence has been criticized by neural net researchers for being excessively inflexible and sequential. On the other hand, the application of neural net techniques to the types of high-level cognitive processing studied in traditional artificial intelligence presents major problems as well. A promising way out of this impasse is to build neural net models that accomplish massively parallel case-based reasoning. Case-based reasoning, which has received much attention recently, is essentially the same as analogy-based reasoning, and avoids many of the problems leveled at traditional artificial intelligence. Further problems are avoided by doing many strands of case-based reasoning in parallel, and by implementing the whole system as a neural net. In addition, such a system provides an approach to some aspects of the problems of noise, uncertainty and novelty in reasoning systems. The current neural net system (Conposit), which performs standard rule-based reasoning, is being modified into a massively parallel case-based reasoning version.

  17. Preparation of name and address data for record linkage using hidden Markov models

    PubMed Central

    Churches, Tim; Christen, Peter; Lim, Kim; Zhu, Justin Xi

    2002-01-01

    Background Record linkage refers to the process of joining records that relate to the same entity or event in one or more data collections. In the absence of a shared, unique key, record linkage involves the comparison of ensembles of partially-identifying, non-unique data items between pairs of records. Data items with variable formats, such as names and addresses, need to be transformed and normalised in order to validly carry out these comparisons. Traditionally, deterministic rule-based data processing systems have been used to carry out this pre-processing, which is commonly referred to as "standardisation". This paper describes an alternative approach to standardisation, using a combination of lexicon-based tokenisation and probabilistic hidden Markov models (HMMs). Methods HMMs were trained to standardise typical Australian name and address data drawn from a range of health data collections. The accuracy of the results was compared to that produced by rule-based systems. Results Training of HMMs was found to be quick and did not require any specialised skills. For addresses, HMMs produced equal or better standardisation accuracy than a widely-used rule-based system. However, acccuracy was worse when used with simpler name data. Possible reasons for this poorer performance are discussed. Conclusion Lexicon-based tokenisation and HMMs provide a viable and effort-effective alternative to rule-based systems for pre-processing more complex variably formatted data such as addresses. Further work is required to improve the performance of this approach with simpler data such as names. Software which implements the methods described in this paper is freely available under an open source license for other researchers to use and improve. PMID:12482326

  18. (abstract) Scaling Nominal Solar Cell Impedances for Array Design

    NASA Technical Reports Server (NTRS)

    Mueller, Robert L; Wallace, Matthew T.; Iles, Peter

    1994-01-01

    This paper discusses a task the objective of which is to characterize solar cell array AC impedance and develop scaling rules for impedance characterization of large arrays by testing single solar cells and small arrays. This effort is aimed at formulating a methodology for estimating the AC impedance of the Mars Pathfinder (MPF) cruise and lander solar arrays based upon testing single cells and small solar cell arrays and to create a basis for design of a single shunt limiter for MPF power control of flight solar arrays having very different inpedances.

  19. A probabilistic method to diagnose faults of air handling units

    NASA Astrophysics Data System (ADS)

    Dey, Debashis

    Air handling unit (AHU) is one of the most extensively used equipment in large commercial buildings. This device is typically customized and lacks quality system integration which can result in hardwire failures and controller errors. Air handling unit Performance Assessment Rules (APAR) is a fault detection tool that uses a set of expert rules derived from mass and energy balances to detect faults in air handling units. APAR is computationally simple enough that it can be embedded in commercial building automation and control systems and relies only upon sensor data and control signals that are commonly available in these systems. Although APAR has many advantages over other methods, for example no training data required and easy to implement commercially, most of the time it is unable to provide the diagnosis of the faults. For instance, a fault on temperature sensor could be fixed bias, drifting bias, inappropriate location, complete failure. Also a fault in mixing box can be return and outdoor damper leak or stuck. In addition, when multiple rules are satisfied the list of faults increases. There is no proper way to have the correct diagnosis for rule based fault detection system. To overcome this limitation we proposed Bayesian Belief Network (BBN) as a diagnostic tool. BBN can be used to simulate diagnostic thinking of FDD experts through a probabilistic way. In this study we developed a new way to detect and diagnose faults in AHU through combining APAR rules and Bayesian Belief network. Bayesian Belief Network is used as a decision support tool for rule based expert system. BBN is highly capable to prioritize faults when multiple rules are satisfied simultaneously. Also it can get information from previous AHU operating conditions and maintenance records to provide proper diagnosis. The proposed model is validated with real time measured data of a campus building at University of Texas at San Antonio (UTSA).The results show that BBN is correctly able to prioritize faults which can be verified by manual investigation.

  20. Toward micro-scale spatial modeling of gentrification

    NASA Astrophysics Data System (ADS)

    O'Sullivan, David

    A simple preliminary model of gentrification is presented. The model is based on an irregular cellular automaton architecture drawing on the concept of proximal space, which is well suited to the spatial externalities present in housing markets at the local scale. The rent gap hypothesis on which the model's cell transition rules are based is discussed. The model's transition rules are described in detail. Practical difficulties in configuring and initializing the model are described and its typical behavior reported. Prospects for further development of the model are discussed. The current model structure, while inadequate, is well suited to further elaboration and the incorporation of other interesting and relevant effects.

  1. Teaching artificial neural systems to drive: Manual training techniques for autonomous systems

    NASA Technical Reports Server (NTRS)

    Shepanski, J. F.; Macy, S. A.

    1987-01-01

    A methodology was developed for manually training autonomous control systems based on artificial neural systems (ANS). In applications where the rule set governing an expert's decisions is difficult to formulate, ANS can be used to extract rules by associating the information an expert receives with the actions taken. Properly constructed networks imitate rules of behavior that permits them to function autonomously when they are trained on the spanning set of possible situations. This training can be provided manually, either under the direct supervision of a system trainer, or indirectly using a background mode where the networks assimilates training data as the expert performs its day-to-day tasks. To demonstrate these methods, an ANS network was trained to drive a vehicle through simulated freeway traffic.

  2. An application of object-oriented knowledge representation to engineering expert systems

    NASA Technical Reports Server (NTRS)

    Logie, D. S.; Kamil, H.; Umaretiya, J. R.

    1990-01-01

    The paper describes an object-oriented knowledge representation and its application to engineering expert systems. The object-oriented approach promotes efficient handling of the problem data by allowing knowledge to be encapsulated in objects and organized by defining relationships between the objects. An Object Representation Language (ORL) was implemented as a tool for building and manipulating the object base. Rule-based knowledge representation is then used to simulate engineering design reasoning. Using a common object base, very large expert systems can be developed, comprised of small, individually processed, rule sets. The integration of these two schemes makes it easier to develop practical engineering expert systems. The general approach to applying this technology to the domain of the finite element analysis, design, and optimization of aerospace structures is discussed.

  3. Microcomputer-based classification of environmental data in municipal areas

    NASA Astrophysics Data System (ADS)

    Thiergärtner, H.

    1995-10-01

    Multivariate data-processing methods used in mineral resource identification can be used to classify urban regions. Using elements of expert systems, geographical information systems, as well as known classification and prognosis systems, it is possible to outline a single model that consists of resistant and of temporary parts of a knowledge base including graphical input and output treatment and of resistant and temporary elements of a bank of methods and algorithms. Whereas decision rules created by experts will be stored in expert systems directly, powerful classification rules in form of resistant but latent (implicit) decision algorithms may be implemented in the suggested model. The latent functions will be transformed into temporary explicit decision rules by learning processes depending on the actual task(s), parameter set(s), pixels selection(s), and expert control(s). This takes place both at supervised and nonsupervised classification of multivariately described pixel sets representing municipal subareas. The model is outlined briefly and illustrated by results obtained in a target area covering a part of the city of Berlin (Germany).

  4. A CLIPS-based tool for aircraft pilot-vehicle interface design

    NASA Technical Reports Server (NTRS)

    Fowler, Thomas D.; Rogers, Steven P.

    1991-01-01

    The Pilot-Vehicle Interface of modern aircraft is the cognitive, sensory, and psychomotor link between the pilot, the avionics modules, and all other systems on board the aircraft. To assist pilot-vehicle interface designers, a C Language Integrated Production System (CLIPS) based tool was developed that allows design information to be stored in a table that can be modified by rules representing design knowledge. Developed for the Apple Macintosh, the tool allows users without any CLIPS programming experience to form simple rules using a point and click interface.

  5. How to select combination operators for fuzzy expert systems using CRI

    NASA Technical Reports Server (NTRS)

    Turksen, I. B.; Tian, Y.

    1992-01-01

    A method to select combination operators for fuzzy expert systems using the Compositional Rule of Inference (CRI) is proposed. First, fuzzy inference processes based on CRI are classified into three categories in terms of their inference results: the Expansion Type Inference, the Reduction Type Inference, and Other Type Inferences. Further, implication operators under Sup-T composition are classified as the Expansion Type Operator, the Reduction Type Operator, and the Other Type Operators. Finally, the combination of rules or their consequences is investigated for inference processes based on CRI.

  6. Clarifying the paradigm for the ethics of donation and transplantation: Was 'dead' really so clear before organ donation?

    PubMed Central

    Shemie, Sam D

    2007-01-01

    Recent commentaries by Verheijde et al, Evans and Potts suggesting that donation after cardiac death practices routinely violate the dead donor rule are based on flawed presumptions. Cell biology, cardiopulmonary resuscitation, critical care life support technologies, donation and transplantation continue to inform concepts of life and death. The impact of oxygen deprivation to cells, organs and the brain is discussed in relation to death as a biological transition. In the face of advancing organ support and replacement technologies, the reversibility of cardiac arrest is now purely related to the context in which it occurs, in association to the availability and application of support systems to maintain oxygenated circulation. The 'complete and irreversible' lexicon commonly used in death discussions and legal statutes are ambiguous, indefinable and should be replaced by accurate terms. Criticism of controlled DCD on the basis of violating the dead donor rule, where autoresuscitation has not been described beyond 2 minutes, in which life support is withdrawn and CPR is not provided, is not valid. However, any post mortem intervention that re-establishes brain blood flow should be prohibited. In comparison to traditional practice, organ donation has forced the clarification of the diagnostic criteria for death and improved the rigour of the determinations. PMID:17718918

  7. A machine independent expert system for diagnosing environmentally induced spacecraft anomalies

    NASA Technical Reports Server (NTRS)

    Rolincik, Mark J.

    1991-01-01

    A new rule-based, machine independent analytical tool for diagnosing spacecraft anomalies, the EnviroNET expert system, was developed. Expert systems provide an effective method for storing knowledge, allow computers to sift through large amounts of data pinpointing significant parts, and most importantly, use heuristics in addition to algorithms which allow approximate reasoning and inference, and the ability to attack problems not rigidly defines. The EviroNET expert system knowledge base currently contains over two hundred rules, and links to databases which include past environmental data, satellite data, and previous known anomalies. The environmental causes considered are bulk charging, single event upsets (SEU), surface charging, and total radiation dose.

  8. User Guide for Unmanned Aerial System (UAS) Operations on the National Ranges

    DTIC Science & Technology

    2007-11-01

    WARFARE CENTER WEAPONS DIVISION, PT. MUGU NAVAL AIR WARFARE CENTER WEAPONS DIVISION, CHINA LAKE NAVAL AIR WARFARE CENTER AIRCRAFT DIVISION, PATUXENT...with IFR Instrument Flight Rules MRTFB Major Range and Test Facility Base NAS National Airspace System NM nautical mile NTIA National...sectional charts, Instrument Flight Rules ( IFR ) enroute charts, and terminal area charts. The floor and ceiling, operating hours, and controlling

  9. Ion Channel ElectroPhysiology Ontology (ICEPO) - a case study of text mining assisted ontology development.

    PubMed

    Elayavilli, Ravikumar Komandur; Liu, Hongfang

    2016-01-01

    Computational modeling of biological cascades is of great interest to quantitative biologists. Biomedical text has been a rich source for quantitative information. Gathering quantitative parameters and values from biomedical text is one significant challenge in the early steps of computational modeling as it involves huge manual effort. While automatically extracting such quantitative information from bio-medical text may offer some relief, lack of ontological representation for a subdomain serves as impedance in normalizing textual extractions to a standard representation. This may render textual extractions less meaningful to the domain experts. In this work, we propose a rule-based approach to automatically extract relations involving quantitative data from biomedical text describing ion channel electrophysiology. We further translated the quantitative assertions extracted through text mining to a formal representation that may help in constructing ontology for ion channel events using a rule based approach. We have developed Ion Channel ElectroPhysiology Ontology (ICEPO) by integrating the information represented in closely related ontologies such as, Cell Physiology Ontology (CPO), and Cardiac Electro Physiology Ontology (CPEO) and the knowledge provided by domain experts. The rule-based system achieved an overall F-measure of 68.93% in extracting the quantitative data assertions system on an independently annotated blind data set. We further made an initial attempt in formalizing the quantitative data assertions extracted from the biomedical text into a formal representation that offers potential to facilitate the integration of text mining into ontological workflow, a novel aspect of this study. This work is a case study where we created a platform that provides formal interaction between ontology development and text mining. We have achieved partial success in extracting quantitative assertions from the biomedical text and formalizing them in ontological framework. The ICEPO ontology is available for download at http://openbionlp.org/mutd/supplementarydata/ICEPO/ICEPO.owl.

  10. The Balance-Scale Task Revisited: A Comparison of Statistical Models for Rule-Based and Information-Integration Theories of Proportional Reasoning

    PubMed Central

    Hofman, Abe D.; Visser, Ingmar; Jansen, Brenda R. J.; van der Maas, Han L. J.

    2015-01-01

    We propose and test three statistical models for the analysis of children’s responses to the balance scale task, a seminal task to study proportional reasoning. We use a latent class modelling approach to formulate a rule-based latent class model (RB LCM) following from a rule-based perspective on proportional reasoning and a new statistical model, the Weighted Sum Model, following from an information-integration approach. Moreover, a hybrid LCM using item covariates is proposed, combining aspects of both a rule-based and information-integration perspective. These models are applied to two different datasets, a standard paper-and-pencil test dataset (N = 779), and a dataset collected within an online learning environment that included direct feedback, time-pressure, and a reward system (N = 808). For the paper-and-pencil dataset the RB LCM resulted in the best fit, whereas for the online dataset the hybrid LCM provided the best fit. The standard paper-and-pencil dataset yielded more evidence for distinct solution rules than the online data set in which quantitative item characteristics are more prominent in determining responses. These results shed new light on the discussion on sequential rule-based and information-integration perspectives of cognitive development. PMID:26505905

  11. A rule-based expert system for generating control displays at the Advanced Photon Source

    NASA Astrophysics Data System (ADS)

    Coulter, Karen J.

    1994-12-01

    The integration of a rule-based expert system for generating screen displays for controlling and monitoring instrumentation under the Experimental Physics and Industrial Control System (EPICS) is presented. The expert system is implemented using CLIPS, an expert system shell from the Software Technology Branch at Lyndon B. Johnson Space Center. The user selects the hardware input and output to be displayed and the expert system constructs a graphical control screen appropriate for the data. Such a system provides a method for implementing a common look and feel for displays created by several different users and reduces the amount of time required to create displays for new hardware configurations. Users are able to modify the displays as needed using the EPICS display editor tool.

  12. Expert systems in clinical microbiology.

    PubMed

    Winstanley, Trevor; Courvalin, Patrice

    2011-07-01

    This review aims to discuss expert systems in general and how they may be used in medicine as a whole and clinical microbiology in particular (with the aid of interpretive reading). It considers rule-based systems, pattern-based systems, and data mining and introduces neural nets. A variety of noncommercial systems is described, and the central role played by the EUCAST is stressed. The need for expert rules in the environment of reset EUCAST breakpoints is also questioned. Commercial automated systems with on-board expert systems are considered, with emphasis being placed on the "big three": Vitek 2, BD Phoenix, and MicroScan. By necessity and in places, the review becomes a general review of automated system performances for the detection of specific resistance mechanisms rather than focusing solely on expert systems. Published performance evaluations of each system are drawn together and commented on critically.

  13. Developing a modular architecture for creation of rule-based clinical diagnostic criteria.

    PubMed

    Hong, Na; Pathak, Jyotishman; Chute, Christopher G; Jiang, Guoqian

    2016-01-01

    With recent advances in computerized patient records system, there is an urgent need for producing computable and standards-based clinical diagnostic criteria. Notably, constructing rule-based clinical diagnosis criteria has become one of the goals in the International Classification of Diseases (ICD)-11 revision. However, few studies have been done in building a unified architecture to support the need for diagnostic criteria computerization. In this study, we present a modular architecture for enabling the creation of rule-based clinical diagnostic criteria leveraging Semantic Web technologies. The architecture consists of two modules: an authoring module that utilizes a standards-based information model and a translation module that leverages Semantic Web Rule Language (SWRL). In a prototype implementation, we created a diagnostic criteria upper ontology (DCUO) that integrates ICD-11 content model with the Quality Data Model (QDM). Using the DCUO, we developed a transformation tool that converts QDM-based diagnostic criteria into Semantic Web Rule Language (SWRL) representation. We evaluated the domain coverage of the upper ontology model using randomly selected diagnostic criteria from broad domains (n = 20). We also tested the transformation algorithms using 6 QDM templates for ontology population and 15 QDM-based criteria data for rule generation. As the results, the first draft of DCUO contains 14 root classes, 21 subclasses, 6 object properties and 1 data property. Investigation Findings, and Signs and Symptoms are the two most commonly used element types. All 6 HQMF templates are successfully parsed and populated into their corresponding domain specific ontologies and 14 rules (93.3 %) passed the rule validation. Our efforts in developing and prototyping a modular architecture provide useful insight into how to build a scalable solution to support diagnostic criteria representation and computerization.

  14. Toward sensor-based context aware systems.

    PubMed

    Sakurai, Yoshitaka; Takada, Kouhei; Anisetti, Marco; Bellandi, Valerio; Ceravolo, Paolo; Damiani, Ernesto; Tsuruta, Setsuo

    2012-01-01

    This paper proposes a methodology for sensor data interpretation that can combine sensor outputs with contexts represented as sets of annotated business rules. Sensor readings are interpreted to generate events labeled with the appropriate type and level of uncertainty. Then, the appropriate context is selected. Reconciliation of different uncertainty types is achieved by a simple technique that moves uncertainty from events to business rules by generating combs of standard Boolean predicates. Finally, context rules are evaluated together with the events to take a decision. The feasibility of our idea is demonstrated via a case study where a context-reasoning engine has been connected to simulated heartbeat sensors using prerecorded experimental data. We use sensor outputs to identify the proper context of operation of a system and trigger decision-making based on context information.

  15. Fuzzy based attitude controller for flexible spacecraft with on/off thrusters

    NASA Astrophysics Data System (ADS)

    Knapp, Roger G.; Adams, Neil J.

    A fuzzy-based attitude controller is designed for attitude control of a generic spacecraft with on/off thrusters. The controller is comprised of packages of rules dedicated to addressing different objectives (e.g., disturbance rejection, low fuel consumption, avoiding the excitation of flexible appendages, etc.). These rule packages can be inserted or removed depending on the requirements of the particular spacecraft and are parameterized based on vehicle parameters such as inertia or operational parameters such as the maneuvering rate. Individual rule packages can be 'weighted' relative to each other to emphasize the importance of one objective relative to another. Finally, the fuzzy controller and rule packages are demonstrated using the high-fidelity Space Shuttle Interactive On-Orbit Simulator (IOS) while performing typical on-orbit operations and are subsequently compared with the existing shuttle flight control system performance.

  16. Fuzzy based attitude controller for flexible spacecraft with on/off thrusters

    NASA Astrophysics Data System (ADS)

    Knapp, Roger Glenn

    1993-05-01

    A fuzzy-based attitude controller is designed for attitude control of a generic spacecraft with on/off thrusters. The controller is comprised of packages of rules dedicated to addressing different objectives (e.g., disturbance rejection, low fuel consumption, avoiding the excitation of flexible appendages, etc.). These rule packages can be inserted or removed depending on the requirements of the particular spacecraft and are parameterized based on vehicle parameters such as inertia or operational parameters such as the maneuvering rate. Individual rule packages can be 'weighted' relative to each other to emphasize the importance of one objective relative to another. Finally, the fuzzy controller and rule packages are demonstrated using the high-fidelity Space Shuttle Interactive On-Orbit Simulator (IOS) while performing typical on-orbit operations and are subsequently compared with the existing shuttle flight control system performance.

  17. Estrogen receptor expert system overview and examples

    EPA Science Inventory

    The estrogen receptor expert system (ERES) is a rule-based system developed to prioritize chemicals based upon their potential for binding to the ER. The ERES was initially developed to predict ER affinity of chemicals from two specific EPA chemical inventories, antimicrobial pe...

  18. VHBuild.com: A Web-Based System for Managing Knowledge in Projects.

    ERIC Educational Resources Information Center

    Li, Heng; Tang, Sandy; Man, K. F.; Love, Peter E. D.

    2002-01-01

    Describes an intelligent Web-based construction project management system called VHBuild.com which integrates project management, knowledge management, and artificial intelligence technologies. Highlights include an information flow model; time-cost optimization based on genetic algorithms; rule-based drawing interpretation; and a case-based…

  19. CARSVM: a class association rule-based classification framework and its application to gene expression data.

    PubMed

    Kianmehr, Keivan; Alhajj, Reda

    2008-09-01

    In this study, we aim at building a classification framework, namely the CARSVM model, which integrates association rule mining and support vector machine (SVM). The goal is to benefit from advantages of both, the discriminative knowledge represented by class association rules and the classification power of the SVM algorithm, to construct an efficient and accurate classifier model that improves the interpretability problem of SVM as a traditional machine learning technique and overcomes the efficiency issues of associative classification algorithms. In our proposed framework: instead of using the original training set, a set of rule-based feature vectors, which are generated based on the discriminative ability of class association rules over the training samples, are presented to the learning component of the SVM algorithm. We show that rule-based feature vectors present a high-qualified source of discrimination knowledge that can impact substantially the prediction power of SVM and associative classification techniques. They provide users with more conveniences in terms of understandability and interpretability as well. We have used four datasets from UCI ML repository to evaluate the performance of the developed system in comparison with five well-known existing classification methods. Because of the importance and popularity of gene expression analysis as real world application of the classification model, we present an extension of CARSVM combined with feature selection to be applied to gene expression data. Then, we describe how this combination will provide biologists with an efficient and understandable classifier model. The reported test results and their biological interpretation demonstrate the applicability, efficiency and effectiveness of the proposed model. From the results, it can be concluded that a considerable increase in classification accuracy can be obtained when the rule-based feature vectors are integrated in the learning process of the SVM algorithm. In the context of applicability, according to the results obtained from gene expression analysis, we can conclude that the CARSVM system can be utilized in a variety of real world applications with some adjustments.

  20. Unified Description of Inelastic Propensity Rules for Electron Transport through Nanoscale Junctions

    NASA Astrophysics Data System (ADS)

    Paulsson, Magnus; Frederiksen, Thomas; Ueba, Hiromu; Lorente, Nicolás; Brandbyge, Mads

    2008-06-01

    We present a method to analyze the results of first-principles based calculations of electronic currents including inelastic electron-phonon effects. This method allows us to determine the electronic and vibrational symmetries in play, and hence to obtain the so-called propensity rules for the studied systems. We show that only a few scattering states—namely those belonging to the most transmitting eigenchannels—need to be considered for a complete description of the electron transport. We apply the method on first-principles calculations of four different systems and obtain the propensity rules in each case.

  1. Proof Rules for Automated Compositional Verification through Learning

    NASA Technical Reports Server (NTRS)

    Barringer, Howard; Giannakopoulou, Dimitra; Pasareanu, Corina S.

    2003-01-01

    Compositional proof systems not only enable the stepwise development of concurrent processes but also provide a basis to alleviate the state explosion problem associated with model checking. An assume-guarantee style of specification and reasoning has long been advocated to achieve compositionality. However, this style of reasoning is often non-trivial, typically requiring human input to determine appropriate assumptions. In this paper, we present novel assume- guarantee rules in the setting of finite labelled transition systems with blocking communication. We show how these rules can be applied in an iterative and fully automated fashion within a framework based on learning.

  2. IMGT/3Dstructure-DB and IMGT/StructuralQuery, a database and a tool for immunoglobulin, T cell receptor and MHC structural data

    PubMed Central

    Kaas, Quentin; Ruiz, Manuel; Lefranc, Marie-Paule

    2004-01-01

    IMGT/3Dstructure-DB and IMGT/Structural-Query are a novel 3D structure database and a new tool for immunological proteins. They are part of IMGT, the international ImMunoGenetics information system®, a high-quality integrated knowledge resource specializing in immunoglobulins (IG), T cell receptors (TR), major histocompatibility complex (MHC) and related proteins of the immune system (RPI) of human and other vertebrate species, which consists of databases, Web resources and interactive on-line tools. IMGT/3Dstructure-DB data are described according to the IMGT Scientific chart rules based on the IMGT-ONTOLOGY concepts. IMGT/3Dstructure-DB provides IMGT gene and allele identification of IG, TR and MHC proteins with known 3D structures, domain delimitations, amino acid positions according to the IMGT unique numbering and renumbered coordinate flat files. Moreover IMGT/3Dstructure-DB provides 2D graphical representations (or Collier de Perles) and results of contact analysis. The IMGT/StructuralQuery tool allows search of this database based on specific structural characteristics. IMGT/3Dstructure-DB and IMGT/StructuralQuery are freely available at http://imgt.cines.fr. PMID:14681396

  3. A knowledge-based support system for mechanical ventilation of the lungs. The KUSIVAR concept and prototype.

    PubMed

    Rudowski, R; Frostell, C; Gill, H

    1989-09-01

    The KUSIVAR is an expert system for mechanical ventilation of adult patients suffering from respiratory insufficiency. Its main objective is to provide guidance in respirator management. The knowledge base includes both qualitative, rule-based knowledge and quantitative knowledge expressed in the form of mathematical models (expert control) which is used for prediction of arterial gas tensions and optimization purposes. The system is data driven and uses a forward chaining mechanism for rule invocation. The interaction with the user will be performed in advisory, critiquing, semi-automatic and automatic modes. The system is at present in an advanced prototype stage. Prototyping is performed using KEE (Knowledge Engineering Environment) on a Sperry Explorer workstation. For further development and clinical use the expert system will be downloaded to an advanced PC. The system is intended to support therapy with a Siemens-Elema Servoventilator 900 C.

  4. A knowledge-based, concept-oriented view generation system for clinical data.

    PubMed

    Zeng, Q; Cimino, J J

    2001-04-01

    Information overload is a well-known problem for clinicians who must review large amounts of data in patient records. Concept-oriented views, which organize patient data around clinical concepts such as diagnostic strategies and therapeutic goals, may offer a solution to the problem of information overload. However, although concept-oriented views are desirable, they are difficult to create and maintain. We have developed a general-purpose, knowledge-based approach to the generation of concept-oriented views and have developed a system to test our approach. The system creates concept-oriented views through automated identification of relevant patient data. The knowledge in the system is represented by both a semantic network and rules. The key relevant data identification function is accomplished by a rule-based traversal of the semantic network. This paper focuses on the design and implementation of the system; an evaluation of the system is reported separately.

  5. A.I.-based real-time support for high performance aircraft operations

    NASA Technical Reports Server (NTRS)

    Vidal, J. J.

    1985-01-01

    Artificial intelligence (AI) based software and hardware concepts are applied to the handling system malfunctions during flight tests. A representation of malfunction procedure logic using Boolean normal forms are presented. The representation facilitates the automation of malfunction procedures and provides easy testing for the embedded rules. It also forms a potential basis for a parallel implementation in logic hardware. The extraction of logic control rules, from dynamic simulation and their adaptive revision after partial failure are examined. It uses a simplified 2-dimensional aircraft model with a controller that adaptively extracts control rules for directional thrust that satisfies a navigational goal without exceeding pre-established position and velocity limits. Failure recovery (rule adjusting) is examined after partial actuator failure. While this experiment was performed with primitive aircraft and mission models, it illustrates an important paradigm and provided complexity extrapolations for the proposed extraction of expertise from simulation, as discussed. The use of relaxation and inexact reasoning in expert systems was also investigated.

  6. Autonomous Flight Safety System September 27, 2005, Aircraft Test

    NASA Technical Reports Server (NTRS)

    Simpson, James C.

    2005-01-01

    This report describes the first aircraft test of the Autonomous Flight Safety System (AFSS). The test was conducted on September 27, 2005, near Kennedy Space Center (KSC) using a privately-owned single-engine plane and evaluated the performance of several basic flight safety rules using real-time data onboard a moving aerial vehicle. This test follows the first road test of AFSS conducted in February 2005 at KSC. AFSS is a joint KSC and Wallops Flight Facility (WEF) project that is in its third phase of development. AFSS is an independent subsystem intended for use with Expendable Launch Vehicles that uses tracking data from redundant onboard sensors to autonomously make flight termination decisions using software-based rules implemented on redundant flight processors. The goals of this project are to increase capabilities by allowing launches from locations that do not have or cannot afford extensive ground-based range safety assets, to decrease range costs, and to decrease reaction time for special situations. The mission rules are configured for each operation by the responsible Range Safety authorities and can be loosely categorized in four major categories: Parameter Threshold Violations, Physical Boundary Violations present position and instantaneous impact point (TIP), Gate Rules static and dynamic, and a Green-Time Rule. Examples of each of these rules were evaluated during this aircraft test.

  7. [Research on the Clinical Alarm Management Mechanism Based on Closed-loop Control Theory].

    PubMed

    Lin, Zhongkuan; Zheng, Kun; Shen, Yunming; Wu, Yunyun

    2018-05-30

    This paper proposes a clinical alarm management system based on the theory of the closed loop control. The alarm management mechanism can be divided into the expected standard, improving execution rule, rule execution, medical devices with alarm functions, results analysis strategy and the output link. And, we make relevant application and discussion. Results showed that the mechanism can be operable and effective.

  8. Starmind: A Fuzzy Logic Knowledge-Based System for the Automated Classification of Stars in the MK System

    NASA Astrophysics Data System (ADS)

    Manteiga, M.; Carricajo, I.; Rodríguez, A.; Dafonte, C.; Arcay, B.

    2009-02-01

    Astrophysics is evolving toward a more rational use of costly observational data by intelligently exploiting the large terrestrial and spatial astronomical databases. In this paper, we present a study showing the suitability of an expert system to perform the classification of stellar spectra in the Morgan and Keenan (MK) system. Using the formalism of artificial intelligence for the development of such a system, we propose a rules' base that contains classification criteria and confidence grades, all integrated in an inference engine that emulates human reasoning by means of a hierarchical decision rules tree that also considers the uncertainty factors associated with rules. Our main objective is to illustrate the formulation and development of such a system for an astrophysical classification problem. An extensive spectral database of MK standard spectra has been collected and used as a reference to determine the spectral indexes that are suitable for classification in the MK system. It is shown that by considering 30 spectral indexes and associating them with uncertainty factors, we can find an accurate diagnose in MK types of a particular spectrum. The system was evaluated against the NOAO-INDO-US spectral catalog.

  9. A cell-based study on pedestrian acceleration and overtaking in a transfer station corridor

    NASA Astrophysics Data System (ADS)

    Ji, Xiangfeng; Zhou, Xuemei; Ran, Bin

    2013-04-01

    Pedestrian speed in a transfer station corridor is faster than usual and sometimes running can be found among some of them. In this paper, pedestrians are divided into two categories. The first one is aggressive, and the other is conservative. Aggressive pedestrians weaving their way through crowd in the corridor are the study object of this paper. During recent decades, much attention has been paid to the pedestrians' behavior, such as overtaking (also deceleration) and collision avoidance, and that continues in this paper. After sufficiently analyzing the characteristics of pedestrian flow in transfer station corridor, a cell-based model is presented in this paper, including the acceleration (also deceleration) and overtaking analysis. Acceleration (also deceleration) in a corridor is fixed according to Newton's Law and then speed calculated with a kinematic formula is discretized into cells based on the fuzzy logic. After the speed is updated, overtaking is analyzed based on updated speed and force explicitly, compared to rule-based models, which herein we call implicit ones. During the analysis of overtaking, a threshold value to determine the overtaking direction is introduced. Actually, model in this paper is a two-step one. The first step is to update speed, which is the cells the pedestrian can move in one time interval and the other is to analyze the overtaking. Finally, a comparison between the rule-based cellular automata, the model in this paper and data in HCM 2000 is made to demonstrate our model can be used to achieve reasonable simulation of acceleration (also deceleration) and overtaking among pedestrians.

  10. Electron microscopy of terminal buds on the barbels of the silurid fish, Corydoras paleatus.

    PubMed

    Fujimotu, S; Yamamoto, K

    1980-06-01

    The terminal buds of the Corydoras paleatus were observed with the electron microscope. Almost all the cells constituting the buds can be classified into two distinct cell types, supporting and receptor cells. In addition, a few cells designated as basal cells exist in the bottom of the buds and appear to be an immature form of each distinct cell type in the course of cell renewal. The receptor cells are characterized by the presence of tubules extending from the apical process. By the application of lanthanum nitrate as an extracellular marker, we demonstrated that the tubular system is in continuity with the extracellular space. The data suggest that the tubular system represents an amplification of the apical cell surface as a particular site of chemoreceptive activities, although we do not rule out a role for active absorptions of ions in a very hypotonic environment.

  11. A Scala DSL for RETE-Based Runtime Verification

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus

    2013-01-01

    Runtime verification (RV) consists in part of checking execution traces against formalized specifications. Several systems have emerged, most of which support specification notations based on state machines, regular expressions, temporal logic, or grammars. The field of Artificial Intelligence (AI) has for an even longer period of time studied rule-based production systems, which at a closer look appear to be relevant for RV, although seemingly focused on slightly different application domains, such as for example business processes and expert systems. The core algorithm in many of these systems is the Rete algorithm. We have implemented a Rete-based runtime verification system, named LogFire (originally intended for offline log analysis but also applicable to online analysis), as an internal DSL in the Scala programming language, using Scala's support for defining DSLs. This combination appears attractive from a practical point of view. Our contribution is in part conceptual in arguing that such rule-based frameworks originating from AI may be suited for RV.

  12. Quantum Tic-Tac-Toe as Metaphor for Quantum Physics

    NASA Astrophysics Data System (ADS)

    Goff, Allan; Lehmann, Dale; Siegel, Joel

    2004-02-01

    Quantum Tic-Tac-Toe is presented as an abstract quantum system derived from the rules of Classical Tic-Tac-Toe. Abstract quantum systems can be constructed from classical systems by the addition of three types of rules; rules of Superposition, rules of Entanglement, and rules of Collapse. This is formally done for Quantum Tic-Tac-Toe. As a part of this construction it is shown that abstract quantum systems can be viewed as an ensemble of classical systems. That is, the state of a quantum game implies a set of simultaneous classical games. The number and evolution of the ensemble of classical games is driven by the superposition, entanglement, and collapse rules. Various aspects and play situations provide excellent metaphors for standard features of quantum mechanics. Several of the more significant metaphors are discussed, including a measurement mechanism, the correspondence principle, Everett's Many Worlds Hypothesis, an ascertainity principle, and spooky action at a distance. Abstract quantum systems also show the consistency of backwards-in-time causality, and the influence on the present of both pasts and futures that never happened. The strongest logical argument against faster-than-light (FTL) phenomena is that since FTL implies backwards-in-time causality, temporal paradox is an unavoidable consequence of FTL; hence FTL is impossible. Since abstract quantum systems support backwards-in-time causality but avoid temporal paradox through pruning of the classical ensemble, it may be that quantum based FTL schemes are possible allowing backwards-in-time causality, but prohibiting temporal paradox.

  13. An easy-to-use diagnostic system development shell

    NASA Technical Reports Server (NTRS)

    Tsai, L. C.; Ross, J. B.; Han, C. Y.; Wee, W. G.

    1987-01-01

    The Diagnostic System Development Shell (DSDS), an expert system development shell for diagnostic systems, is described. The major objective of building the DSDS is to create a very easy to use and friendly environment for knowledge engineers and end-users. The DSDS is written in OPS5 and CommonLisp. It runs on a VAX/VMS system. A set of domain independent, generalized rules is built in the DSDS, so the users need not be concerned about building the rules. The facts are explicitly represented in a unified format. A powerful check facility which helps the user to check the errors in the created knowledge bases is provided. A judgement facility and other useful facilities are also available. A diagnostic system based on the DSDS system is question driven and can call or be called by other knowledge based systems written in OPS5 and CommonLisp. A prototype diagnostic system for diagnosing a Philips constant potential X-ray system has been built using the DSDS.

  14. Knowledge-based control of an adaptive interface

    NASA Technical Reports Server (NTRS)

    Lachman, Roy

    1989-01-01

    The analysis, development strategy, and preliminary design for an intelligent, adaptive interface is reported. The design philosophy couples knowledge-based system technology with standard human factors approaches to interface development for computer workstations. An expert system has been designed to drive the interface for application software. The intelligent interface will be linked to application packages, one at a time, that are planned for multiple-application workstations aboard Space Station Freedom. Current requirements call for most Space Station activities to be conducted at the workstation consoles. One set of activities will consist of standard data management services (DMS). DMS software includes text processing, spreadsheets, data base management, etc. Text processing was selected for the first intelligent interface prototype because text-processing software can be developed initially as fully functional but limited with a small set of commands. The program's complexity then can be increased incrementally. The intelligent interface includes the operator's behavior and three types of instructions to the underlying application software are included in the rule base. A conventional expert-system inference engine searches the data base for antecedents to rules and sends the consequents of fired rules as commands to the underlying software. Plans for putting the expert system on top of a second application, a database management system, will be carried out following behavioral research on the first application. The intelligent interface design is suitable for use with ground-based workstations now common in government, industrial, and educational organizations.

  15. Automatic Fault Recognition of Photovoltaic Modules Based on Statistical Analysis of Uav Thermography

    NASA Astrophysics Data System (ADS)

    Kim, D.; Youn, J.; Kim, C.

    2017-08-01

    As a malfunctioning PV (Photovoltaic) cell has a higher temperature than adjacent normal cells, we can detect it easily with a thermal infrared sensor. However, it will be a time-consuming way to inspect large-scale PV power plants by a hand-held thermal infrared sensor. This paper presents an algorithm for automatically detecting defective PV panels using images captured with a thermal imaging camera from an UAV (unmanned aerial vehicle). The proposed algorithm uses statistical analysis of thermal intensity (surface temperature) characteristics of each PV module to verify the mean intensity and standard deviation of each panel as parameters for fault diagnosis. One of the characteristics of thermal infrared imaging is that the larger the distance between sensor and target, the lower the measured temperature of the object. Consequently, a global detection rule using the mean intensity of all panels in the fault detection algorithm is not applicable. Therefore, a local detection rule based on the mean intensity and standard deviation range was developed to detect defective PV modules from individual array automatically. The performance of the proposed algorithm was tested on three sample images; this verified a detection accuracy of defective panels of 97 % or higher. In addition, as the proposed algorithm can adjust the range of threshold values for judging malfunction at the array level, the local detection rule is considered better suited for highly sensitive fault detection compared to a global detection rule.

  16. Prototype of Kepler Processing Workflows For Microscopy And Neuroinformatics

    PubMed Central

    Astakhov, V.; Bandrowski, A.; Gupta, A.; Kulungowski, A.W.; Grethe, J.S.; Bouwer, J.; Molina, T.; Rowley, V.; Penticoff, S.; Terada, M.; Wong, W.; Hakozaki, H.; Kwon, O.; Martone, M.E.; Ellisman, M.

    2016-01-01

    We report on progress of employing the Kepler workflow engine to prototype “end-to-end” application integration workflows that concern data coming from microscopes deployed at the National Center for Microscopy Imaging Research (NCMIR). This system is built upon the mature code base of the Cell Centered Database (CCDB) and integrated rule-oriented data system (IRODS) for distributed storage. It provides integration with external projects such as the Whole Brain Catalog (WBC) and Neuroscience Information Framework (NIF), which benefit from NCMIR data. We also report on specific workflows which spawn from main workflows and perform data fusion and orchestration of Web services specific for the NIF project. This “Brain data flow” presents a user with categorized information about sources that have information on various brain regions. PMID:28479932

  17. A Rules-Based Service for Suggesting Visualizations to Analyze Earth Science Phenomena.

    NASA Astrophysics Data System (ADS)

    Prabhu, A.; Zednik, S.; Fox, P. A.; Ramachandran, R.; Maskey, M.; Shie, C. L.; Shen, S.

    2016-12-01

    Current Earth Science Information Systems lack support for new or interdisciplinary researchers, who may be unfamiliar with the domain vocabulary or the breadth of relevant data available. We need to evolve the current information systems, to reduce the time required for data preparation, processing and analysis. This can be done by effectively salvaging the "dark" resources in Earth Science. We assert that Earth science metadata assets are dark resources, information resources that organizations collect, process, and store for regular business or operational activities but fail to utilize for other purposes. In order to effectively use these dark resources, especially for data processing and visualization, we need a combination of domain, data product and processing knowledge, i.e. a knowledge base from which specific data operations can be performed. In this presentation, we describe a semantic, rules based approach to provide i.e. a service to visualize Earth Science phenomena, based on the data variables extracted using the "dark" metadata resources. We use Jena rules to make assertions about compatibility between a phenomena and various visualizations based on multiple factors. We created separate orthogonal rulesets to map each of these factors to the various phenomena. Some of the factors we have considered include measurements, spatial resolution and time intervals. This approach enables easy additions and deletions based on newly obtained domain knowledge or phenomena related information and thus improving the accuracy of the rules service overall.

  18. Advancing reservoir operation description in physically based hydrological models

    NASA Astrophysics Data System (ADS)

    Anghileri, Daniela; Giudici, Federico; Castelletti, Andrea; Burlando, Paolo

    2016-04-01

    Last decades have seen significant advances in our capacity of characterizing and reproducing hydrological processes within physically based models. Yet, when the human component is considered (e.g. reservoirs, water distribution systems), the associated decisions are generally modeled with very simplistic rules, which might underperform in reproducing the actual operators' behaviour on a daily or sub-daily basis. For example, reservoir operations are usually described by a target-level rule curve, which represents the level that the reservoir should track during normal operating conditions. The associated release decision is determined by the current state of the reservoir relative to the rule curve. This modeling approach can reasonably reproduce the seasonal water volume shift due to reservoir operation. Still, it cannot capture more complex decision making processes in response, e.g., to the fluctuations of energy prices and demands, the temporal unavailability of power plants or varying amount of snow accumulated in the basin. In this work, we link a physically explicit hydrological model with detailed hydropower behavioural models describing the decision making process by the dam operator. In particular, we consider two categories of behavioural models: explicit or rule-based behavioural models, where reservoir operating rules are empirically inferred from observational data, and implicit or optimization based behavioural models, where, following a normative economic approach, the decision maker is represented as a rational agent maximising a utility function. We compare these two alternate modelling approaches on the real-world water system of Lake Como catchment in the Italian Alps. The water system is characterized by the presence of 18 artificial hydropower reservoirs generating almost 13% of the Italian hydropower production. Results show to which extent the hydrological regime in the catchment is affected by different behavioural models and reservoir operating strategies.

  19. Quantitative knowledge acquisition for expert systems

    NASA Technical Reports Server (NTRS)

    Belkin, Brenda L.; Stengel, Robert F.

    1991-01-01

    A common problem in the design of expert systems is the definition of rules from data obtained in system operation or simulation. While it is relatively easy to collect data and to log the comments of human operators engaged in experiments, generalizing such information to a set of rules has not previously been a direct task. A statistical method is presented for generating rule bases from numerical data, motivated by an example based on aircraft navigation with multiple sensors. The specific objective is to design an expert system that selects a satisfactory suite of measurements from a dissimilar, redundant set, given an arbitrary navigation geometry and possible sensor failures. The systematic development is described of a Navigation Sensor Management (NSM) Expert System from Kalman Filter convariance data. The method invokes two statistical techniques: Analysis of Variance (ANOVA) and the ID3 Algorithm. The ANOVA technique indicates whether variations of problem parameters give statistically different covariance results, and the ID3 algorithms identifies the relationships between the problem parameters using probabilistic knowledge extracted from a simulation example set. Both are detailed.

  20. Development of an expert system for fractography of environmentally assisted cracking

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Minoshima, Kohji; Komai, Kenjiro; Yamasaki, Norimasa

    1997-12-31

    An expert system that diagnoses the causes of failure of environmentally assisted cracking (EAC) based upon fractography has been developed. The system uses the OPS83 programming language, expressing rules in the manner of production rules, and is composed of three independent subsystems, which respectively deal with EACs of high-strength or high-tensile-strength steel, aluminum alloy, and stainless steel in dry and humidified air, water, and aqueous solutions containing Cl, Br, or I ions. The concerned EAC issues cover stress corrosion cracking (SCC), hydrogen embrittlement, cyclic SCC, dynamic SCC, and corrosion fatigue as well as fatigue and overload fracture. The knowledge basemore » covers the rules relating to not only environments, materials, and loading conditions, but also macroscopic and microscopic fracture surface morphology. In order to deal with vague expressions of fracture surface morphology, fuzzy set theory is used in the system, and the description of rules about vague fracture surface appearance is thereby possible. Applying the developed expert system to case histories, accurate diagnoses were made. The authors discuss the related diagnosis results and usefulness of the developed system.« less

  1. TARGET's role in knowledge acquisition, engineering, validation, and documentation

    NASA Technical Reports Server (NTRS)

    Levi, Keith R.

    1994-01-01

    We investigate the use of the TARGET task analysis tool for use in the development of rule-based expert systems. We found TARGET to be very helpful in the knowledge acquisition process. It enabled us to perform knowledge acquisition with one knowledge engineer rather than two. In addition, it improved communication between the domain expert and knowledge engineer. We also found it to be useful for both the rule development and refinement phases of the knowledge engineering process. Using the network in these phases required us to develop guidelines that enabled us to easily translate the network into production rules. A significant requirement for TARGET remaining useful throughout the knowledge engineering process was the need to carefully maintain consistency between the network and the rule representations. Maintaining consistency not only benefited the knowledge engineering process, but also has significant payoffs in the areas of validation of the expert system and documentation of the knowledge in the system.

  2. Automatic inference of indexing rules for MEDLINE

    PubMed Central

    Névéol, Aurélie; Shooshan, Sonya E; Claveau, Vincent

    2008-01-01

    Background: Indexing is a crucial step in any information retrieval system. In MEDLINE, a widely used database of the biomedical literature, the indexing process involves the selection of Medical Subject Headings in order to describe the subject matter of articles. The need for automatic tools to assist MEDLINE indexers in this task is growing with the increasing number of publications being added to MEDLINE. Methods: In this paper, we describe the use and the customization of Inductive Logic Programming (ILP) to infer indexing rules that may be used to produce automatic indexing recommendations for MEDLINE indexers. Results: Our results show that this original ILP-based approach outperforms manual rules when they exist. In addition, the use of ILP rules also improves the overall performance of the Medical Text Indexer (MTI), a system producing automatic indexing recommendations for MEDLINE. Conclusion: We expect the sets of ILP rules obtained in this experiment to be integrated into MTI. PMID:19025687

  3. Automatic inference of indexing rules for MEDLINE.

    PubMed

    Névéol, Aurélie; Shooshan, Sonya E; Claveau, Vincent

    2008-11-19

    Indexing is a crucial step in any information retrieval system. In MEDLINE, a widely used database of the biomedical literature, the indexing process involves the selection of Medical Subject Headings in order to describe the subject matter of articles. The need for automatic tools to assist MEDLINE indexers in this task is growing with the increasing number of publications being added to MEDLINE. In this paper, we describe the use and the customization of Inductive Logic Programming (ILP) to infer indexing rules that may be used to produce automatic indexing recommendations for MEDLINE indexers. Our results show that this original ILP-based approach outperforms manual rules when they exist. In addition, the use of ILP rules also improves the overall performance of the Medical Text Indexer (MTI), a system producing automatic indexing recommendations for MEDLINE. We expect the sets of ILP rules obtained in this experiment to be integrated into MTI.

  4. Hierarchical fuzzy control of low-energy building systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yu, Zhen; Dexter, Arthur

    2010-04-15

    A hierarchical fuzzy supervisory controller is described that is capable of optimizing the operation of a low-energy building, which uses solar energy to heat and cool its interior spaces. The highest level fuzzy rules choose the most appropriate set of lower level rules according to the weather and occupancy information; the second level fuzzy rules determine an optimal energy profile and the overall modes of operation of the heating, ventilating and air-conditioning system (HVAC); the third level fuzzy rules select the mode of operation of specific equipment, and assign schedules to the local controllers so that the optimal energy profilemore » can be achieved in the most efficient way. Computer simulation is used to compare the hierarchical fuzzy control scheme with a supervisory control scheme based on expert rules. The performance is evaluated by comparing the energy consumption and thermal comfort. (author)« less

  5. Rule Mining Techniques to Predict Prokaryotic Metabolic Pathways.

    PubMed

    Saidi, Rabie; Boudellioua, Imane; Martin, Maria J; Solovyev, Victor

    2017-01-01

    It is becoming more evident that computational methods are needed for the identification and the mapping of pathways in new genomes. We introduce an automatic annotation system (ARBA4Path Association Rule-Based Annotator for Pathways) that utilizes rule mining techniques to predict metabolic pathways across wide range of prokaryotes. It was demonstrated that specific combinations of protein domains (recorded in our rules) strongly determine pathways in which proteins are involved and thus provide information that let us very accurately assign pathway membership (with precision of 0.999 and recall of 0.966) to proteins of a given prokaryotic taxon. Our system can be used to enhance the quality of automatically generated annotations as well as annotating proteins with unknown function. The prediction models are represented in the form of human-readable rules, and they can be used effectively to add absent pathway information to many proteins in UniProtKB/TrEMBL database.

  6. A RULE-BASED SYSTEM FOR EVALUATING FINAL COVERS FOR HAZARDOUS WASTE LANDFILLS

    EPA Science Inventory

    This chapter examines how rules are used as a knowledge representation formalism in the domain of hazardous waste management. A specific example from this domain involves performance evaluation of final covers used to close hazardous waste landfills. Final cover design and associ...

  7. Timely Diagnostic Feedback for Database Concept Learning

    ERIC Educational Resources Information Center

    Lin, Jian-Wei; Lai, Yuan-Cheng; Chuang, Yuh-Shy

    2013-01-01

    To efficiently learn database concepts, this work adopts association rules to provide diagnostic feedback for drawing an Entity-Relationship Diagram (ERD). Using association rules and Asynchronous JavaScript and XML (AJAX) techniques, this work implements a novel Web-based Timely Diagnosis System (WTDS), which provides timely diagnostic feedback…

  8. 75 FR 67282 - Provisions Common to Registered Entities

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-02

    ...The Commodity Futures Trading Commission (``Commission'' or ``CFTC'') is proposing rules to implement new statutory provisions enacted under Title VII of the Dodd-Frank Wall Street Reform and Consumer Protection Act (``Dodd-Frank Act'') and amend existing rules affected by the passage of the Dodd-Frank Act. These proposed rules apply to designated contract markets (``DCMs''), derivatives clearing organizations (``DCOs''), swap execution facilities (``SEFs'') and swap data repositories (``SDRs''). The proposed rules implement the new statutory framework for certification and approval for new products, new rules and rule amendments submitted to the Commission by registered entities. Furthermore, the proposed rules prohibit event contracts based on certain excluded commodities, establish special procedures for certain rule changes proposed by systemically important derivatives clearing organizations (``SIDCOs''), and provide for the tolling of review periods for certain novel derivative products pending the resolution of jurisdictional determinations.

  9. Exclusive Liquid Repellency: An Open Multi-Liquid-Phase Technology for Rare Cell Culture and Single-Cell Processing.

    PubMed

    Li, Chao; Yu, Jiaquan; Schehr, Jennifer; Berry, Scott M; Leal, Ticiana A; Lang, Joshua M; Beebe, David J

    2018-05-23

    The concept of high liquid repellency in multi-liquid-phase systems (e.g., aqueous droplets in an oil background) has been applied to areas of biomedical research to realize intrinsic advantages not available in single-liquid-phase systems. Such advantages have included minimizing analyte loss, facile manipulation of single-cell samples, elimination of biofouling, and ease of use regarding loading and retrieving of the sample. In this paper, we present generalized design rules for predicting the wettability of solid-liquid-liquid systems (especially for discrimination between exclusive liquid repellency (ELR) and finite liquid repellency) to extend the applications of ELR. We then apply ELR to two model systems with open microfluidic design in cell biology: (1) in situ underoil culture and combinatorial coculture of mammalian cells in order to demonstrate directed single-cell multiencapsulation with minimal waste of samples as compared to stochastic cell seeding and (2) isolation of a pure population of circulating tumor cells, which is required for certain downstream analyses including sequencing and gene expression profiling.

  10. Neurosemantics, neurons and system theory.

    PubMed

    Breidbach, Olaf

    2007-08-01

    Following the concept of internal representations, signal processing in a neuronal system has to be evaluated exclusively based on internal system characteristics. Thus, this approach omits the external observer as a control function for sensory integration. Instead, the configuration of the system and its computational performance are the effects of endogenous factors. Such self-referential operation is due to a strictly local computation in a network and, thereby, computations follow a set of rules that constitute the emergent behaviour of the system. These rules can be shown to correspond to a "logic" that is intrinsic to the system, an idea which provides the basis for neurosemantics.

  11. Equating an expert system to a classifier in order to evaluate the expert system

    NASA Technical Reports Server (NTRS)

    Odell, Patrick L.

    1989-01-01

    A strategy to evaluate an expert system is formulated. The strategy proposed is based on finding an equivalent classifier to an expert system and evaluate that classifier with respect to an optimal classifier, a Bayes classifier. Here it is shown that for the rules considered an equivalent classifier exists. Also, a brief consideration of meta and meta-meta rules is included. Also, a taxonomy of expert systems is presented and an assertion made that an equivalent classifier exists for each type of expert system in the taxonomy with associated sets of underlying assumptions.

  12. Quantitative phase imaging of retinal cells (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    LaForest, Timothé; Carpentras, Dino; Kowalczuk, Laura; Behar-Cohen, Francine; Moser, Christophe

    2017-02-01

    Vision process is ruled by several cells layers of the retina. Before reaching the photoreceptors, light entering the eye has to pass through a few hundreds of micrometers thick layer of ganglion and neurons cells. Macular degeneration is a non-curable disease of themacula occurring with age. This disease can be diagnosed at an early stage by imaging neuronal cells in the retina and observing their death chronically. These cells are phase objects locatedon a background that presents an absorption pattern and so difficult to see with standard imagingtechniques in vivo. Phase imaging methods usually need the illumination system to be on the opposite side of the sample with respect to theimaging system. This is a constraintand a challenge for phase imaging in-vivo. Recently, the possibility of performing phase contrast imaging from one side using properties of scattering media has been shown. This phase contrast imaging is based on the back illumination generated by the sample itself. Here, we present a reflection phase imaging technique based on oblique back-illumination. The oblique back-illumination creates a dark field image of the sample. Generating asymmetric oblique illumination allows obtaining differential phase contrast image, which in turn can be processed to recover a quantitative phase image. In the case of the eye, a transcleral illumination can generate oblique incident light on the retina and the choroidal layer.The back reflected light is then collected by the eye lens to produce dark field image. We show experimental results of retinal phase imagesin ex vivo samples of human and pig retina.

  13. Health-Terrain: Visualizing Large Scale Health Data

    DTIC Science & Technology

    2014-12-01

    systems can only be realized if the quality of emerging large medical databases can be characterized and the meaning of the data understood. For this...Designed and tested an evaluation procedure for health data visualization system. This visualization framework offers a real time and web-based solution...rule is shown in the table, with the quality measures of each rule including the support, confidence, Laplace, Gain, p-s, lift and Conviction. We

  14. Scalable software architectures for decision support.

    PubMed

    Musen, M A

    1999-12-01

    Interest in decision-support programs for clinical medicine soared in the 1970s. Since that time, workers in medical informatics have been particularly attracted to rule-based systems as a means of providing clinical decision support. Although developers have built many successful applications using production rules, they also have discovered that creation and maintenance of large rule bases is quite problematic. In the 1980s, several groups of investigators began to explore alternative programming abstractions that can be used to build decision-support systems. As a result, the notions of "generic tasks" and of reusable problem-solving methods became extremely influential. By the 1990s, academic centers were experimenting with architectures for intelligent systems based on two classes of reusable components: (1) problem-solving methods--domain-independent algorithms for automating stereotypical tasks--and (2) domain ontologies that captured the essential concepts (and relationships among those concepts) in particular application areas. This paper highlights how developers can construct large, maintainable decision-support systems using these kinds of building blocks. The creation of domain ontologies and problem-solving methods is the fundamental end product of basic research in medical informatics. Consequently, these concepts need more attention by our scientific community.

  15. Deductive Glue Code Synthesis for Embedded Software Systems Based on Code Patterns

    NASA Technical Reports Server (NTRS)

    Liu, Jian; Fu, Jicheng; Zhang, Yansheng; Bastani, Farokh; Yen, I-Ling; Tai, Ann; Chau, Savio N.

    2006-01-01

    Automated code synthesis is a constructive process that can be used to generate programs from specifications. It can, thus, greatly reduce the software development cost and time. The use of formal code synthesis approach for software generation further increases the dependability of the system. Though code synthesis has many potential benefits, the synthesis techniques are still limited. Meanwhile, components are widely used in embedded system development. Applying code synthesis to component based software development (CBSD) process can greatly enhance the capability of code synthesis while reducing the component composition efforts. In this paper, we discuss the issues and techniques for applying deductive code synthesis techniques to CBSD. For deductive synthesis in CBSD, a rule base is the key for inferring appropriate component composition. We use the code patterns to guide the development of rules. Code patterns have been proposed to capture the typical usages of the components. Several general composition operations have been identified to facilitate systematic composition. We present the technique for rule development and automated generation of new patterns from existing code patterns. A case study of using this method in building a real-time control system is also presented.

  16. Self-organizing path integration using a linked continuous attractor and competitive network: path integration of head direction.

    PubMed

    Stringer, Simon M; Rolls, Edmund T

    2006-12-01

    A key issue is how networks in the brain learn to perform path integration, that is update a represented position using a velocity signal. Using head direction cells as an example, we show that a competitive network could self-organize to learn to respond to combinations of head direction and angular head rotation velocity. These combination cells can then be used to drive a continuous attractor network to the next head direction based on the incoming rotation signal. An associative synaptic modification rule with a short term memory trace enables preceding combination cell activity during training to be associated with the next position in the continuous attractor network. The network accounts for the presence of neurons found in the brain that respond to combinations of head direction and angular head rotation velocity. Analogous networks in the hippocampal system could self-organize to perform path integration of place and spatial view representations.

  17. Analysis of cell division patterns in the Arabidopsis shoot apical meristem

    DOE PAGES

    Shapiro, Bruce E.; Tobin, Cory; Mjolsness, Eric; ...

    2015-03-30

    The stereotypic pattern of cell shapes in the Arabidopsis shoot apical meristem (SAM) suggests that strict rules govern the placement of new walls during cell division. When a cell in the SAM divides, a new wall is built that connects existing walls and divides the cytoplasm of the daughter cells. Because features that are determined by the placement of new walls such as cell size, shape, and number of neighbors are highly regular, rules must exist for maintaining such order. Here in this paper we present a quantitative model of these rules that incorporates different observed features of cell division.more » Each feature is incorporated into a "potential function" that contributes a single term to a total analog of potential energy. New cell walls are predicted to occur at locations where the potential function is minimized. Quantitative terms that represent the well-known historical rules of plant cell division, such as those given by Hofmeister, Errera, and Sachs are developed and evaluated against observed cell divisions in the epidermal layer (L1) of Arabidopsis thaliana SAM. The method is general enough to allow additional terms for nongeometric properties such as internal concentration gradients and mechanical tensile forces.« less

  18. Optics Toolbox: An Intelligent Relational Database System For Optical Designers

    NASA Astrophysics Data System (ADS)

    Weller, Scott W.; Hopkins, Robert E.

    1986-12-01

    Optical designers were among the first to use the computer as an engineering tool. Powerful programs have been written to do ray-trace analysis, third-order layout, and optimization. However, newer computing techniques such as database management and expert systems have not been adopted by the optical design community. For the purpose of this discussion we will define a relational database system as a database which allows the user to specify his requirements using logical relations. For example, to search for all lenses in a lens database with a F/number less than two, and a half field of view near 28 degrees, you might enter the following: FNO < 2.0 and FOV of 28 degrees ± 5% Again for the purpose of this discussion, we will define an expert system as a program which contains expert knowledge, can ask intelligent questions, and can form conclusions based on the answers given and the knowledge which it contains. Most expert systems store this knowledge in the form of rules-of-thumb, which are written in an English-like language, and which are easily modified by the user. An example rule is: IF require microscope objective in air and require NA > 0.9 THEN suggest the use of an oil immersion objective The heart of the expert system is the rule interpreter, sometimes called an inference engine, which reads the rules and forms conclusions based on them. The use of a relational database system containing lens prototypes seems to be a viable prospect. However, it is not clear that expert systems have a place in optical design. In domains such as medical diagnosis and petrology, expert systems are flourishing. These domains are quite different from optical design, however, because optical design is a creative process, and the rules are difficult to write down. We do think that an expert system is feasible in the area of first order layout, which is sufficiently diagnostic in nature to permit useful rules to be written. This first-order expert would emulate an expert designer as he interacted with a customer for the first time: asking the right questions, forming conclusions, and making suggestions. With these objectives in mind, we have developed the Optics Toolbox. Optics Toolbox is actually two programs in one: it is a powerful relational database system with twenty-one search parameters, four search modes, and multi-database support, as well as a first-order optical design expert system with a rule interpreter which has full access to the relational database. The system schematic is shown in Figure 1.

  19. Semi-empirical correlation for binary interaction parameters of the Peng-Robinson equation of state with the van der Waals mixing rules for the prediction of high-pressure vapor-liquid equilibrium.

    PubMed

    Fateen, Seif-Eddeen K; Khalil, Menna M; Elnabawy, Ahmed O

    2013-03-01

    Peng-Robinson equation of state is widely used with the classical van der Waals mixing rules to predict vapor liquid equilibria for systems containing hydrocarbons and related compounds. This model requires good values of the binary interaction parameter kij . In this work, we developed a semi-empirical correlation for kij partly based on the Huron-Vidal mixing rules. We obtained values for the adjustable parameters of the developed formula for over 60 binary systems and over 10 categories of components. The predictions of the new equation system were slightly better than the constant-kij model in most cases, except for 10 systems whose predictions were considerably improved with the new correlation.

  20. Phases, phase equilibria, and phase rules in low-dimensional systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Frolov, T., E-mail: timfrol@berkeley.edu; Mishin, Y., E-mail: ymishin@gmu.edu

    2015-07-28

    We present a unified approach to thermodynamic description of one, two, and three dimensional phases and phase transformations among them. The approach is based on a rigorous definition of a phase applicable to thermodynamic systems of any dimensionality. Within this approach, the same thermodynamic formalism can be applied for the description of phase transformations in bulk systems, interfaces, and line defects separating interface phases. For both lines and interfaces, we rigorously derive an adsorption equation, the phase coexistence equations, and other thermodynamic relations expressed in terms of generalized line and interface excess quantities. As a generalization of the Gibbs phasemore » rule for bulk phases, we derive phase rules for lines and interfaces and predict the maximum number of phases than may coexist in systems of the respective dimensionality.« less

  1. Intelligent model-based diagnostics for vehicle health management

    NASA Astrophysics Data System (ADS)

    Luo, Jianhui; Tu, Fang; Azam, Mohammad S.; Pattipati, Krishna R.; Willett, Peter K.; Qiao, Liu; Kawamoto, Masayuki

    2003-08-01

    The recent advances in sensor technology, remote communication and computational capabilities, and standardized hardware/software interfaces are creating a dramatic shift in the way the health of vehicles is monitored and managed. These advances facilitate remote monitoring, diagnosis and condition-based maintenance of automotive systems. With the increased sophistication of electronic control systems in vehicles, there is a concomitant increased difficulty in the identification of the malfunction phenomena. Consequently, the current rule-based diagnostic systems are difficult to develop, validate and maintain. New intelligent model-based diagnostic methodologies that exploit the advances in sensor, telecommunications, computing and software technologies are needed. In this paper, we will investigate hybrid model-based techniques that seamlessly employ quantitative (analytical) models and graph-based dependency models for intelligent diagnosis. Automotive engineers have found quantitative simulation (e.g. MATLAB/SIMULINK) to be a vital tool in the development of advanced control systems. The hybrid method exploits this capability to improve the diagnostic system's accuracy and consistency, utilizes existing validated knowledge on rule-based methods, enables remote diagnosis, and responds to the challenges of increased system complexity. The solution is generic and has the potential for application in a wide range of systems.

  2. A semi-symmetric image encryption scheme based on the function projective synchronization of two hyperchaotic systems

    PubMed Central

    Li, Jinqing; Qi, Hui; Cong, Ligang; Yang, Huamin

    2017-01-01

    Both symmetric and asymmetric color image encryption have advantages and disadvantages. In order to combine their advantages and try to overcome their disadvantages, chaos synchronization is used to avoid the key transmission for the proposed semi-symmetric image encryption scheme. Our scheme is a hybrid chaotic encryption algorithm, and it consists of a scrambling stage and a diffusion stage. The control law and the update rule of function projective synchronization between the 3-cell quantum cellular neural networks (QCNN) response system and the 6th-order cellular neural network (CNN) drive system are formulated. Since the function projective synchronization is used to synchronize the response system and drive system, Alice and Bob got the key by two different chaotic systems independently and avoid the key transmission by some extra security links, which prevents security key leakage during the transmission. Both numerical simulations and security analyses such as information entropy analysis, differential attack are conducted to verify the feasibility, security, and efficiency of the proposed scheme. PMID:28910349

  3. A simple attitude control of quadrotor helicopter based on Ziegler-Nichols rules for tuning PD parameters.

    PubMed

    He, ZeFang; Zhao, Long

    2014-01-01

    An attitude control strategy based on Ziegler-Nichols rules for tuning PD (proportional-derivative) parameters of quadrotor helicopters is presented to solve the problem that quadrotor tends to be instable. This problem is caused by the narrow definition domain of attitude angles of quadrotor helicopters. The proposed controller is nonlinear and consists of a linear part and a nonlinear part. The linear part is a PD controller with PD parameters tuned by Ziegler-Nichols rules and acts on the quadrotor decoupled linear system after feedback linearization; the nonlinear part is a feedback linearization item which converts a nonlinear system into a linear system. It can be seen from the simulation results that the attitude controller proposed in this paper is highly robust, and its control effect is better than the other two nonlinear controllers. The nonlinear parts of the other two nonlinear controllers are the same as the attitude controller proposed in this paper. The linear part involves a PID (proportional-integral-derivative) controller with the PID controller parameters tuned by Ziegler-Nichols rules and a PD controller with the PD controller parameters tuned by GA (genetic algorithms). Moreover, this attitude controller is simple and easy to implement.

  4. A decision tree-based on-line preventive control strategy for power system transient instability prevention

    NASA Astrophysics Data System (ADS)

    Xu, Yan; Dong, Zhao Yang; Zhang, Rui; Wong, Kit Po

    2014-02-01

    Maintaining transient stability is a basic requirement for secure power system operations. Preventive control deals with modifying the system operating point to withstand probable contingencies. In this article, a decision tree (DT)-based on-line preventive control strategy is proposed for transient instability prevention of power systems. Given a stability database, a distance-based feature estimation algorithm is first applied to identify the critical generators, which are then used as features to develop a DT. By interpreting the splitting rules of DT, preventive control is realised by formulating the rules in a standard optimal power flow model and solving it. The proposed method is transparent in control mechanism, on-line computation compatible and convenient to deal with multi-contingency. The effectiveness and efficiency of the method has been verified on New England 10-machine 39-bus test system.

  5. A knowledge based application of the extended aircraft interrogation and display system

    NASA Technical Reports Server (NTRS)

    Glover, Richard D.; Larson, Richard R.

    1991-01-01

    A family of multiple-processor ground support test equipment was used to test digital flight-control systems on high-performance research aircraft. A unit recently built for the F-18 high alpha research vehicle project is the latest model in a series called the extended aircraft interrogation and display system. The primary feature emphasized monitors the aircraft MIL-STD-1553B data buses and provides real-time engineering units displays of flight-control parameters. A customized software package was developed to provide real-time data interpretation based on rules embodied in a highly structured knowledge database. The configuration of this extended aircraft interrogation and display system is briefly described, and the evolution of the rule based package and its application to failure modes and effects testing on the F-18 high alpha research vehicle is discussed.

  6. Policy recommendations for addressing privacy challenges associated with cell-based research and interventions.

    PubMed

    Ogbogu, Ubaka; Burningham, Sarah; Ollenberger, Adam; Calder, Kathryn; Du, Li; El Emam, Khaled; Hyde-Lay, Robyn; Isasi, Rosario; Joly, Yann; Kerr, Ian; Malin, Bradley; McDonald, Michael; Penney, Steven; Piat, Gayle; Roy, Denis-Claude; Sugarman, Jeremy; Vercauteren, Suzanne; Verhenneman, Griet; West, Lori; Caulfield, Timothy

    2014-02-03

    The increased use of human biological material for cell-based research and clinical interventions poses risks to the privacy of patients and donors, including the possibility of re-identification of individuals from anonymized cell lines and associated genetic data. These risks will increase as technologies and databases used for re-identification become affordable and more sophisticated. Policies that require ongoing linkage of cell lines to donors' clinical information for research and regulatory purposes, and existing practices that limit research participants' ability to control what is done with their genetic data, amplify the privacy concerns. To date, the privacy issues associated with cell-based research and interventions have not received much attention in the academic and policymaking contexts. This paper, arising out of a multi-disciplinary workshop, aims to rectify this by outlining the issues, proposing novel governance strategies and policy recommendations, and identifying areas where further evidence is required to make sound policy decisions. The authors of this paper take the position that existing rules and norms can be reasonably extended to address privacy risks in this context without compromising emerging developments in the research environment, and that exceptions from such rules should be justified using a case-by-case approach. In developing new policies, the broader framework of regulations governing cell-based research and related areas must be taken into account, as well as the views of impacted groups, including scientists, research participants and the general public. This paper outlines deliberations at a policy development workshop focusing on privacy challenges associated with cell-based research and interventions. The paper provides an overview of these challenges, followed by a discussion of key themes and recommendations that emerged from discussions at the workshop. The paper concludes that privacy risks associated with cell-based research and interventions should be addressed through evidence-based policy reforms that account for both well-established legal and ethical norms and current knowledge about actual or anticipated harms. The authors also call for research studies that identify and address gaps in understanding of privacy risks.

  7. Policy recommendations for addressing privacy challenges associated with cell-based research and interventions

    PubMed Central

    2014-01-01

    Background The increased use of human biological material for cell-based research and clinical interventions poses risks to the privacy of patients and donors, including the possibility of re-identification of individuals from anonymized cell lines and associated genetic data. These risks will increase as technologies and databases used for re-identification become affordable and more sophisticated. Policies that require ongoing linkage of cell lines to donors’ clinical information for research and regulatory purposes, and existing practices that limit research participants’ ability to control what is done with their genetic data, amplify the privacy concerns. Discussion To date, the privacy issues associated with cell-based research and interventions have not received much attention in the academic and policymaking contexts. This paper, arising out of a multi-disciplinary workshop, aims to rectify this by outlining the issues, proposing novel governance strategies and policy recommendations, and identifying areas where further evidence is required to make sound policy decisions. The authors of this paper take the position that existing rules and norms can be reasonably extended to address privacy risks in this context without compromising emerging developments in the research environment, and that exceptions from such rules should be justified using a case-by-case approach. In developing new policies, the broader framework of regulations governing cell-based research and related areas must be taken into account, as well as the views of impacted groups, including scientists, research participants and the general public. Summary This paper outlines deliberations at a policy development workshop focusing on privacy challenges associated with cell-based research and interventions. The paper provides an overview of these challenges, followed by a discussion of key themes and recommendations that emerged from discussions at the workshop. The paper concludes that privacy risks associated with cell-based research and interventions should be addressed through evidence-based policy reforms that account for both well-established legal and ethical norms and current knowledge about actual or anticipated harms. The authors also call for research studies that identify and address gaps in understanding of privacy risks. PMID:24485220

  8. Expert Systems in Clinical Microbiology

    PubMed Central

    Winstanley, Trevor; Courvalin, Patrice

    2011-01-01

    Summary: This review aims to discuss expert systems in general and how they may be used in medicine as a whole and clinical microbiology in particular (with the aid of interpretive reading). It considers rule-based systems, pattern-based systems, and data mining and introduces neural nets. A variety of noncommercial systems is described, and the central role played by the EUCAST is stressed. The need for expert rules in the environment of reset EUCAST breakpoints is also questioned. Commercial automated systems with on-board expert systems are considered, with emphasis being placed on the “big three”: Vitek 2, BD Phoenix, and MicroScan. By necessity and in places, the review becomes a general review of automated system performances for the detection of specific resistance mechanisms rather than focusing solely on expert systems. Published performance evaluations of each system are drawn together and commented on critically. PMID:21734247

  9. Elements of decisional dynamics: An agent-based approach applied to artificial financial market

    NASA Astrophysics Data System (ADS)

    Lucas, Iris; Cotsaftis, Michel; Bertelle, Cyrille

    2018-02-01

    This paper introduces an original mathematical description for describing agents' decision-making process in the case of problems affected by both individual and collective behaviors in systems characterized by nonlinear, path dependent, and self-organizing interactions. An application to artificial financial markets is proposed by designing a multi-agent system based on the proposed formalization. In this application, agents' decision-making process is based on fuzzy logic rules and the price dynamics is purely deterministic according to the basic matching rules of a central order book. Finally, while putting most parameters under evolutionary control, the computational agent-based system is able to replicate several stylized facts of financial time series (distributions of stock returns showing a heavy tail with positive excess kurtosis, absence of autocorrelations in stock returns, and volatility clustering phenomenon).

  10. Elements of decisional dynamics: An agent-based approach applied to artificial financial market.

    PubMed

    Lucas, Iris; Cotsaftis, Michel; Bertelle, Cyrille

    2018-02-01

    This paper introduces an original mathematical description for describing agents' decision-making process in the case of problems affected by both individual and collective behaviors in systems characterized by nonlinear, path dependent, and self-organizing interactions. An application to artificial financial markets is proposed by designing a multi-agent system based on the proposed formalization. In this application, agents' decision-making process is based on fuzzy logic rules and the price dynamics is purely deterministic according to the basic matching rules of a central order book. Finally, while putting most parameters under evolutionary control, the computational agent-based system is able to replicate several stylized facts of financial time series (distributions of stock returns showing a heavy tail with positive excess kurtosis, absence of autocorrelations in stock returns, and volatility clustering phenomenon).

  11. TMS for Instantiating a Knowledge Base With Incomplete Data

    NASA Technical Reports Server (NTRS)

    James, Mark

    2007-01-01

    A computer program that belongs to the class known among software experts as output truth-maintenance-systems (output TMSs) has been devised as one of a number of software tools for reducing the size of the knowledge base that must be searched during execution of artificial- intelligence software of the rule-based inference-engine type in a case in which data are missing. This program determines whether the consequences of activation of two or more rules can be combined without causing a logical inconsistency. For example, in a case involving hypothetical scenarios that could lead to turning a given device on or off, the program determines whether a scenario involving a given combination of rules could lead to turning the device both on and off at the same time, in which case that combination of rules would not be included in the scenario.

  12. Fuzzy based attitude controller for flexible spacecraft with on/off thrusters. M.S. Thesis - M.I.T., 1993

    NASA Technical Reports Server (NTRS)

    Knapp, Roger Glenn

    1993-01-01

    A fuzzy-based attitude controller is designed for attitude control of a generic spacecraft with on/off thrusters. The controller is comprised of packages of rules dedicated to addressing different objectives (e.g., disturbance rejection, low fuel consumption, avoiding the excitation of flexible appendages, etc.). These rule packages can be inserted or removed depending on the requirements of the particular spacecraft and are parameterized based on vehicle parameters such as inertia or operational parameters such as the maneuvering rate. Individual rule packages can be 'weighted' relative to each other to emphasize the importance of one objective relative to another. Finally, the fuzzy controller and rule packages are demonstrated using the high-fidelity Space Shuttle Interactive On-Orbit Simulator (IOS) while performing typical on-orbit operations and are subsequently compared with the existing shuttle flight control system performance.

  13. VIP: A knowledge-based design aid for the engineering of space systems

    NASA Technical Reports Server (NTRS)

    Lewis, Steven M.; Bellman, Kirstie L.

    1990-01-01

    The Vehicles Implementation Project (VIP), a knowledge-based design aid for the engineering of space systems is described. VIP combines qualitative knowledge in the form of rules, quantitative knowledge in the form of equations, and other mathematical modeling tools. The system allows users rapidly to develop and experiment with models of spacecraft system designs. As information becomes available to the system, appropriate equations are solved symbolically and the results are displayed. Users may browse through the system, observing dependencies and the effects of altering specific parameters. The system can also suggest approaches to the derivation of specific parameter values. In addition to providing a tool for the development of specific designs, VIP aims at increasing the user's understanding of the design process. Users may rapidly examine the sensitivity of a given parameter to others in the system and perform tradeoffs or optimizations of specific parameters. A second major goal of VIP is to integrate the existing corporate knowledge base of models and rules into a central, symbolic form.

  14. How should we build a generic open-source water management simulator?

    NASA Astrophysics Data System (ADS)

    Khadem, M.; Meier, P.; Rheinheimer, D. E.; Padula, S.; Matrosov, E.; Selby, P. D.; Knox, S.; Harou, J. J.

    2014-12-01

    Increasing water needs for agriculture, industry and cities mean effective and flexible water resource system management tools will remain in high demand. Currently many regions or countries use simulators that have been adapted over time to their unique system properties and water management rules and realities. Most regions operate with a preferred short-list of water management and planning decision support systems. Is there scope for a simulator, shared within the water management community, that could be adapted to different contexts, integrate community contributions, and connect to generic data and model management software? What role could open-source play in such a project? How could a genericuser-interface and data/model management software sustainably be attached to this model or suite of models? Finally, how could such a system effectively leverage existing model formulations, modeling technologies and software? These questions are addressed by the initial work presented here. We introduce a generic water resource simulation formulation that enables and integrates both rule-based and optimization driven technologies. We suggest how it could be linked to other sub-models allowing for detailed agent-based simulation of water management behaviours. An early formulation is applied as an example to the Thames water resource system in the UK. The model uses centralised optimisation to calculate allocations but allows for rule-based operations as well in an effort to represent observed behaviours and rules with fidelity. The model is linked through import/export commands to a generic network model platform named Hydra. Benefits and limitations of the approach are discussed and planned work and potential use cases are outlined.

  15. Automated Title Page Cataloging: A Feasibility Study.

    ERIC Educational Resources Information Center

    Weibel, Stuart; And Others

    1989-01-01

    Describes the design of a prototype rule-based system for the automation of descriptive cataloging from title pages. The discussion covers the results of tests of the prototype, major impediments to automatic cataloging from title pages, and prospects for further progress. The rules implemented in the prototype are appended. (16 references)…

  16. A General Design Rule to Manipulate Photocarrier Transport Path in Solar Cells and Its Realization by the Plasmonic-Electrical Effect

    NASA Astrophysics Data System (ADS)

    Sha, Wei E. I.; Zhu, Hugh L.; Chen, Luzhou; Chew, Weng Cho; Choy, Wallace C. H.

    2015-02-01

    It is well known that transport paths of photocarriers (electrons and holes) before collected by electrodes strongly affect bulk recombination and thus electrical properties of solar cells, including open-circuit voltage and fill factor. For boosting device performance, a general design rule, tailored to arbitrary electron to hole mobility ratio, is proposed to decide the transport paths of photocarriers. Due to a unique ability to localize and concentrate light, plasmonics is explored to manipulate photocarrier transport through spatially redistributing light absorption at the active layer of devices. Without changing the active materials, we conceive a plasmonic-electrical concept, which tunes electrical properties of solar cells via the plasmon-modified optical field distribution, to realize the design rule. Incorporating spectrally and spatially configurable metallic nanostructures, thin-film solar cells are theoretically modelled and experimentally fabricated to validate the design rule and verify the plasmonic-tunable electrical properties. The general design rule, together with the plasmonic-electrical effect, contributes to the evolution of emerging photovoltaics.

  17. On a Formal Tool for Reasoning About Flight Software Cost Analysis

    NASA Technical Reports Server (NTRS)

    Spagnuolo, John N., Jr.; Stukes, Sherry A.

    2013-01-01

    A report focuses on the development of flight software (FSW) cost estimates for 16 Discovery-class missions at JPL. The techniques and procedures developed enabled streamlining of the FSW analysis process, and provided instantaneous confirmation that the data and processes used for these estimates were consistent across all missions. The research provides direction as to how to build a prototype rule-based system for FSW cost estimation that would provide (1) FSW cost estimates, (2) explanation of how the estimates were arrived at, (3) mapping of costs, (4) mathematical trend charts with explanations of why the trends are what they are, (5) tables with ancillary FSW data of interest to analysts, (6) a facility for expert modification/enhancement of the rules, and (7) a basis for conceptually convenient expansion into more complex, useful, and general rule-based systems.

  18. Application of decision rules for empowering of Indonesian telematics services SMEs

    NASA Astrophysics Data System (ADS)

    Tosida, E. T.; Hairlangga, O.; Amirudin, F.; Ridwanah, M.

    2018-03-01

    The independence of the field of telematics became one of Indonesia's vision in 2024. One effort to achieve it can be done by empowering SMEs in the field of telematics. Empowerment carried out need a practical mechanism by utilizing data centered, including through the National Economic Census database (Susenas). Based on the Susenas can be formulated the decision rules of determining the provision of assistance for SMEs in the field of telematics. The way it did by generating the rule base through the classification technique. The CART algorithm-based decision rule model performs better than C45 and ID3 models. The high level of performance model is also in line with the regulations applied by the government. This becomes one of the strengths of research, because the resulting model is consistent with the existing conditions in Indonesia. The rules base generated from the three classification techniques show different rules. The CART technique has pattern matching with the realization of activities in The Ministry of Cooperatives and SMEs. So far, the government has difficulty in referring data related to the empowerment of SMEs telematics services. Therefore, the findings resulting from this research can be used as an alternative decision support system related to the program of empowerment of SMEs in telematics.

  19. Stem cell transplantation as a dynamical system: are clinical outcomes deterministic?

    PubMed

    Toor, Amir A; Kobulnicky, Jared D; Salman, Salman; Roberts, Catherine H; Jameson-Lee, Max; Meier, Jeremy; Scalora, Allison; Sheth, Nihar; Koparde, Vishal; Serrano, Myrna; Buck, Gregory A; Clark, William B; McCarty, John M; Chung, Harold M; Manjili, Masoud H; Sabo, Roy T; Neale, Michael C

    2014-01-01

    Outcomes in stem cell transplantation (SCT) are modeled using probability theory. However, the clinical course following SCT appears to demonstrate many characteristics of dynamical systems, especially when outcomes are considered in the context of immune reconstitution. Dynamical systems tend to evolve over time according to mathematically determined rules. Characteristically, the future states of the system are predicated on the states preceding them, and there is sensitivity to initial conditions. In SCT, the interaction between donor T cells and the recipient may be considered as such a system in which, graft source, conditioning, and early immunosuppression profoundly influence immune reconstitution over time. This eventually determines clinical outcomes, either the emergence of tolerance or the development of graft versus host disease. In this paper, parallels between SCT and dynamical systems are explored and a conceptual framework for developing mathematical models to understand disparate transplant outcomes is proposed.

  20. Stem Cell Transplantation as a Dynamical System: Are Clinical Outcomes Deterministic?

    PubMed Central

    Toor, Amir A.; Kobulnicky, Jared D.; Salman, Salman; Roberts, Catherine H.; Jameson-Lee, Max; Meier, Jeremy; Scalora, Allison; Sheth, Nihar; Koparde, Vishal; Serrano, Myrna; Buck, Gregory A.; Clark, William B.; McCarty, John M.; Chung, Harold M.; Manjili, Masoud H.; Sabo, Roy T.; Neale, Michael C.

    2014-01-01

    Outcomes in stem cell transplantation (SCT) are modeled using probability theory. However, the clinical course following SCT appears to demonstrate many characteristics of dynamical systems, especially when outcomes are considered in the context of immune reconstitution. Dynamical systems tend to evolve over time according to mathematically determined rules. Characteristically, the future states of the system are predicated on the states preceding them, and there is sensitivity to initial conditions. In SCT, the interaction between donor T cells and the recipient may be considered as such a system in which, graft source, conditioning, and early immunosuppression profoundly influence immune reconstitution over time. This eventually determines clinical outcomes, either the emergence of tolerance or the development of graft versus host disease. In this paper, parallels between SCT and dynamical systems are explored and a conceptual framework for developing mathematical models to understand disparate transplant outcomes is proposed. PMID:25520720

Top