The Evolvement of Automobile Steering System Based on TRIZ
NASA Astrophysics Data System (ADS)
Zhao, Xinjun; Zhang, Shuang
Products and techniques pass through a process of birth, growth, maturity, death and quit the stage like biological evolution process. The developments of products and techniques conform to some evolvement rules. If people know and hold these rules, they can design new kind of products and forecast the develop trends of the products. Thereby, enterprises can grasp the future technique directions of products, and make product and technique innovation. Below, based on TRIZ theory, the mechanism evolvement, the function evolvement and the appearance evolvement of automobile steering system had been analyzed and put forward some new ideas about future automobile steering system.
Evolving fuzzy rules in a learning classifier system
NASA Technical Reports Server (NTRS)
Valenzuela-Rendon, Manuel
1993-01-01
The fuzzy classifier system (FCS) combines the ideas of fuzzy logic controllers (FLC's) and learning classifier systems (LCS's). It brings together the expressive powers of fuzzy logic as it has been applied in fuzzy controllers to express relations between continuous variables, and the ability of LCS's to evolve co-adapted sets of rules. The goal of the FCS is to develop a rule-based system capable of learning in a reinforcement regime, and that can potentially be used for process control.
eFSM--a novel online neural-fuzzy semantic memory model.
Tung, Whye Loon; Quek, Chai
2010-01-01
Fuzzy rule-based systems (FRBSs) have been successfully applied to many areas. However, traditional fuzzy systems are often manually crafted, and their rule bases that represent the acquired knowledge are static and cannot be trained to improve the modeling performance. This subsequently leads to intensive research on the autonomous construction and tuning of a fuzzy system directly from the observed training data to address the knowledge acquisition bottleneck, resulting in well-established hybrids such as neural-fuzzy systems (NFSs) and genetic fuzzy systems (GFSs). However, the complex and dynamic nature of real-world problems demands that fuzzy rule-based systems and models be able to adapt their parameters and ultimately evolve their rule bases to address the nonstationary (time-varying) characteristics of their operating environments. Recently, considerable research efforts have been directed to the study of evolving Tagaki-Sugeno (T-S)-type NFSs based on the concept of incremental learning. In contrast, there are very few incremental learning Mamdani-type NFSs reported in the literature. Hence, this paper presents the evolving neural-fuzzy semantic memory (eFSM) model, a neural-fuzzy Mamdani architecture with a data-driven progressively adaptive structure (i.e., rule base) based on incremental learning. Issues related to the incremental learning of the eFSM rule base are carefully investigated, and a novel parameter learning approach is proposed for the tuning of the fuzzy set parameters in eFSM. The proposed eFSM model elicits highly interpretable semantic knowledge in the form of Mamdani-type if-then fuzzy rules from low-level numeric training data. These Mamdani fuzzy rules define the computing structure of eFSM and are incrementally learned with the arrival of each training data sample. New rules are constructed from the emergence of novel training data and obsolete fuzzy rules that no longer describe the recently observed data trends are pruned. This enables eFSM to maintain a current and compact set of Mamdani-type if-then fuzzy rules that collectively generalizes and describes the salient associative mappings between the inputs and outputs of the underlying process being modeled. The learning and modeling performances of the proposed eFSM are evaluated using several benchmark applications and the results are encouraging.
A Hybrid Genetic Programming Algorithm for Automated Design of Dispatching Rules.
Nguyen, Su; Mei, Yi; Xue, Bing; Zhang, Mengjie
2018-06-04
Designing effective dispatching rules for production systems is a difficult and timeconsuming task if it is done manually. In the last decade, the growth of computing power, advanced machine learning, and optimisation techniques has made the automated design of dispatching rules possible and automatically discovered rules are competitive or outperform existing rules developed by researchers. Genetic programming is one of the most popular approaches to discovering dispatching rules in the literature, especially for complex production systems. However, the large heuristic search space may restrict genetic programming from finding near optimal dispatching rules. This paper develops a new hybrid genetic programming algorithm for dynamic job shop scheduling based on a new representation, a new local search heuristic, and efficient fitness evaluators. Experiments show that the new method is effective regarding the quality of evolved rules. Moreover, evolved rules are also significantly smaller and contain more relevant attributes.
Automating the design of scientific computing software
NASA Technical Reports Server (NTRS)
Kant, Elaine
1992-01-01
SINAPSE is a domain-specific software design system that generates code from specifications of equations and algorithm methods. This paper describes the system's design techniques (planning in a space of knowledge-based refinement and optimization rules), user interaction style (user has option to control decision making), and representation of knowledge (rules and objects). It also summarizes how the system knowledge has evolved over time and suggests some issues in building software design systems to facilitate reuse.
Evolving rule-based systems in two medical domains using genetic programming.
Tsakonas, Athanasios; Dounias, Georgios; Jantzen, Jan; Axer, Hubertus; Bjerregaard, Beth; von Keyserlingk, Diedrich Graf
2004-11-01
To demonstrate and compare the application of different genetic programming (GP) based intelligent methodologies for the construction of rule-based systems in two medical domains: the diagnosis of aphasia's subtypes and the classification of pap-smear examinations. Past data representing (a) successful diagnosis of aphasia's subtypes from collaborating medical experts through a free interview per patient, and (b) correctly classified smears (images of cells) by cyto-technologists, previously stained using the Papanicolaou method. Initially a hybrid approach is proposed, which combines standard genetic programming and heuristic hierarchical crisp rule-base construction. Then, genetic programming for the production of crisp rule based systems is attempted. Finally, another hybrid intelligent model is composed by a grammar driven genetic programming system for the generation of fuzzy rule-based systems. Results denote the effectiveness of the proposed systems, while they are also compared for their efficiency, accuracy and comprehensibility, to those of an inductive machine learning approach as well as to those of a standard genetic programming symbolic expression approach. The proposed GP-based intelligent methodologies are able to produce accurate and comprehensible results for medical experts performing competitive to other intelligent approaches. The aim of the authors was the production of accurate but also sensible decision rules that could potentially help medical doctors to extract conclusions, even at the expense of a higher classification score achievement.
Tightening the Iron Cage: Concertive Control in Self-Managing Teams.
ERIC Educational Resources Information Center
Barker, James R.
1993-01-01
Describes how an (industrial) organization's control system evolved in response to a managerial change from hierarchical, bureaucratic control to concertive control via self-management teams. The organization's members developed a system of value-based normative rules that controlled their actions more powerfully and completely than did the former…
The Evolution of Sonic Ecosystems
NASA Astrophysics Data System (ADS)
McCormack, Jon
This chapter describes a novel type of artistic artificial life software environment. Agents that have the ability to make and listen to sound populate a synthetic world. An evolvable, rule-based classifier system drives agent behavior. Agents compete for limited resources in a virtual environment that is influenced by the presence and movement of people observing the system. Electronic sensors create a link between the real and virtual spaces, virtual agents evolve implicitly to try to maintain the interest of the human audience, whose presence provides them with life-sustaining food.
RANWAR: rank-based weighted association rule mining from gene expression and methylation data.
Mallik, Saurav; Mukhopadhyay, Anirban; Maulik, Ujjwal
2015-01-01
Ranking of association rules is currently an interesting topic in data mining and bioinformatics. The huge number of evolved rules of items (or, genes) by association rule mining (ARM) algorithms makes confusion to the decision maker. In this article, we propose a weighted rule-mining technique (say, RANWAR or rank-based weighted association rule-mining) to rank the rules using two novel rule-interestingness measures, viz., rank-based weighted condensed support (wcs) and weighted condensed confidence (wcc) measures to bypass the problem. These measures are basically depended on the rank of items (genes). Using the rank, we assign weight to each item. RANWAR generates much less number of frequent itemsets than the state-of-the-art association rule mining algorithms. Thus, it saves time of execution of the algorithm. We run RANWAR on gene expression and methylation datasets. The genes of the top rules are biologically validated by Gene Ontologies (GOs) and KEGG pathway analyses. Many top ranked rules extracted from RANWAR that hold poor ranks in traditional Apriori, are highly biologically significant to the related diseases. Finally, the top rules evolved from RANWAR, that are not in Apriori, are reported.
An Interval Type-2 Neural Fuzzy System for Online System Identification and Feature Elimination.
Lin, Chin-Teng; Pal, Nikhil R; Wu, Shang-Lin; Liu, Yu-Ting; Lin, Yang-Yin
2015-07-01
We propose an integrated mechanism for discarding derogatory features and extraction of fuzzy rules based on an interval type-2 neural fuzzy system (NFS)-in fact, it is a more general scheme that can discard bad features, irrelevant antecedent clauses, and even irrelevant rules. High-dimensional input variable and a large number of rules not only enhance the computational complexity of NFSs but also reduce their interpretability. Therefore, a mechanism for simultaneous extraction of fuzzy rules and reducing the impact of (or eliminating) the inferior features is necessary. The proposed approach, namely an interval type-2 Neural Fuzzy System for online System Identification and Feature Elimination (IT2NFS-SIFE), uses type-2 fuzzy sets to model uncertainties associated with information and data in designing the knowledge base. The consequent part of the IT2NFS-SIFE is of Takagi-Sugeno-Kang type with interval weights. The IT2NFS-SIFE possesses a self-evolving property that can automatically generate fuzzy rules. The poor features can be discarded through the concept of a membership modulator. The antecedent and modulator weights are learned using a gradient descent algorithm. The consequent part weights are tuned via the rule-ordered Kalman filter algorithm to enhance learning effectiveness. Simulation results show that IT2NFS-SIFE not only simplifies the system architecture by eliminating derogatory/irrelevant antecedent clauses, rules, and features but also maintains excellent performance.
Maulik, Ujjwal; Mallik, Saurav; Mukhopadhyay, Anirban; Bandyopadhyay, Sanghamitra
2015-01-01
Microarray and beadchip are two most efficient techniques for measuring gene expression and methylation data in bioinformatics. Biclustering deals with the simultaneous clustering of genes and samples. In this article, we propose a computational rule mining framework, StatBicRM (i.e., statistical biclustering-based rule mining) to identify special type of rules and potential biomarkers using integrated approaches of statistical and binary inclusion-maximal biclustering techniques from the biological datasets. At first, a novel statistical strategy has been utilized to eliminate the insignificant/low-significant/redundant genes in such way that significance level must satisfy the data distribution property (viz., either normal distribution or non-normal distribution). The data is then discretized and post-discretized, consecutively. Thereafter, the biclustering technique is applied to identify maximal frequent closed homogeneous itemsets. Corresponding special type of rules are then extracted from the selected itemsets. Our proposed rule mining method performs better than the other rule mining algorithms as it generates maximal frequent closed homogeneous itemsets instead of frequent itemsets. Thus, it saves elapsed time, and can work on big dataset. Pathway and Gene Ontology analyses are conducted on the genes of the evolved rules using David database. Frequency analysis of the genes appearing in the evolved rules is performed to determine potential biomarkers. Furthermore, we also classify the data to know how much the evolved rules are able to describe accurately the remaining test (unknown) data. Subsequently, we also compare the average classification accuracy, and other related factors with other rule-based classifiers. Statistical significance tests are also performed for verifying the statistical relevance of the comparative results. Here, each of the other rule mining methods or rule-based classifiers is also starting with the same post-discretized data-matrix. Finally, we have also included the integrated analysis of gene expression and methylation for determining epigenetic effect (viz., effect of methylation) on gene expression level. PMID:25830807
Maulik, Ujjwal; Mallik, Saurav; Mukhopadhyay, Anirban; Bandyopadhyay, Sanghamitra
2015-01-01
Microarray and beadchip are two most efficient techniques for measuring gene expression and methylation data in bioinformatics. Biclustering deals with the simultaneous clustering of genes and samples. In this article, we propose a computational rule mining framework, StatBicRM (i.e., statistical biclustering-based rule mining) to identify special type of rules and potential biomarkers using integrated approaches of statistical and binary inclusion-maximal biclustering techniques from the biological datasets. At first, a novel statistical strategy has been utilized to eliminate the insignificant/low-significant/redundant genes in such way that significance level must satisfy the data distribution property (viz., either normal distribution or non-normal distribution). The data is then discretized and post-discretized, consecutively. Thereafter, the biclustering technique is applied to identify maximal frequent closed homogeneous itemsets. Corresponding special type of rules are then extracted from the selected itemsets. Our proposed rule mining method performs better than the other rule mining algorithms as it generates maximal frequent closed homogeneous itemsets instead of frequent itemsets. Thus, it saves elapsed time, and can work on big dataset. Pathway and Gene Ontology analyses are conducted on the genes of the evolved rules using David database. Frequency analysis of the genes appearing in the evolved rules is performed to determine potential biomarkers. Furthermore, we also classify the data to know how much the evolved rules are able to describe accurately the remaining test (unknown) data. Subsequently, we also compare the average classification accuracy, and other related factors with other rule-based classifiers. Statistical significance tests are also performed for verifying the statistical relevance of the comparative results. Here, each of the other rule mining methods or rule-based classifiers is also starting with the same post-discretized data-matrix. Finally, we have also included the integrated analysis of gene expression and methylation for determining epigenetic effect (viz., effect of methylation) on gene expression level.
Nguyen, Su; Zhang, Mengjie; Tan, Kay Chen
2017-09-01
Automated design of dispatching rules for production systems has been an interesting research topic over the last several years. Machine learning, especially genetic programming (GP), has been a powerful approach to dealing with this design problem. However, intensive computational requirements, accuracy and interpretability are still its limitations. This paper aims at developing a new surrogate assisted GP to help improving the quality of the evolved rules without significant computational costs. The experiments have verified the effectiveness and efficiency of the proposed algorithms as compared to those in the literature. Furthermore, new simplification and visualisation approaches have also been developed to improve the interpretability of the evolved rules. These approaches have shown great potentials and proved to be a critical part of the automated design system.
Hyper-heuristic Evolution of Dispatching Rules: A Comparison of Rule Representations.
Branke, Jürgen; Hildebrandt, Torsten; Scholz-Reiter, Bernd
2015-01-01
Dispatching rules are frequently used for real-time, online scheduling in complex manufacturing systems. Design of such rules is usually done by experts in a time consuming trial-and-error process. Recently, evolutionary algorithms have been proposed to automate the design process. There are several possibilities to represent rules for this hyper-heuristic search. Because the representation determines the search neighborhood and the complexity of the rules that can be evolved, a suitable choice of representation is key for a successful evolutionary algorithm. In this paper we empirically compare three different representations, both numeric and symbolic, for automated rule design: A linear combination of attributes, a representation based on artificial neural networks, and a tree representation. Using appropriate evolutionary algorithms (CMA-ES for the neural network and linear representations, genetic programming for the tree representation), we empirically investigate the suitability of each representation in a dynamic stochastic job shop scenario. We also examine the robustness of the evolved dispatching rules against variations in the underlying job shop scenario, and visualize what the rules do, in order to get an intuitive understanding of their inner workings. Results indicate that the tree representation using an improved version of genetic programming gives the best results if many candidate rules can be evaluated, closely followed by the neural network representation that already leads to good results for small to moderate computational budgets. The linear representation is found to be competitive only for extremely small computational budgets.
Foundations of Representation: Where Might Graphical Symbol Systems Come from?
ERIC Educational Resources Information Center
Garrod, Simon; Fay, Nicolas; Lee, John; Oberlander, Jon; MacLeod, Tracy
2007-01-01
It has been suggested that iconic graphical signs evolve into symbolic graphical signs through repeated usage. This article reports a series of interactive graphical communication experiments using a "pictionary" task to establish the conditions under which the evolution might occur. Experiment 1 rules out a simple repetition based account in…
Evolution of Collective Behaviour in an Artificial World Using Linguistic Fuzzy Rule-Based Systems
Lebar Bajec, Iztok
2017-01-01
Collective behaviour is a fascinating and easily observable phenomenon, attractive to a wide range of researchers. In biology, computational models have been extensively used to investigate various properties of collective behaviour, such as: transfer of information across the group, benefits of grouping (defence against predation, foraging), group decision-making process, and group behaviour types. The question ‘why,’ however remains largely unanswered. Here the interest goes into which pressures led to the evolution of such behaviour, and evolutionary computational models have already been used to test various biological hypotheses. Most of these models use genetic algorithms to tune the parameters of previously presented non-evolutionary models, but very few attempt to evolve collective behaviour from scratch. Of these last, the successful attempts display clumping or swarming behaviour. Empirical evidence suggests that in fish schools there exist three classes of behaviour; swarming, milling and polarized. In this paper we present a novel, artificial life-like evolutionary model, where individual agents are governed by linguistic fuzzy rule-based systems, which is capable of evolving all three classes of behaviour. PMID:28045964
Evolution of Collective Behaviour in an Artificial World Using Linguistic Fuzzy Rule-Based Systems.
Demšar, Jure; Lebar Bajec, Iztok
2017-01-01
Collective behaviour is a fascinating and easily observable phenomenon, attractive to a wide range of researchers. In biology, computational models have been extensively used to investigate various properties of collective behaviour, such as: transfer of information across the group, benefits of grouping (defence against predation, foraging), group decision-making process, and group behaviour types. The question 'why,' however remains largely unanswered. Here the interest goes into which pressures led to the evolution of such behaviour, and evolutionary computational models have already been used to test various biological hypotheses. Most of these models use genetic algorithms to tune the parameters of previously presented non-evolutionary models, but very few attempt to evolve collective behaviour from scratch. Of these last, the successful attempts display clumping or swarming behaviour. Empirical evidence suggests that in fish schools there exist three classes of behaviour; swarming, milling and polarized. In this paper we present a novel, artificial life-like evolutionary model, where individual agents are governed by linguistic fuzzy rule-based systems, which is capable of evolving all three classes of behaviour.
Community detection in complex networks by using membrane algorithm
NASA Astrophysics Data System (ADS)
Liu, Chuang; Fan, Linan; Liu, Zhou; Dai, Xiang; Xu, Jiamei; Chang, Baoren
Community detection in complex networks is a key problem of network analysis. In this paper, a new membrane algorithm is proposed to solve the community detection in complex networks. The proposed algorithm is based on membrane systems, which consists of objects, reaction rules, and a membrane structure. Each object represents a candidate partition of a complex network, and the quality of objects is evaluated according to network modularity. The reaction rules include evolutionary rules and communication rules. Evolutionary rules are responsible for improving the quality of objects, which employ the differential evolutionary algorithm to evolve objects. Communication rules implement the information exchanged among membranes. Finally, the proposed algorithm is evaluated on synthetic, real-world networks with real partitions known and the large-scaled networks with real partitions unknown. The experimental results indicate the superior performance of the proposed algorithm in comparison with other experimental algorithms.
NASA Astrophysics Data System (ADS)
Manteiga, M.; Carricajo, I.; Rodríguez, A.; Dafonte, C.; Arcay, B.
2009-02-01
Astrophysics is evolving toward a more rational use of costly observational data by intelligently exploiting the large terrestrial and spatial astronomical databases. In this paper, we present a study showing the suitability of an expert system to perform the classification of stellar spectra in the Morgan and Keenan (MK) system. Using the formalism of artificial intelligence for the development of such a system, we propose a rules' base that contains classification criteria and confidence grades, all integrated in an inference engine that emulates human reasoning by means of a hierarchical decision rules tree that also considers the uncertainty factors associated with rules. Our main objective is to illustrate the formulation and development of such a system for an astrophysical classification problem. An extensive spectral database of MK standard spectra has been collected and used as a reference to determine the spectral indexes that are suitable for classification in the MK system. It is shown that by considering 30 spectral indexes and associating them with uncertainty factors, we can find an accurate diagnose in MK types of a particular spectrum. The system was evaluated against the NOAO-INDO-US spectral catalog.
System and method for embedding emotion in logic systems
NASA Technical Reports Server (NTRS)
Curtis, Steven A. (Inventor)
2012-01-01
A system, method, and computer readable-media for creating a stable synthetic neural system. The method includes training an intellectual choice-driven synthetic neural system (SNS), training an emotional rule-driven SNS by generating emotions from rules, incorporating the rule-driven SNS into the choice-driven SNS through an evolvable interface, and balancing the emotional SNS and the intellectual SNS to achieve stability in a nontrivial autonomous environment with a Stability Algorithm for Neural Entities (SANE). Generating emotions from rules can include coding the rules into the rule-driven SNS in a self-consistent way. Training the emotional rule-driven SNS can occur during a training stage in parallel with training the choice-driven SNS. The training stage can include a self assessment loop which measures performance characteristics of the rule-driven SNS against core genetic code. The method uses a stability threshold to measure stability of the incorporated rule-driven SNS and choice-driven SNS using SANE.
Evolving Agents: Communication and Cognition
2005-06-01
systems [11] and the first Chomsky ideas concerning mechanisms of language grammar related to deep structure [12] encountered CC of rules. Model-based...Perennial (2000) 3. Jackendoff, R.: Foundations of Language: Brain, Meaning, Grammar , Evolution. Oxford University Press, New York, NY (2002) 4. Pinker, S... University Press, Princeton, NJ (1961) 11. Minsky, M.L.: Semantic Information Processing. The MIT Press, Cambridge, MA (1968) 12. Chomsky , N
Mallik, Saurav; Zhao, Zhongming
2017-12-28
For transcriptomic analysis, there are numerous microarray-based genomic data, especially those generated for cancer research. The typical analysis measures the difference between a cancer sample-group and a matched control group for each transcript or gene. Association rule mining is used to discover interesting item sets through rule-based methodology. Thus, it has advantages to find causal effect relationships between the transcripts. In this work, we introduce two new rule-based similarity measures-weighted rank-based Jaccard and Cosine measures-and then propose a novel computational framework to detect condensed gene co-expression modules ( C o n G E M s) through the association rule-based learning system and the weighted similarity scores. In practice, the list of evolved condensed markers that consists of both singular and complex markers in nature depends on the corresponding condensed gene sets in either antecedent or consequent of the rules of the resultant modules. In our evaluation, these markers could be supported by literature evidence, KEGG (Kyoto Encyclopedia of Genes and Genomes) pathway and Gene Ontology annotations. Specifically, we preliminarily identified differentially expressed genes using an empirical Bayes test. A recently developed algorithm-RANWAR-was then utilized to determine the association rules from these genes. Based on that, we computed the integrated similarity scores of these rule-based similarity measures between each rule-pair, and the resultant scores were used for clustering to identify the co-expressed rule-modules. We applied our method to a gene expression dataset for lung squamous cell carcinoma and a genome methylation dataset for uterine cervical carcinogenesis. Our proposed module discovery method produced better results than the traditional gene-module discovery measures. In summary, our proposed rule-based method is useful for exploring biomarker modules from transcriptomic data.
Tang, Jinjun; Zou, Yajie; Ash, John; Zhang, Shen; Liu, Fang; Wang, Yinhai
2016-01-01
Travel time is an important measurement used to evaluate the extent of congestion within road networks. This paper presents a new method to estimate the travel time based on an evolving fuzzy neural inference system. The input variables in the system are traffic flow data (volume, occupancy, and speed) collected from loop detectors located at points both upstream and downstream of a given link, and the output variable is the link travel time. A first order Takagi-Sugeno fuzzy rule set is used to complete the inference. For training the evolving fuzzy neural network (EFNN), two learning processes are proposed: (1) a K-means method is employed to partition input samples into different clusters, and a Gaussian fuzzy membership function is designed for each cluster to measure the membership degree of samples to the cluster centers. As the number of input samples increases, the cluster centers are modified and membership functions are also updated; (2) a weighted recursive least squares estimator is used to optimize the parameters of the linear functions in the Takagi-Sugeno type fuzzy rules. Testing datasets consisting of actual and simulated data are used to test the proposed method. Three common criteria including mean absolute error (MAE), root mean square error (RMSE), and mean absolute relative error (MARE) are utilized to evaluate the estimation performance. Estimation results demonstrate the accuracy and effectiveness of the EFNN method through comparison with existing methods including: multiple linear regression (MLR), instantaneous model (IM), linear model (LM), neural network (NN), and cumulative plots (CP).
Tang, Jinjun; Zou, Yajie; Ash, John; Zhang, Shen; Liu, Fang; Wang, Yinhai
2016-01-01
Travel time is an important measurement used to evaluate the extent of congestion within road networks. This paper presents a new method to estimate the travel time based on an evolving fuzzy neural inference system. The input variables in the system are traffic flow data (volume, occupancy, and speed) collected from loop detectors located at points both upstream and downstream of a given link, and the output variable is the link travel time. A first order Takagi-Sugeno fuzzy rule set is used to complete the inference. For training the evolving fuzzy neural network (EFNN), two learning processes are proposed: (1) a K-means method is employed to partition input samples into different clusters, and a Gaussian fuzzy membership function is designed for each cluster to measure the membership degree of samples to the cluster centers. As the number of input samples increases, the cluster centers are modified and membership functions are also updated; (2) a weighted recursive least squares estimator is used to optimize the parameters of the linear functions in the Takagi-Sugeno type fuzzy rules. Testing datasets consisting of actual and simulated data are used to test the proposed method. Three common criteria including mean absolute error (MAE), root mean square error (RMSE), and mean absolute relative error (MARE) are utilized to evaluate the estimation performance. Estimation results demonstrate the accuracy and effectiveness of the EFNN method through comparison with existing methods including: multiple linear regression (MLR), instantaneous model (IM), linear model (LM), neural network (NN), and cumulative plots (CP). PMID:26829639
Knowledge extraction from evolving spiking neural networks with rank order population coding.
Soltic, Snjezana; Kasabov, Nikola
2010-12-01
This paper demonstrates how knowledge can be extracted from evolving spiking neural networks with rank order population coding. Knowledge discovery is a very important feature of intelligent systems. Yet, a disproportionally small amount of research is centered on the issue of knowledge extraction from spiking neural networks which are considered to be the third generation of artificial neural networks. The lack of knowledge representation compatibility is becoming a major detriment to end users of these networks. We show that a high-level knowledge can be obtained from evolving spiking neural networks. More specifically, we propose a method for fuzzy rule extraction from an evolving spiking network with rank order population coding. The proposed method was used for knowledge discovery on two benchmark taste recognition problems where the knowledge learnt by an evolving spiking neural network was extracted in the form of zero-order Takagi-Sugeno fuzzy IF-THEN rules.
Genetic learning in rule-based and neural systems
NASA Technical Reports Server (NTRS)
Smith, Robert E.
1993-01-01
The design of neural networks and fuzzy systems can involve complex, nonlinear, and ill-conditioned optimization problems. Often, traditional optimization schemes are inadequate or inapplicable for such tasks. Genetic Algorithms (GA's) are a class of optimization procedures whose mechanics are based on those of natural genetics. Mathematical arguments show how GAs bring substantial computational leverage to search problems, without requiring the mathematical characteristics often necessary for traditional optimization schemes (e.g., modality, continuity, availability of derivative information, etc.). GA's have proven effective in a variety of search tasks that arise in neural networks and fuzzy systems. This presentation begins by introducing the mechanism and theoretical underpinnings of GA's. GA's are then related to a class of rule-based machine learning systems called learning classifier systems (LCS's). An LCS implements a low-level production-system that uses a GA as its primary rule discovery mechanism. This presentation illustrates how, despite its rule-based framework, an LCS can be thought of as a competitive neural network. Neural network simulator code for an LCS is presented. In this context, the GA is doing more than optimizing and objective function. It is searching for an ecology of hidden nodes with limited connectivity. The GA attempts to evolve this ecology such that effective neural network performance results. The GA is particularly well adapted to this task, given its naturally-inspired basis. The LCS/neural network analogy extends itself to other, more traditional neural networks. Conclusions to the presentation discuss the implications of using GA's in ecological search problems that arise in neural and fuzzy systems.
A Rules-Based Service for Suggesting Visualizations to Analyze Earth Science Phenomena.
NASA Astrophysics Data System (ADS)
Prabhu, A.; Zednik, S.; Fox, P. A.; Ramachandran, R.; Maskey, M.; Shie, C. L.; Shen, S.
2016-12-01
Current Earth Science Information Systems lack support for new or interdisciplinary researchers, who may be unfamiliar with the domain vocabulary or the breadth of relevant data available. We need to evolve the current information systems, to reduce the time required for data preparation, processing and analysis. This can be done by effectively salvaging the "dark" resources in Earth Science. We assert that Earth science metadata assets are dark resources, information resources that organizations collect, process, and store for regular business or operational activities but fail to utilize for other purposes. In order to effectively use these dark resources, especially for data processing and visualization, we need a combination of domain, data product and processing knowledge, i.e. a knowledge base from which specific data operations can be performed. In this presentation, we describe a semantic, rules based approach to provide i.e. a service to visualize Earth Science phenomena, based on the data variables extracted using the "dark" metadata resources. We use Jena rules to make assertions about compatibility between a phenomena and various visualizations based on multiple factors. We created separate orthogonal rulesets to map each of these factors to the various phenomena. Some of the factors we have considered include measurements, spatial resolution and time intervals. This approach enables easy additions and deletions based on newly obtained domain knowledge or phenomena related information and thus improving the accuracy of the rules service overall.
Evolving optimised decision rules for intrusion detection using particle swarm paradigm
NASA Astrophysics Data System (ADS)
Sivatha Sindhu, Siva S.; Geetha, S.; Kannan, A.
2012-12-01
The aim of this article is to construct a practical intrusion detection system (IDS) that properly analyses the statistics of network traffic pattern and classify them as normal or anomalous class. The objective of this article is to prove that the choice of effective network traffic features and a proficient machine-learning paradigm enhances the detection accuracy of IDS. In this article, a rule-based approach with a family of six decision tree classifiers, namely Decision Stump, C4.5, Naive Baye's Tree, Random Forest, Random Tree and Representative Tree model to perform the detection of anomalous network pattern is introduced. In particular, the proposed swarm optimisation-based approach selects instances that compose training set and optimised decision tree operate over this trained set producing classification rules with improved coverage, classification capability and generalisation ability. Experiment with the Knowledge Discovery and Data mining (KDD) data set which have information on traffic pattern, during normal and intrusive behaviour shows that the proposed algorithm produces optimised decision rules and outperforms other machine-learning algorithm.
SAMS--a systems architecture for developing intelligent health information systems.
Yılmaz, Özgün; Erdur, Rıza Cenk; Türksever, Mustafa
2013-12-01
In this paper, SAMS, a novel health information system architecture for developing intelligent health information systems is proposed and also some strategies for developing such systems are discussed. The systems fulfilling this architecture will be able to store electronic health records of the patients using OWL ontologies, share patient records among different hospitals and provide physicians expertise to assist them in making decisions. The system is intelligent because it is rule-based, makes use of rule-based reasoning and has the ability to learn and evolve itself. The learning capability is provided by extracting rules from previously given decisions by the physicians and then adding the extracted rules to the system. The proposed system is novel and original in all of these aspects. As a case study, a system is implemented conforming to SAMS architecture for use by dentists in the dental domain. The use of the developed system is described with a scenario. For evaluation, the developed dental information system will be used and tried by a group of dentists. The development of this system proves the applicability of SAMS architecture. By getting decision support from a system derived from this architecture, the cognitive gap between experienced and inexperienced physicians can be compensated. Thus, patient satisfaction can be achieved, inexperienced physicians are supported in decision making and the personnel can improve their knowledge. A physician can diagnose a case, which he/she has never diagnosed before, using this system. With the help of this system, it will be possible to store general domain knowledge in this system and the personnel's need to medical guideline documents will be reduced.
A PC based fault diagnosis expert system
NASA Technical Reports Server (NTRS)
Marsh, Christopher A.
1990-01-01
The Integrated Status Assessment (ISA) prototype expert system performs system level fault diagnosis using rules and models created by the user. The ISA evolved from concepts to a stand-alone demonstration prototype using OPS5 on a LISP Machine. The LISP based prototype was rewritten in C and the C Language Integrated Production System (CLIPS) to run on a Personal Computer (PC) and a graphics workstation. The ISA prototype has been used to demonstrate fault diagnosis functions of Space Station Freedom's Operation Management System (OMS). This paper describes the development of the ISA prototype from early concepts to the current PC/workstation version used today and describes future areas of development for the prototype.
Developing an Intelligent Computer-Aided Trainer
NASA Technical Reports Server (NTRS)
Hua, Grace
1990-01-01
The Payload-assist module Deploys/Intelligent Computer-Aided Training (PD/ICAT) system was developed as a prototype for intelligent tutoring systems with the intention of seeing PD/ICAT evolve and produce a general ICAT architecture and development environment that can be adapted by a wide variety of training tasks. The proposed architecture is composed of a user interface, a domain expert, a training session manager, a trainee model and a training scenario generator. The PD/ICAT prototype was developed in the LISP environment. Although it has been well received by its peers and users, it could not be delivered toe its end users for practical use because of specific hardware and software constraints. To facilitate delivery of PD/ICAT to its users and to prepare for a more widely accepted development and delivery environment for future ICAT applications, we have ported this training system to a UNIX workstation and adopted use of a conventional language, C, and a C-based rule-based language, CLIPS. A rapid conversion of the PD/ICAT expert system to CLIPS was possible because the knowledge was basically represented as a forward chaining rule base. The resulting CLIPS rule base has been tested successfully in other ICATs as well. Therefore, the porting effort has proven to be a positive step toward our ultimate goal of building a general purpose ICAT development environment.
Evolving fuzzy rules for relaxed-criteria negotiation.
Sim, Kwang Mong
2008-12-01
In the literature on automated negotiation, very few negotiation agents are designed with the flexibility to slightly relax their negotiation criteria to reach a consensus more rapidly and with more certainty. Furthermore, these relaxed-criteria negotiation agents were not equipped with the ability to enhance their performance by learning and evolving their relaxed-criteria negotiation rules. The impetus of this work is designing market-driven negotiation agents (MDAs) that not only have the flexibility of relaxing bargaining criteria using fuzzy rules, but can also evolve their structures by learning new relaxed-criteria fuzzy rules to improve their negotiation outcomes as they participate in negotiations in more e-markets. To this end, an evolutionary algorithm for adapting and evolving relaxed-criteria fuzzy rules was developed. Implementing the idea in a testbed, two kinds of experiments for evaluating and comparing EvEMDAs (MDAs with relaxed-criteria rules that are evolved using the evolutionary algorithm) and EMDAs (MDAs with relaxed-criteria rules that are manually constructed) were carried out through stochastic simulations. Empirical results show that: 1) EvEMDAs generally outperformed EMDAs in different types of e-markets and 2) the negotiation outcomes of EvEMDAs generally improved as they negotiated in more e-markets.
Evolving learning rules and emergence of cooperation in spatial prisoner's dilemma.
Moyano, Luis G; Sánchez, Angel
2009-07-07
In the evolutionary Prisoner's dilemma (PD) game, agents play with each other and update their strategies in every generation according to some microscopic dynamical rule. In its spatial version, agents do not play with every other but, instead, interact only with their neighbours, thus mimicking the existing of a social or contact network that defines who interacts with whom. In this work, we explore evolutionary, spatial PD systems consisting of two types of agents, each with a certain update (reproduction, learning) rule. We investigate two different scenarios: in the first case, update rules remain fixed for the entire evolution of the system; in the second case, agents update both strategy and update rule in every generation. We show that in a well-mixed population the evolutionary outcome is always full defection. We subsequently focus on two-strategy competition with nearest-neighbour interactions on the contact network and synchronised update of strategies. Our results show that, for an important range of the parameters of the game, the final state of the system is largely different from that arising from the usual setup of a single, fixed dynamical rule. Furthermore, the results are also very different if update rules are fixed or evolve with the strategies. In these respect, we have studied representative update rules, finding that some of them may become extinct while others prevail. We describe the new and rich variety of final outcomes that arise from this co-evolutionary dynamics. We include examples of other neighbourhoods and asynchronous updating that confirm the robustness of our conclusions. Our results pave the way to an evolutionary rationale for modelling social interactions through game theory with a preferred set of update rules.
Symmetry of interactions rules in incompletely connected random replicator ecosystems.
Kärenlampi, Petri P
2014-06-01
The evolution of an incompletely connected system of species with speciation and extinction is investigated in terms of random replicators. It is found that evolving random replicator systems with speciation do become large and complex, depending on speciation parameters. Antisymmetric interactions result in large systems, whereas systems with symmetric interactions remain small. A co-dominating feature is within-species interaction pressure: large within-species interaction increases species diversity. Average fitness evolves in all systems, however symmetry and connectivity evolve in small systems only. Newcomers get extinct almost immediately in symmetric systems. The distribution in species lifetimes is determined for antisymmetric systems. The replicator systems investigated do not show any sign of self-organized criticality. The generalized Lotka-Volterra system is shown to be a tedious way of implementing the replicator system.
Combining rules, background knowledge and change patterns to maintain semantic annotations.
Cardoso, Silvio Domingos; Chantal, Reynaud-Delaître; Da Silveira, Marcos; Pruski, Cédric
2017-01-01
Knowledge Organization Systems (KOS) play a key role in enriching biomedical information in order to make it machine-understandable and shareable. This is done by annotating medical documents, or more specifically, associating concept labels from KOS with pieces of digital information, e.g., images or texts. However, the dynamic nature of KOS may impact the annotations, thus creating a mismatch between the evolved concept and the associated information. To solve this problem, methods to maintain the quality of the annotations are required. In this paper, we define a framework based on rules, background knowledge and change patterns to drive the annotation adaption process. We evaluate experimentally the proposed approach in realistic cases-studies and demonstrate the overall performance of our approach in different KOS considering the precision, recall, F1-score and AUC value of the system.
Combining rules, background knowledge and change patterns to maintain semantic annotations
Cardoso, Silvio Domingos; Chantal, Reynaud-Delaître; Da Silveira, Marcos; Pruski, Cédric
2017-01-01
Knowledge Organization Systems (KOS) play a key role in enriching biomedical information in order to make it machine-understandable and shareable. This is done by annotating medical documents, or more specifically, associating concept labels from KOS with pieces of digital information, e.g., images or texts. However, the dynamic nature of KOS may impact the annotations, thus creating a mismatch between the evolved concept and the associated information. To solve this problem, methods to maintain the quality of the annotations are required. In this paper, we define a framework based on rules, background knowledge and change patterns to drive the annotation adaption process. We evaluate experimentally the proposed approach in realistic cases-studies and demonstrate the overall performance of our approach in different KOS considering the precision, recall, F1-score and AUC value of the system. PMID:29854115
TROUBLE 3: A fault diagnostic expert system for Space Station Freedom's power system
NASA Technical Reports Server (NTRS)
Manner, David B.
1990-01-01
Designing Space Station Freedom has given NASA many opportunities to develop expert systems that automate onboard operations of space based systems. One such development, TROUBLE 3, an expert system that was designed to automate the fault diagnostics of Space Station Freedom's electric power system is described. TROUBLE 3's design is complicated by the fact that Space Station Freedom's power system is evolving and changing. TROUBLE 3 has to be made flexible enough to handle changes with minimal changes to the program. Three types of expert systems were studied: rule-based, set-covering, and model-based. A set-covering approach was selected for TROUBLE 3 because if offered the needed flexibility that was missing from the other approaches. With this flexibility, TROUBLE 3 is not limited to Space Station Freedom applications, it can easily be adapted to handle any diagnostic system.
A fully-online Neuro-Fuzzy model for flow forecasting in basins with limited data
NASA Astrophysics Data System (ADS)
Ashrafi, Mohammad; Chua, Lloyd Hock Chye; Quek, Chai; Qin, Xiaosheng
2017-02-01
Current state-of-the-art online neuro fuzzy models (NFMs) such as DENFIS (Dynamic Evolving Neural-Fuzzy Inference System) have been used for runoff forecasting. Online NFMs adopt a local learning approach and are able to adapt to changes continuously. The DENFIS model however requires upper/lower bound for normalization and also the number of rules increases monotonically. This requirement makes the model unsuitable for use in basins with limited data, since a priori data is required. In order to address this and other drawbacks of current online models, the Generic Self-Evolving Takagi-Sugeno-Kang (GSETSK) is adopted in this study for forecast applications in basins with limited data. GSETSK is a fully-online NFM which updates its structure and parameters based on the most recent data. The model does not require the need for historical data and adopts clustering and rule pruning techniques to generate a compact and up-to-date rule-base. GSETSK was used in two forecast applications, rainfall-runoff (a catchment in Sweden) and river routing (Lower Mekong River) forecasts. Each of these two applications was studied under two scenarios: (i) there is no prior data, and (ii) only limited data is available (1 year for the Swedish catchment and 1 season for the Mekong River). For the Swedish Basin, GSETSK model results were compared to available results from a calibrated HBV (Hydrologiska Byråns Vattenbalansavdelning) model. For the Mekong River, GSETSK results were compared against the URBS (Unified River Basin Simulator) model. Both comparisons showed that results from GSETSK are comparable with the physically based models, which were calibrated with historical data. Thus, even though GSETSK was trained with a very limited dataset in comparison with HBV or URBS, similar results were achieved. Similarly, further comparisons between GSETSK with DENFIS and the RBF (Radial Basis Function) models highlighted further advantages of GSETSK as having a rule-base (compared to opaque RBF) which is more compact, up-to-date and more easily interpretable.
Systematic Analysis of the Decision Rules of Traditional Chinese Medicine
Bin-Rong, Ma; Xi-Yuan, Jiang; Su-Ming, Liso; Huai-ning, Zhu; Xiu-ru, Lin
1981-01-01
Chinese traditional medicine has evolved over many centuries, and has accumulated a body of observed relationships between symptoms, signs and prognoses, and the efficacy of alternative treatments and prescriptions. With the assistance of a computer-based clinical data base for recording the diagnostic and therapeutic practice of skilled practitioners of Chinese traditional medicine, a systematic program is being conducted to identify and define the clinical decision-making rules that underlie current practice.
Apex predator and the cyclic competition in a rock-paper-scissors game of three species
NASA Astrophysics Data System (ADS)
Souza-Filho, C. A.; Bazeia, D.; Ramos, J. G. G. S.
2017-06-01
This work deals with the effects of an apex predator on the cyclic competition among three distinct species that follow the rules of the rock-paper-scissors game. The investigation develops standard stochastic simulations but is motivated by a procedure which is explained in the work. We add the apex predator as the fourth species in a system that contains three species that evolve following the standard rules of migration, reproduction, and predation, and study how the system evolves in this new environment, in comparison with the case in the absence of the apex predator. The results show that the apex predator engenders the tendency to spread uniformly in the lattice, contributing to destroy the spiral patterns, keeping biodiversity but diminishing the average size of the clusters of the species that compete cyclically.
Lian, Xiao-Xiao; Guo, Xiao-Xia
2018-01-01
To investigate the herbal prescription rules of Professor Jiang Liangduo in the treatment of abdominal mass based on the traditional Chinese medicine inheritance support system software (TCMISS) of version 2.5, find out new herbal formulas for the treatment of abdominal mass, and then provide new reference to its traditional Chinese medicine therapy. By the method of retrospective study, one hundred and thirty-two outpatient prescriptions of Professor Jiang for the treatment of abdominal mass were collected to establish a typical database with TCMISS. Four properties, five tastes, channel tropism, frequency count, Chinese herbal prescriptions rules and the new prescriptions were analyzed so as to dig out the prescription rules. There were 57 herbs with a frequency>=15, and then 91 core combinations of 2-5 herbs were evolved and 9 new prescriptions were created. It was found out that these drugs mainly had the effects of liver nourishing and soothing, soft-moist and dredging-tonifying, supporting right and dispeling evil, cooperating with the method of calming the liver and resolving hard lump according to the actual situation. It reflected the thought of treatment based on syndrome differentiation in TCM, and provided a new reference for its clinical treatment and research. Copyright© by the Chinese Pharmaceutical Association.
A step-by-step introduction to rule-based design of synthetic genetic constructs using GenoCAD.
Wilson, Mandy L; Hertzberg, Russell; Adam, Laura; Peccoud, Jean
2011-01-01
GenoCAD is an open source web-based system that provides a streamlined, rule-driven process for designing genetic sequences. GenoCAD provides a graphical interface that allows users to design sequences consistent with formalized design strategies specific to a domain, organization, or project. Design strategies include limited sets of user-defined parts and rules indicating how these parts are to be combined in genetic constructs. In addition to reducing design time to minutes, GenoCAD improves the quality and reliability of the finished sequence by ensuring that the designs follow established rules of sequence construction. GenoCAD.org is a publicly available instance of GenoCAD that can be found at www.genocad.org. The source code and latest build are available from SourceForge to allow advanced users to install and customize GenoCAD for their unique needs. This chapter focuses primarily on how the GenoCAD tools can be used to organize genetic parts into customized personal libraries, then how these libraries can be used to design sequences. In addition, GenoCAD's parts management system and search capabilities are described in detail. Instructions are provided for installing a local instance of GenoCAD on a server. Some of the future enhancements of this rapidly evolving suite of applications are briefly described. Copyright © 2011 Elsevier Inc. All rights reserved.
The island rule: made to be broken?
Meiri, Shai; Cooper, Natalie; Purvis, Andy
2007-01-01
The island rule is a hypothesis whereby small mammals evolve larger size on islands while large insular mammals dwarf. The rule is believed to emanate from small mammals growing larger to control more resources and enhance metabolic efficiency, while large mammals evolve smaller size to reduce resource requirements and increase reproductive output. We show that there is no evidence for the existence of the island rule when phylogenetic comparative methods are applied to a large, high-quality dataset. Rather, there are just a few clade-specific patterns: carnivores; heteromyid rodents; and artiodactyls typically evolve smaller size on islands whereas murid rodents usually grow larger. The island rule is probably an artefact of comparing distantly related groups showing clade-specific responses to insularity. Instead of a rule, size evolution on islands is likely to be governed by the biotic and abiotic characteristics of different islands, the biology of the species in question and contingency. PMID:17986433
Agent-based modeling of the immune system: NetLogo, a promising framework.
Chiacchio, Ferdinando; Pennisi, Marzio; Russo, Giulia; Motta, Santo; Pappalardo, Francesco
2014-01-01
Several components that interact with each other to evolve a complex, and, in some cases, unexpected behavior, represents one of the main and fascinating features of the mammalian immune system. Agent-based modeling and cellular automata belong to a class of discrete mathematical approaches in which entities (agents) sense local information and undertake actions over time according to predefined rules. The strength of this approach is characterized by the appearance of a global behavior that emerges from interactions among agents. This behavior is unpredictable, as it does not follow linear rules. There are a lot of works that investigates the immune system with agent-based modeling and cellular automata. They have shown the ability to see clearly and intuitively into the nature of immunological processes. NetLogo is a multiagent programming language and modeling environment for simulating complex phenomena. It is designed for both research and education and is used across a wide range of disciplines and education levels. In this paper, we summarize NetLogo applications to immunology and, particularly, how this framework can help in the development and formulation of hypotheses that might drive further experimental investigations of disease mechanisms.
Selection Shapes Transcriptional Logic and Regulatory Specialization in Genetic Networks.
Fogelmark, Karl; Peterson, Carsten; Troein, Carl
2016-01-01
Living organisms need to regulate their gene expression in response to environmental signals and internal cues. This is a computational task where genes act as logic gates that connect to form transcriptional networks, which are shaped at all scales by evolution. Large-scale mutations such as gene duplications and deletions add and remove network components, whereas smaller mutations alter the connections between them. Selection determines what mutations are accepted, but its importance for shaping the resulting networks has been debated. To investigate the effects of selection in the shaping of transcriptional networks, we derive transcriptional logic from a combinatorially powerful yet tractable model of the binding between DNA and transcription factors. By evolving the resulting networks based on their ability to function as either a simple decision system or a circadian clock, we obtain information on the regulation and logic rules encoded in functional transcriptional networks. Comparisons are made between networks evolved for different functions, as well as with structurally equivalent but non-functional (neutrally evolved) networks, and predictions are validated against the transcriptional network of E. coli. We find that the logic rules governing gene expression depend on the function performed by the network. Unlike the decision systems, the circadian clocks show strong cooperative binding and negative regulation, which achieves tight temporal control of gene expression. Furthermore, we find that transcription factors act preferentially as either activators or repressors, both when binding multiple sites for a single target gene and globally in the transcriptional networks. This separation into positive and negative regulators requires gene duplications, which highlights the interplay between mutation and selection in shaping the transcriptional networks.
Biologic History and the Cardinal Rule of Life
NASA Astrophysics Data System (ADS)
Schopf, J. W.
2004-12-01
In broad perspective, the history of life is remarkably static -- once set, a system that has changed little over all of geological time. The basic chemistry of living systems (CHONSP, and the monomers and polymers they compose), the genetics and cellular structure of life, even the ecologic division of the biologic world into "eaters" (heterotrophs) and "eatees" (autotrophs), are innovations all dating from the Archean that have carried over to the present. Throughout Earth history, biology has followed the Cardinal Rule of Life -- avoid change, never evolve at all! Biology maintains the status quo, opportunistically responding only if conditions change. Life's credo might well be "if it ain't broken, don't fix it." Of course, biomolecules do get "broken," by mutations, but living systems have many biochemical repair mechanisms. Evolution is a result of small changes that slip through unfixed. We see the results of evolution in the fossil record only because of the vastness, the true enormity, of geological time. What events punctuated this static underpinning to produce the modern living world? Only three, each in its own way shaping the course of life's history. The earliest, photosynthesis, freed life from dependence on foodstuffs made by nonbiologic processes. The advent of the advanced form of this process, oxygenic ("green plant") photosynthesis -- also an Archean innovation -- pumped oxygen into the environment (markedly increasing energy yields), "rusted the Earth" (evidenced by banded iron-formations), and, by ˜2,300 Ma ago, led to establishment of an aerobic-anaerobic ecosystem like that today. Not surprisingly, given the Cardinal Rule of Life, the inventors of this innovation, microbial cyanobacteria, evolved little over billions of years. The second major innovation was sex. In the modern world, this reproductive process is exhibited only by nucleated (eukaryotic) cells, derived from non-sexual eukaryotic ancestors. Although eukaryotes date from ˜2,000 Ma ago, they first evolved slowly -- following the Cardinal Rule of Life -- until ˜1,000 Ma ago when sexual reproduction took over. This development markedly speeded the development of new species that could compete, and eventually dominate, in habitats previously owned by their non-sexual prokaryotic ancestors, as evidenced both in the fossil record and by molecular biology-based rRNA phylogenetic trees. The third innovation was cellular differentiation and multicelluarity. Although the "Cambrian Explosion" -- the great radiation of animal life during the Cambrian Period beginning ˜550 Ma ago -- is commonly viewed as reflecting this event, it seems more a continuum than a step-function change. Evolution speeded in the half-billion years between 1,000 Ma ago and the beginning of the Cambrian: phytoplankton gave rise to multicellular seaweeds by ˜850 Ma; and primitive protozoans, present as early as ˜950 Ma, had by ˜600 Ma given rise to soft-bodied multicelled animals. Soon thereafter, animals developed shelly protective armor -- marking the beginning of the Cambrian Period, and thus of the Phanerozoic Eon. The Phanerozoic history of life is familiar to all, from spore-producing to seed-producing to flowering plants, from animals without backbones to fish, land-dwelling vertebrates, then birds and mammals. Plants ("eatees") and animals ("eaters") co-evolved in sequence. Again, life followed the Cardinal Rule, changing little, then evolving rapidly, as new ecologic opportunities became available.
1994-07-01
incorporate the Bell-La Padula rules for implementing the DoD security policy. The policy from which we begin here is the organization’s operational...security policy, which assumes the Bell-La Padula model and assigns the required security variables to elements of the system. A way to ensure a
Nowak, Martin A.; Krakauer, David C.
1999-01-01
The emergence of language was a defining moment in the evolution of modern humans. It was an innovation that changed radically the character of human society. Here, we provide an approach to language evolution based on evolutionary game theory. We explore the ways in which protolanguages can evolve in a nonlinguistic society and how specific signals can become associated with specific objects. We assume that early in the evolution of language, errors in signaling and perception would be common. We model the probability of misunderstanding a signal and show that this limits the number of objects that can be described by a protolanguage. This “error limit” is not overcome by employing more sounds but by combining a small set of more easily distinguishable sounds into words. The process of “word formation” enables a language to encode an essentially unlimited number of objects. Next, we analyze how words can be combined into sentences and specify the conditions for the evolution of very simple grammatical rules. We argue that grammar originated as a simplified rule system that evolved by natural selection to reduce mistakes in communication. Our theory provides a systematic approach for thinking about the origin and evolution of human language. PMID:10393942
Selection Shapes Transcriptional Logic and Regulatory Specialization in Genetic Networks
Fogelmark, Karl; Peterson, Carsten; Troein, Carl
2016-01-01
Background Living organisms need to regulate their gene expression in response to environmental signals and internal cues. This is a computational task where genes act as logic gates that connect to form transcriptional networks, which are shaped at all scales by evolution. Large-scale mutations such as gene duplications and deletions add and remove network components, whereas smaller mutations alter the connections between them. Selection determines what mutations are accepted, but its importance for shaping the resulting networks has been debated. Methodology To investigate the effects of selection in the shaping of transcriptional networks, we derive transcriptional logic from a combinatorially powerful yet tractable model of the binding between DNA and transcription factors. By evolving the resulting networks based on their ability to function as either a simple decision system or a circadian clock, we obtain information on the regulation and logic rules encoded in functional transcriptional networks. Comparisons are made between networks evolved for different functions, as well as with structurally equivalent but non-functional (neutrally evolved) networks, and predictions are validated against the transcriptional network of E. coli. Principal Findings We find that the logic rules governing gene expression depend on the function performed by the network. Unlike the decision systems, the circadian clocks show strong cooperative binding and negative regulation, which achieves tight temporal control of gene expression. Furthermore, we find that transcription factors act preferentially as either activators or repressors, both when binding multiple sites for a single target gene and globally in the transcriptional networks. This separation into positive and negative regulators requires gene duplications, which highlights the interplay between mutation and selection in shaping the transcriptional networks. PMID:26927540
NASA Astrophysics Data System (ADS)
Vaucouleur, Sebastien
2011-02-01
We introduce code query by example for customisation of evolvable software products in general and of enterprise resource planning systems (ERPs) in particular. The concept is based on an initial empirical study on practices around ERP systems. We motivate our design choices based on those empirical results, and we show how the proposed solution helps with respect to the infamous upgrade problem: the conflict between the need for customisation and the need for upgrade of ERP systems. We further show how code query by example can be used as a form of lightweight static analysis, to detect automatically potential defects in large software products. Code query by example as a form of lightweight static analysis is particularly interesting in the context of ERP systems: it is often the case that programmers working in this field are not computer science specialists but more of domain experts. Hence, they require a simple language to express custom rules.
Actionable Capability for Social and Economic Systems (ACSES)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fernandez, Steven J; Brecke, Peter K; Carmichael, Theodore D
The foundation of the Actionable Capability for Social and Economic Systems (ACSES) project is a useful regional-scale social-simulation system. This report is organized into five chapters that describe insights that were gained concerning the five key feasibility questions pertaining to such a system: (1) Should such a simulation system exist, would the current state of data sets or collectible data sets be adequate to support such a system? (2) By comparing different agent-based simulation systems, is it feasible to compare simulation systems and select one appropriate for a given application with agents behaving according to modern social theory rather thanmore » ad hoc rule sets? (3) Provided that a selected simulation system for a region of interest could be constructed, can the simulation system be updated with new and changing conditions so that the universe of potential outcomes are constrained by events on the ground as they evolve? (4) As these results are constrained by evolving events on the ground, is it feasible to still generate surprise and emerging behavior to suggest outcomes from novel courses of action? (5) As these systems may for the first time require large numbers (hundreds of millions) of agents operating with complexities demanded of modern social theories, can results still be generated within actionable decision cycles?« less
The S(c)ensory Immune System Theory.
Veiga-Fernandes, Henrique; Freitas, António A
2017-10-01
Viewpoints on the immune system have evolved across different paradigms, including the clonal selection theory, the idiotypic network, and the danger and tolerance models. Herein, we propose that in multicellular organisms, where panoplies of cells from different germ layers interact and immune cells are constantly generated, the behavior of the immune system is defined by the rules governing cell survival, systems physiology and organismic homeostasis. Initially, these rules were imprinted at the single cell-protist level, but supervened modifications in the transition to multicellular organisms. This context determined the emergence of the 'sensory immune system', which operates in a s(c)ensor mode to ensure systems physiology, organismic homeostasis, and perpetuation of its replicating molecules. Copyright © 2017 Elsevier Ltd. All rights reserved.
Using economy of means to evolve transition rules within 2D cellular automata.
Ripps, David L
2010-01-01
Running a cellular automaton (CA) on a rectangular lattice is a time-honored method for studying artificial life on a digital computer. Commonly, the researcher wishes to investigate some specific or general mode of behavior, say, the ability of a coherent pattern of points to glide within the lattice, or to generate copies of itself. This technique has a problem: how to design the transitions table-the set of distinct rules that specify the next content of a cell from its current content and that of its near neighbors. Often the table is painstakingly designed manually, rule by rule. The problem is exacerbated by the potentially vast number of individual rules that need be specified to cover all combinations of center and neighbors when there are several symbols in the alphabet of the CA. In this article a method is presented to have the set of rules evolve automatically while running the CA. The transition table is initially empty, with rules being added as the need arises. A novel principle drives the evolution: maximum economy of means-maximizing the reuse of rules introduced on previous cycles. This method may not be a panacea applicable to all CA studies. Nevertheless, it is sufficiently potent to evolve sets of rules and associated patterns of points that glide (periodically regenerate themselves at another location) and to generate gliding "children" that then "mate" by collision.
Hripcsak, George
1997-01-01
Abstract An information system architecture defines the components of a system and the interfaces among the components. A good architecture is essential for creating an Integrated Advanced Information Management System (IAIMS) that works as an integrated whole yet is flexible enough to accommodate many users and roles, multiple applications, changing vendors, evolving user needs, and advancing technology. Modularity and layering promote flexibility by reducing the complexity of a system and by restricting the ways in which components may interact. Enterprise-wide mediation promotes integration by providing message routing, support for standards, dictionary-based code translation, a centralized conceptual data schema, business rule implementation, and consistent access to databases. Several IAIMS sites have adopted a client-server architecture, and some have adopted a three-tiered approach, separating user interface functions, application logic, and repositories. PMID:9067884
Mallik, Saurav; Bhadra, Tapas; Mukherji, Ayan; Mallik, Saurav; Bhadra, Tapas; Mukherji, Ayan; Mallik, Saurav; Bhadra, Tapas; Mukherji, Ayan
2018-04-01
Association rule mining is an important technique for identifying interesting relationships between gene pairs in a biological data set. Earlier methods basically work for a single biological data set, and, in maximum cases, a single minimum support cutoff can be applied globally, i.e., across all genesets/itemsets. To overcome this limitation, in this paper, we propose dynamic threshold-based FP-growth rule mining algorithm that integrates gene expression, methylation and protein-protein interaction profiles based on weighted shortest distance to find the novel associations among different pairs of genes in multi-view data sets. For this purpose, we introduce three new thresholds, namely, Distance-based Variable/Dynamic Supports (DVS), Distance-based Variable Confidences (DVC), and Distance-based Variable Lifts (DVL) for each rule by integrating co-expression, co-methylation, and protein-protein interactions existed in the multi-omics data set. We develop the proposed algorithm utilizing these three novel multiple threshold measures. In the proposed algorithm, the values of , , and are computed for each rule separately, and subsequently it is verified whether the support, confidence, and lift of each evolved rule are greater than or equal to the corresponding individual , , and values, respectively, or not. If all these three conditions for a rule are found to be true, the rule is treated as a resultant rule. One of the major advantages of the proposed method compared with other related state-of-the-art methods is that it considers both the quantitative and interactive significance among all pairwise genes belonging to each rule. Moreover, the proposed method generates fewer rules, takes less running time, and provides greater biological significance for the resultant top-ranking rules compared to previous methods.
Brain Dynamics in Predicting Driving Fatigue Using a Recurrent Self-Evolving Fuzzy Neural Network.
Liu, Yu-Ting; Lin, Yang-Yin; Wu, Shang-Lin; Chuang, Chun-Hsiang; Lin, Chin-Teng
2016-02-01
This paper proposes a generalized prediction system called a recurrent self-evolving fuzzy neural network (RSEFNN) that employs an on-line gradient descent learning rule to address the electroencephalography (EEG) regression problem in brain dynamics for driving fatigue. The cognitive states of drivers significantly affect driving safety; in particular, fatigue driving, or drowsy driving, endangers both the individual and the public. For this reason, the development of brain-computer interfaces (BCIs) that can identify drowsy driving states is a crucial and urgent topic of study. Many EEG-based BCIs have been developed as artificial auxiliary systems for use in various practical applications because of the benefits of measuring EEG signals. In the literature, the efficacy of EEG-based BCIs in recognition tasks has been limited by low resolutions. The system proposed in this paper represents the first attempt to use the recurrent fuzzy neural network (RFNN) architecture to increase adaptability in realistic EEG applications to overcome this bottleneck. This paper further analyzes brain dynamics in a simulated car driving task in a virtual-reality environment. The proposed RSEFNN model is evaluated using the generalized cross-subject approach, and the results indicate that the RSEFNN is superior to competing models regardless of the use of recurrent or nonrecurrent structures.
iRODS: A Distributed Data Management Cyberinfrastructure for Observatories
NASA Astrophysics Data System (ADS)
Rajasekar, A.; Moore, R.; Vernon, F.
2007-12-01
Large-scale and long-term preservation of both observational and synthesized data requires a system that virtualizes data management concepts. A methodology is needed that can work across long distances in space (distribution) and long-periods in time (preservation). The system needs to manage data stored on multiple types of storage systems including new systems that become available in the future. This concept is called infrastructure independence, and is typically implemented through virtualization mechanisms. Data grids are built upon concepts of data and trust virtualization. These concepts enable the management of collections of data that are distributed across multiple institutions, stored on multiple types of storage systems, and accessed by multiple types of clients. Data virtualization ensures that the name spaces used to identify files, users, and storage systems are persistent, even when files are migrated onto future technology. This is required to preserve authenticity, the link between the record and descriptive and provenance metadata. Trust virtualization ensures that access controls remain invariant as files are moved within the data grid. This is required to track the chain of custody of records over time. The Storage Resource Broker (http://www.sdsc.edu/srb) is one such data grid used in a wide variety of applications in earth and space sciences such as ROADNet (roadnet.ucsd.edu), SEEK (seek.ecoinformatics.org), GEON (www.geongrid.org) and NOAO (www.noao.edu). Recent extensions to data grids provide one more level of virtualization - policy or management virtualization. Management virtualization ensures that execution of management policies can be automated, and that rules can be created that verify assertions about the shared collections of data. When dealing with distributed large-scale data over long periods of time, the policies used to manage the data and provide assurances about the authenticity of the data become paramount. The integrated Rule-Oriented Data System (iRODS) (http://irods.sdsc.edu) provides the mechanisms needed to describe not only management policies, but also to track how the policies are applied and their execution results. The iRODS data grid maps management policies to rules that control the execution of the remote micro-services. As an example, a rule can be created that automatically creates a replica whenever a file is added to a specific collection, or extracts its metadata automatically and registers it in a searchable catalog. For the replication operation, the persistent state information consists of the replica location, the creation date, the owner, the replica size, etc. The mechanism used by iRODS for providing policy virtualization is based on well-defined functions, called micro-services, which are chained into alternative workflows using rules. A rule engine, based on the event-condition-action paradigm executes the rule-based workflows after an event. Rules can be deferred to a pre-determined time or executed on a periodic basis. As the data management policies evolve, the iRODS system can implement new rules, new micro-services, and new state information (metadata content) needed to manage the new policies. Each sub- collection can be managed using a different set of policies. The discussion of the concepts in rule-based policy virtualization and its application to long-term and large-scale data management for observatories such as ORION and NEON will be the basis of the paper.
Bankhead, Armand; Magnuson, Nancy S; Heckendorn, Robert B
2007-06-07
A computer simulation is used to model ductal carcinoma in situ, a form of non-invasive breast cancer. The simulation uses known histological morphology, cell types, and stochastic cell proliferation to evolve tumorous growth within a duct. The ductal simulation is based on a hybrid cellular automaton design using genetic rules to determine each cell's behavior. The genetic rules are a mutable abstraction that demonstrate genetic heterogeneity in a population. Our goal was to examine the role (if any) that recently discovered mammary stem cell hierarchies play in genetic heterogeneity, DCIS initiation and aggressiveness. Results show that simpler progenitor hierarchies result in greater genetic heterogeneity and evolve DCIS significantly faster. However, the more complex progenitor hierarchy structure was able to sustain the rapid reproduction of a cancer cell population for longer periods of time.
The Interagency: Evolving a Hamstrung and Broken System?
2013-05-23
foreign to the Somali culture.85 Somalis’ oral traditions extend back to prehistory , tying them to the rules of antiquity and the family of Muhammad...Center, 2007), 30. 86 Ibid, 29.; Prehistory is a common term that refers to the time before written history. 87Bolger, Savage Peace, 267. 26
Rule groupings: An approach towards verification of expert systems
NASA Technical Reports Server (NTRS)
Mehrotra, Mala
1991-01-01
Knowledge-based expert systems are playing an increasingly important role in NASA space and aircraft systems. However, many of NASA's software applications are life- or mission-critical and knowledge-based systems do not lend themselves to the traditional verification and validation techniques for highly reliable software. Rule-based systems lack the control abstractions found in procedural languages. Hence, it is difficult to verify or maintain such systems. Our goal is to automatically structure a rule-based system into a set of rule-groups having a well-defined interface to other rule-groups. Once a rule base is decomposed into such 'firewalled' units, studying the interactions between rules would become more tractable. Verification-aid tools can then be developed to test the behavior of each such rule-group. Furthermore, the interactions between rule-groups can be studied in a manner similar to integration testing. Such efforts will go a long way towards increasing our confidence in the expert-system software. Our research efforts address the feasibility of automating the identification of rule groups, in order to decompose the rule base into a number of meaningful units.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hummel, K.E.
1987-12-01
Expert systems are artificial intelligence programs that solve problems requiring large amounts of heuristic knowledge, based on years of experience and tradition. Production systems are domain-independent tools that support the development of rule-based expert systems. This document describes a general purpose production system known as HERB. This system was developed to support the programming of expert systems using hierarchically structured rule bases. HERB encourages the partitioning of rules into multiple rule bases and supports the use of multiple conflict resolution strategies. Multiple rule bases can also be placed on a system stack and simultaneously searched during each interpreter cycle. Bothmore » backward and forward chaining rules are supported by HERB. The condition portion of each rule can contain both patterns, which are matched with facts in a data base, and LISP expressions, which are explicitly evaluated in the LISP environment. Properties of objects can also be stored in the HERB data base and referenced within the scope of each rule. This document serves both as an introduction to the principles of LISP-based production systems and as a user's manual for the HERB system. 6 refs., 17 figs.« less
Flexible Early Warning Systems with Workflows and Decision Tables
NASA Astrophysics Data System (ADS)
Riedel, F.; Chaves, F.; Zeiner, H.
2012-04-01
An essential part of early warning systems and systems for crisis management are decision support systems that facilitate communication and collaboration. Often official policies specify how different organizations collaborate and what information is communicated to whom. For early warning systems it is crucial that information is exchanged dynamically in a timely manner and all participants get exactly the information they need to fulfil their role in the crisis management process. Information technology obviously lends itself to automate parts of the process. We have experienced however that in current operational systems the information logistics processes are hard-coded, even though they are subject to change. In addition, systems are tailored to the policies and requirements of a certain organization and changes can require major software refactoring. We seek to develop a system that can be deployed and adapted to multiple organizations with different dynamic runtime policies. A major requirement for such a system is that changes can be applied locally without affecting larger parts of the system. In addition to the flexibility regarding changes in policies and processes, the system needs to be able to evolve; when new information sources become available, it should be possible to integrate and use these in the decision process. In general, this kind of flexibility comes with a significant increase in complexity. This implies that only IT professionals can maintain a system that can be reconfigured and adapted; end-users are unable to utilise the provided flexibility. In the business world similar problems arise and previous work suggested using business process management systems (BPMS) or workflow management systems (WfMS) to guide and automate early warning processes or crisis management plans. However, the usability and flexibility of current WfMS are limited, because current notations and user interfaces are still not suitable for end-users, and workflows are usually only suited for rigid processes. We show how improvements can be achieved by using decision tables and rule-based adaptive workflows. Decision tables have been shown to be an intuitive tool that can be used by domain experts to express rule sets that can be interpreted automatically at runtime. Adaptive workflows use a rule-based approach to increase the flexibility of workflows by providing mechanisms to adapt workflows based on context changes, human intervention and availability of services. The combination of workflows, decision tables and rule-based adaption creates a framework that opens up new possibilities for flexible and adaptable workflows, especially, for use in early warning and crisis management systems.
A Hyper-Heuristic Ensemble Method for Static Job-Shop Scheduling.
Hart, Emma; Sim, Kevin
2016-01-01
We describe a new hyper-heuristic method NELLI-GP for solving job-shop scheduling problems (JSSP) that evolves an ensemble of heuristics. The ensemble adopts a divide-and-conquer approach in which each heuristic solves a unique subset of the instance set considered. NELLI-GP extends an existing ensemble method called NELLI by introducing a novel heuristic generator that evolves heuristics composed of linear sequences of dispatching rules: each rule is represented using a tree structure and is itself evolved. Following a training period, the ensemble is shown to outperform both existing dispatching rules and a standard genetic programming algorithm on a large set of new test instances. In addition, it obtains superior results on a set of 210 benchmark problems from the literature when compared to two state-of-the-art hyper-heuristic approaches. Further analysis of the relationship between heuristics in the evolved ensemble and the instances each solves provides new insights into features that might describe similar instances.
An Autonomous Flight Safety System
NASA Technical Reports Server (NTRS)
Bull, James B.; Lanzi, Raymond J.
2007-01-01
The Autonomous Flight Safety System (AFSS) being developed by NASA s Goddard Space Flight Center s Wallops Flight Facility and Kennedy Space Center has completed two successful developmental flights and is preparing for a third. AFSS has been demonstrated to be a viable architecture for implementation of a completely vehicle based system capable of protecting life and property in event of an errant vehicle by terminating the flight or initiating other actions. It is capable of replacing current human-in-the-loop systems or acting in parallel with them. AFSS is configured prior to flight in accordance with a specific rule set agreed upon by the range safety authority and the user to protect the public and assure mission success. This paper discusses the motivation for the project, describes the method of development, and presents an overview of the evolving architecture and the current status.
Ultra-Structure database design methodology for managing systems biology data and analyses
Maier, Christopher W; Long, Jeffrey G; Hemminger, Bradley M; Giddings, Morgan C
2009-01-01
Background Modern, high-throughput biological experiments generate copious, heterogeneous, interconnected data sets. Research is dynamic, with frequently changing protocols, techniques, instruments, and file formats. Because of these factors, systems designed to manage and integrate modern biological data sets often end up as large, unwieldy databases that become difficult to maintain or evolve. The novel rule-based approach of the Ultra-Structure design methodology presents a potential solution to this problem. By representing both data and processes as formal rules within a database, an Ultra-Structure system constitutes a flexible framework that enables users to explicitly store domain knowledge in both a machine- and human-readable form. End users themselves can change the system's capabilities without programmer intervention, simply by altering database contents; no computer code or schemas need be modified. This provides flexibility in adapting to change, and allows integration of disparate, heterogenous data sets within a small core set of database tables, facilitating joint analysis and visualization without becoming unwieldy. Here, we examine the application of Ultra-Structure to our ongoing research program for the integration of large proteomic and genomic data sets (proteogenomic mapping). Results We transitioned our proteogenomic mapping information system from a traditional entity-relationship design to one based on Ultra-Structure. Our system integrates tandem mass spectrum data, genomic annotation sets, and spectrum/peptide mappings, all within a small, general framework implemented within a standard relational database system. General software procedures driven by user-modifiable rules can perform tasks such as logical deduction and location-based computations. The system is not tied specifically to proteogenomic research, but is rather designed to accommodate virtually any kind of biological research. Conclusion We find Ultra-Structure offers substantial benefits for biological information systems, the largest being the integration of diverse information sources into a common framework. This facilitates systems biology research by integrating data from disparate high-throughput techniques. It also enables us to readily incorporate new data types, sources, and domain knowledge with no change to the database structure or associated computer code. Ultra-Structure may be a significant step towards solving the hard problem of data management and integration in the systems biology era. PMID:19691849
Probabilistic Cellular Automata
Agapie, Alexandru; Giuclea, Marius
2014-01-01
Abstract Cellular automata are binary lattices used for modeling complex dynamical systems. The automaton evolves iteratively from one configuration to another, using some local transition rule based on the number of ones in the neighborhood of each cell. With respect to the number of cells allowed to change per iteration, we speak of either synchronous or asynchronous automata. If randomness is involved to some degree in the transition rule, we speak of probabilistic automata, otherwise they are called deterministic. With either type of cellular automaton we are dealing with, the main theoretical challenge stays the same: starting from an arbitrary initial configuration, predict (with highest accuracy) the end configuration. If the automaton is deterministic, the outcome simplifies to one of two configurations, all zeros or all ones. If the automaton is probabilistic, the whole process is modeled by a finite homogeneous Markov chain, and the outcome is the corresponding stationary distribution. Based on our previous results for the asynchronous case—connecting the probability of a configuration in the stationary distribution to its number of zero-one borders—the article offers both numerical and theoretical insight into the long-term behavior of synchronous cellular automata. PMID:24999557
Probabilistic cellular automata.
Agapie, Alexandru; Andreica, Anca; Giuclea, Marius
2014-09-01
Cellular automata are binary lattices used for modeling complex dynamical systems. The automaton evolves iteratively from one configuration to another, using some local transition rule based on the number of ones in the neighborhood of each cell. With respect to the number of cells allowed to change per iteration, we speak of either synchronous or asynchronous automata. If randomness is involved to some degree in the transition rule, we speak of probabilistic automata, otherwise they are called deterministic. With either type of cellular automaton we are dealing with, the main theoretical challenge stays the same: starting from an arbitrary initial configuration, predict (with highest accuracy) the end configuration. If the automaton is deterministic, the outcome simplifies to one of two configurations, all zeros or all ones. If the automaton is probabilistic, the whole process is modeled by a finite homogeneous Markov chain, and the outcome is the corresponding stationary distribution. Based on our previous results for the asynchronous case-connecting the probability of a configuration in the stationary distribution to its number of zero-one borders-the article offers both numerical and theoretical insight into the long-term behavior of synchronous cellular automata.
Concurrent approach for evolving compact decision rule sets
NASA Astrophysics Data System (ADS)
Marmelstein, Robert E.; Hammack, Lonnie P.; Lamont, Gary B.
1999-02-01
The induction of decision rules from data is important to many disciplines, including artificial intelligence and pattern recognition. To improve the state of the art in this area, we introduced the genetic rule and classifier construction environment (GRaCCE). It was previously shown that GRaCCE consistently evolved decision rule sets from data, which were significantly more compact than those produced by other methods (such as decision tree algorithms). The primary disadvantage of GRaCCe, however, is its relatively poor run-time execution performance. In this paper, a concurrent version of the GRaCCE architecture is introduced, which improves the efficiency of the original algorithm. A prototype of the algorithm is tested on an in- house parallel processor configuration and the results are discussed.
Rule groupings: A software engineering approach towards verification of expert systems
NASA Technical Reports Server (NTRS)
Mehrotra, Mala
1991-01-01
Currently, most expert system shells do not address software engineering issues for developing or maintaining expert systems. As a result, large expert systems tend to be incomprehensible, difficult to debug or modify and almost impossible to verify or validate. Partitioning rule based systems into rule groups which reflect the underlying subdomains of the problem should enhance the comprehensibility, maintainability, and reliability of expert system software. Attempts were made to semiautomatically structure a CLIPS rule base into groups of related rules that carry the same type of information. Different distance metrics that capture relevant information from the rules for grouping are discussed. Two clustering algorithms that partition the rule base into groups of related rules are given. Two independent evaluation criteria are developed to measure the effectiveness of the grouping strategies. Results of the experiment with three sample rule bases are presented.
Emergence of a Communication System: International Sign
NASA Astrophysics Data System (ADS)
Rosenstock, Rachel
International Sign (henceforth IS) is a communication system that is used widely in the international Deaf Community. The present study is one of the first to research extensively the origin of both the IS lexicon and grammatical structures. Findings demonstrate that IS is both influenced by naturally evolved sign languages used in grown deaf communities (henceforth SLs) and relies heavily on iconic, universal structures. This paper shows that IS continues to develop from a simplistic iconic system into a conventionalized system with increasingly complex rules.
Automated revision of CLIPS rule-bases
NASA Technical Reports Server (NTRS)
Murphy, Patrick M.; Pazzani, Michael J.
1994-01-01
This paper describes CLIPS-R, a theory revision system for the revision of CLIPS rule-bases. CLIPS-R may be used for a variety of knowledge-base revision tasks, such as refining a prototype system, adapting an existing system to slightly different operating conditions, or improving an operational system that makes occasional errors. We present a description of how CLIPS-R revises rule-bases, and an evaluation of the system on three rule-bases.
2007-04-01
Services and System Capabilities Enterprise Rules and Standards for Interoperability Navy AFArmy TRANS COM DFASDLA Ente prise Shared Services and System...Where commonality among components exists, there are also opportunities for identifying and leveraging shared services . A service-oriented architecture...and (3) shared services . The BMA federation strategy, according to these officials, is the first mission area federation strategy, and it is their
A Balanced Mixture of Antagonistic Pressures Promotes the Evolution of Parallel Movement
NASA Astrophysics Data System (ADS)
Demšar, Jure; Štrumbelj, Erik; Lebar Bajec, Iztok
2016-12-01
A common hypothesis about the origins of collective behaviour suggests that animals might live and move in groups to increase their chances of surviving predator attacks. This hypothesis is supported by several studies that use computational models to simulate natural evolution. These studies, however, either tune an ad-hoc model to ‘reproduce’ collective behaviour, or concentrate on a single type of predation pressure, or infer the emergence of collective behaviour from an increase in prey density. In nature, prey are often targeted by multiple predator species simultaneously and this might have played a pivotal role in the evolution of collective behaviour. We expand on previous research by using an evolutionary rule-based system to simulate the evolution of prey behaviour when prey are subject to multiple simultaneous predation pressures. We analyse the evolved behaviour via prey density, polarization, and angular momentum. Our results suggest that a mixture of antagonistic external pressures that simultaneously steer prey towards grouping and dispersing might be required for prey individuals to evolve dynamic parallel movement.
Kawakami, Tomoya; Fujita, Naotaka; Yoshihisa, Tomoki; Tsukamoto, Masahiko
2014-01-01
In recent years, sensors become popular and Home Energy Management System (HEMS) takes an important role in saving energy without decrease in QoL (Quality of Life). Currently, many rule-based HEMSs have been proposed and almost all of them assume "IF-THEN" rules. The Rete algorithm is a typical pattern matching algorithm for IF-THEN rules. Currently, we have proposed a rule-based Home Energy Management System (HEMS) using the Rete algorithm. In the proposed system, rules for managing energy are processed by smart taps in network, and the loads for processing rules and collecting data are distributed to smart taps. In addition, the number of processes and collecting data are reduced by processing rules based on the Rete algorithm. In this paper, we evaluated the proposed system by simulation. In the simulation environment, rules are processed by a smart tap that relates to the action part of each rule. In addition, we implemented the proposed system as HEMS using smart taps.
An Analysis on a Negotiation Model Based on Multiagent Systems with Symbiotic Learning and Evolution
NASA Astrophysics Data System (ADS)
Hossain, Md. Tofazzal
This study explores an evolutionary analysis on a negotiation model based on Masbiole (Multiagent Systems with Symbiotic Learning and Evolution) which has been proposed as a new methodology of Multiagent Systems (MAS) based on symbiosis in the ecosystem. In Masbiole, agents evolve in consideration of not only their own benefits and losses, but also the benefits and losses of opponent agents. To aid effective application of Masbiole, we develop a competitive negotiation model where rigorous and advanced intelligent decision-making mechanisms are required for agents to achieve solutions. A Negotiation Protocol is devised aiming at developing a set of rules for agents' behavior during evolution. Simulations use a newly developed evolutionary computing technique, called Genetic Network Programming (GNP) which has the directed graph-type gene structure that can develop and design the required intelligent mechanisms for agents. In a typical scenario, competitive negotiation solutions are reached by concessions that are usually predetermined in the conventional MAS. In this model, however, not only concession is determined automatically by symbiotic evolution (making the system intelligent, automated, and efficient) but the solution also achieves Pareto optimal automatically.
Changes and Issues in the Validation of Experience
ERIC Educational Resources Information Center
Triby, Emmanuel
2005-01-01
This article analyses the main changes in the rules for validating experience in France and of what they mean for society. It goes on to consider university validation practices. The way in which this system is evolving offers a chance to identify the issues involved for the economy and for society, with particular attention to the expected…
Program Costing with the CAMPUS Simulation Model. Project PRIME Report, Number 5.
ERIC Educational Resources Information Center
Cordes, David C.
The first section of this report on program costing with the CAMPUS simulation discusses the structuring process of Program Planning and Budgeting (PPB) systems, and emphasizes the ideas, rules, and principles for structuring resource data that have evolved during the 10 years of PPB existence. It also discusses the WICHE-PMS program…
NASA Astrophysics Data System (ADS)
Shen, Fuhui; Lian, Junhe; Münstermann, Sebastian
2018-05-01
Experimental and numerical investigations on the forming limit diagram (FLD) of a ferritic stainless steel were performed in this study. The FLD of this material was obtained by Nakajima tests. Both the Marciniak-Kuczynski (MK) model and the modified maximum force criterion (MMFC) were used for the theoretical prediction of the FLD. From the results of uniaxial tensile tests along different loading directions with respect to the rolling direction, strong anisotropic plastic behaviour was observed in the investigated steel. A recently proposed anisotropic evolving non-associated Hill48 (enHill48) plasticity model, which was developed from the conventional Hill48 model based on the non-associated flow rule with evolving anisotropic parameters, was adopted to describe the anisotropic hardening behaviour of the investigated material. In the previous study, the model was coupled with the MMFC for FLD prediction. In the current study, the enHill48 was further coupled with the MK model. By comparing the predicted forming limit curves with the experimental results, the influences of anisotropy in terms of flow rule and evolving features on the forming limit prediction were revealed and analysed. In addition, the forming limit predictive performances of the MK and the MMFC models in conjunction with the enHill48 plasticity model were compared and evaluated.
A fuzzy classifier system for process control
NASA Technical Reports Server (NTRS)
Karr, C. L.; Phillips, J. C.
1994-01-01
A fuzzy classifier system that discovers rules for controlling a mathematical model of a pH titration system was developed by researchers at the U.S. Bureau of Mines (USBM). Fuzzy classifier systems successfully combine the strengths of learning classifier systems and fuzzy logic controllers. Learning classifier systems resemble familiar production rule-based systems, but they represent their IF-THEN rules by strings of characters rather than in the traditional linguistic terms. Fuzzy logic is a tool that allows for the incorporation of abstract concepts into rule based-systems, thereby allowing the rules to resemble the familiar 'rules-of-thumb' commonly used by humans when solving difficult process control and reasoning problems. Like learning classifier systems, fuzzy classifier systems employ a genetic algorithm to explore and sample new rules for manipulating the problem environment. Like fuzzy logic controllers, fuzzy classifier systems encapsulate knowledge in the form of production rules. The results presented in this paper demonstrate the ability of fuzzy classifier systems to generate a fuzzy logic-based process control system.
Reducing the Conflict Factors Strategies in Question Answering System
NASA Astrophysics Data System (ADS)
Suwarningsih, W.; Purwarianti, A.; Supriana, I.
2017-03-01
A rule-based system is prone to conflict as new knowledge every time will emerge and indirectly must sign in to the knowledge base that is used by the system. A conflict occurred between the rules in the knowledge base can lead to the errors of reasoning or reasoning circulation. Therefore, when added, the new rules will lead to conflict with other rules, and the only rules that really can be added to the knowledge base. From these conditions, this paper aims to propose a conflict resolution strategy for a medical debriefing system by analyzing scenarios based upon the runtime to improve the efficiency and reliability of systems.
Revisiting Nuclear Thermal Propulsion for Human Mars Exploration
NASA Technical Reports Server (NTRS)
Percy, Thomas K.; Rodriguez, Mitchell
2017-01-01
Nuclear Thermal Propulsion (NTP) has long been considered as a viable in-space transportation alternative for delivering crew and cargo to the Martian system. While technology development work in nuclear propulsion has continued over the year, general interest in NTP propulsion applications has historically been tied directly to the ebb and flow of interest in sending humans to explore Mars. As far back as the 1960’s, plans for NTP-based human Mars exploration have been proposed and periodically revisited having most recently been considered as part of NASA Design Reference Architecture (DRA) 5.0. NASA has been investigating human Mars exploration strategies tied to its current Journey to Mars for the past few years however, NTP has only recently been added into the set of alternatives under consideration for in-space propulsion under the Mars Study Capability (MSC) team, formerly the Evolvable Mars Campaign (EMC) team. The original charter of the EMC was to find viable human Mars exploration approaches that relied heavily on technology investment work already underway, specifically related to the development of large Solar Electric Propulsion (SEP) systems. The EMC team baselined several departures from traditional Mars exploration ground rules to enable these types of architectures. These ground rule changes included lower energy conjunction class trajectories with corresponding longer flight times, aggregation of mission elements in cis-Lunar space rather than Low Earth Orbit (LEO) and, in some cases, the pre-deployment of Earth return propulsion systems to Mars. As the MSC team continues to refine the in-space transportation trades, an NTP-based architecture that takes advantage of some of these ground rule departures is being introduced.
Evolution of Bow-Tie Architectures in Biology
Friedlander, Tamar; Mayo, Avraham E.; Tlusty, Tsvi; Alon, Uri
2015-01-01
Bow-tie or hourglass structure is a common architectural feature found in many biological systems. A bow-tie in a multi-layered structure occurs when intermediate layers have much fewer components than the input and output layers. Examples include metabolism where a handful of building blocks mediate between multiple input nutrients and multiple output biomass components, and signaling networks where information from numerous receptor types passes through a small set of signaling pathways to regulate multiple output genes. Little is known, however, about how bow-tie architectures evolve. Here, we address the evolution of bow-tie architectures using simulations of multi-layered systems evolving to fulfill a given input-output goal. We find that bow-ties spontaneously evolve when the information in the evolutionary goal can be compressed. Mathematically speaking, bow-ties evolve when the rank of the input-output matrix describing the evolutionary goal is deficient. The maximal compression possible (the rank of the goal) determines the size of the narrowest part of the network—that is the bow-tie. A further requirement is that a process is active to reduce the number of links in the network, such as product-rule mutations, otherwise a non-bow-tie solution is found in the evolutionary simulations. This offers a mechanism to understand a common architectural principle of biological systems, and a way to quantitate the effective rank of the goals under which they evolved. PMID:25798588
NASA Technical Reports Server (NTRS)
Nieten, Joseph L.; Seraphine, Kathleen M.
1991-01-01
Procedural modeling systems, rule based modeling systems, and a method for converting a procedural model to a rule based model are described. Simulation models are used to represent real time engineering systems. A real time system can be represented by a set of equations or functions connected so that they perform in the same manner as the actual system. Most modeling system languages are based on FORTRAN or some other procedural language. Therefore, they must be enhanced with a reaction capability. Rule based systems are reactive by definition. Once the engineering system has been decomposed into a set of calculations using only basic algebraic unary operations, a knowledge network of calculations and functions can be constructed. The knowledge network required by a rule based system can be generated by a knowledge acquisition tool or a source level compiler. The compiler would take an existing model source file, a syntax template, and a symbol table and generate the knowledge network. Thus, existing procedural models can be translated and executed by a rule based system. Neural models can be provide the high capacity data manipulation required by the most complex real time models.
On Decision-Making Among Multiple Rule-Bases in Fuzzy Control Systems
NASA Technical Reports Server (NTRS)
Tunstel, Edward; Jamshidi, Mo
1997-01-01
Intelligent control of complex multi-variable systems can be a challenge for single fuzzy rule-based controllers. This class of problems cam often be managed with less difficulty by distributing intelligent decision-making amongst a collection of rule-bases. Such an approach requires that a mechanism be chosen to ensure goal-oriented interaction between the multiple rule-bases. In this paper, a hierarchical rule-based approach is described. Decision-making mechanisms based on generalized concepts from single-rule-based fuzzy control are described. Finally, the effects of different aggregation operators on multi-rule-base decision-making are examined in a navigation control problem for mobile robots.
NASA Astrophysics Data System (ADS)
Walker, M. J.
2016-12-01
Small unmanned aerial systems (sUAS, also known as drones) potentially provide researchers and managers with the capacity to enhance temporal and spatial resolution of data sets for natural resources science and management. sUAS have been used for many types of data collection and have a partial definition in mass of the aircraft, ranging from 0.5 to <55 lbs (0.2 to <24.9 kg). Aircraft within this range of mass can present a collision hazard to other aircraft. The Federal Aviation Administration (FAA) recently faced the challenge of removing regulatory barriers to sUAS application while minimizing risk in the national airspace. The regulatory and legal framework developed for using sUAS in natural resources science and management has evolved from a very conservative approach prior in the first decade of the 21st century. FAA's recently revised operating rules for sUAS, significantly changing pilot certification requirements and operating rules in the national airspace. The next 2-5 years will bring advances in sUAS applications for science and management, building upon the accomplishments of users who complied with the former regulatory environment. We review the current operating rules (49 CFR, part 107) that apply specifically to sUAS and discuss the implications for researchers and managers. While part 107 relaxed many restrictions, it is important to understand the regulatory framework currently in place that encourages development of applications for sUAS while adhering to the mandate that the national airspace be safe and secure. We consider potential applications for natural resources science and management in the context of the recently released operating rules, especially with respect to training requirements and protocols for use.
Challenges for Rule Systems on the Web
NASA Astrophysics Data System (ADS)
Hu, Yuh-Jong; Yeh, Ching-Long; Laun, Wolfgang
The RuleML Challenge started in 2007 with the objective of inspiring the issues of implementation for management, integration, interoperation and interchange of rules in an open distributed environment, such as the Web. Rules are usually classified as three types: deductive rules, normative rules, and reactive rules. The reactive rules are further classified as ECA rules and production rules. The study of combination rule and ontology is traced back to an earlier active rule system for relational and object-oriented (OO) databases. Recently, this issue has become one of the most important research problems in the Semantic Web. Once we consider a computer executable policy as a declarative set of rules and ontologies that guides the behavior of entities within a system, we have a flexible way to implement real world policies without rewriting the computer code, as we did before. Fortunately, we have de facto rule markup languages, such as RuleML or RIF to achieve the portability and interchange of rules for different rule systems. Otherwise, executing real-life rule-based applications on the Web is almost impossible. Several commercial or open source rule engines are available for the rule-based applications. However, we still need a standard rule language and benchmark for not only to compare the rule systems but also to measure the progress in the field. Finally, a number of real-life rule-based use cases will be investigated to demonstrate the applicability of current rule systems on the Web.
Data Clustering and Evolving Fuzzy Decision Tree for Data Base Classification Problems
NASA Astrophysics Data System (ADS)
Chang, Pei-Chann; Fan, Chin-Yuan; Wang, Yen-Wen
Data base classification suffers from two well known difficulties, i.e., the high dimensionality and non-stationary variations within the large historic data. This paper presents a hybrid classification model by integrating a case based reasoning technique, a Fuzzy Decision Tree (FDT), and Genetic Algorithms (GA) to construct a decision-making system for data classification in various data base applications. The model is major based on the idea that the historic data base can be transformed into a smaller case-base together with a group of fuzzy decision rules. As a result, the model can be more accurately respond to the current data under classifying from the inductions by these smaller cases based fuzzy decision trees. Hit rate is applied as a performance measure and the effectiveness of our proposed model is demonstrated by experimentally compared with other approaches on different data base classification applications. The average hit rate of our proposed model is the highest among others.
Revenue Sufficiency and Reliability in a Zero Marginal Cost Future
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frew, Bethany A.
Features of existing wholesale electricity markets, such as administrative pricing rules and policy-based reliability standards, can distort market incentives from allowing generators sufficient opportunities to recover both fixed and variable costs. Moreover, these challenges can be amplified by other factors, including (1) inelastic demand resulting from a lack of price signal clarity, (2) low- or near-zero marginal cost generation, particularly arising from low natural gas fuel prices and variable generation (VG), such as wind and solar, and (3) the variability and uncertainty of this VG. As power systems begin to incorporate higher shares of VG, many questions arise about themore » suitability of the existing marginal-cost-based price formation, primarily within an energy-only market structure, to ensure the economic viability of resources that might be needed to provide system reliability. This article discusses these questions and provides a summary of completed and ongoing modelling-based work at the National Renewable Energy Laboratory to better understand the impacts of evolving power systems on reliability and revenue sufficiency.« less
Revenue Sufficiency and Reliability in a Zero Marginal Cost Future: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Frew, Bethany A.; Milligan, Michael; Brinkman, Greg
Features of existing wholesale electricity markets, such as administrative pricing rules and policy-based reliability standards, can distort market incentives from allowing generators sufficient opportunities to recover both fixed and variable costs. Moreover, these challenges can be amplified by other factors, including (1) inelastic demand resulting from a lack of price signal clarity, (2) low- or near-zero marginal cost generation, particularly arising from low natural gas fuel prices and variable generation (VG), such as wind and solar, and (3) the variability and uncertainty of this VG. As power systems begin to incorporate higher shares of VG, many questions arise about themore » suitability of the existing marginal-cost-based price formation, primarily within an energy-only market structure, to ensure the economic viability of resources that might be needed to provide system reliability. This article discusses these questions and provides a summary of completed and ongoing modelling-based work at the National Renewable Energy Laboratory to better understand the impacts of evolving power systems on reliability and revenue sufficiency.« less
Dynamic and adaptive policy models for coalition operations
NASA Astrophysics Data System (ADS)
Verma, Dinesh; Calo, Seraphin; Chakraborty, Supriyo; Bertino, Elisa; Williams, Chris; Tucker, Jeremy; Rivera, Brian; de Mel, Geeth R.
2017-05-01
It is envisioned that the success of future military operations depends on the better integration, organizationally and operationally, among allies, coalition members, inter-agency partners, and so forth. However, this leads to a challenging and complex environment where the heterogeneity and dynamism in the operating environment intertwines with the evolving situational factors that affect the decision-making life cycle of the war fighter. Therefore, the users in such environments need secure, accessible, and resilient information infrastructures where policy-based mechanisms adopt the behaviours of the systems to meet end user goals. By specifying and enforcing a policy based model and framework for operations and security which accommodates heterogeneous coalitions, high levels of agility can be enabled to allow rapid assembly and restructuring of system and information resources. However, current prevalent policy models (e.g., rule based event-condition-action model and its variants) are not sufficient to deal with the highly dynamic and plausibly non-deterministic nature of these environments. Therefore, to address the above challenges, in this paper, we present a new approach for policies which enables managed systems to take more autonomic decisions regarding their operations.
On the efficacy of cinema, or what the visual system did not evolve to do
NASA Technical Reports Server (NTRS)
Cutting, James E.
1989-01-01
Spatial displays, and a constraint that they do not place on the use of spatial instruments are discussed. Much of the work done in visual perception by psychologists and by computer scientists has concerned displays that show the motion of rigid objects. Typically, if one assumes that objects are rigid, one can then proceed to understand how the constant shape of the object can be perceived (or computed) as it moves through space. The author maintains that photographs and cinema are visual displays that are also powerful forms of art. Their efficacy, in part, stems from the fact that, although viewpoint is constrained when composing them, it is not nearly so constrained when viewing them. It is obvious, according to the author, that human visual systems did not evolve to watch movies or look at photographs. Thus, what photographs and movies present must be allowed in the rule-governed system under which vision evolved. Machine-vision algorithms, to be applicable to human vision, should show the same types of tolerance.
Rule-based programming paradigm: a formal basis for biological, chemical and physical computation.
Krishnamurthy, V; Krishnamurthy, E V
1999-03-01
A rule-based programming paradigm is described as a formal basis for biological, chemical and physical computations. In this paradigm, the computations are interpreted as the outcome arising out of interaction of elements in an object space. The interactions can create new elements (or same elements with modified attributes) or annihilate old elements according to specific rules. Since the interaction rules are inherently parallel, any number of actions can be performed cooperatively or competitively among the subsets of elements, so that the elements evolve toward an equilibrium or unstable or chaotic state. Such an evolution may retain certain invariant properties of the attributes of the elements. The object space resembles Gibbsian ensemble that corresponds to a distribution of points in the space of positions and momenta (called phase space). It permits the introduction of probabilities in rule applications. As each element of the ensemble changes over time, its phase point is carried into a new phase point. The evolution of this probability cloud in phase space corresponds to a distributed probabilistic computation. Thus, this paradigm can handle tor deterministic exact computation when the initial conditions are exactly specified and the trajectory of evolution is deterministic. Also, it can handle probabilistic mode of computation if we want to derive macroscopic or bulk properties of matter. We also explain how to support this rule-based paradigm using relational-database like query processing and transactions.
Towards a voxel-based geographic automata for the simulation of geospatial processes
NASA Astrophysics Data System (ADS)
Jjumba, Anthony; Dragićević, Suzana
2016-07-01
Many geographic processes evolve in a three dimensional space and time continuum. However, when they are represented with the aid of geographic information systems (GIS) or geosimulation models they are modelled in a framework of two-dimensional space with an added temporal component. The objective of this study is to propose the design and implementation of voxel-based automata as a methodological approach for representing spatial processes evolving in the four-dimensional (4D) space-time domain. Similar to geographic automata models which are developed to capture and forecast geospatial processes that change in a two-dimensional spatial framework using cells (raster geospatial data), voxel automata rely on the automata theory and use three-dimensional volumetric units (voxels). Transition rules have been developed to represent various spatial processes which range from the movement of an object in 3D to the diffusion of airborne particles and landslide simulation. In addition, the proposed 4D models demonstrate that complex processes can be readily reproduced from simple transition functions without complex methodological approaches. The voxel-based automata approach provides a unique basis to model geospatial processes in 4D for the purpose of improving representation, analysis and understanding their spatiotemporal dynamics. This study contributes to the advancement of the concepts and framework of 4D GIS.
NASA Astrophysics Data System (ADS)
Smith, R.; Kasprzyk, J. R.; Zagona, E. A.
2013-12-01
Population growth and climate change, combined with difficulties in building new infrastructure, motivate portfolio-based solutions to ensuring sufficient water supply. Powerful simulation models with graphical user interfaces (GUI) are often used to evaluate infrastructure portfolios; these GUI based models require manual modification of the system parameters, such as reservoir operation rules, water transfer schemes, or system capacities. Multiobjective evolutionary algorithm (MOEA) based optimization can be employed to balance multiple objectives and automatically suggest designs for infrastructure systems, but MOEA based decision support typically uses a fixed problem formulation (i.e., a single set of objectives, decisions, and constraints). This presentation suggests a dynamic framework for linking GUI-based infrastructure models with MOEA search. The framework begins with an initial formulation which is solved using a MOEA. Then, stakeholders can interact with candidate solutions, viewing their properties in the GUI model. This is followed by changes in the formulation which represent users' evolving understanding of exigent system properties. Our case study is built using RiverWare, an object-oriented, data-centered model that facilitates the representation of a diverse array of water resources systems. Results suggest that assumptions within the initial MOEA search are violated after investigating tradeoffs and reveal how formulations should be modified to better capture stakeholders' preferences.
Rule groupings in expert systems using nearest neighbour decision rules, and convex hulls
NASA Technical Reports Server (NTRS)
Anastasiadis, Stergios
1991-01-01
Expert System shells are lacking in many areas of software engineering. Large rule based systems are not semantically comprehensible, difficult to debug, and impossible to modify or validate. Partitioning a set of rules found in CLIPS (C Language Integrated Production System) into groups of rules which reflect the underlying semantic subdomains of the problem, will address adequately the concerns stated above. Techniques are introduced to structure a CLIPS rule base into groups of rules that inherently have common semantic information. The concepts involved are imported from the field of A.I., Pattern Recognition, and Statistical Inference. Techniques focus on the areas of feature selection, classification, and a criteria of how 'good' the classification technique is, based on Bayesian Decision Theory. A variety of distance metrics are discussed for measuring the 'closeness' of CLIPS rules and various Nearest Neighbor classification algorithms are described based on the above metric.
A logical model of cooperating rule-based systems
NASA Technical Reports Server (NTRS)
Bailin, Sidney C.; Moore, John M.; Hilberg, Robert H.; Murphy, Elizabeth D.; Bahder, Shari A.
1989-01-01
A model is developed to assist in the planning, specification, development, and verification of space information systems involving distributed rule-based systems. The model is based on an analysis of possible uses of rule-based systems in control centers. This analysis is summarized as a data-flow model for a hypothetical intelligent control center. From this data-flow model, the logical model of cooperating rule-based systems is extracted. This model consists of four layers of increasing capability: (1) communicating agents, (2) belief-sharing knowledge sources, (3) goal-sharing interest areas, and (4) task-sharing job roles.
NASA Astrophysics Data System (ADS)
García-Morales, Vladimir; Manzanares, José A.; Mafe, Salvador
2017-04-01
We present a weakly coupled map lattice model for patterning that explores the effects exerted by weakening the local dynamic rules on model biological and artificial networks composed of two-state building blocks (cells). To this end, we use two cellular automata models based on (i) a smooth majority rule (model I) and (ii) a set of rules similar to those of Conway's Game of Life (model II). The normal and abnormal cell states evolve according to local rules that are modulated by a parameter κ . This parameter quantifies the effective weakening of the prescribed rules due to the limited coupling of each cell to its neighborhood and can be experimentally controlled by appropriate external agents. The emergent spatiotemporal maps of single-cell states should be of significance for positional information processes as well as for intercellular communication in tumorigenesis, where the collective normalization of abnormal single-cell states by a predominantly normal neighborhood may be crucial.
García-Morales, Vladimir; Manzanares, José A; Mafe, Salvador
2017-04-01
We present a weakly coupled map lattice model for patterning that explores the effects exerted by weakening the local dynamic rules on model biological and artificial networks composed of two-state building blocks (cells). To this end, we use two cellular automata models based on (i) a smooth majority rule (model I) and (ii) a set of rules similar to those of Conway's Game of Life (model II). The normal and abnormal cell states evolve according to local rules that are modulated by a parameter κ. This parameter quantifies the effective weakening of the prescribed rules due to the limited coupling of each cell to its neighborhood and can be experimentally controlled by appropriate external agents. The emergent spatiotemporal maps of single-cell states should be of significance for positional information processes as well as for intercellular communication in tumorigenesis, where the collective normalization of abnormal single-cell states by a predominantly normal neighborhood may be crucial.
A neural network architecture for implementation of expert systems for real time monitoring
NASA Technical Reports Server (NTRS)
Ramamoorthy, P. A.
1991-01-01
Since neural networks have the advantages of massive parallelism and simple architecture, they are good tools for implementing real time expert systems. In a rule based expert system, the antecedents of rules are in the conjunctive or disjunctive form. We constructed a multilayer feedforward type network in which neurons represent AND or OR operations of rules. Further, we developed a translator which can automatically map a given rule base into the network. Also, we proposed a new and powerful yet flexible architecture that combines the advantages of both fuzzy expert systems and neural networks. This architecture uses the fuzzy logic concepts to separate input data domains into several smaller and overlapped regions. Rule-based expert systems for time critical applications using neural networks, the automated implementation of rule-based expert systems with neural nets, and fuzzy expert systems vs. neural nets are covered.
SIRE: A Simple Interactive Rule Editor for NICBES
NASA Technical Reports Server (NTRS)
Bykat, Alex
1988-01-01
To support evolution of domain expertise, and its representation in an expert system knowledge base, a user-friendly rule base editor is mandatory. The Nickel Cadmium Battery Expert System (NICBES), a prototype of an expert system for the Hubble Space Telescope power storage management system, does not provide such an editor. In the following, a description of a Simple Interactive Rule Base Editor (SIRE) for NICBES is described. The SIRE provides a consistent internal representation of the NICBES knowledge base. It supports knowledge presentation and provides a user-friendly and code language independent medium for rule addition and modification. The SIRE is integrated with NICBES via an interface module. This module provides translation of the internal representation to Prolog-type rules (Horn clauses), latter rule assertion, and a simple mechanism for rule selection for its Prolog inference engine.
Das, Saptarshi; Pan, Indranil; Das, Shantanu; Gupta, Amitava
2012-03-01
Genetic algorithm (GA) has been used in this study for a new approach of suboptimal model reduction in the Nyquist plane and optimal time domain tuning of proportional-integral-derivative (PID) and fractional-order (FO) PI(λ)D(μ) controllers. Simulation studies show that the new Nyquist-based model reduction technique outperforms the conventional H(2)-norm-based reduced parameter modeling technique. With the tuned controller parameters and reduced-order model parameter dataset, optimum tuning rules have been developed with a test-bench of higher-order processes via genetic programming (GP). The GP performs a symbolic regression on the reduced process parameters to evolve a tuning rule which provides the best analytical expression to map the data. The tuning rules are developed for a minimum time domain integral performance index described by a weighted sum of error index and controller effort. From the reported Pareto optimal front of the GP-based optimal rule extraction technique, a trade-off can be made between the complexity of the tuning formulae and the control performance. The efficacy of the single-gene and multi-gene GP-based tuning rules has been compared with the original GA-based control performance for the PID and PI(λ)D(μ) controllers, handling four different classes of representative higher-order processes. These rules are very useful for process control engineers, as they inherit the power of the GA-based tuning methodology, but can be easily calculated without the requirement for running the computationally intensive GA every time. Three-dimensional plots of the required variation in PID/fractional-order PID (FOPID) controller parameters with reduced process parameters have been shown as a guideline for the operator. Parametric robustness of the reported GP-based tuning rules has also been shown with credible simulation examples. Copyright © 2011 ISA. Published by Elsevier Ltd. All rights reserved.
Analysis, Simulation, and Verification of Knowledge-Based, Rule-Based, and Expert Systems
NASA Technical Reports Server (NTRS)
Hinchey, Mike; Rash, James; Erickson, John; Gracanin, Denis; Rouff, Chris
2010-01-01
Mathematically sound techniques are used to view a knowledge-based system (KBS) as a set of processes executing in parallel and being enabled in response to specific rules being fired. The set of processes can be manipulated, examined, analyzed, and used in a simulation. The tool that embodies this technology may warn developers of errors in their rules, but may also highlight rules (or sets of rules) in the system that are underspecified (or overspecified) and need to be corrected for the KBS to operate as intended. The rules embodied in a KBS specify the allowed situations, events, and/or results of the system they describe. In that sense, they provide a very abstract specification of a system. The system is implemented through the combination of the system specification together with an appropriate inference engine, independent of the algorithm used in that inference engine. Viewing the rule base as a major component of the specification, and choosing an appropriate specification notation to represent it, reveals how additional power can be derived from an approach to the knowledge-base system that involves analysis, simulation, and verification. This innovative approach requires no special knowledge of the rules, and allows a general approach where standardized analysis, verification, simulation, and model checking techniques can be applied to the KBS.
ERIC Educational Resources Information Center
Assié-Lumumba, N'Dri Thérèse
2016-01-01
This paper is a reflection that critically examines the dynamics of education and the struggle by African people for freedom, control of the mind, self-definition and the right to determine their own destiny from the start of colonial rule to the present. The primary methodological approach is historical structuralism, which stipulates that social…
1991-02-01
3 2.2 Hybrid Rule/Fact Schemas .............................................................. 3 3 THE LIMITATIONS OF RULE BASED KNOWLEDGE...or hybrid rule/fact schemas. 2 UNCLASSIFIED .WA UNCLASSIFIED ERL-0520-RR 2.1 Propositional Logic The simplest form of production-rules are based upon...requirements which may lead to poor system performance. 2.2 Hybrid Rule/Fact Schemas Hybrid rule/fact relationships (also known as Predicate Calculus ) have
Research on key technology of the verification system of steel rule based on vision measurement
NASA Astrophysics Data System (ADS)
Jia, Siyuan; Wang, Zhong; Liu, Changjie; Fu, Luhua; Li, Yiming; Lu, Ruijun
2018-01-01
The steel rule plays an important role in quantity transmission. However, the traditional verification method of steel rule based on manual operation and reading brings about low precision and low efficiency. A machine vison based verification system of steel rule is designed referring to JJG1-1999-Verificaiton Regulation of Steel Rule [1]. What differentiates this system is that it uses a new calibration method of pixel equivalent and decontaminates the surface of steel rule. Experiments show that these two methods fully meet the requirements of the verification system. Measuring results strongly prove that these methods not only meet the precision of verification regulation, but also improve the reliability and efficiency of the verification system.
Suzuki, Hideaki; Ono, Naoaki; Yuta, Kikuo
2003-01-01
In order for an artificial life (Alife) system to evolve complex creatures, an artificial environment prepared by a designer has to satisfy several conditions. To clarify this requirement, we first assume that an artificial environment implemented in the computational medium is composed of an information space in which elementary symbols move around and react with each other according to human-prepared elementary rules. As fundamental properties of these factors (space, symbols, transportation, and reaction), we present ten criteria from a comparison with the biochemical reaction space in the real world. Then, in the latter half of the article, we take several computational Alife systems one by one, and assess them in terms of the proposed criteria. The assessment can be used not only for improving previous Alife systems but also for devising new Alife models in which complex forms of artificial creatures can be expected to evolve.
Modeling the prediction of business intelligence system effectiveness.
Weng, Sung-Shun; Yang, Ming-Hsien; Koo, Tian-Lih; Hsiao, Pei-I
2016-01-01
Although business intelligence (BI) technologies are continually evolving, the capability to apply BI technologies has become an indispensable resource for enterprises running in today's complex, uncertain and dynamic business environment. This study performed pioneering work by constructing models and rules for the prediction of business intelligence system effectiveness (BISE) in relation to the implementation of BI solutions. For enterprises, effectively managing critical attributes that determine BISE to develop prediction models with a set of rules for self-evaluation of the effectiveness of BI solutions is necessary to improve BI implementation and ensure its success. The main study findings identified the critical prediction indicators of BISE that are important to forecasting BI performance and highlighted five classification and prediction rules of BISE derived from decision tree structures, as well as a refined regression prediction model with four critical prediction indicators constructed by logistic regression analysis that can enable enterprises to improve BISE while effectively managing BI solution implementation and catering to academics to whom theory is important.
NASA Technical Reports Server (NTRS)
Ramamoorthy, P. A.; Huang, Song; Govind, Girish
1991-01-01
In fault diagnosis, control and real-time monitoring, both timing and accuracy are critical for operators or machines to reach proper solutions or appropriate actions. Expert systems are becoming more popular in the manufacturing community for dealing with such problems. In recent years, neural networks have revived and their applications have spread to many areas of science and engineering. A method of using neural networks to implement rule-based expert systems for time-critical applications is discussed here. This method can convert a given rule-based system into a neural network with fixed weights and thresholds. The rules governing the translation are presented along with some examples. We also present the results of automated machine implementation of such networks from the given rule-base. This significantly simplifies the translation process to neural network expert systems from conventional rule-based systems. Results comparing the performance of the proposed approach based on neural networks vs. the classical approach are given. The possibility of very large scale integration (VLSI) realization of such neural network expert systems is also discussed.
Strategies for adding adaptive learning mechanisms to rule-based diagnostic expert systems
NASA Technical Reports Server (NTRS)
Stclair, D. C.; Sabharwal, C. L.; Bond, W. E.; Hacke, Keith
1988-01-01
Rule-based diagnostic expert systems can be used to perform many of the diagnostic chores necessary in today's complex space systems. These expert systems typically take a set of symptoms as input and produce diagnostic advice as output. The primary objective of such expert systems is to provide accurate and comprehensive advice which can be used to help return the space system in question to nominal operation. The development and maintenance of diagnostic expert systems is time and labor intensive since the services of both knowledge engineer(s) and domain expert(s) are required. The use of adaptive learning mechanisms to increment evaluate and refine rules promises to reduce both time and labor costs associated with such systems. This paper describes the basic adaptive learning mechanisms of strengthening, weakening, generalization, discrimination, and discovery. Next basic strategies are discussed for adding these learning mechanisms to rule-based diagnostic expert systems. These strategies support the incremental evaluation and refinement of rules in the knowledge base by comparing the set of advice given by the expert system (A) with the correct diagnosis (C). Techniques are described for selecting those rules in the in the knowledge base which should participate in adaptive learning. The strategies presented may be used with a wide variety of learning algorithms. Further, these strategies are applicable to a large number of rule-based diagnostic expert systems. They may be used to provide either immediate or deferred updating of the knowledge base.
Recommendation System Based On Association Rules For Distributed E-Learning Management Systems
NASA Astrophysics Data System (ADS)
Mihai, Gabroveanu
2015-09-01
Traditional Learning Management Systems are installed on a single server where learning materials and user data are kept. To increase its performance, the Learning Management System can be installed on multiple servers; learning materials and user data could be distributed across these servers obtaining a Distributed Learning Management System. In this paper is proposed the prototype of a recommendation system based on association rules for Distributed Learning Management System. Information from LMS databases is analyzed using distributed data mining algorithms in order to extract the association rules. Then the extracted rules are used as inference rules to provide personalized recommendations. The quality of provided recommendations is improved because the rules used to make the inferences are more accurate, since these rules aggregate knowledge from all e-Learning systems included in Distributed Learning Management System.
NASA Astrophysics Data System (ADS)
Serrano, Rafael; González, Luis Carlos; Martín, Francisco Jesús
2009-11-01
Under the project SENSOR-IA which has had financial funding from the Order of Incentives to the Regional Technology Centers of the Counsil of Innovation, Science and Enterprise of Andalusia, an architecture for the optimization of a machining process in real time through rule-based expert system has been developed. The architecture consists of an acquisition system and sensor data processing engine (SATD) from an expert system (SE) rule-based which communicates with the SATD. The SE has been designed as an inference engine with an algorithm for effective action, using a modus ponens rule model of goal-oriented rules.The pilot test demonstrated that it is possible to govern in real time the machining process based on rules contained in a SE. The tests have been done with approximated rules. Future work includes an exhaustive collection of data with different tool materials and geometries in a database to extract more precise rules.
An expert system for natural language processing
NASA Technical Reports Server (NTRS)
Hennessy, John F.
1988-01-01
A solution to the natural language processing problem that uses a rule based system, written in OPS5, to replace the traditional parsing method is proposed. The advantage to using a rule based system are explored. Specifically, the extensibility of a rule based solution is discussed as well as the value of maintaining rules that function independently. Finally, the power of using semantics to supplement the syntactic analysis of a sentence is considered.
NASA Astrophysics Data System (ADS)
Silva, Paulo
2018-05-01
In many societies, informality has been a relevant part of the construction of the urban fabric. This is valid along a city’s history and in recent urbanization processes. In the past, informality was in the origin of many of urban planning. Very soon urban planning adopted, as one of their main missions malfunctions in cities. Therefore, the need of formalization became one of the main reasons on the emergence, the control of informal processes. As an answer to informal individual solutions, urban planning responded with standardized rules and the urge of creating spaces fitting into pre-established rules instead of rules fitting into spaces. Urban planning as a discipline has gradually changed its path. The contrast between urbanization promoted under formal urban planning and informal urbanization is only one sign of the mismatch between urban planning actions and informal urbanization dynamics. Considering this tension between formal and informal dynamics, in some cases, planning rules and planning processes continue ignoring informal dynamics; in other cases, planning rules are designed to integrate informality “without losing its face” through “planning games” [1]; and a third and less explored way in which planning systems interact with informality and from that interaction learn how to improve (we consider it a process of enrichment) planning rules while they promote an upgrade of informal interventions [2]. This latter win-win situation in which both informal and formal systems benefit from their interaction is still rare: most of the time either only one side benefits or none benefit from the interaction. Nevertheless, there are signs that from this interaction co-dependent adaptation might occur with positive outcomes for the urban system – in which co-evolutionary dynamics can be traced. We propose to look at the way building rules have been designed in Europe in a context considered successful in the sense of dealing of informality – the one of Portugal. The country experienced a wave of informality associated with illegal urbanization since the 1960’s in the main urban areas. The process of interaction between informal and formal urban systems proved to be a success in statistic terms. Slum clearance reduced the existence of informal occupations to almost zero. Informal settlements involving land tenure have been dealt with in the last two decades with considerable positive impact in the urban fabric. Based on this, with this paper we will evaluate how informal and formal systems are impacting each other and changing along the time the shape of building and of planning rules. For this we will look at the planning tools created to formalize informal settlements in the Lisbon Metropolitan Area from the last forty years to see how urban and building rules were adapted to respond to the specific needs of informal settlements; how this adaptation moved from temporary and exceptional to permanent rules; finally, how were these new rules able to “contaminate” the general planning and building codes. We aim that these findings would help us to contribute to a “healthier” relation between formal and informal urban systems, not ignoring each other, not controlling each other but instead learning with each other. By achieving this, planning systems become more responsive; on the other hand, informal occupations can be upgraded without being destroyed with the contribution of the planning systems.
Instruction-matrix-based genetic programming.
Li, Gang; Wang, Jin Feng; Lee, Kin Hong; Leung, Kwong-Sak
2008-08-01
In genetic programming (GP), evolving tree nodes separately would reduce the huge solution space. However, tree nodes are highly interdependent with respect to their fitness. In this paper, we propose a new GP framework, namely, instruction-matrix (IM)-based GP (IMGP), to handle their interactions. IMGP maintains an IM to evolve tree nodes and subtrees separately. IMGP extracts program trees from an IM and updates the IM with the information of the extracted program trees. As the IM actually keeps most of the information of the schemata of GP and evolves the schemata directly, IMGP is effective and efficient. Our experimental results on benchmark problems have verified that IMGP is not only better than those of canonical GP in terms of the qualities of the solutions and the number of program evaluations, but they are also better than some of the related GP algorithms. IMGP can also be used to evolve programs for classification problems. The classifiers obtained have higher classification accuracies than four other GP classification algorithms on four benchmark classification problems. The testing errors are also comparable to or better than those obtained with well-known classifiers. Furthermore, an extended version, called condition matrix for rule learning, has been used successfully to handle multiclass classification problems.
Spatio-Temporal Pattern Mining on Trajectory Data Using Arm
NASA Astrophysics Data System (ADS)
Khoshahval, S.; Farnaghi, M.; Taleai, M.
2017-09-01
Preliminary mobile was considered to be a device to make human connections easier. But today the consumption of this device has been evolved to a platform for gaming, web surfing and GPS-enabled application capabilities. Embedding GPS in handheld devices, altered them to significant trajectory data gathering facilities. Raw GPS trajectory data is a series of points which contains hidden information. For revealing hidden information in traces, trajectory data analysis is needed. One of the most beneficial concealed information in trajectory data is user activity patterns. In each pattern, there are multiple stops and moves which identifies users visited places and tasks. This paper proposes an approach to discover user daily activity patterns from GPS trajectories using association rules. Finding user patterns needs extraction of user's visited places from stops and moves of GPS trajectories. In order to locate stops and moves, we have implemented a place recognition algorithm. After extraction of visited points an advanced association rule mining algorithm, called Apriori was used to extract user activity patterns. This study outlined that there are useful patterns in each trajectory that can be emerged from raw GPS data using association rule mining techniques in order to find out about multiple users' behaviour in a system and can be utilized in various location-based applications.
NASA Technical Reports Server (NTRS)
Bochsler, Daniel C.
1988-01-01
A complete listing is given of the expert system rules for the Entry phase of the Onboard Navigation (ONAV) Ground Based Expert Trainer System for aircraft/space shuttle navigation. These source listings appear in the same format as utilized and required by the C Language Integrated Production System (CLIPS) expert system shell which is the basis for the ONAV entry system. A schematic overview is given of how the rules are organized. These groups result from a partitioning of the rules according to the overall function which a given set of rules performs. This partitioning was established and maintained according to that established in the knowledge specification document. In addition, four other groups of rules are specified. The four groups (control flow, operator inputs, output management, and data tables) perform functions that affect all the other functional rule groups. As the name implies, control flow ensures that the rule groups are executed in the order required for proper operation; operator input rules control the introduction into the CLIPS fact base of various kinds of data required by the expert system; output management rules control the updating of the ONAV expert system user display screen during execution of the system; and data tables are static information utilized by many different rule sets gathered in one convenient place.
The Dynamics of Interacting Swarms
2018-04-04
Unlimited 16 Ira Schwartzt (202) 404-8359 Swarms are self-organized dynamical coupled agents which evolve from simple rules of communication. They are ...when delay is introduced to the communicating agents. One of our major findings is that interacting swarms are far less likely to flock cohesively if...they are coupled with delay. In addition, parameter ranges based on coupling strength, incidence angle of collision, and delay change dramatically
Autonomous Learning in Mobile Cognitive Machines
2017-11-25
2016). [18] Liu, Wei, et al. "Ssd: Single shot multibox detector." European conference on computer vision. Springer, Cham, 2016. [19] Vinyals, Oriol...the brain being evolved to support its mobility has been raised. In fact, as the project progressed, the researchers discovered that if one of the...deductive, relies on rule-based programming, and can solve complex problems, however, faces difficulties in learning and adaptability. The latter
Implementation of artificial intelligence rules in a data base management system
NASA Technical Reports Server (NTRS)
Feyock, S.
1986-01-01
The intelligent front end prototype was transformed into a RIM-integrated system. A RIM-based expert system was written which demonstrated the developed capability. The use of rules to produce extensibility of the intelligent front end, including the concept of demons and rule manipulation rules were investigated. Innovative approaches such as syntax programming were to be considered.
Techniques and implementation of the embedded rule-based expert system using Ada
NASA Technical Reports Server (NTRS)
Liberman, Eugene M.; Jones, Robert E.
1991-01-01
Ada is becoming an increasingly popular programming language for large Government-funded software projects. Ada with its portability, transportability, and maintainability lends itself well to today's complex programming environment. In addition, expert systems have also assured a growing role in providing human-like reasoning capability and expertise for computer systems. The integration of expert system technology with Ada programming language, specifically a rule-based expert system using an ART-Ada (Automated Reasoning Tool for Ada) system shell is discussed. The NASA Lewis Research Center was chosen as a beta test site for ART-Ada. The test was conducted by implementing the existing Autonomous Power EXpert System (APEX), a Lisp-base power expert system, in ART-Ada. Three components, the rule-based expert system, a graphics user interface, and communications software make up SMART-Ada (Systems fault Management with ART-Ada). The main objective, to conduct a beta test on the ART-Ada rule-based expert system shell, was achieved. The system is operational. New Ada tools will assist in future successful projects. ART-Ada is one such tool and is a viable alternative to the straight Ada code when an application requires a rule-based or knowledge-based approach.
Cosmides, Leda; Barrett, H Clark; Tooby, John
2010-05-11
Blank-slate theories of human intelligence propose that reasoning is carried out by general-purpose operations applied uniformly across contents. An evolutionary approach implies a radically different model of human intelligence. The task demands of different adaptive problems select for functionally specialized problem-solving strategies, unleashing massive increases in problem-solving power for ancestrally recurrent adaptive problems. Because exchange can evolve only if cooperators can detect cheaters, we hypothesized that the human mind would be equipped with a neurocognitive system specialized for reasoning about social exchange. Whereas humans perform poorly when asked to detect violations of most conditional rules, we predicted and found a dramatic spike in performance when the rule specifies an exchange and violations correspond to cheating. According to critics, people's uncanny accuracy at detecting violations of social exchange rules does not reflect a cheater detection mechanism, but extends instead to all rules regulating when actions are permitted (deontic conditionals). Here we report experimental tests that falsify these theories by demonstrating that deontic rules as a class do not elicit the search for violations. We show that the cheater detection system functions with pinpoint accuracy, searching for violations of social exchange rules only when these are likely to reveal the presence of someone who intends to cheat. It does not search for violations of social exchange rules when these are accidental, when they do not benefit the violator, or when the situation would make cheating difficult.
Adaptive specializations, social exchange, and the evolution of human intelligence
Cosmides, Leda; Barrett, H. Clark; Tooby, John
2010-01-01
Blank-slate theories of human intelligence propose that reasoning is carried out by general-purpose operations applied uniformly across contents. An evolutionary approach implies a radically different model of human intelligence. The task demands of different adaptive problems select for functionally specialized problem-solving strategies, unleashing massive increases in problem-solving power for ancestrally recurrent adaptive problems. Because exchange can evolve only if cooperators can detect cheaters, we hypothesized that the human mind would be equipped with a neurocognitive system specialized for reasoning about social exchange. Whereas humans perform poorly when asked to detect violations of most conditional rules, we predicted and found a dramatic spike in performance when the rule specifies an exchange and violations correspond to cheating. According to critics, people's uncanny accuracy at detecting violations of social exchange rules does not reflect a cheater detection mechanism, but extends instead to all rules regulating when actions are permitted (deontic conditionals). Here we report experimental tests that falsify these theories by demonstrating that deontic rules as a class do not elicit the search for violations. We show that the cheater detection system functions with pinpoint accuracy, searching for violations of social exchange rules only when these are likely to reveal the presence of someone who intends to cheat. It does not search for violations of social exchange rules when these are accidental, when they do not benefit the violator, or when the situation would make cheating difficult. PMID:20445099
Advocates and critics for tactical behaviors in UGV navigation
NASA Astrophysics Data System (ADS)
Hussain, Talib S.; Vidaver, Gordon; Berliner, Jeffrey
2005-05-01
Critical to the development of unmanned ground vehicle platforms is the incorporation of adaptive tactical behaviors for the planning of high-level navigation and tactical actions. BBN Technologies recently completed a simulation-based project for the Army Research Lab (ARL) in which we applied an evolutionary computation approach to navigating through a terrain to capture flag objectives while faced with one or more mobile enemies. Our Advocates and Critics for Tactical Behaviors (ACTB) system evolves plans for the vehicle that control its movement goals (in the form of waypoints), and its future actions (e.g., pointing cameras). We apply domain-specific, state-dependent genetic operators called advocates that promote specific tactical behaviors (e.g., adapt a plan to stay closer to walls). We define the fitness function as a weighted sum of a number of independent, domain-specific, state-dependent evaluation components called critics. Critics reward plans based upon specific tactical criteria, such as minimizing risk of exposure or time to the flags. Additionally, the ACTB system provides the capability for a human commander to specify the "rules of engagement" under which the vehicle will operate. The rules of engagement determine the planning emphasis required under different tactical situations (e.g., discovery of an enemy), and provide a mechanism for automatically adapting the relative selection probabilities of the advocates, the weights of the critics, and the depth of planning in response to tactical events. The ACTB system demonstrated highly effective performance in a head-to-head testing event, held by ARL, against two competing tactical behavior systems.
A Rule-Based System Implementing a Method for Translating FOL Formulas into NL Sentences
NASA Astrophysics Data System (ADS)
Mpagouli, Aikaterini; Hatzilygeroudis, Ioannis
In this paper, we mainly present the implementation of a system that translates first order logic (FOL) formulas into natural language (NL) sentences. The motivation comes from an intelligent tutoring system teaching logic as a knowledge representation language, where it is used as a means for feedback to the students-users. FOL to NL conversion is achieved by using a rule-based approach, where we exploit the pattern matching capabilities of rules. So, the system consists of rule-based modules corresponding to the phases of our translation methodology. Facts are used in a lexicon providing lexical and grammatical information that helps in producing the NL sentences. The whole system is implemented in Jess, a java-implemented rule-based programming tool. Experimental results confirm the success of our choices.
Chen, Song-Lin; Chen, Cong; Zhu, Hui; Li, Jing; Pang, Yan
2016-01-01
Cancer-related anorexia syndrome (CACS) is one of the main causes for death at present as well as a syndrome seriously harming patients' quality of life, treatment effect and survival time. In current clinical researches, there are fewer reports about empirical traditional Chinese medicine(TCM) prescriptions and patent prescriptions treating CACS, and prescription rules are rarely analyzed in a systematic manner. As the hidden rules are not excavated, it is hard to have an innovative discovery and knowledge of clinical medication. In this paper, the grey screening method combined with the multivariate statistical method was used to build the ″CACS prescriptions database″. Based on the database, totally 359 prescriptions were selected, the frequency of herbs in prescription was determined, and commonly combined drugs were evolved into 4 new prescriptions for different syndromes. Prescriptions of TCM in treatment of CACS gave priority to benefiting qi for strengthening spleen, also laid emphasis on replenishing kidney essence, dispersing stagnated liver-qi and dispersing lung-qi. Moreover, interdependence and mutual promotion of yin and yang should be taken into account to reflect TCM's holism and theory for treatment based on syndrome differentiation. The grey screening method, as a valuable traditional Chinese medicine research-supporting method, can be used to subjectively and objectively analyze prescription rules; and the new prescriptions can provide reference for the clinical use of TCM for treating CACS and the drug development. Copyright© by the Chinese Pharmaceutical Association.
Role of Utility and Inference in the Evolution of Functional Information
Sharov, Alexei A.
2009-01-01
Functional information means an encoded network of functions in living organisms from molecular signaling pathways to an organism’s behavior. It is represented by two components: code and an interpretation system, which together form a self-sustaining semantic closure. Semantic closure allows some freedom between components because small variations of the code are still interpretable. The interpretation system consists of inference rules that control the correspondence between the code and the function (phenotype) and determines the shape of the fitness landscape. The utility factor operates at multiple time scales: short-term selection drives evolution towards higher survival and reproduction rate within a given fitness landscape, and long-term selection favors those fitness landscapes that support adaptability and lead to evolutionary expansion of certain lineages. Inference rules make short-term selection possible by shaping the fitness landscape and defining possible directions of evolution, but they are under control of the long-term selection of lineages. Communication normally occurs within a set of agents with compatible interpretation systems, which I call communication system. Functional information cannot be directly transferred between communication systems with incompatible inference rules. Each biological species is a genetic communication system that carries unique functional information together with inference rules that determine evolutionary directions and constraints. This view of the relation between utility and inference can resolve the conflict between realism/positivism and pragmatism. Realism overemphasizes the role of inference in evolution of human knowledge because it assumes that logic is embedded in reality. Pragmatism substitutes usefulness for truth and therefore ignores the advantage of inference. The proposed concept of evolutionary pragmatism rejects the idea that logic is embedded in reality; instead, inference rules are constructed within each communication system to represent reality and they evolve towards higher adaptability on a long time scale. PMID:20160960
Autonomous Flight Rules - A Concept for Self-Separation in U.S. Domestic Airspace
NASA Technical Reports Server (NTRS)
Wing, David J.; Cotton, William B.
2011-01-01
Autonomous Flight Rules (AFR) are proposed as a new set of operating regulations in which aircraft navigate on tracks of their choice while self-separating from traffic and weather. AFR would exist alongside Instrument and Visual Flight Rules (IFR and VFR) as one of three available flight options for any appropriately trained and qualified operator with the necessary certified equipment. Historically, ground-based separation services evolved by necessity as aircraft began operating in the clouds and were unable to see each other. Today, technologies for global navigation, airborne surveillance, and onboard computing enable the functions of traffic conflict management to be fully integrated with navigation procedures onboard the aircraft. By self-separating, aircraft can operate with more flexibility and fewer restrictions than are required when using ground-based separation. The AFR concept is described in detail and provides practical means by which self-separating aircraft could share the same airspace as IFR and VFR aircraft without disrupting the ongoing processes of Air Traffic Control.
Verification and Validation of KBS with Neural Network Components
NASA Technical Reports Server (NTRS)
Wen, Wu; Callahan, John
1996-01-01
Artificial Neural Network (ANN) play an important role in developing robust Knowledge Based Systems (KBS). The ANN based components used in these systems learn to give appropriate predictions through training with correct input-output data patterns. Unlike traditional KBS that depends on a rule database and a production engine, the ANN based system mimics the decisions of an expert without specifically formulating the if-than type of rules. In fact, the ANNs demonstrate their superiority when such if-then type of rules are hard to generate by human expert. Verification of traditional knowledge based system is based on the proof of consistency and completeness of the rule knowledge base and correctness of the production engine.These techniques, however, can not be directly applied to ANN based components.In this position paper, we propose a verification and validation procedure for KBS with ANN based components. The essence of the procedure is to obtain an accurate system specification through incremental modification of the specifications using an ANN rule extraction algorithm.
A simple rule for the evolution of cooperation on graphs and social networks.
Ohtsuki, Hisashi; Hauert, Christoph; Lieberman, Erez; Nowak, Martin A
2006-05-25
A fundamental aspect of all biological systems is cooperation. Cooperative interactions are required for many levels of biological organization ranging from single cells to groups of animals. Human society is based to a large extent on mechanisms that promote cooperation. It is well known that in unstructured populations, natural selection favours defectors over cooperators. There is much current interest, however, in studying evolutionary games in structured populations and on graphs. These efforts recognize the fact that who-meets-whom is not random, but determined by spatial relationships or social networks. Here we describe a surprisingly simple rule that is a good approximation for all graphs that we have analysed, including cycles, spatial lattices, random regular graphs, random graphs and scale-free networks: natural selection favours cooperation, if the benefit of the altruistic act, b, divided by the cost, c, exceeds the average number of neighbours, k, which means b/c > k. In this case, cooperation can evolve as a consequence of 'social viscosity' even in the absence of reputation effects or strategic complexity.
Motif formation and industry specific topologies in the Japanese business firm network
NASA Astrophysics Data System (ADS)
Maluck, Julian; Donner, Reik V.; Takayasu, Hideki; Takayasu, Misako
2017-05-01
Motifs and roles are basic quantities for the characterization of interactions among 3-node subsets in complex networks. In this work, we investigate how the distribution of 3-node motifs can be influenced by modifying the rules of an evolving network model while keeping the statistics of simpler network characteristics, such as the link density and the degree distribution, invariant. We exemplify this problem for the special case of the Japanese Business Firm Network, where a well-studied and relatively simple yet realistic evolving network model is available, and compare the resulting motif distribution in the real-world and simulated networks. To better approximate the motif distribution of the real-world network in the model, we introduce both subgraph dependent and global additional rules. We find that a specific rule that allows only for the merging process between nodes with similar link directionality patterns reduces the observed excess of densely connected motifs with bidirectional links. Our study improves the mechanistic understanding of motif formation in evolving network models to better describe the characteristic features of real-world networks with a scale-free topology.
Evolution of a designless nanoparticle network into reconfigurable Boolean logic
NASA Astrophysics Data System (ADS)
Bose, S. K.; Lawrence, C. P.; Liu, Z.; Makarenko, K. S.; van Damme, R. M. J.; Broersma, H. J.; van der Wiel, W. G.
2015-12-01
Natural computers exploit the emergent properties and massive parallelism of interconnected networks of locally active components. Evolution has resulted in systems that compute quickly and that use energy efficiently, utilizing whatever physical properties are exploitable. Man-made computers, on the other hand, are based on circuits of functional units that follow given design rules. Hence, potentially exploitable physical processes, such as capacitive crosstalk, to solve a problem are left out. Until now, designless nanoscale networks of inanimate matter that exhibit robust computational functionality had not been realized. Here we artificially evolve the electrical properties of a disordered nanomaterials system (by optimizing the values of control voltages using a genetic algorithm) to perform computational tasks reconfigurably. We exploit the rich behaviour that emerges from interconnected metal nanoparticles, which act as strongly nonlinear single-electron transistors, and find that this nanoscale architecture can be configured in situ into any Boolean logic gate. This universal, reconfigurable gate would require about ten transistors in a conventional circuit. Our system meets the criteria for the physical realization of (cellular) neural networks: universality (arbitrary Boolean functions), compactness, robustness and evolvability, which implies scalability to perform more advanced tasks. Our evolutionary approach works around device-to-device variations and the accompanying uncertainties in performance. Moreover, it bears a great potential for more energy-efficient computation, and for solving problems that are very hard to tackle in conventional architectures.
An Embedded Rule-Based Diagnostic Expert System in Ada
NASA Technical Reports Server (NTRS)
Jones, Robert E.; Liberman, Eugene M.
1992-01-01
Ada is becoming an increasingly popular programming language for large Government-funded software projects. Ada with it portability, transportability, and maintainability lends itself well to today's complex programming environment. In addition, expert systems have also assumed a growing role in providing human-like reasoning capability expertise for computer systems. The integration is discussed of expert system technology with Ada programming language, especially a rule-based expert system using an ART-Ada (Automated Reasoning Tool for Ada) system shell. NASA Lewis was chosen as a beta test site for ART-Ada. The test was conducted by implementing the existing Autonomous Power EXpert System (APEX), a Lisp-based power expert system, in ART-Ada. Three components, the rule-based expert systems, a graphics user interface, and communications software make up SMART-Ada (Systems fault Management with ART-Ada). The rules were written in the ART-Ada development environment and converted to Ada source code. The graphics interface was developed with the Transportable Application Environment (TAE) Plus, which generates Ada source code to control graphics images. SMART-Ada communicates with a remote host to obtain either simulated or real data. The Ada source code generated with ART-Ada, TAE Plus, and communications code was incorporated into an Ada expert system that reads the data from a power distribution test bed, applies the rule to determine a fault, if one exists, and graphically displays it on the screen. The main objective, to conduct a beta test on the ART-Ada rule-based expert system shell, was achieved. The system is operational. New Ada tools will assist in future successful projects. ART-Ada is one such tool and is a viable alternative to the straight Ada code when an application requires a rule-based or knowledge-based approach.
Coevolving memetic algorithms: a review and progress report.
Smith, Jim E
2007-02-01
Coevolving memetic algorithms are a family of metaheuristic search algorithms in which a rule-based representation of local search (LS) is coadapted alongside candidate solutions within a hybrid evolutionary system. Simple versions of these systems have been shown to outperform other nonadaptive memetic and evolutionary algorithms on a range of problems. This paper presents a rationale for such systems and places them in the context of other recent work on adaptive memetic algorithms. It then proposes a general structure within which a population of LS algorithms can be evolved in tandem with the solutions to which they are applied. Previous research started with a simple self-adaptive system before moving on to more complex models. Results showed that the algorithm was able to discover and exploit certain forms of structure and regularities within the problems. This "metalearning" of problem features provided a means of creating highly scalable algorithms. This work is briefly reviewed to highlight some of the important findings and behaviors exhibited. Based on this analysis, new results are then presented from systems with more flexible representations, which, again, show significant improvements. Finally, the current state of, and future directions for, research in this area is discussed.
Lim, I; Walkup, R K; Vannier, M W
1993-04-01
Quantitative evaluation of upper extremity impairment, a percentage rating most often determined using a rule based procedure, has been implemented on a personal computer using an artificial intelligence, rule-based expert system (AI system). In this study, the rules given in Chapter 3 of the AMA Guides to the Evaluation of Permanent Impairment (Third Edition) were used to develop such an AI system for the Apple Macintosh. The program applies the rules from the Guides in a consistent and systematic fashion. It is faster and less error-prone than the manual method, and the results have a higher degree of precision, since intermediate values are not truncated.
Klepiszewski, K; Schmitt, T G
2002-01-01
While conventional rule based, real time flow control of sewer systems is in common use, control systems based on fuzzy logic have been used only rarely, but successfully. The intention of this study is to compare a conventional rule based control of a combined sewer system with a fuzzy logic control by using hydrodynamic simulation. The objective of both control strategies is to reduce the combined sewer overflow volume by an optimization of the utilized storage capacities of four combined sewer overflow tanks. The control systems affect the outflow of four combined sewer overflow tanks depending on the water levels inside the structures. Both systems use an identical rule base. The developed control systems are tested and optimized for a single storm event which affects heterogeneously hydraulic load conditions and local discharge. Finally the efficiencies of the two different control systems are compared for two more storm events. The results indicate that the conventional rule based control and the fuzzy control similarly reach the objective of the control strategy. In spite of the higher expense to design the fuzzy control system its use provides no advantages in this case.
Rule-based topology system for spatial databases to validate complex geographic datasets
NASA Astrophysics Data System (ADS)
Martinez-Llario, J.; Coll, E.; Núñez-Andrés, M.; Femenia-Ribera, C.
2017-06-01
A rule-based topology software system providing a highly flexible and fast procedure to enforce integrity in spatial relationships among datasets is presented. This improved topology rule system is built over the spatial extension Jaspa. Both projects are open source, freely available software developed by the corresponding author of this paper. Currently, there is no spatial DBMS that implements a rule-based topology engine (considering that the topology rules are designed and performed in the spatial backend). If the topology rules are applied in the frontend (as in many GIS desktop programs), ArcGIS is the most advanced solution. The system presented in this paper has several major advantages over the ArcGIS approach: it can be extended with new topology rules, it has a much wider set of rules, and it can mix feature attributes with topology rules as filters. In addition, the topology rule system can work with various DBMSs, including PostgreSQL, H2 or Oracle, and the logic is performed in the spatial backend. The proposed topology system allows users to check the complex spatial relationships among features (from one or several spatial layers) that require some complex cartographic datasets, such as the data specifications proposed by INSPIRE in Europe and the Land Administration Domain Model (LADM) for Cadastral data.
A new hybrid case-based reasoning approach for medical diagnosis systems.
Sharaf-El-Deen, Dina A; Moawad, Ibrahim F; Khalifa, M E
2014-02-01
Case-Based Reasoning (CBR) has been applied in many different medical applications. Due to the complexities and the diversities of this domain, most medical CBR systems become hybrid. Besides, the case adaptation process in CBR is often a challenging issue as it is traditionally carried out manually by domain experts. In this paper, a new hybrid case-based reasoning approach for medical diagnosis systems is proposed to improve the accuracy of the retrieval-only CBR systems. The approach integrates case-based reasoning and rule-based reasoning, and also applies the adaptation process automatically by exploiting adaptation rules. Both adaptation rules and reasoning rules are generated from the case-base. After solving a new case, the case-base is expanded, and both adaptation and reasoning rules are updated. To evaluate the proposed approach, a prototype was implemented and experimented to diagnose breast cancer and thyroid diseases. The final results show that the proposed approach increases the diagnosing accuracy of the retrieval-only CBR systems, and provides a reliable accuracy comparing to the current breast cancer and thyroid diagnosis systems.
A self-learning rule base for command following in dynamical systems
NASA Technical Reports Server (NTRS)
Tsai, Wei K.; Lee, Hon-Mun; Parlos, Alexander
1992-01-01
In this paper, a self-learning Rule Base for command following in dynamical systems is presented. The learning is accomplished though reinforcement learning using an associative memory called SAM. The main advantage of SAM is that it is a function approximator with explicit storage of training samples. A learning algorithm patterned after the dynamic programming is proposed. Two artificially created, unstable dynamical systems are used for testing, and the Rule Base was used to generate a feedback control to improve the command following ability of the otherwise uncontrolled systems. The numerical results are very encouraging. The controlled systems exhibit a more stable behavior and a better capability to follow reference commands. The rules resulting from the reinforcement learning are explicitly stored and they can be modified or augmented by human experts. Due to overlapping storage scheme of SAM, the stored rules are similar to fuzzy rules.
Empirical Analysis and Refinement of Expert System Knowledge Bases
1988-08-31
refinement. Both a simulated case generation program, and a random rule basher were developed to enhance rule refinement experimentation. *Substantial...the second fiscal year 88 objective was fully met. Rule Refinement System Simulated Rule Basher Case Generator Stored Cases Expert System Knowledge...generated until the rule is satisfied. Cases may be randomly generated for a given rule or hypothesis. Rule Basher Given that one has a correct
NASA Technical Reports Server (NTRS)
Hruska, S. I.; Dalke, A.; Ferguson, J. J.; Lacher, R. C.
1991-01-01
Rule-based expert systems may be structurally and functionally mapped onto a special class of neural networks called expert networks. This mapping lends itself to adaptation of connectionist learning strategies for the expert networks. A parsing algorithm to translate C Language Integrated Production System (CLIPS) rules into a network of interconnected assertion and operation nodes has been developed. The translation of CLIPS rules to an expert network and back again is illustrated. Measures of uncertainty similar to those rules in MYCIN-like systems are introduced into the CLIPS system and techniques for combining and hiring nodes in the network based on rule-firing with these certainty factors in the expert system are presented. Several learning algorithms are under study which automate the process of attaching certainty factors to rules.
Feedback-induced phase transitions in active heterogeneous conductors.
Ocko, Samuel A; Mahadevan, L
2015-04-03
An active conducting medium is one where the resistance (conductance) of the medium is modified by the current (flow) and in turn modifies the flow, so that the classical linear laws relating current and resistance, e.g., Ohm's law or Darcy's law, are modified over time as the system itself evolves. We consider a minimal model for this feedback coupling in terms of two parameters that characterize the way in which addition or removal of matter follows a simple local (or nonlocal) feedback rule corresponding to either flow-seeking or flow-avoiding behavior. Using numerical simulations and a continuum mean field theory, we show that flow-avoiding feedback causes an initially uniform system to become strongly heterogeneous via a tunneling (channel-building) phase separation; flow-seeking feedback leads to an immuring (wall-building) phase separation. Our results provide a qualitative explanation for the patterning of active conducting media in natural systems, while suggesting ways to realize complex architectures using simple rules in engineered systems.
Modeling the stylized facts in finance through simple nonlinear adaptive systems
Hommes, Cars H.
2002-01-01
Recent work on adaptive systems for modeling financial markets is discussed. Financial markets are viewed as evolutionary systems between different, competing trading strategies. Agents are boundedly rational in the sense that they tend to follow strategies that have performed well, according to realized profits or accumulated wealth, in the recent past. Simple technical trading rules may survive evolutionary competition in a heterogeneous world where prices and beliefs co-evolve over time. Evolutionary models can explain important stylized facts, such as fat tails, clustered volatility, and long memory, of real financial series. PMID:12011401
Anisotropic invasion and its consequences in two-strategy evolutionary games on a square lattice
NASA Astrophysics Data System (ADS)
Szabó, György; Varga, Levente; Szabó, Mátyás
2016-11-01
We have studied invasion processes in two-strategy evolutionary games on a square lattice for imitation rule when the players interact with their nearest neighbors. Monte Carlo simulations are performed for systems where the pair interactions are composed of a unit strength coordination game when varying the strengths of the self-dependent and cross-dependent components at a fixed noise level. The visualization of strategy distributions has clearly indicated that circular homogeneous domains evolve into squares with an orientation dependent on the composition. This phenomenon is related to the anisotropy of invasion velocities along the interfaces separating the two homogeneous regions. The quantified invasion velocities indicate the existence of a parameter region in which the invasions are opposite for the horizontal (or vertical) and the tilted interfaces. In this parameter region faceted islands of both strategies shrink and the system evolves from a random initial state into the homogeneous state that first percolated.
Research on complex 3D tree modeling based on L-system
NASA Astrophysics Data System (ADS)
Gang, Chen; Bin, Chen; Yuming, Liu; Hui, Li
2018-03-01
L-system as a fractal iterative system could simulate complex geometric patterns. Based on the field observation data of trees and knowledge of forestry experts, this paper extracted modeling constraint rules and obtained an L-system rules set. Using the self-developed L-system modeling software the L-system rule set was parsed to generate complex tree 3d models.The results showed that the geometrical modeling method based on l-system could be used to describe the morphological structure of complex trees and generate 3D tree models.
RuleMonkey: software for stochastic simulation of rule-based models
2010-01-01
Background The system-level dynamics of many molecular interactions, particularly protein-protein interactions, can be conveniently represented using reaction rules, which can be specified using model-specification languages, such as the BioNetGen language (BNGL). A set of rules implicitly defines a (bio)chemical reaction network. The reaction network implied by a set of rules is often very large, and as a result, generation of the network implied by rules tends to be computationally expensive. Moreover, the cost of many commonly used methods for simulating network dynamics is a function of network size. Together these factors have limited application of the rule-based modeling approach. Recently, several methods for simulating rule-based models have been developed that avoid the expensive step of network generation. The cost of these "network-free" simulation methods is independent of the number of reactions implied by rules. Software implementing such methods is now needed for the simulation and analysis of rule-based models of biochemical systems. Results Here, we present a software tool called RuleMonkey, which implements a network-free method for simulation of rule-based models that is similar to Gillespie's method. The method is suitable for rule-based models that can be encoded in BNGL, including models with rules that have global application conditions, such as rules for intramolecular association reactions. In addition, the method is rejection free, unlike other network-free methods that introduce null events, i.e., steps in the simulation procedure that do not change the state of the reaction system being simulated. We verify that RuleMonkey produces correct simulation results, and we compare its performance against DYNSTOC, another BNGL-compliant tool for network-free simulation of rule-based models. We also compare RuleMonkey against problem-specific codes implementing network-free simulation methods. Conclusions RuleMonkey enables the simulation of rule-based models for which the underlying reaction networks are large. It is typically faster than DYNSTOC for benchmark problems that we have examined. RuleMonkey is freely available as a stand-alone application http://public.tgen.org/rulemonkey. It is also available as a simulation engine within GetBonNie, a web-based environment for building, analyzing and sharing rule-based models. PMID:20673321
76 FR 22633 - Retail Foreign Exchange Transactions
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-22
... margin. A national bank's relationship with a retail forex customer may evolve out of a prior... currency with retail customers. The proposed rule also describes various requirements with which national... CEA with a retail customer \\5\\ except pursuant to a rule or regulation of a Federal regulatory agency...
A Swarm Optimization approach for clinical knowledge mining.
Christopher, J Jabez; Nehemiah, H Khanna; Kannan, A
2015-10-01
Rule-based classification is a typical data mining task that is being used in several medical diagnosis and decision support systems. The rules stored in the rule base have an impact on classification efficiency. Rule sets that are extracted with data mining tools and techniques are optimized using heuristic or meta-heuristic approaches in order to improve the quality of the rule base. In this work, a meta-heuristic approach called Wind-driven Swarm Optimization (WSO) is used. The uniqueness of this work lies in the biological inspiration that underlies the algorithm. WSO uses Jval, a new metric, to evaluate the efficiency of a rule-based classifier. Rules are extracted from decision trees. WSO is used to obtain different permutations and combinations of rules whereby the optimal ruleset that satisfies the requirement of the developer is used for predicting the test data. The performance of various extensions of decision trees, namely, RIPPER, PART, FURIA and Decision Tables are analyzed. The efficiency of WSO is also compared with the traditional Particle Swarm Optimization. Experiments were carried out with six benchmark medical datasets. The traditional C4.5 algorithm yields 62.89% accuracy with 43 rules for liver disorders dataset where as WSO yields 64.60% with 19 rules. For Heart disease dataset, C4.5 is 68.64% accurate with 98 rules where as WSO is 77.8% accurate with 34 rules. The normalized standard deviation for accuracy of PSO and WSO are 0.5921 and 0.5846 respectively. WSO provides accurate and concise rulesets. PSO yields results similar to that of WSO but the novelty of WSO lies in its biological motivation and it is customization for rule base optimization. The trade-off between the prediction accuracy and the size of the rule base is optimized during the design and development of rule-based clinical decision support system. The efficiency of a decision support system relies on the content of the rule base and classification accuracy. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Nieten, Joseph L.; Burke, Roger
1993-03-01
The system diagnostic builder (SDB) is an automated knowledge acquisition tool using state- of-the-art artificial intelligence (AI) technologies. The SDB uses an inductive machine learning technique to generate rules from data sets that are classified by a subject matter expert (SME). Thus, data is captured from the subject system, classified by an expert, and used to drive the rule generation process. These rule-bases are used to represent the observable behavior of the subject system, and to represent knowledge about this system. The rule-bases can be used in any knowledge based system which monitors or controls a physical system or simulation. The SDB has demonstrated the utility of using inductive machine learning technology to generate reliable knowledge bases. In fact, we have discovered that the knowledge captured by the SDB can be used in any number of applications. For example, the knowledge bases captured from the SMS can be used as black box simulations by intelligent computer aided training devices. We can also use the SDB to construct knowledge bases for the process control industry, such as chemical production, or oil and gas production. These knowledge bases can be used in automated advisory systems to ensure safety, productivity, and consistency.
Rule-guided human classification of Volunteered Geographic Information
NASA Astrophysics Data System (ADS)
Ali, Ahmed Loai; Falomir, Zoe; Schmid, Falko; Freksa, Christian
2017-05-01
During the last decade, web technologies and location sensing devices have evolved generating a form of crowdsourcing known as Volunteered Geographic Information (VGI). VGI acted as a platform of spatial data collection, in particular, when a group of public participants are involved in collaborative mapping activities: they work together to collect, share, and use information about geographic features. VGI exploits participants' local knowledge to produce rich data sources. However, the resulting data inherits problematic data classification. In VGI projects, the challenges of data classification are due to the following: (i) data is likely prone to subjective classification, (ii) remote contributions and flexible contribution mechanisms in most projects, and (iii) the uncertainty of spatial data and non-strict definitions of geographic features. These factors lead to various forms of problematic classification: inconsistent, incomplete, and imprecise data classification. This research addresses classification appropriateness. Whether the classification of an entity is appropriate or inappropriate is related to quantitative and/or qualitative observations. Small differences between observations may be not recognizable particularly for non-expert participants. Hence, in this paper, the problem is tackled by developing a rule-guided classification approach. This approach exploits data mining techniques of Association Classification (AC) to extract descriptive (qualitative) rules of specific geographic features. The rules are extracted based on the investigation of qualitative topological relations between target features and their context. Afterwards, the extracted rules are used to develop a recommendation system able to guide participants to the most appropriate classification. The approach proposes two scenarios to guide participants towards enhancing the quality of data classification. An empirical study is conducted to investigate the classification of grass-related features like forest, garden, park, and meadow. The findings of this study indicate the feasibility of the proposed approach.
NASA Astrophysics Data System (ADS)
Lian, Junhe; Shen, Fuhui; Liu, Wenqi; Münstermann, Sebastian
2018-05-01
The constitutive model development has been driven to a very accurate and fine-resolution description of the material behaviour responding to various environmental variable changes. The evolving features of the anisotropic behaviour during deformation, therefore, has drawn particular attention due to its possible impacts on the sheet metal forming industry. An evolving non-associated Hill48 (enHill48) model was recently proposed and applied to the forming limit prediction by coupling with the modified maximum force criterion. On the one hand, the study showed the significance to include the anisotropic evolution for accurate forming limit prediction. On the other hand, it also illustrated that the enHill48 model introduced an instability region that suddenly decreases the formability. Therefore, in this study, an alternative model that is based on the associated flow rule and provides similar anisotropic predictive capability is extended to chapter the evolving effects and further applied to the forming limit prediction. The final results are compared with experimental data as well as the results by enHill48 model.
An expert system to manage the operation of the Space Shuttle's fuel cell cryogenic reactant tanks
NASA Technical Reports Server (NTRS)
Murphey, Amy Y.
1990-01-01
This paper describes a rule-based expert system to manage the operation of the Space Shuttle's cryogenic fuel system. Rules are based on standard fuel tank operating procedures described in the EECOM Console Handbook. The problem of configuring the operation of the Space Shuttle's fuel tanks is well-bounded and well defined. Moreover, the solution of this problem can be encoded in a knowledge-based system. Therefore, a rule-based expert system is the appropriate paradigm. Furthermore, the expert system could be used in coordination with power system simulation software to design operating procedures for specific missions.
Research of Litchi Diseases Diagnosis Expertsystem Based on Rbr and Cbr
NASA Astrophysics Data System (ADS)
Xu, Bing; Liu, Liqun
To conquer the bottleneck problems existing in the traditional rule-based reasoning diseases diagnosis system, such as low reasoning efficiency and lack of flexibility, etc.. It researched the integrated case-based reasoning (CBR) and rule-based reasoning (RBR) technology, and put forward a litchi diseases diagnosis expert system (LDDES) with integrated reasoning method. The method use data mining and knowledge obtaining technology to establish knowledge base and case library. It adopt rules to instruct the retrieval and matching for CBR, and use association rule and decision trees algorithm to calculate case similarity.The experiment shows that the method can increase the system's flexibility and reasoning ability, and improve the accuracy of litchi diseases diagnosis.
Hotz, Christine S; Templeton, Steven J; Christopher, Mary M
2005-03-01
A rule-based expert system using CLIPS programming language was created to classify body cavity effusions as transudates, modified transudates, exudates, chylous, and hemorrhagic effusions. The diagnostic accuracy of the rule-based system was compared with that produced by 2 machine-learning methods: Rosetta, a rough sets algorithm and RIPPER, a rule-induction method. Results of 508 body cavity fluid analyses (canine, feline, equine) obtained from the University of California-Davis Veterinary Medical Teaching Hospital computerized patient database were used to test CLIPS and to test and train RIPPER and Rosetta. The CLIPS system, using 17 rules, achieved an accuracy of 93.5% compared with pathologist consensus diagnoses. Rosetta accurately classified 91% of effusions by using 5,479 rules. RIPPER achieved the greatest accuracy (95.5%) using only 10 rules. When the original rules of the CLIPS application were replaced with those of RIPPER, the accuracy rates were identical. These results suggest that both rule-based expert systems and machine-learning methods hold promise for the preliminary classification of body fluids in the clinical laboratory.
The “Common Rule” refers to the federal regulations that govern research involving human subjects. These regulations have been largely unchanged since 1981, while the research they cover has continued to evolve. After a 6-year rulemaking process, the Common Rule was ...
An integrated theory of the mind.
Anderson, John R; Bothell, Daniel; Byrne, Michael D; Douglass, Scott; Lebiere, Christian; Qin, Yulin
2004-10-01
Adaptive control of thought-rational (ACT-R; J. R. Anderson & C. Lebiere, 1998) has evolved into a theory that consists of multiple modules but also explains how these modules are integrated to produce coherent cognition. The perceptual-motor modules, the goal module, and the declarative memory module are presented as examples of specialized systems in ACT-R. These modules are associated with distinct cortical regions. These modules place chunks in buffers where they can be detected by a production system that responds to patterns of information in the buffers. At any point in time, a single production rule is selected to respond to the current pattern. Subsymbolic processes serve to guide the selection of rules to fire as well as the internal operations of some modules. Much of learning involves tuning of these subsymbolic processes. A number of simple and complex empirical examples are described to illustrate how these modules function singly and in concert. 2004 APA
Implementing a Commercial Rule Base as a Medication Order Safety Net
Reichley, Richard M.; Seaton, Terry L.; Resetar, Ervina; Micek, Scott T.; Scott, Karen L.; Fraser, Victoria J.; Dunagan, W. Claiborne; Bailey, Thomas C.
2005-01-01
A commercial rule base (Cerner Multum) was used to identify medication orders exceeding recommended dosage limits at five hospitals within BJC HealthCare, an integrated health care system. During initial testing, clinical pharmacists determined that there was an excessive number of nuisance and clinically insignificant alerts, with an overall alert rate of 9.2%. A method for customizing the commercial rule base was implemented to increase rule specificity for problematic rules. The system was subsequently deployed at two facilities and achieved alert rates of less than 1%. Pharmacists screened these alerts and contacted ordering physicians in 21% of cases. Physicians made therapeutic changes in response to 38% of alerts presented to them. By applying simple techniques to customize rules, commercial rule bases can be used to rapidly deploy a safety net to screen drug orders for excessive dosages, while preserving the rule architecture for later implementations of more finely tuned clinical decision support. PMID:15802481
A Robust Scalable Transportation System Concept
NASA Technical Reports Server (NTRS)
Hahn, Andrew; DeLaurentis, Daniel
2006-01-01
This report documents the 2005 Revolutionary System Concept for Aeronautics (RSCA) study entitled "A Robust, Scalable Transportation System Concept". The objective of the study was to generate, at a high-level of abstraction, characteristics of a new concept for the National Airspace System, or the new NAS, under which transportation goals such as increased throughput, delay reduction, and improved robustness could be realized. Since such an objective can be overwhelmingly complex if pursued at the lowest levels of detail, instead a System-of-Systems (SoS) approach was adopted to model alternative air transportation architectures at a high level. The SoS approach allows the consideration of not only the technical aspects of the NAS", but also incorporates policy, socio-economic, and alternative transportation system considerations into one architecture. While the representations of the individual systems are basic, the higher level approach allows for ways to optimize the SoS at the network level, determining the best topology (i.e. configuration of nodes and links). The final product (concept) is a set of rules of behavior and network structure that not only satisfies national transportation goals, but represents the high impact rules that accomplish those goals by getting the agents to "do the right thing" naturally. The novel combination of Agent Based Modeling and Network Theory provides the core analysis methodology in the System-of-Systems approach. Our method of approach is non-deterministic which means, fundamentally, it asks and answers different questions than deterministic models. The nondeterministic method is necessary primarily due to our marriage of human systems with technological ones in a partially unknown set of future worlds. Our goal is to understand and simulate how the SoS, human and technological components combined, evolve.
Healthcare information systems: data mining methods in the creation of a clinical recommender system
NASA Astrophysics Data System (ADS)
Duan, L.; Street, W. N.; Xu, E.
2011-05-01
Recommender systems have been extensively studied to present items, such as movies, music and books that are likely of interest to the user. Researchers have indicated that integrated medical information systems are becoming an essential part of the modern healthcare systems. Such systems have evolved to an integrated enterprise-wide system. In particular, such systems are considered as a type of enterprise information systems or ERP system addressing healthcare industry sector needs. As part of efforts, nursing care plan recommender systems can provide clinical decision support, nursing education, clinical quality control, and serve as a complement to existing practice guidelines. We propose to use correlations among nursing diagnoses, outcomes and interventions to create a recommender system for constructing nursing care plans. In the current study, we used nursing diagnosis data to develop the methodology. Our system utilises a prefix-tree structure common in itemset mining to construct a ranked list of suggested care plan items based on previously-entered items. Unlike common commercial systems, our system makes sequential recommendations based on user interaction, modifying a ranked list of suggested items at each step in care plan construction. We rank items based on traditional association-rule measures such as support and confidence, as well as a novel measure that anticipates which selections might improve the quality of future rankings. Since the multi-step nature of our recommendations presents problems for traditional evaluation measures, we also present a new evaluation method based on average ranking position and use it to test the effectiveness of different recommendation strategies.
Implementing PAT with Standards
NASA Astrophysics Data System (ADS)
Chandramohan, Laakshmana Sabari; Doolla, Suryanarayana; Khaparde, S. A.
2016-02-01
Perform Achieve Trade (PAT) is a market-based incentive mechanism to promote energy efficiency. The purpose of this work is to address the challenges inherent to inconsistent representation of business processes, and interoperability issues in PAT like cap-and-trade mechanisms especially when scaled. Studies by various agencies have highlighted that as the mechanism evolves including more industrial sectors and industries in its ambit, implementation will become more challenging. This paper analyses the major needs of PAT (namely tracking, monitoring, auditing & verifying energy-saving reports, and providing technical support & guidance to stakeholders); and how the aforesaid reasons affect them. Though current technologies can handle these challenges to an extent, standardization activities for implementation have been scanty for PAT and this work attempts to evolve them. The inconsistent modification of business processes, rules, and procedures across stakeholders, and interoperability among heterogeneous systems are addressed. This paper proposes the adoption of specifically two standards into PAT, namely Business Process Model and Notation for maintaining consistency in business process modelling, and Common Information Model (IEC 61970, 61968, 62325 combined) for information exchange. Detailed architecture and organization of these adoptions are reported. The work can be used by PAT implementing agencies, stakeholders, and standardization bodies.
Rule-based mechanisms of learning for intelligent adaptive flight control
NASA Technical Reports Server (NTRS)
Handelman, David A.; Stengel, Robert F.
1990-01-01
How certain aspects of human learning can be used to characterize learning in intelligent adaptive control systems is investigated. Reflexive and declarative memory and learning are described. It is shown that model-based systems-theoretic adaptive control methods exhibit attributes of reflexive learning, whereas the problem-solving capabilities of knowledge-based systems of artificial intelligence are naturally suited for implementing declarative learning. Issues related to learning in knowledge-based control systems are addressed, with particular attention given to rule-based systems. A mechanism for real-time rule-based knowledge acquisition is suggested, and utilization of this mechanism within the context of failure diagnosis for fault-tolerant flight control is demonstrated.
Evolution and inheritance of early embryonic patterning in D. simulans and D. sechellia
Lott, Susan E.; Ludwig, Michael Z.; Kreitman, Martin
2010-01-01
Pattern formation in Drosophila is a widely studied example of a robust developmental system. Such robust systems pose a challenge to adaptive evolution, as they mask variation which selection may otherwise act upon. Yet we find variation in the localization of expression domains (henceforth ‘stripe allometry’) in the pattern formation pathway. Specifically, we characterize differences in the gap genes giant and Kruppel, and the pair-rule gene even-skipped, which differ between the sibling species D. simulans and D. sechellia. In a double-backcross experiment, stripe allometry is consistent with maternal inheritance of stripe positioning and multiple genetic factors, with a distinct genetic basis from embryo length. Embryos produced by F1 and F2 backcross mothers exhibit novel spatial patterns of gene expression relative to the parental species, with no measurable increase in positional variance among individuals. Buffering of novel spatial patterns in the backcross genotypes suggests that robustness need not be disrupted in order for the trait to evolve, and perhaps the system is incapable of evolving to prevent the expression of all genetic variation. This limitation, and the ability of natural selection to act on minute genetic differences that are within the “margin of error” for the buffering mechanism, indicates that developmentally buffered traits can evolve without disruption of robustness PMID:21121913
Grouin, Cyril; Zweigenbaum, Pierre
2013-01-01
In this paper, we present a comparison of two approaches to automatically de-identify medical records written in French: a rule-based system and a machine-learning based system using a conditional random fields (CRF) formalism. Both systems have been designed to process nine identifiers in a corpus of medical records in cardiology. We performed two evaluations: first, on 62 documents in cardiology, and on 10 documents in foetopathology - produced by optical character recognition (OCR) - to evaluate the robustness of our systems. We achieved a 0.843 (rule-based) and 0.883 (machine-learning) exact match overall F-measure in cardiology. While the rule-based system allowed us to achieve good results on nominative (first and last names) and numerical data (dates, phone numbers, and zip codes), the machine-learning approach performed best on more complex categories (postal addresses, hospital names, medical devices, and towns). On the foetopathology corpus, although our systems have not been designed for this corpus and despite OCR character recognition errors, we obtained promising results: a 0.681 (rule-based) and 0.638 (machine-learning) exact-match overall F-measure. This demonstrates that existing tools can be applied to process new documents of lower quality.
NASA Astrophysics Data System (ADS)
Huang, Yin; Chen, Jianhua; Xiong, Shaojun
2009-07-01
Mobile-Learning (M-learning) makes many learners get the advantages of both traditional learning and E-learning. Currently, Web-based Mobile-Learning Systems have created many new ways and defined new relationships between educators and learners. Association rule mining is one of the most important fields in data mining and knowledge discovery in databases. Rules explosion is a serious problem which causes great concerns, as conventional mining algorithms often produce too many rules for decision makers to digest. Since Web-based Mobile-Learning System collects vast amounts of student profile data, data mining and knowledge discovery techniques can be applied to find interesting relationships between attributes of learners, assessments, the solution strategies adopted by learners and so on. Therefore ,this paper focus on a new data-mining algorithm, combined with the advantages of genetic algorithm and simulated annealing algorithm , called ARGSA(Association rules based on an improved Genetic Simulated Annealing Algorithm), to mine the association rules. This paper first takes advantage of the Parallel Genetic Algorithm and Simulated Algorithm designed specifically for discovering association rules. Moreover, the analysis and experiment are also made to show the proposed method is superior to the Apriori algorithm in this Mobile-Learning system.
TEES 2.2: Biomedical Event Extraction for Diverse Corpora
2015-01-01
Background The Turku Event Extraction System (TEES) is a text mining program developed for the extraction of events, complex biomedical relationships, from scientific literature. Based on a graph-generation approach, the system detects events with the use of a rich feature set built via dependency parsing. The TEES system has achieved record performance in several of the shared tasks of its domain, and continues to be used in a variety of biomedical text mining tasks. Results The TEES system was quickly adapted to the BioNLP'13 Shared Task in order to provide a public baseline for derived systems. An automated approach was developed for learning the underlying annotation rules of event type, allowing immediate adaptation to the various subtasks, and leading to a first place in four out of eight tasks. The system for the automated learning of annotation rules is further enhanced in this paper to the point of requiring no manual adaptation to any of the BioNLP'13 tasks. Further, the scikit-learn machine learning library is integrated into the system, bringing a wide variety of machine learning methods usable with TEES in addition to the default SVM. A scikit-learn ensemble method is also used to analyze the importances of the features in the TEES feature sets. Conclusions The TEES system was introduced for the BioNLP'09 Shared Task and has since then demonstrated good performance in several other shared tasks. By applying the current TEES 2.2 system to multiple corpora from these past shared tasks an overarching analysis of the most promising methods and possible pitfalls in the evolving field of biomedical event extraction are presented. PMID:26551925
TEES 2.2: Biomedical Event Extraction for Diverse Corpora.
Björne, Jari; Salakoski, Tapio
2015-01-01
The Turku Event Extraction System (TEES) is a text mining program developed for the extraction of events, complex biomedical relationships, from scientific literature. Based on a graph-generation approach, the system detects events with the use of a rich feature set built via dependency parsing. The TEES system has achieved record performance in several of the shared tasks of its domain, and continues to be used in a variety of biomedical text mining tasks. The TEES system was quickly adapted to the BioNLP'13 Shared Task in order to provide a public baseline for derived systems. An automated approach was developed for learning the underlying annotation rules of event type, allowing immediate adaptation to the various subtasks, and leading to a first place in four out of eight tasks. The system for the automated learning of annotation rules is further enhanced in this paper to the point of requiring no manual adaptation to any of the BioNLP'13 tasks. Further, the scikit-learn machine learning library is integrated into the system, bringing a wide variety of machine learning methods usable with TEES in addition to the default SVM. A scikit-learn ensemble method is also used to analyze the importances of the features in the TEES feature sets. The TEES system was introduced for the BioNLP'09 Shared Task and has since then demonstrated good performance in several other shared tasks. By applying the current TEES 2.2 system to multiple corpora from these past shared tasks an overarching analysis of the most promising methods and possible pitfalls in the evolving field of biomedical event extraction are presented.
Integrating policy-based management and SLA performance monitoring
NASA Astrophysics Data System (ADS)
Liu, Tzong-Jye; Lin, Chin-Yi; Chang, Shu-Hsin; Yen, Meng-Tzu
2001-10-01
Policy-based management system provides the configuration capability for the system administrators to focus on the requirements of customers. The service level agreement performance monitoring mechanism helps system administrators to verify the correctness of policies. However, it is difficult for a device to process the policies directly because the policies are the management concept. This paper proposes a mechanism to decompose a policy into rules that can be efficiently processed by a device. Thus, the device may process the rule and collect the performance statistics information efficiently; and the policy-based management system may collect these performance statistics information and report the service-level agreement performance monitoring information to the system administrator. The proposed policy-based management system achieves both the policy configuration and service-level agreement performance monitoring requirements. A policy consists of a condition part and an action part. The condition part is a Boolean expression of a source host IP group, a destination host IP group, etc. The action part is the parameters of services. We say that an address group is compact if it only consists of a range of IP address that can be denoted by a pair of IP address and corresponding IP mask. If the condition part of a policy only consists of the compact address group, we say that the policy is a rule. Since a device can efficiently process a compact address and a system administrator prefers to define a range of IP address, the policy-based management system has to translate policy into rules and supplements the gaps between policy and rules. The proposed policy-based management system builds the relationships between VPN and policies, policy and rules. Since the system administrator wants to monitor the system performance information of VPNs and policies, the proposed policy-based management system downloads the relationships among VPNs, policies and rules to the SNMP agents. The SNMP agents build the management information base (MIB) of all VPNs, policies and rules according to the relationships obtained from the management server. Thus, the proposed policy-based management system may get all performance monitoring information of VPNs and policies from agents. The proposed policy-based manager achieves two goals: a) provide a management environment for the system administrator to configure their network only considering the policy requirement issues and b) let the device have only to process the packet and then collect the required performance information. These two things make the proposed management system satisfy both the user and device requirements.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-13
... in the face of changing marketplace conditions, evolving consumer behavior, and technological...) 326-2984, Attorney, Division of Enforcement, Bureau of Consumer Protection, Federal Trade Commission... continuing need for the rule or guide as well as the rule's or guide's costs and benefits to consumers and...
Organizational Knowledge Transfer Using Ontologies and a Rule-Based System
NASA Astrophysics Data System (ADS)
Okabe, Masao; Yoshioka, Akiko; Kobayashi, Keido; Yamaguchi, Takahira
In recent automated and integrated manufacturing, so-called intelligence skill is becoming more and more important and its efficient transfer to next-generation engineers is one of the urgent issues. In this paper, we propose a new approach without costly OJT (on-the-job training), that is, combinational usage of a domain ontology, a rule ontology and a rule-based system. Intelligence skill can be decomposed into pieces of simple engineering rules. A rule ontology consists of these engineering rules as primitives and the semantic relations among them. A domain ontology consists of technical terms in the engineering rules and the semantic relations among them. A rule ontology helps novices get the total picture of the intelligence skill and a domain ontology helps them understand the exact meanings of the engineering rules. A rule-based system helps domain experts externalize their tacit intelligence skill to ontologies and also helps novices internalize them. As a case study, we applied our proposal to some actual job at a remote control and maintenance office of hydroelectric power stations in Tokyo Electric Power Co., Inc. We also did an evaluation experiment for this case study and the result supports our proposal.
Evolving virtual creatures and catapults.
Chaumont, Nicolas; Egli, Richard; Adami, Christoph
2007-01-01
We present a system that can evolve the morphology and the controller of virtual walking and block-throwing creatures (catapults) using a genetic algorithm. The system is based on Sims' work, implemented as a flexible platform with an off-the-shelf dynamics engine. Experiments aimed at evolving Sims-type walkers resulted in the emergence of various realistic gaits while using fairly simple objective functions. Due to the flexibility of the system, drastically different morphologies and functions evolved with only minor modifications to the system and objective function. For example, various throwing techniques evolved when selecting for catapults that propel a block as far as possible. Among the strategies and morphologies evolved, we find the drop-kick strategy, as well as the systematic invention of the principle behind the wheel, when allowing mutations to the projectile.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Woods, M. P.; Centre for Quantum Technologies, National University of Singapore; QuTech, Delft University of Technology, Lorentzweg 1, 2611 CJ Delft
2016-02-15
Instances of discrete quantum systems coupled to a continuum of oscillators are ubiquitous in physics. Often the continua are approximated by a discrete set of modes. We derive error bounds on expectation values of system observables that have been time evolved under such discretised Hamiltonians. These bounds take on the form of a function of time and the number of discrete modes, where the discrete modes are chosen according to Gauss quadrature rules. The derivation makes use of tools from the field of Lieb-Robinson bounds and the theory of orthonormal polynomials.
NASA Technical Reports Server (NTRS)
Sartori, Michael A.; Passino, Kevin M.; Antsaklis, Panos J.
1992-01-01
In rule-based AI planning, expert, and learning systems, it is often the case that the left-hand-sides of the rules must be repeatedly compared to the contents of some 'working memory'. The traditional approach to solve such a 'match phase problem' for production systems is to use the Rete Match Algorithm. Here, a new technique using a multilayer perceptron, a particular artificial neural network model, is presented to solve the match phase problem for rule-based AI systems. A syntax for premise formulas (i.e., the left-hand-sides of the rules) is defined, and working memory is specified. From this, it is shown how to construct a multilayer perceptron that finds all of the rules which can be executed for the current situation in working memory. The complexity of the constructed multilayer perceptron is derived in terms of the maximum number of nodes and the required number of layers. A method for reducing the number of layers to at most three is also presented.
ARROWSMITH-P: A prototype expert system for software engineering management
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Ramsey, Connie Loggia
1985-01-01
Although the field of software engineering is relatively new, it can benefit from the use of expert systems. Two prototype expert systems were developed to aid in software engineering management. Given the values for certain metrics, these systems will provide interpretations which explain any abnormal patterns of these values during the development of a software project. The two systems, which solve the same problem, were built using different methods, rule-based deduction and frame-based abduction. A comparison was done to see which method was better suited to the needs of this field. It was found that both systems performed moderately well, but the rule-based deduction system using simple rules provided more complete solutions than did the frame-based abduction system.
Web-based Weather Expert System (WES) for Space Shuttle Launch
NASA Technical Reports Server (NTRS)
Bardina, Jorge E.; Rajkumar, T.
2003-01-01
The Web-based Weather Expert System (WES) is a critical module of the Virtual Test Bed development to support 'go/no go' decisions for Space Shuttle operations in the Intelligent Launch and Range Operations program of NASA. The weather rules characterize certain aspects of the environment related to the launching or landing site, the time of the day or night, the pad or runway conditions, the mission durations, the runway equipment and landing type. Expert system rules are derived from weather contingency rules, which were developed over years by NASA. Backward chaining, a goal-directed inference method is adopted, because a particular consequence or goal clause is evaluated first, and then chained backward through the rules. Once a rule is satisfied or true, then that particular rule is fired and the decision is expressed. The expert system is continuously verifying the rules against the past one-hour weather conditions and the decisions are made. The normal procedure of operations requires a formal pre-launch weather briefing held on Launch minus 1 day, which is a specific weather briefing for all areas of Space Shuttle launch operations. In this paper, the Web-based Weather Expert System of the Intelligent Launch and range Operations program is presented.
An Expert-System Engine With Operative Probabilities
NASA Technical Reports Server (NTRS)
Orlando, N. E.; Palmer, M. T.; Wallace, R. S.
1986-01-01
Program enables proof-of-concepts tests of expert systems under development. AESOP is rule-based inference engine for expert system, which makes decisions about particular situation given user-supplied hypotheses, rules, and answers to questions drawn from rules. If knowledge base containing hypotheses and rules governing environment is available to AESOP, almost any situation within that environment resolved by answering questions asked by AESOP. Questions answered with YES, NO, MAYBE, DON'T KNOW, DON'T CARE, or with probability factor ranging from 0 to 10. AESOP written in Franz LISP for interactive execution.
NASA Astrophysics Data System (ADS)
Mabu, Shingo; Hirasawa, Kotaro; Furuzuki, Takayuki
Genetic Network Programming (GNP) is an evolutionary computation which represents its solutions using graph structures. Since GNP can create quite compact programs and has an implicit memory function, it has been clarified that GNP works well especially in dynamic environments. In addition, a study on creating trading rules on stock markets using GNP with Importance Index (GNP-IMX) has been done. IMX is a new element which is a criterion for decision making. In this paper, we combined GNP-IMX with Actor-Critic (GNP-IMX&AC) and create trading rules on stock markets. Evolution-based methods evolve their programs after enough period of time because they must calculate fitness values, however reinforcement learning can change programs during the period, therefore the trading rules can be created efficiently. In the simulation, the proposed method is trained using the stock prices of 10 brands in 2002 and 2003. Then the generalization ability is tested using the stock prices in 2004. The simulation results show that the proposed method can obtain larger profits than GNP-IMX without AC and Buy&Hold.
Distributed traffic signal control using fuzzy logic
NASA Technical Reports Server (NTRS)
Chiu, Stephen
1992-01-01
We present a distributed approach to traffic signal control, where the signal timing parameters at a given intersection are adjusted as functions of the local traffic condition and of the signal timing parameters at adjacent intersections. Thus, the signal timing parameters evolve dynamically using only local information to improve traffic flow. This distributed approach provides for a fault-tolerant, highly responsive traffic management system. The signal timing at an intersection is defined by three parameters: cycle time, phase split, and offset. We use fuzzy decision rules to adjust these three parameters based only on local information. The amount of change in the timing parameters during each cycle is limited to a small fraction of the current parameters to ensure smooth transition. We show the effectiveness of this method through simulation of the traffic flow in a network of controlled intersections.
Multiagent data warehousing and multiagent data mining for cerebrum/cerebellum modeling
NASA Astrophysics Data System (ADS)
Zhang, Wen-Ran
2002-03-01
An algorithm named Neighbor-Miner is outlined for multiagent data warehousing and multiagent data mining. The algorithm is defined in an evolving dynamic environment with autonomous or semiautonomous agents. Instead of mining frequent itemsets from customer transactions, the new algorithm discovers new agents and mining agent associations in first-order logic from agent attributes and actions. While the Apriori algorithm uses frequency as a priory threshold, the new algorithm uses agent similarity as priory knowledge. The concept of agent similarity leads to the notions of agent cuboid, orthogonal multiagent data warehousing (MADWH), and multiagent data mining (MADM). Based on agent similarities and action similarities, Neighbor-Miner is proposed and illustrated in a MADWH/MADM approach to cerebrum/cerebellum modeling. It is shown that (1) semiautonomous neurofuzzy agents can be identified for uniped locomotion and gymnastic training based on attribute relevance analysis; (2) new agents can be discovered and agent cuboids can be dynamically constructed in an orthogonal MADWH, which resembles an evolving cerebrum/cerebellum system; and (3) dynamic motion laws can be discovered as association rules in first order logic. Although examples in legged robot gymnastics are used to illustrate the basic ideas, the new approach is generally suitable for a broad category of data mining tasks where knowledge can be discovered collectively by a set of agents from a geographically or geometrically distributed but relevant environment, especially in scientific and engineering data environments.
NASA Astrophysics Data System (ADS)
Macian-Sorribes, Hector; Pulido-Velazquez, Manuel
2013-04-01
Water resources systems are operated, mostly, using a set of pre-defined rules not regarding, usually, to an optimal allocation in terms of water use or economic benefits, but to historical and institutional reasons. These operating policies are reproduced, commonly, as hedging rules, pack rules or zone-based operations, and simulation models can be used to test their performance under a wide range of hydrological and/or socio-economic hypothesis. Despite the high degree of acceptation and testing that these models have achieved, the actual operation of water resources systems hardly follows all the time the pre-defined rules with the consequent uncertainty on the system performance. Real-world reservoir operation is very complex, affected by input uncertainty (imprecision in forecast inflow, seepage and evaporation losses, etc.), filtered by the reservoir operator's experience and natural risk-aversion, while considering the different physical and legal/institutional constraints in order to meet the different demands and system requirements. The aim of this work is to expose a fuzzy logic approach to derive and assess the historical operation of a system. This framework uses a fuzzy rule-based system to reproduce pre-defined rules and also to match as close as possible the actual decisions made by managers. After built up, the fuzzy rule-based system can be integrated in a water resources management model, making possible to assess the system performance at the basin scale. The case study of the Mijares basin (eastern Spain) is used to illustrate the method. A reservoir operating curve regulates the two main reservoir releases (operated in a conjunctive way) with the purpose of guaranteeing a high realiability of supply to the traditional irrigation districts with higher priority (more senior demands that funded the reservoir construction). A fuzzy rule-based system has been created to reproduce the operating curve's performance, defining the system state (total water stored in the reservoirs) and the month of the year as inputs; and the demand deliveries as outputs. The developed simulation management model integrates the fuzzy-ruled system of the operation of the two main reservoirs of the basin with the corresponding mass balance equations, the physical or boundary conditions and the water allocation rules among the competing demands. Historical information on inflow time series is used as inputs to the model simulation, being trained and validated using historical information on reservoir storage level and flow in several streams of the Mijares river. This methodology provides a more flexible and close to real policies approach. The model is easy to develop and to understand due to its rule-based structure, which mimics the human way of thinking. This can improve cooperation and negotiation between managers, decision-makers and stakeholders. The approach can be also applied to analyze the historical operation of the reservoir (what we have called a reservoir operation "audit").
Designing boosting ensemble of relational fuzzy systems.
Scherer, Rafał
2010-10-01
A method frequently used in classification systems for improving classification accuracy is to combine outputs of several classifiers. Among various types of classifiers, fuzzy ones are tempting because of using intelligible fuzzy if-then rules. In the paper we build an AdaBoost ensemble of relational neuro-fuzzy classifiers. Relational fuzzy systems bond input and output fuzzy linguistic values by a binary relation; thus, fuzzy rules have additional, comparing to traditional fuzzy systems, weights - elements of a fuzzy relation matrix. Thanks to this the system is better adjustable to data during learning. In the paper an ensemble of relational fuzzy systems is proposed. The problem is that such an ensemble contains separate rule bases which cannot be directly merged. As systems are separate, we cannot treat fuzzy rules coming from different systems as rules from the same (single) system. In the paper, the problem is addressed by a novel design of fuzzy systems constituting the ensemble, resulting in normalization of individual rule bases during learning. The method described in the paper is tested on several known benchmarks and compared with other machine learning solutions from the literature.
Compartmental and Spatial Rule-Based Modeling with Virtual Cell.
Blinov, Michael L; Schaff, James C; Vasilescu, Dan; Moraru, Ion I; Bloom, Judy E; Loew, Leslie M
2017-10-03
In rule-based modeling, molecular interactions are systematically specified in the form of reaction rules that serve as generators of reactions. This provides a way to account for all the potential molecular complexes and interactions among multivalent or multistate molecules. Recently, we introduced rule-based modeling into the Virtual Cell (VCell) modeling framework, permitting graphical specification of rules and merger of networks generated automatically (using the BioNetGen modeling engine) with hand-specified reaction networks. VCell provides a number of ordinary differential equation and stochastic numerical solvers for single-compartment simulations of the kinetic systems derived from these networks, and agent-based network-free simulation of the rules. In this work, compartmental and spatial modeling of rule-based models has been implemented within VCell. To enable rule-based deterministic and stochastic spatial simulations and network-free agent-based compartmental simulations, the BioNetGen and NFSim engines were each modified to support compartments. In the new rule-based formalism, every reactant and product pattern and every reaction rule are assigned locations. We also introduce the rule-based concept of molecular anchors. This assures that any species that has a molecule anchored to a predefined compartment will remain in this compartment. Importantly, in addition to formulation of compartmental models, this now permits VCell users to seamlessly connect reaction networks derived from rules to explicit geometries to automatically generate a system of reaction-diffusion equations. These may then be simulated using either the VCell partial differential equations deterministic solvers or the Smoldyn stochastic simulator. Copyright © 2017 Biophysical Society. Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Nieten, Joseph; Burke, Roger
1993-01-01
Consideration is given to the System Diagnostic Builder (SDB), an automated knowledge acquisition tool using state-of-the-art AI technologies. The SDB employs an inductive machine learning technique to generate rules from data sets that are classified by a subject matter expert. Thus, data are captured from the subject system, classified, and used to drive the rule generation process. These rule bases are used to represent the observable behavior of the subject system, and to represent knowledge about this system. The knowledge bases captured from the Shuttle Mission Simulator can be used as black box simulations by the Intelligent Computer Aided Training devices. The SDB can also be used to construct knowledge bases for the process control industry, such as chemical production or oil and gas production.
Creating an ontology driven rules base for an expert system for medical diagnosis.
Bertaud Gounot, Valérie; Donfack, Valéry; Lasbleiz, Jérémy; Bourde, Annabel; Duvauferrier, Régis
2011-01-01
Expert systems of the 1980s have failed on the difficulties of maintaining large rule bases. The current work proposes a method to achieve and maintain rule bases grounded on ontologies (like NCIT). The process described here for an expert system on plasma cell disorder encompasses extraction of a sub-ontology and automatic and comprehensive generation of production rules. The creation of rules is not based directly on classes, but on individuals (instances). Instances can be considered as prototypes of diseases formally defined by "destrictions" in the ontology. Thus, it is possible to use this process to make diagnoses of diseases. The perspectives of this work are considered: the process described with an ontology formalized in OWL1 can be extended by using an ontology in OWL2 and allow reasoning about numerical data in addition to symbolic data.
Intrusion Detection Systems with Live Knowledge System
2016-05-31
Ripple -down Rule (RDR) to maintain the knowledge from human experts with knowledge base generated by the Induct RDR, which is a machine-learning based RDR...propose novel approach that uses Ripple -down Rule (RDR) to maintain the knowledge from human experts with knowledge base generated by the Induct RDR...detection model by applying Induct RDR approach. The proposed induct RDR ( Ripple Down Rules) approach allows to acquire the phishing detection
A Data Stream Model For Runoff Simulation In A Changing Environment
NASA Astrophysics Data System (ADS)
Yang, Q.; Shao, J.; Zhang, H.; Wang, G.
2017-12-01
Runoff simulation is of great significance for water engineering design, water disaster control, water resources planning and management in a catchment or region. A large number of methods including concept-based process-driven models and statistic-based data-driven models, have been proposed and widely used in worldwide during past decades. Most existing models assume that the relationship among runoff and its impacting factors is stationary. However, in the changing environment (e.g., climate change, human disturbance), their relationship usually evolves over time. In this study, we propose a data stream model for runoff simulation in a changing environment. Specifically, the proposed model works in three steps: learning a rule set, expansion of a rule, and simulation. The first step is to initialize a rule set. When a new observation arrives, the model will check which rule covers it and then use the rule for simulation. Meanwhile, Page-Hinckley (PH) change detection test is used to monitor the online simulation error of each rule. If a change is detected, the corresponding rule is removed from the rule set. In the second step, for each rule, if it covers more than a given number of instance, the rule is expected to expand. In the third step, a simulation model of each leaf node is learnt with a perceptron without activation function, and is updated with adding a newly incoming observation. Taking Fuxi River catchment as a case study, we applied the model to simulate the monthly runoff in the catchment. Results show that abrupt change is detected in the year of 1997 by using the Page-Hinckley change detection test method, which is consistent with the historic record of flooding. In addition, the model achieves good simulation results with the RMSE of 13.326, and outperforms many established methods. The findings demonstrated that the proposed data stream model provides a promising way to simulate runoff in a changing environment.
Presenting Data: Can You Follow a Recipe?
ERIC Educational Resources Information Center
Drummond, Gordon B.; Tom, Brian D. M.
2011-01-01
In this article, the authors address the practicalities of how data should be presented, summarized, and interpreted. There are no exact rules; indeed there are valid concerns that exact rules may be inappropriate and too prescriptive. New procedures evolve, and new methods may be needed to deal with new types of data, just as people know that new…
Monitoring Contract Enforcement within Virtual Organizations
NASA Astrophysics Data System (ADS)
Squicciarini, Anna; Paci, Federica
Virtual Organizations (VOs) represent a new collaboration paradigm in which the participating entities pool resources, services, and information to achieve a common goal. VOs are often created on demand and dynamically evolve over time. An organization identifies a business opportunity and creates a VO to meet it. In this paper we develop a system for monitoring the sharing of resources in VO. Sharing rules are defined by a particular, common type of contract in which virtual organization members agree to make available some amount of specified resource over a given time period. The main component of the system is a monitoring tool for policy enforcement, called Security Controller (SC). VO members’ interactions are monitored in a decentralized manner in that each member has one associated SC which intercepts all the exchanged messages. We show that having SCs in VOs prevents from serious security breaches and guarantees VOs correct functioning without degrading the execution time of members’ interactions. We base our discussion on application scenarios and illustrate the SC prototype, along with some performance evaluation.
Expert system shell to reason on large amounts of data
NASA Technical Reports Server (NTRS)
Giuffrida, Gionanni
1994-01-01
The current data base management systems (DBMS's) do not provide a sophisticated environment to develop rule based expert systems applications. Some of the new DBMS's come with some sort of rule mechanism; these are active and deductive database systems. However, both of these are not featured enough to support full implementation based on rules. On the other hand, current expert system shells do not provide any link with external databases. That is, all the data are kept in the system working memory. Such working memory is maintained in main memory. For some applications the reduced size of the available working memory could represent a constraint for the development. Typically these are applications which require reasoning on huge amounts of data. All these data do not fit into the computer main memory. Moreover, in some cases these data can be already available in some database systems and continuously updated while the expert system is running. This paper proposes an architecture which employs knowledge discovering techniques to reduce the amount of data to be stored in the main memory; in this architecture a standard DBMS is coupled with a rule-based language. The data are stored into the DBMS. An interface between the two systems is responsible for inducing knowledge from the set of relations. Such induced knowledge is then transferred to the rule-based language working memory.
Automated rule-base creation via CLIPS-Induce
NASA Technical Reports Server (NTRS)
Murphy, Patrick M.
1994-01-01
Many CLIPS rule-bases contain one or more rule groups that perform classification. In this paper we describe CLIPS-Induce, an automated system for the creation of a CLIPS classification rule-base from a set of test cases. CLIPS-Induce consists of two components, a decision tree induction component and a CLIPS production extraction component. ID3, a popular decision tree induction algorithm, is used to induce a decision tree from the test cases. CLIPS production extraction is accomplished through a top-down traversal of the decision tree. Nodes of the tree are used to construct query rules, and branches of the tree are used to construct classification rules. The learned CLIPS productions may easily be incorporated into a large CLIPS system that perform tasks such as accessing a database or displaying information.
Chen, Yen-Lin; Chiang, Hsin-Han; Yu, Chao-Wei; Chiang, Chuan-Yen; Liu, Chuan-Ming; Wang, Jenq-Haur
2012-01-01
This study develops and integrates an efficient knowledge-based system and a component-based framework to design an intelligent and flexible home health care system. The proposed knowledge-based system integrates an efficient rule-based reasoning model and flexible knowledge rules for determining efficiently and rapidly the necessary physiological and medication treatment procedures based on software modules, video camera sensors, communication devices, and physiological sensor information. This knowledge-based system offers high flexibility for improving and extending the system further to meet the monitoring demands of new patient and caregiver health care by updating the knowledge rules in the inference mechanism. All of the proposed functional components in this study are reusable, configurable, and extensible for system developers. Based on the experimental results, the proposed intelligent homecare system demonstrates that it can accomplish the extensible, customizable, and configurable demands of the ubiquitous healthcare systems to meet the different demands of patients and caregivers under various rehabilitation and nursing conditions.
Chen, Yen-Lin; Chiang, Hsin-Han; Yu, Chao-Wei; Chiang, Chuan-Yen; Liu, Chuan-Ming; Wang, Jenq-Haur
2012-01-01
This study develops and integrates an efficient knowledge-based system and a component-based framework to design an intelligent and flexible home health care system. The proposed knowledge-based system integrates an efficient rule-based reasoning model and flexible knowledge rules for determining efficiently and rapidly the necessary physiological and medication treatment procedures based on software modules, video camera sensors, communication devices, and physiological sensor information. This knowledge-based system offers high flexibility for improving and extending the system further to meet the monitoring demands of new patient and caregiver health care by updating the knowledge rules in the inference mechanism. All of the proposed functional components in this study are reusable, configurable, and extensible for system developers. Based on the experimental results, the proposed intelligent homecare system demonstrates that it can accomplish the extensible, customizable, and configurable demands of the ubiquitous healthcare systems to meet the different demands of patients and caregivers under various rehabilitation and nursing conditions. PMID:23112650
NASA Astrophysics Data System (ADS)
Driandanu, Galih; Surarso, Bayu; Suryono
2018-02-01
A radio frequency identification (RFID) has obtained increasing attention with the emergence of various applications. This study aims to examine the implementation of rule based expert system supported by RFID technology into a monitoring information system of drug supply in a hospital. This research facilitates in monitoring the real time drug supply by using data sample from the hospital pharmacy. This system able to identify and count the number of drug and provide warning and report in real time. the conclusion is the rule based expert system and RFID technology can facilitate the performance in monitoring the drug supply quickly and precisely.
Fuzzylot: a novel self-organising fuzzy-neural rule-based pilot system for automated vehicles.
Pasquier, M; Quek, C; Toh, M
2001-10-01
This paper presents part of our research work concerned with the realisation of an Intelligent Vehicle and the technologies required for its routing, navigation, and control. An automated driver prototype has been developed using a self-organising fuzzy rule-based system (POPFNN-CRI(S)) to model and subsequently emulate human driving expertise. The ability of fuzzy logic to represent vague information using linguistic variables makes it a powerful tool to develop rule-based control systems when an exact working model is not available, as is the case of any vehicle-driving task. Designing a fuzzy system, however, is a complex endeavour, due to the need to define the variables and their associated fuzzy sets, and determine a suitable rule base. Many efforts have thus been devoted to automating this process, yielding the development of learning and optimisation techniques. One of them is the family of POP-FNNs, or Pseudo-Outer Product Fuzzy Neural Networks (TVR, AARS(S), AARS(NS), CRI, Yager). These generic self-organising neural networks developed at the Intelligent Systems Laboratory (ISL/NTU) are based on formal fuzzy mathematical theory and are able to objectively extract a fuzzy rule base from training data. In this application, a driving simulator has been developed, that integrates a detailed model of the car dynamics, complete with engine characteristics and environmental parameters, and an OpenGL-based 3D-simulation interface coupled with driving wheel and accelerator/ brake pedals. The simulator has been used on various road scenarios to record from a human pilot driving data consisting of steering and speed control actions associated to road features. Specifically, the POPFNN-CRI(S) system is used to cluster the data and extract a fuzzy rule base modelling the human driving behaviour. Finally, the effectiveness of the generated rule base has been validated using the simulator in autopilot mode.
Influences on infant speech processing: toward a new synthesis.
Werker, J F; Tees, R C
1999-01-01
To comprehend and produce language, we must be able to recognize the sound patterns of our language and the rules for how these sounds "map on" to meaning. Human infants are born with a remarkable array of perceptual sensitivities that allow them to detect the basic properties that are common to the world's languages. During the first year of life, these sensitivities undergo modification reflecting an exquisite tuning to just that phonological information that is needed to map sound to meaning in the native language. We review this transition from language-general to language-specific perceptual sensitivity that occurs during the first year of life and consider whether the changes propel the child into word learning. To account for the broad-based initial sensitivities and subsequent reorganizations, we offer an integrated transactional framework based on the notion of a specialized perceptual-motor system that has evolved to serve human speech, but which functions in concert with other developing abilities. In so doing, we highlight the links between infant speech perception, babbling, and word learning.
Design of impact-resistant boron/aluminum large fan blade
NASA Technical Reports Server (NTRS)
Salemme, C. T.; Yokel, S. A.
1978-01-01
The technical program was comprised of two technical tasks. Task 1 encompassed the preliminary boron/aluminum fan blade design effort. Two preliminary designs were evolved. An initial design consisted of 32 blades per stage and was based on material properties extracted from manufactured blades. A final design of 36 blades per stage was based on rule-of-mixture material properties. In Task 2, the selected preliminary blade design was refined via more sophisticated analytical tools. Detailed finite element stress analysis and aero performance analysis were carried out to determine blade material frequencies and directional stresses.
Chance vs. necessity in living systems: a false antinomy.
Buiatti, Marcello; Buiatti, Marco
2008-01-01
The concepts of order and randomness are crucial to understand 'living systems' structural and dynamical rules. In the history of biology, they lay behind the everlasting debate on the relative roles of chance and determinism in evolution. Jacques Monod [1970] built a theory where chance (randomness) and determinism (order) were considered as two complementary aspects of life. In the present paper, we will give an up to date version of the problem going beyond the dichotomy between chance and determinism. To this end, we will first see how the view on living systems has evolved from the mechanistic one of the 19th century to the one stemming from the most recent literature, where they emerge as complex systems continuously evolving through multiple interactions among their components and with the surrounding environment. We will then report on the ever increasing evidence of "friendly" co-existence in living beings between a number of "variability generators", fixed by evolution, and the "spontaneous order" derived from interactions between components. We will propose that the "disorder" generated is "benevolent" because it allows living systems to rapidly adapt to changes in the environment by continuously changing, while keeping their internal harmony.
NASA Astrophysics Data System (ADS)
Souza, Pricilla Camões Martins de; Schmitt, Renata da Silva; Stanton, Natasha
2017-09-01
The Ararauama Lagoon Fault System composes one of the most prominent set of lineaments of the SE Brazilian continental margin. It is located onshore in a key tectonic domain, where the basement inheritance rule is not followed. This fault system is characterized by ENE-WSW silicified tectonic breccias and cataclasites showing evidences of recurrent tectonic reactivations. Based on field work, microtectonic, kinematic and dynamic analysis, we reconstructed the paleostresses in the region and propose a sequence of three brittle deformational phases accountable for these reactivations: 1) NE-SW dextral transcurrence; 2) NNW-SSE dextral oblique extension that evolved to NNW-SSE "pure" extension; 3) ENE-WSW dextral oblique extension. These phases are reasonably correlated with the tectonic events responsible for the onset and evolution of the SE onshore rift basins, between the Neocretaceous and Holocene. However, based on petrographic studies and supported by regional geological correlations, we assume that the origin of this fault system is older, related to the Early Cretaceous South Atlantic rifting. This study provides significant information about one of the main structural trends of the SE Brazilian continental margin and the tectonic events that controlled its segmentation, since the Gondwana rifting, and compartmentalization of its onshore sedimentary deposits during the Cenozoic.
Evaluating the Effectiveness of Auditing Rules for Electronic Health Record Systems
Hedda, Monica; Malin, Bradley A.; Yan, Chao; Fabbri, Daniel
2017-01-01
Healthcare organizations (HCOs) often deploy rule-based auditing systems to detect insider threats to sensitive patient health information in electronic health record (EHR) systems. These rule-based systems define behavior deemed to be high-risk a priori (e.g., family member, co-worker access). While such rules seem logical, there has been little scientific investigation into the effectiveness of these auditing rules in identifying inappropriate behavior. Thus, in this paper, we introduce an approach to evaluate the effectiveness of individual high-risk rules and rank them according to their potential risk. We investigate the rate of high-risk access patterns and minimum rate of high-risk accesses that can be explained with appropriate clinical reasons in a large EHR system. An analysis of 8M accesses from one-week of data shows that specific high-risk flags occur more frequently than theoretically expected and the rate at which accesses can be explained away with five simple reasons is 16 - 43%. PMID:29854153
Evaluating the Effectiveness of Auditing Rules for Electronic Health Record Systems.
Hedda, Monica; Malin, Bradley A; Yan, Chao; Fabbri, Daniel
2017-01-01
Healthcare organizations (HCOs) often deploy rule-based auditing systems to detect insider threats to sensitive patient health information in electronic health record (EHR) systems. These rule-based systems define behavior deemed to be high-risk a priori (e.g., family member, co-worker access). While such rules seem logical, there has been little scientific investigation into the effectiveness of these auditing rules in identifying inappropriate behavior. Thus, in this paper, we introduce an approach to evaluate the effectiveness of individual high-risk rules and rank them according to their potential risk. We investigate the rate of high-risk access patterns and minimum rate of high-risk accesses that can be explained with appropriate clinical reasons in a large EHR system. An analysis of 8M accesses from one-week of data shows that specific high-risk flags occur more frequently than theoretically expected and the rate at which accesses can be explained away with five simple reasons is 16 - 43%.
RB-ARD: A proof of concept rule-based abort
NASA Technical Reports Server (NTRS)
Smith, Richard; Marinuzzi, John
1987-01-01
The Abort Region Determinator (ARD) is a console program in the space shuttle mission control center. During shuttle ascent, the Flight Dynamics Officer (FDO) uses the ARD to determine the possible abort modes and make abort calls for the crew. The goal of the Rule-based Abort region Determinator (RB/ARD) project was to test the concept of providing an onboard ARD for the shuttle or an automated ARD for the mission control center (MCC). A proof of concept rule-based system was developed on a LMI Lambda computer using PICON, a knowdedge-based system shell. Knowdedge derived from documented flight rules and ARD operation procedures was coded in PICON rules. These rules, in conjunction with modules of conventional code, enable the RB-ARD to carry out key parts of the ARD task. Current capabilities of the RB-ARD include: continuous updating of the available abort mode, recognition of a limited number of main engine faults and recommendation of safing actions. Safing actions recommended by the RB-ARD concern the Space Shuttle Main Engine (SSME) limit shutdown system and powerdown of the SSME Ac buses.
Evolution and inheritance of early embryonic patterning in Drosophila simulans and D. sechellia.
Lott, Susan E; Ludwig, Michael Z; Kreitman, Martin
2011-05-01
Pattern formation in Drosophila is a widely studied example of a robust developmental system. Such robust systems pose a challenge to adaptive evolution, as they mask variation that selection may otherwise act upon. Yet we find variation in the localization of expression domains (henceforth "stripe allometry") in the pattern formation pathway. Specifically, we characterize differences in the gap genes giant and Kruppel, and the pair-rule gene even-skipped, which differ between the sibling species Drosophila simulans and D. sechellia. In a double-backcross experiment, stripe allometry is consistent with maternal inheritance of stripe positioning and multiple genetic factors, with a distinct genetic basis from embryo length. Embryos produced by F1 and F2 backcross mothers exhibit novel spatial patterns of gene expression relative to the parental species, with no measurable increase in positional variance among individuals. Buffering of novel spatial patterns in the backcross genotypes suggests that robustness need not be disrupted in order for the trait to evolve, and perhaps the system is incapable of evolving to prevent the expression of all genetic variation. This limitation, and the ability of natural selection to act on minute genetic differences that are within the "margin of error" for the buffering mechanism, indicates that developmentally buffered traits can evolve without disruption of robustness. © 2010 The Author(s). Evolution© 2010 The Society for the Study of Evolution.
Expert systems for diagnostic purposes, prospected applications to the radar field
NASA Astrophysics Data System (ADS)
Filippi, Riccardo
Expert systems applied to fault diagnosis, particularly electrical circuit troubleshooting, are introduced. Diagnostic systems consisting of sequences of rules of the symptom-disease type (rule based system) and systems based upon a physical and functional description of the unit subjected to fault diagnosis are treated. Application of such systems to radar equipment troubleshooting, in particular to the transmitter, is discussed.
Chylek, Lily A.; Harris, Leonard A.; Tung, Chang-Shung; Faeder, James R.; Lopez, Carlos F.
2013-01-01
Rule-based modeling was developed to address the limitations of traditional approaches for modeling chemical kinetics in cell signaling systems. These systems consist of multiple interacting biomolecules (e.g., proteins), which themselves consist of multiple parts (e.g., domains, linear motifs, and sites of phosphorylation). Consequently, biomolecules that mediate information processing generally have the potential to interact in multiple ways, with the number of possible complexes and post-translational modification states tending to grow exponentially with the number of binary interactions considered. As a result, only large reaction networks capture all possible consequences of the molecular interactions that occur in a cell signaling system, which is problematic because traditional modeling approaches for chemical kinetics (e.g., ordinary differential equations) require explicit network specification. This problem is circumvented through representation of interactions in terms of local rules. With this approach, network specification is implicit and model specification is concise. Concise representation results in a coarse graining of chemical kinetics, which is introduced because all reactions implied by a rule inherit the rate law associated with that rule. Coarse graining can be appropriate if interactions are modular, and the coarseness of a model can be adjusted as needed. Rules can be specified using specialized model-specification languages, and recently developed tools designed for specification of rule-based models allow one to leverage powerful software engineering capabilities. A rule-based model comprises a set of rules, which can be processed by general-purpose simulation and analysis tools to achieve different objectives (e.g., to perform either a deterministic or stochastic simulation). PMID:24123887
The evolution of social learning rules: payoff-biased and frequency-dependent biased transmission.
Kendal, Jeremy; Giraldeau, Luc-Alain; Laland, Kevin
2009-09-21
Humans and other animals do not use social learning indiscriminately, rather, natural selection has favoured the evolution of social learning rules that make selective use of social learning to acquire relevant information in a changing environment. We present a gene-culture coevolutionary analysis of a small selection of such rules (unbiased social learning, payoff-biased social learning and frequency-dependent biased social learning, including conformism and anti-conformism) in a population of asocial learners where the environment is subject to a constant probability of change to a novel state. We define conditions under which each rule evolves to a genetically polymorphic equilibrium. We find that payoff-biased social learning may evolve under high levels of environmental variation if the fitness benefit associated with the acquired behaviour is either high or low but not of intermediate value. In contrast, both conformist and anti-conformist biases can become fixed when environment variation is low, whereupon the mean fitness in the population is higher than for a population of asocial learners. Our examination of the population dynamics reveals stable limit cycles under conformist and anti-conformist biases and some highly complex dynamics including chaos. Anti-conformists can out-compete conformists when conditions favour a low equilibrium frequency of the learned behaviour. We conclude that evolution, punctuated by the repeated successful invasion of different social learning rules, should continuously favour a reduction in the equilibrium frequency of asocial learning, and propose that, among competing social learning rules, the dominant rule will be the one that can persist with the lowest frequency of asocial learning.
A Meta-Analysis of Institutional Theories
1989-06-01
GPOUP SUBGROUP Institutional Theory , Isomorphism, Administrative Difterpntiation, Diffusion of Change, Rational, Unit Of Analysis 19 ABSTRACT (Continue on... institutional theory may lead to better decision making and evaluation criteria on the part of managers in the non-profit sector. C. SCOPE This paper... institutional theory : I) Organizations evolving in environments with elabora- ted institutional rules create structure that conform to those rules. 2
An algorithm for automated layout of process description maps drawn in SBGN.
Genc, Begum; Dogrusoz, Ugur
2016-01-01
Evolving technology has increased the focus on genomics. The combination of today's advanced techniques with decades of molecular biology research has yielded huge amounts of pathway data. A standard, named the Systems Biology Graphical Notation (SBGN), was recently introduced to allow scientists to represent biological pathways in an unambiguous, easy-to-understand and efficient manner. Although there are a number of automated layout algorithms for various types of biological networks, currently none specialize on process description (PD) maps as defined by SBGN. We propose a new automated layout algorithm for PD maps drawn in SBGN. Our algorithm is based on a force-directed automated layout algorithm called Compound Spring Embedder (CoSE). On top of the existing force scheme, additional heuristics employing new types of forces and movement rules are defined to address SBGN-specific rules. Our algorithm is the only automatic layout algorithm that properly addresses all SBGN rules for drawing PD maps, including placement of substrates and products of process nodes on opposite sides, compact tiling of members of molecular complexes and extensively making use of nested structures (compound nodes) to properly draw cellular locations and molecular complex structures. As demonstrated experimentally, the algorithm results in significant improvements over use of a generic layout algorithm such as CoSE in addressing SBGN rules on top of commonly accepted graph drawing criteria. An implementation of our algorithm in Java is available within ChiLay library (https://github.com/iVis-at-Bilkent/chilay). ugur@cs.bilkent.edu.tr or dogrusoz@cbio.mskcc.org Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.
An algorithm for automated layout of process description maps drawn in SBGN
Genc, Begum; Dogrusoz, Ugur
2016-01-01
Motivation: Evolving technology has increased the focus on genomics. The combination of today’s advanced techniques with decades of molecular biology research has yielded huge amounts of pathway data. A standard, named the Systems Biology Graphical Notation (SBGN), was recently introduced to allow scientists to represent biological pathways in an unambiguous, easy-to-understand and efficient manner. Although there are a number of automated layout algorithms for various types of biological networks, currently none specialize on process description (PD) maps as defined by SBGN. Results: We propose a new automated layout algorithm for PD maps drawn in SBGN. Our algorithm is based on a force-directed automated layout algorithm called Compound Spring Embedder (CoSE). On top of the existing force scheme, additional heuristics employing new types of forces and movement rules are defined to address SBGN-specific rules. Our algorithm is the only automatic layout algorithm that properly addresses all SBGN rules for drawing PD maps, including placement of substrates and products of process nodes on opposite sides, compact tiling of members of molecular complexes and extensively making use of nested structures (compound nodes) to properly draw cellular locations and molecular complex structures. As demonstrated experimentally, the algorithm results in significant improvements over use of a generic layout algorithm such as CoSE in addressing SBGN rules on top of commonly accepted graph drawing criteria. Availability and implementation: An implementation of our algorithm in Java is available within ChiLay library (https://github.com/iVis-at-Bilkent/chilay). Contact: ugur@cs.bilkent.edu.tr or dogrusoz@cbio.mskcc.org Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26363029
Knowledge-based reasoning in the Paladin tactical decision generation system
NASA Technical Reports Server (NTRS)
Chappell, Alan R.
1993-01-01
A real-time tactical decision generation system for air combat engagements, Paladin, has been developed. A pilot's job in air combat includes tasks that are largely symbolic. These symbolic tasks are generally performed through the application of experience and training (i.e. knowledge) gathered over years of flying a fighter aircraft. Two such tasks, situation assessment and throttle control, are identified and broken out in Paladin to be handled by specialized knowledge based systems. Knowledge pertaining to these tasks is encoded into rule-bases to provide the foundation for decisions. Paladin uses a custom built inference engine and a partitioned rule-base structure to give these symbolic results in real-time. This paper provides an overview of knowledge-based reasoning systems as a subset of rule-based systems. The knowledge used by Paladin in generating results as well as the system design for real-time execution is discussed.
On the fusion of tuning parameters of fuzzy rules and neural network
NASA Astrophysics Data System (ADS)
Mamuda, Mamman; Sathasivam, Saratha
2017-08-01
Learning fuzzy rule-based system with neural network can lead to a precise valuable empathy of several problems. Fuzzy logic offers a simple way to reach at a definite conclusion based upon its vague, ambiguous, imprecise, noisy or missing input information. Conventional learning algorithm for tuning parameters of fuzzy rules using training input-output data usually end in a weak firing state, this certainly powers the fuzzy rule and makes it insecure for a multiple-input fuzzy system. In this paper, we introduce a new learning algorithm for tuning the parameters of the fuzzy rules alongside with radial basis function neural network (RBFNN) in training input-output data based on the gradient descent method. By the new learning algorithm, the problem of weak firing using the conventional method was addressed. We illustrated the efficiency of our new learning algorithm by means of numerical examples. MATLAB R2014(a) software was used in simulating our result The result shows that the new learning method has the best advantage of training the fuzzy rules without tempering with the fuzzy rule table which allowed a membership function of the rule to be used more than one time in the fuzzy rule base.
Analysis of Rules for Islamic Inheritance Law in Indonesia Using Hybrid Rule Based Learning
NASA Astrophysics Data System (ADS)
Khosyi'ah, S.; Irfan, M.; Maylawati, D. S.; Mukhlas, O. S.
2018-01-01
Along with the development of human civilization in Indonesia, the changes and reform of Islamic inheritance law so as to conform to the conditions and culture cannot be denied. The distribution of inheritance in Indonesia can be done automatically by storing the rule of Islamic inheritance law in the expert system. In this study, we analyze the knowledge of experts in Islamic inheritance in Indonesia and represent it in the form of rules using rule-based Forward Chaining (FC) and Davis-Putman-Logemann-Loveland (DPLL) algorithms. By hybridizing FC and DPLL algorithms, the rules of Islamic inheritance law in Indonesia are clearly defined and measured. The rules were conceptually validated by some experts in Islamic laws and informatics. The results revealed that generally all rules were ready for use in an expert system.
Dehghani Soufi, Mahsa; Samad-Soltani, Taha; Shams Vahdati, Samad; Rezaei-Hachesu, Peyman
2018-06-01
Fast and accurate patient triage for the response process is a critical first step in emergency situations. This process is often performed using a paper-based mode, which intensifies workload and difficulty, wastes time, and is at risk of human errors. This study aims to design and evaluate a decision support system (DSS) to determine the triage level. A combination of the Rule-Based Reasoning (RBR) and Fuzzy Logic Classifier (FLC) approaches were used to predict the triage level of patients according to the triage specialist's opinions and Emergency Severity Index (ESI) guidelines. RBR was applied for modeling the first to fourth decision points of the ESI algorithm. The data relating to vital signs were used as input variables and modeled using fuzzy logic. Narrative knowledge was converted to If-Then rules using XML. The extracted rules were then used to create the rule-based engine and predict the triage levels. Fourteen RBR and 27 fuzzy rules were extracted and used in the rule-based engine. The performance of the system was evaluated using three methods with real triage data. The accuracy of the clinical decision support systems (CDSSs; in the test data) was 99.44%. The evaluation of the error rate revealed that, when using the traditional method, 13.4% of the patients were miss-triaged, which is statically significant. The completeness of the documentation also improved from 76.72% to 98.5%. Designed system was effective in determining the triage level of patients and it proved helpful for nurses as they made decisions, generated nursing diagnoses based on triage guidelines. The hybrid approach can reduce triage misdiagnosis in a highly accurate manner and improve the triage outcomes. Copyright © 2018 Elsevier B.V. All rights reserved.
A rule-based expert system for chemical prioritization using effects-based chemical categories
A rule-based expert system (ES) was developed to predict chemical binding to the estrogen receptor (ER) patterned on the research approaches championed by Gilman Veith to whom this article and journal issue are dedicated. The ERES was built to be mechanistically-transparent and m...
Privacy as an enabler, not an impediment: building trust into health information exchange.
McGraw, Deven; Dempsey, James X; Harris, Leslie; Goldman, Janlori
2009-01-01
Building privacy and security protections into health information technology systems will bolster trust in such systems and promote their adoption. The privacy issue, too long seen as a barrier to electronic health information exchange, can be resolved through a comprehensive framework that implements core privacy principles, adopts trusted network design characteristics, and establishes oversight and accountability mechanisms. The public policy challenges of implementing this framework in a complex and evolving environment will require improvements to existing law, new rules for entities outside the traditional health care sector, a more nuanced approach to the role of consent, and stronger enforcement mechanisms.
CRISPR-based herd immunity can limit phage epidemics in bacterial populations
Geyrhofer, Lukas; Barton, Nicholas H
2018-01-01
Herd immunity, a process in which resistant individuals limit the spread of a pathogen among susceptible hosts has been extensively studied in eukaryotes. Even though bacteria have evolved multiple immune systems against their phage pathogens, herd immunity in bacteria remains unexplored. Here we experimentally demonstrate that herd immunity arises during phage epidemics in structured and unstructured Escherichia coli populations consisting of differing frequencies of susceptible and resistant cells harboring CRISPR immunity. In addition, we develop a mathematical model that quantifies how herd immunity is affected by spatial population structure, bacterial growth rate, and phage replication rate. Using our model we infer a general epidemiological rule describing the relative speed of an epidemic in partially resistant spatially structured populations. Our experimental and theoretical findings indicate that herd immunity may be important in bacterial communities, allowing for stable coexistence of bacteria and their phages and the maintenance of polymorphism in bacterial immunity. PMID:29521625
NASA Technical Reports Server (NTRS)
Ely, Jay J.
2005-01-01
Electromagnetic interference (EMI) promises to be an ever-evolving concern for flight electronic systems. This paper introduces EMI and identifies its impact upon civil aviation radio systems. New wireless services, like mobile phones, text messaging, email, web browsing, radio frequency identification (RFID), and mobile audio/video services are now being introduced into passenger airplanes. FCC and FAA rules governing the use of mobile phones and other portable electronic devices (PEDs) on board airplanes are presented along with a perspective of how these rules are now being rewritten to better facilitate in-flight wireless services. This paper provides a comprehensive overview of NASA cooperative research with the FAA, RTCA, airlines and universities to obtain laboratory radiated emission data for numerous PED types, aircraft radio frequency (RF) coupling measurements, estimated aircraft radio interference thresholds, and direct-effects EMI testing. These elements are combined together to provide high-confidence answers regarding the EMI potential of new wireless products being used on passenger airplanes. This paper presents a vision for harmonizing new wireless services with aeronautical radio services by detecting, assessing, controlling and mitigating the effects of EMI.
C-Language Integrated Production System, Version 5.1
NASA Technical Reports Server (NTRS)
Riley, Gary; Donnell, Brian; Ly, Huyen-Anh VU; Culbert, Chris; Savely, Robert T.; Mccoy, Daniel J.; Giarratano, Joseph
1992-01-01
CLIPS 5.1 provides cohesive software tool for handling wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming provides representation of knowledge by use of heuristics. Object-oriented programming enables modeling of complex systems as modular components. Procedural programming enables CLIPS to represent knowledge in ways similar to those allowed in such languages as C, Pascal, Ada, and LISP. Working with CLIPS 5.1, one can develop expert-system software by use of rule-based programming only, object-oriented programming only, procedural programming only, or combinations of the three.
A rule-based smart automated fertilization and irrigation systems
NASA Astrophysics Data System (ADS)
Yousif, Musab El-Rashid; Ghafar, Khairuddin; Zahari, Rahimi; Lim, Tiong Hoo
2018-04-01
Smart automation in industries has become very important as it can improve the reliability and efficiency of the systems. The use of smart technologies in agriculture have increased over the year to ensure and control the production of crop and address food security. However, it is important to use proper irrigation systems avoid water wastage and overfeeding of the plant. In this paper, a Smart Rule-based Automated Fertilization and Irrigation System is proposed and evaluated. We propose a rule based decision making algorithm to monitor and control the food supply to the plant and the soil quality. A build-in alert system is also used to update the farmer using a text message. The system is developed and evaluated using a real hardware.
Miller, P L; Frawley, S J; Sayward, F G; Yasnoff, W A; Duncan, L; Fleming, D W
1997-06-01
IMM/Serve is a computer program which implements the clinical guidelines for childhood immunization. IMM/Serve accepts as input a child's immunization history. It then indicates which vaccinations are due and which vaccinations should be scheduled next. The clinical guidelines for immunization are quite complex and are modified quite frequently. As a result, it is important that IMM/Serve's knowledge be represented in a format that facilitates the maintenance of that knowledge as the field evolves over time. To achieve this goal, IMM/Serve uses four representations for different parts of its knowledge base: (1) Immunization forecasting parameters that specify the minimum ages and wait-intervals for each dose are stored in tabular form. (2) The clinical logic that determines which set of forecasting parameters applies for a particular patient in each vaccine series is represented using if-then rules. (3) The temporal logic that combines dates, ages, and intervals to calculate recommended dates, is expressed procedurally. (4) The screening logic that checks each previous dose for validity is performed using a decision table that combines minimum ages and wait intervals with a small amount of clinical logic. A knowledge maintenance tool, IMM/Def, has been developed to help maintain the rule-based logic. The paper describes the design of IMM/Serve and the rationale and role of the different forms of knowledge used.
TRICARE revision to CHAMPUS DRG-based payment system, pricing of hospital claims. Final rule.
2014-05-21
This Final rule changes TRICARE's current regulatory provision for inpatient hospital claims priced under the DRG-based payment system. Claims are currently priced by using the rates and weights that are in effect on a beneficiary's date of admission. This Final rule changes that provision to price such claims by using the rates and weights that are in effect on a beneficiary's date of discharge.
Phenomenology-Based Inverse Scattering for Sensor Information Fusion
2006-09-15
abilities in the past. Rule -based systems and mathematics of logic implied significant similarities between the two: Thoughts, words, and phrases...all are logical statements. The situation has changed, in part due to the fact that logic- rule systems have not been sufficiently powerful to explain...references]. 3 Language mechanisms of our mind include abilities to acquire a large vocabulary, rules of grammar, and to use the finite set of
Baek, You Soon; Covey, Paul A; Petersen, Jennifer J; Chetelat, Roger T; McClure, Bruce; Bedinger, Patricia A
2015-02-01
Interspecific reproductive barriers (IRBs) act to ensure species integrity by preventing hybridization. Previous studies on interspecific crosses in the tomato clade have focused on the success of fruit and seed set. The SI × SC rule (SI species × SC species crosses are incompatible, but the reciprocal crosses are compatible) often applies to interspecific crosses. Because SI systems in the Solanaceae affect pollen tube growth, we focused on this process in a comprehensive study of interspecific crosses in the tomato clade to test whether the SI × SC rule was always followed. Pollen tube growth was assessed in reciprocal crosses between all 13 species of the tomato clade using fluorescence microscopy. In crosses between SC and SI species, pollen tube growth follows the SI × SC rule: interspecific pollen tube rejection occurs when SI species are pollinated by SC species, but in the reciprocal crosses (SC × SI), pollen tubes reach ovaries. However, pollen tube rejection occurred in some crosses between pairs of SC species, demonstrating that a fully functional SI system is not necessary for pollen tube rejection in interspecific crosses. Further, gradations in the strength of both pistil and pollen IRBs were revealed in interspecific crosses using SC populations of generally SI species. The SI × SC rule explains many of the compatibility relations in the tomato clade, but exceptions occur with more recently evolved SC species and accessions, revealing differences in strength of both pistil and pollen IRBs. © 2015 Botanical Society of America, Inc.
Personalization of Rule-based Web Services.
Choi, Okkyung; Han, Sang Yong
2008-04-04
Nowadays Web users have clearly expressed their wishes to receive personalized services directly. Personalization is the way to tailor services directly to the immediate requirements of the user. However, the current Web Services System does not provide any features supporting this such as consideration of personalization of services and intelligent matchmaking. In this research a flexible, personalized Rule-based Web Services System to address these problems and to enable efficient search, discovery and construction across general Web documents and Semantic Web documents in a Web Services System is proposed. This system utilizes matchmaking among service requesters', service providers' and users' preferences using a Rule-based Search Method, and subsequently ranks search results. A prototype of efficient Web Services search and construction for the suggested system is developed based on the current work.
NASA Astrophysics Data System (ADS)
Milic, Vladimir; Kasac, Josip; Novakovic, Branko
2015-10-01
This paper is concerned with ?-gain optimisation of input-affine nonlinear systems controlled by analytic fuzzy logic system. Unlike the conventional fuzzy-based strategies, the non-conventional analytic fuzzy control method does not require an explicit fuzzy rule base. As the first contribution of this paper, we prove, by using the Stone-Weierstrass theorem, that the proposed fuzzy system without rule base is universal approximator. The second contribution of this paper is an algorithm for solving a finite-horizon minimax problem for ?-gain optimisation. The proposed algorithm consists of recursive chain rule for first- and second-order derivatives, Newton's method, multi-step Adams method and automatic differentiation. Finally, the results of this paper are evaluated on a second-order nonlinear system.
Life insurance risk assessment using a fuzzy logic expert system
NASA Technical Reports Server (NTRS)
Carreno, Luis A.; Steel, Roy A.
1992-01-01
In this paper, we present a knowledge based system that combines fuzzy processing with rule-based processing to form an improved decision aid for evaluating risk for life insurance. This application illustrates the use of FuzzyCLIPS to build a knowledge based decision support system possessing fuzzy components to improve user interactions and KBS performance. The results employing FuzzyCLIPS are compared with the results obtained from the solution of the problem using traditional numerical equations. The design of the fuzzy solution consists of a CLIPS rule-based system for some factors combined with fuzzy logic rules for others. This paper describes the problem, proposes a solution, presents the results, and provides a sample output of the software product.
Translating expert system rules into Ada code with validation and verification
NASA Technical Reports Server (NTRS)
Becker, Lee; Duckworth, R. James; Green, Peter; Michalson, Bill; Gosselin, Dave; Nainani, Krishan; Pease, Adam
1991-01-01
The purpose of this ongoing research and development program is to develop software tools which enable the rapid development, upgrading, and maintenance of embedded real-time artificial intelligence systems. The goals of this phase of the research were to investigate the feasibility of developing software tools which automatically translate expert system rules into Ada code and develop methods for performing validation and verification testing of the resultant expert system. A prototype system was demonstrated which automatically translated rules from an Air Force expert system was demonstrated which detected errors in the execution of the resultant system. The method and prototype tools for converting AI representations into Ada code by converting the rules into Ada code modules and then linking them with an Activation Framework based run-time environment to form an executable load module are discussed. This method is based upon the use of Evidence Flow Graphs which are a data flow representation for intelligent systems. The development of prototype test generation and evaluation software which was used to test the resultant code is discussed. This testing was performed automatically using Monte-Carlo techniques based upon a constraint based description of the required performance for the system.
Office-based anesthesia: new frontiers, better outcomes, and emphasis on safety.
Desai, Meena S
2008-12-01
Office-based anesthesia has grown and continues to grow very rapidly in the ever-changing medical environment. The demand of patients, surgeons and the evolving economic environment has set off a dynamic growth explosion. This explosion has created aggressive and tumultuous enhancements, some of which have been adapted well and some of which have led to disastrous results. As we institute rules and regulations to govern this 'wild west' of anesthesia, the landscape is set with some new guidelines that continue to evolve.Practice recommendations have been outlined for fire safety especially on patient fires. Closed claim studies offer valuable recommendations for MAC claims in the office based setting. Anesthesia Patient Safety Foundation and the ASA have outlined valuable information regarding the nonsilencing of equipment alarms.New equipment enhancements have generated successful mobile general anesthesia platforms. Finally, as we forge ahead we must construct measurements of our safety and success as outcome parameters are developed. The review of recent literature and technological advances has provided some valuable lessons in the evolution of patient safety and office based technology for the surgical office-based environment. As this specialty grows, measures of its outcome parameters will allow a gauge of performance.
Debugging expert systems using a dynamically created hypertext network
NASA Technical Reports Server (NTRS)
Boyle, Craig D. B.; Schuette, John F.
1991-01-01
The labor intensive nature of expert system writing and debugging motivated this study. The hypothesis is that a hypertext based debugging tool is easier and faster than one traditional tool, the graphical execution trace. HESDE (Hypertext Expert System Debugging Environment) uses Hypertext nodes and links to represent the objects and their relationships created during the execution of a rule based expert system. HESDE operates transparently on top of the CLIPS (C Language Integrated Production System) rule based system environment and is used during the knowledge base debugging process. During the execution process HESDE builds an execution trace. Use of facts, rules, and their values are automatically stored in a Hypertext network for each execution cycle. After the execution process, the knowledge engineer may access the Hypertext network and browse the network created. The network may be viewed in terms of rules, facts, and values. An experiment was conducted to compare HESDE with a graphical debugging environment. Subjects were given representative tasks. For speed and accuracy, in eight of the eleven tasks given to subjects, HESDE was significantly better.
Exact Hybrid Particle/Population Simulation of Rule-Based Models of Biochemical Systems
Stover, Lori J.; Nair, Niketh S.; Faeder, James R.
2014-01-01
Detailed modeling and simulation of biochemical systems is complicated by the problem of combinatorial complexity, an explosion in the number of species and reactions due to myriad protein-protein interactions and post-translational modifications. Rule-based modeling overcomes this problem by representing molecules as structured objects and encoding their interactions as pattern-based rules. This greatly simplifies the process of model specification, avoiding the tedious and error prone task of manually enumerating all species and reactions that can potentially exist in a system. From a simulation perspective, rule-based models can be expanded algorithmically into fully-enumerated reaction networks and simulated using a variety of network-based simulation methods, such as ordinary differential equations or Gillespie's algorithm, provided that the network is not exceedingly large. Alternatively, rule-based models can be simulated directly using particle-based kinetic Monte Carlo methods. This “network-free” approach produces exact stochastic trajectories with a computational cost that is independent of network size. However, memory and run time costs increase with the number of particles, limiting the size of system that can be feasibly simulated. Here, we present a hybrid particle/population simulation method that combines the best attributes of both the network-based and network-free approaches. The method takes as input a rule-based model and a user-specified subset of species to treat as population variables rather than as particles. The model is then transformed by a process of “partial network expansion” into a dynamically equivalent form that can be simulated using a population-adapted network-free simulator. The transformation method has been implemented within the open-source rule-based modeling platform BioNetGen, and resulting hybrid models can be simulated using the particle-based simulator NFsim. Performance tests show that significant memory savings can be achieved using the new approach and a monetary cost analysis provides a practical measure of its utility. PMID:24699269
Exact hybrid particle/population simulation of rule-based models of biochemical systems.
Hogg, Justin S; Harris, Leonard A; Stover, Lori J; Nair, Niketh S; Faeder, James R
2014-04-01
Detailed modeling and simulation of biochemical systems is complicated by the problem of combinatorial complexity, an explosion in the number of species and reactions due to myriad protein-protein interactions and post-translational modifications. Rule-based modeling overcomes this problem by representing molecules as structured objects and encoding their interactions as pattern-based rules. This greatly simplifies the process of model specification, avoiding the tedious and error prone task of manually enumerating all species and reactions that can potentially exist in a system. From a simulation perspective, rule-based models can be expanded algorithmically into fully-enumerated reaction networks and simulated using a variety of network-based simulation methods, such as ordinary differential equations or Gillespie's algorithm, provided that the network is not exceedingly large. Alternatively, rule-based models can be simulated directly using particle-based kinetic Monte Carlo methods. This "network-free" approach produces exact stochastic trajectories with a computational cost that is independent of network size. However, memory and run time costs increase with the number of particles, limiting the size of system that can be feasibly simulated. Here, we present a hybrid particle/population simulation method that combines the best attributes of both the network-based and network-free approaches. The method takes as input a rule-based model and a user-specified subset of species to treat as population variables rather than as particles. The model is then transformed by a process of "partial network expansion" into a dynamically equivalent form that can be simulated using a population-adapted network-free simulator. The transformation method has been implemented within the open-source rule-based modeling platform BioNetGen, and resulting hybrid models can be simulated using the particle-based simulator NFsim. Performance tests show that significant memory savings can be achieved using the new approach and a monetary cost analysis provides a practical measure of its utility.
Monitoring Agents for Assisting NASA Engineers with Shuttle Ground Processing
NASA Technical Reports Server (NTRS)
Semmel, Glenn S.; Davis, Steven R.; Leucht, Kurt W.; Rowe, Danil A.; Smith, Kevin E.; Boeloeni, Ladislau
2005-01-01
The Spaceport Processing Systems Branch at NASA Kennedy Space Center has designed, developed, and deployed a rule-based agent to monitor the Space Shuttle's ground processing telemetry stream. The NASA Engineering Shuttle Telemetry Agent increases situational awareness for system and hardware engineers during ground processing of the Shuttle's subsystems. The agent provides autonomous monitoring of the telemetry stream and automatically alerts system engineers when user defined conditions are satisfied. Efficiency and safety are improved through increased automation. Sandia National Labs' Java Expert System Shell is employed as the agent's rule engine. The shell's predicate logic lends itself well to capturing the heuristics and specifying the engineering rules within this domain. The declarative paradigm of the rule-based agent yields a highly modular and scalable design spanning multiple subsystems of the Shuttle. Several hundred monitoring rules have been written thus far with corresponding notifications sent to Shuttle engineers. This chapter discusses the rule-based telemetry agent used for Space Shuttle ground processing. We present the problem domain along with design and development considerations such as information modeling, knowledge capture, and the deployment of the product. We also present ongoing work with other condition monitoring agents.
Proving Properties of Rule-Based Systems
1990-12-01
in these systems and enable us to use them with more confidence. Each system of rules is encoded as a set of axioms that define the system theory . The...operation of the rule language and information about the subject domain are also described in the system theory . Validation tasks, such as...the validity of the conjecture in the system theory , we have carried out the corresponding validation task. If the proof is restricted to be
Algorithm Optimally Orders Forward-Chaining Inference Rules
NASA Technical Reports Server (NTRS)
James, Mark
2008-01-01
People typically develop knowledge bases in a somewhat ad hoc manner by incrementally adding rules with no specific organization. This often results in a very inefficient execution of those rules since they are so often order sensitive. This is relevant to tasks like Deep Space Network in that it allows the knowledge base to be incrementally developed and have it automatically ordered for efficiency. Although data flow analysis was first developed for use in compilers for producing optimal code sequences, its usefulness is now recognized in many software systems including knowledge-based systems. However, this approach for exhaustively computing data-flow information cannot directly be applied to inference systems because of the ubiquitous execution of the rules. An algorithm is presented that efficiently performs a complete producer/consumer analysis for each antecedent and consequence clause in a knowledge base to optimally order the rules to minimize inference cycles. An algorithm was developed that optimally orders a knowledge base composed of forwarding chaining inference rules such that independent inference cycle executions are minimized, thus, resulting in significantly faster execution. This algorithm was integrated into the JPL tool Spacecraft Health Inference Engine (SHINE) for verification and it resulted in a significant reduction in inference cycles for what was previously considered an ordered knowledge base. For a knowledge base that is completely unordered, then the improvement is much greater.
CT Image Sequence Analysis for Object Recognition - A Rule-Based 3-D Computer Vision System
Dongping Zhu; Richard W. Conners; Daniel L. Schmoldt; Philip A. Araman
1991-01-01
Research is now underway to create a vision system for hardwood log inspection using a knowledge-based approach. In this paper, we present a rule-based, 3-D vision system for locating and identifying wood defects using topological, geometric, and statistical attributes. A number of different features can be derived from the 3-D input scenes. These features and evidence...
Diversity Driven Coexistence: Collective Stability in the Cyclic Competition of Three Species
NASA Astrophysics Data System (ADS)
Bassler, Kevin E.; Frey, Erwin; Zia, R. K. P.
2015-03-01
The basic physics of collective behavior are often difficult to quantify and understand, particularly when the system is driven out of equilibrium. Many complex systems are usefully described as complex networks, consisting of nodes and links. The nodes specify individual components of the system and the links describe their interactions. When both nodes and links change dynamically, or `co-evolve', as happens in many realistic systems, complex mathematical structures are encountered, posing challenges to our understanding. In this context, we introduce a minimal system of node and link degrees of freedom, co-evolving with stochastic rules. Specifically, we show that diversity of social temperament (intro- or extroversion) can produce collective stable coexistence when three species compete cyclically. It is well-known that when only extroverts exist in a stochastic rock-paper-scissors game, or in a conserved predator-prey, Lotka-Volterra system, extinction occurs at times of O(N), where N is the number of nodes. We find that when both introverts and extroverts exist, where introverts sever social interactions and extroverts create them, collective coexistence prevails in long-living, quasi-stationary states. Work supported by the NSF through Grants DMR-1206839 (KEB) and DMR-1244666 (RKPZ), and by the AFOSR and DARPA through Grant FA9550-12-1-0405 (KEB).
Li, Yang; Li, Guoqing; Wang, Zhenhao
2015-01-01
In order to overcome the problems of poor understandability of the pattern recognition-based transient stability assessment (PRTSA) methods, a new rule extraction method based on extreme learning machine (ELM) and an improved Ant-miner (IAM) algorithm is presented in this paper. First, the basic principles of ELM and Ant-miner algorithm are respectively introduced. Then, based on the selected optimal feature subset, an example sample set is generated by the trained ELM-based PRTSA model. And finally, a set of classification rules are obtained by IAM algorithm to replace the original ELM network. The novelty of this proposal is that transient stability rules are extracted from an example sample set generated by the trained ELM-based transient stability assessment model by using IAM algorithm. The effectiveness of the proposed method is shown by the application results on the New England 39-bus power system and a practical power system--the southern power system of Hebei province.
Wilhelmsson, Per K I; Mühlich, Cornelia; Ullrich, Kristian K
2017-01-01
Abstract Plant genomes encode many lineage-specific, unique transcription factors. Expansion of such gene families has been previously found to coincide with the evolution of morphological complexity, although comparative analyses have been hampered by severe sampling bias. Here, we make use of the recently increased availability of plant genomes. We have updated and expanded previous rule sets for domain-based classification of transcription associated proteins (TAPs), comprising transcription factors and transcriptional regulators. The genome-wide annotation of these protein families has been analyzed and made available via the novel TAPscan web interface. We find that many TAP families previously thought to be specific for land plants actually evolved in streptophyte (charophyte) algae; 26 out of 36 TAP family gains are inferred to have occurred in the common ancestor of the Streptophyta (uniting the land plants—Embryophyta—with their closest algal relatives). In contrast, expansions of TAP families were found to occur throughout streptophyte evolution. 17 out of 76 expansion events were found to be common to all land plants and thus probably evolved concomitant with the water-to-land-transition. PMID:29216360
Rule Systems for Runtime Verification: A Short Tutorial
NASA Astrophysics Data System (ADS)
Barringer, Howard; Havelund, Klaus; Rydeheard, David; Groce, Alex
In this tutorial, we introduce two rule-based systems for on and off-line trace analysis, RuleR and LogScope. RuleR is a conditional rule-based system, which has a simple and easily implemented algorithm for effective runtime verification, and into which one can compile a wide range of temporal logics and other specification formalisms used for runtime verification. Specifications can be parameterized with data, or even with specifications, allowing for temporal logic combinators to be defined. We outline a number of simple syntactic extensions of core RuleR that can lead to further conciseness of specification but still enabling easy and efficient implementation. RuleR is implemented in Java and we will demonstrate its ease of use in monitoring Java programs. LogScope is a derivation of RuleR adding a simple very user-friendly temporal logic. It was developed in Python, specifically for supporting testing of spacecraft flight software for NASA’s next 2011 Mars mission MSL (Mars Science Laboratory). The system has been applied by test engineers to analysis of log files generated by running the flight software. Detailed logging is already part of the system design approach, and hence there is no added instrumentation overhead caused by this approach. While post-mortem log analysis prevents the autonomous reaction to problems possible with traditional runtime verification, it provides a powerful tool for test automation. A new system is being developed that integrates features from both RuleR and LogScope.
Conceptual model of knowledge base system
NASA Astrophysics Data System (ADS)
Naykhanova, L. V.; Naykhanova, I. V.
2018-05-01
In the article, the conceptual model of the knowledge based system by the type of the production system is provided. The production system is intended for automation of problems, which solution is rigidly conditioned by the legislation. A core component of the system is a knowledge base. The knowledge base consists of a facts set, a rules set, the cognitive map and ontology. The cognitive map is developed for implementation of a control strategy, ontology - the explanation mechanism. Knowledge representation about recognition of a situation in the form of rules allows describing knowledge of the pension legislation. This approach provides the flexibility, originality and scalability of the system. In the case of changing legislation, it is necessary to change the rules set. This means that the change of the legislation would not be a big problem. The main advantage of the system is that there is an opportunity to be adapted easily to changes of the legislation.
XBONE: a hybrid expert system for supporting diagnosis of bone diseases.
Hatzilygeroudis, I; Vassilakos, P J; Tsakalidis, A
1997-01-01
In this paper, XBONE, a hybrid medical expert system that supports diagnosis of bone diseases is presented. Diagnosis is based on various patient data and is performed in two stages. In the early stage, diagnosis is based on demographic and clinical data of the patient, whereas in the late stage it is mainly based on nuclear medicine image data. Knowledge is represented via an integrated formalism that combines production rules and the Adaline artificial neural unit. Each condition of a rule is assigned a number, called its significance factor, representing its significance in drawing the conclusion of the rule. This results in better representation, reduction of the knowledge base size and gives the system learning capabilities.
Feedbacks between Reservoir Operation and Floodplain Development
NASA Astrophysics Data System (ADS)
Wallington, K.; Cai, X.
2017-12-01
The increased connectedness of socioeconomic and natural systems warrants the study of them jointly as Coupled Natural-Human Systems (CNHS) (Liu et al., 2007). One such CNHS given significant attention in recent years has been the coupled sociological-hydrological system of floodplains. Di Baldassarre et al. (2015) developed a model coupling floodplain development and levee heightening, a flood control measure, which demonstrated the "levee effect" and "adaptation effect" seen in observations. Here, we adapt the concepts discussed by Di Baldassarre et al. (2015) and apply them to floodplains in which the primary flood control measure is reservoir storage, rather than levee construction, to study the role of feedbacks between reservoir operation and floodplain development. Specifically, we investigate the feedback between floodplain development and optimal management of trade-offs between flood water conservation and flood control. By coupling a socio-economic model based on that of Di Baldassarre et al. (2015) with a reservoir optimization model based on that discussed in Ding et al. (2017), we show that reservoir operation rules can co-evolve with floodplain development. Furthermore, we intend to demonstrate that the model results are consistent with real-world data for reservoir operating curves and floodplain development. This model will help explain why some reservoirs are currently operated for purposes which they were not originally intended and thus inform reservoir design and construction.
Sleep-dependent memory triage: Evolving generalization through selective processing
Stickgold, Robert; Walker, Matthew P.
2018-01-01
The brain does not retain all the information it encodes in a day. Much is forgotten, and of those memories retained, their subsequent “evolution” can follow any of a number of pathways. Emerging data makes clear that sleep is a compelling candidate for performing many of these operations. But how does the sleeping brain know which information to preserve and which to forget? What should sleep do with that information it chooses to keep? For information that is retained, sleep can integrate it into existing memory networks, look for common patterns and distill overarching rules, or simply stabilize and strengthen the memory exactly as it was learned. We suggest such “memory triage” lies at the heart of a sleep-dependent memory processing system that selects new information, in a discriminatory manner, and assimilates it into the brain’s vast armamentarium of evolving knowledge, helping guide each organism through its own, unique life. PMID:23354387
Siegel, J; Kirkland, D
1991-01-01
The Composite Health Care System (CHCS), a MUMPS-based hospital information system (HIS), has evolved from the Decentralized Hospital Computer Program (DHCP) installed within VA Hospitals. The authors explore the evolution of an ancillary-based system toward an integrated model with a look at its current state and possible future. The history and relationships between orders of different types tie specific patient-related data into a logical and temporal model. Diagrams demonstrate how the database structure has evolved to support clinical needs for integration. It is suggested that a fully integrated model is capable of meeting traditional HIS needs.
OFMspert: An architecture for an operator's associate that evolves to an intelligent tutor
NASA Technical Reports Server (NTRS)
Mitchell, Christine M.
1991-01-01
With the emergence of new technology for both human-computer interaction and knowledge-based systems, a range of opportunities exist which enhance the effectiveness and efficiency of controllers of high-risk engineering systems. The design of an architecture for an operator's associate is described. This associate is a stand-alone model-based system designed to interact with operators of complex dynamic systems, such as airplanes, manned space systems, and satellite ground control systems in ways comparable to that of a human assistant. The operator function model expert system (OFMspert) architecture and the design and empirical validation of OFMspert's understanding component are described. The design and validation of OFMspert's interactive and control components are also described. A description of current work in which OFMspert provides the foundation in the development of an intelligent tutor that evolves to an assistant, as operator expertise evolves from novice to expert, is provided.
A Data-Driven, Integrated Flare Model Based on Self-Organized Criticality
NASA Astrophysics Data System (ADS)
Dimitropoulou, M.; Isliker, H.; Vlahos, L.; Georgoulis, M.
2013-09-01
We interpret solar flares as events originating in solar active regions having reached the self-organized critical state, by alternatively using two versions of an "integrated flare model" - one static and one dynamic. In both versions the initial conditions are derived from observations aiming to investigate whether well-known scaling laws observed in the distribution functions of characteristic flare parameters are reproduced after the self-organized critical state has been reached. In the static model, we first apply a nonlinear force-free extrapolation that reconstructs the three-dimensional magnetic fields from two-dimensional vector magnetograms. We then locate magnetic discontinuities exceeding a threshold in the Laplacian of the magnetic field. These discontinuities are relaxed in local diffusion events, implemented in the form of cellular-automaton evolution rules. Subsequent loading and relaxation steps lead the system to self-organized criticality, after which the statistical properties of the simulated events are examined. In the dynamic version we deploy an enhanced driving mechanism, which utilizes the observed evolution of active regions, making use of sequential vector magnetograms. We first apply the static cellular automaton model to consecutive solar vector magnetograms until the self-organized critical state is reached. We then evolve the magnetic field inbetween these processed snapshots through spline interpolation, acting as a natural driver in the dynamic model. The identification of magnetically unstable sites as well as their relaxation follow the same rules as in the static model after each interpolation step. Subsequent interpolation/driving and relaxation steps cover all transitions until the end of the sequence. Physical requirements, such as the divergence-free condition for the magnetic field vector, are approximately satisfied in both versions of the model. We obtain robust power laws in the distribution functions of the modelled flaring events with scaling indices in good agreement with observations. We therefore conclude that well-known statistical properties of flares are reproduced after active regions reach self-organized criticality. The significant enhancement in both the static and the dynamic integrated flare models is that they initiate the simulation from observations, thus facilitating energy calculation in physical units. Especially in the dynamic version of the model, the driving of the system is based on observed, evolving vector magnetograms, allowing for the separation between MHD and kinetic timescales through the assignment of distinct MHD timestamps to each interpolation step.
Modern architectures for intelligent systems: reusable ontologies and problem-solving methods.
Musen, M. A.
1998-01-01
When interest in intelligent systems for clinical medicine soared in the 1970s, workers in medical informatics became particularly attracted to rule-based systems. Although many successful rule-based applications were constructed, development and maintenance of large rule bases remained quite problematic. In the 1980s, an entire industry dedicated to the marketing of tools for creating rule-based systems rose and fell, as workers in medical informatics began to appreciate deeply why knowledge acquisition and maintenance for such systems are difficult problems. During this time period, investigators began to explore alternative programming abstractions that could be used to develop intelligent systems. The notions of "generic tasks" and of reusable problem-solving methods became extremely influential. By the 1990s, academic centers were experimenting with architectures for intelligent systems based on two classes of reusable components: (1) domain-independent problem-solving methods-standard algorithms for automating stereotypical tasks--and (2) domain ontologies that captured the essential concepts (and relationships among those concepts) in particular application areas. This paper will highlight how intelligent systems for diverse tasks can be efficiently automated using these kinds of building blocks. The creation of domain ontologies and problem-solving methods is the fundamental end product of basic research in medical informatics. Consequently, these concepts need more attention by our scientific community. PMID:9929181
Modern architectures for intelligent systems: reusable ontologies and problem-solving methods.
Musen, M A
1998-01-01
When interest in intelligent systems for clinical medicine soared in the 1970s, workers in medical informatics became particularly attracted to rule-based systems. Although many successful rule-based applications were constructed, development and maintenance of large rule bases remained quite problematic. In the 1980s, an entire industry dedicated to the marketing of tools for creating rule-based systems rose and fell, as workers in medical informatics began to appreciate deeply why knowledge acquisition and maintenance for such systems are difficult problems. During this time period, investigators began to explore alternative programming abstractions that could be used to develop intelligent systems. The notions of "generic tasks" and of reusable problem-solving methods became extremely influential. By the 1990s, academic centers were experimenting with architectures for intelligent systems based on two classes of reusable components: (1) domain-independent problem-solving methods-standard algorithms for automating stereotypical tasks--and (2) domain ontologies that captured the essential concepts (and relationships among those concepts) in particular application areas. This paper will highlight how intelligent systems for diverse tasks can be efficiently automated using these kinds of building blocks. The creation of domain ontologies and problem-solving methods is the fundamental end product of basic research in medical informatics. Consequently, these concepts need more attention by our scientific community.
Genetic reinforcement learning through symbiotic evolution for fuzzy controller design.
Juang, C F; Lin, J Y; Lin, C T
2000-01-01
An efficient genetic reinforcement learning algorithm for designing fuzzy controllers is proposed in this paper. The genetic algorithm (GA) adopted in this paper is based upon symbiotic evolution which, when applied to fuzzy controller design, complements the local mapping property of a fuzzy rule. Using this Symbiotic-Evolution-based Fuzzy Controller (SEFC) design method, the number of control trials, as well as consumed CPU time, are considerably reduced when compared to traditional GA-based fuzzy controller design methods and other types of genetic reinforcement learning schemes. Moreover, unlike traditional fuzzy controllers, which partition the input space into a grid, SEFC partitions the input space in a flexible way, thus creating fewer fuzzy rules. In SEFC, different types of fuzzy rules whose consequent parts are singletons, fuzzy sets, or linear equations (TSK-type fuzzy rules) are allowed. Further, the free parameters (e.g., centers and widths of membership functions) and fuzzy rules are all tuned automatically. For the TSK-type fuzzy rule especially, which put the proposed learning algorithm in use, only the significant input variables are selected to participate in the consequent of a rule. The proposed SEFC design method has been applied to different simulated control problems, including the cart-pole balancing system, a magnetic levitation system, and a water bath temperature control system. The proposed SEFC has been verified to be efficient and superior from these control problems, and from comparisons with some traditional GA-based fuzzy systems.
48 CFR 6103.306 - Decisions [Rule 306].
Code of Federal Regulations, 2010 CFR
2010-10-01
... 48 Federal Acquisition Regulations System 7 2010-10-01 2010-10-01 false Decisions [Rule 306]. 6103.306 Section 6103.306 Federal Acquisition Regulations System CIVILIAN BOARD OF CONTRACT APPEALS, GENERAL SERVICES ADMINISTRATION TRANSPORTATION RATE CASES 6103.306 Decisions [Rule 306]. The judge will issue a written decision based upon the record,...
Complexities, Catastrophes and Cities: Emergency Dynamics in Varying Scenarios and Urban Topologies
NASA Astrophysics Data System (ADS)
Narzisi, Giuseppe; Mysore, Venkatesh; Byeon, Jeewoong; Mishra, Bud
Complex Systems are often characterized by agents capable of interacting with each other dynamically, often in non-linear and non-intuitive ways. Trying to characterize their dynamics often results in partial differential equations that are difficult, if not impossible, to solve. A large city or a city-state is an example of such an evolving and self-organizing complex environment that efficiently adapts to different and numerous incremental changes to its social, cultural and technological infrastructure [1]. One powerful technique for analyzing such complex systems is Agent-Based Modeling (ABM) [9], which has seen an increasing number of applications in social science, economics and also biology. The agent-based paradigm facilitates easier transfer of domain specific knowledge into a model. ABM provides a natural way to describe systems in which the overall dynamics can be described as the result of the behavior of populations of autonomous components: agents, with a fixed set of rules based on local information and possible central control. As part of the NYU Center for Catastrophe Preparedness and Response (CCPR1), we have been exploring how ABM can serve as a powerful simulation technique for analyzing large-scale urban disasters. The central problem in Disaster Management is that it is not immediately apparent whether the current emergency plans are robust against such sudden, rare and punctuated catastrophic events.
NASA Astrophysics Data System (ADS)
Lee, S.; Hamlet, A. F.; Burges, S. J.
2008-12-01
Climate change in the Western U.S. will bring systematic hydrologic changes affecting many water resources systems. Successful adaptation to these changes, which will be ongoing through the 21st century, will require the 'rebalancing' of competing system objectives such as water supply, flood control, hydropower production, and environmental services in response to hydrologic (and other) changes. Although fixed operating policies for the operation of reservoirs has been a traditional approach to water management in the 20th century, the rapid pace of projected climate shifts (~0.5 F per decade), and the prohibitive costs of recursive policy intervention to mitigate impacts, suggest that more sophisticated approaches will be needed to cope with climate change on a long term basis. The use of 'dynamic rule curves' is an approach that maintains some of the key characteristics of current water management practice (reservoir rule curves) while avoiding many of the fundamental drawbacks of traditional water resources management strategies in a non-stationary climate. In this approach, water resources systems are optimized for each operational period using ensemble streamflow and/or water demand forecasts. The ensemble of optimized reservoir storage traces are then analyzed to produce a set of unique reservoir rule curves for each operational period reflecting the current state of the system. The potential advantage of this approach is that hydrologic changes associated with climate change (such as systematically warmer temperatures) can be captured explicitly in operational hydrologic forecasts, which would in turn inform the optimized reservoir management solutions, creating water resources systems that are largely 'self tending' as the climate system evolves. Furthermore, as hydrologic forecasting systems improve (e.g. in response to improved ENSO forecasting or other scientific advances), so does the performance of reservoir operations. An example of the approach is given for flood control in the Columbia River basin.
Rule-based support system for multiple UMLS semantic type assignments
Geller, James; He, Zhe; Perl, Yehoshua; Morrey, C. Paul; Xu, Julia
2012-01-01
Background When new concepts are inserted into the UMLS, they are assigned one or several semantic types from the UMLS Semantic Network by the UMLS editors. However, not every combination of semantic types is permissible. It was observed that many concepts with rare combinations of semantic types have erroneous semantic type assignments or prohibited combinations of semantic types. The correction of such errors is resource-intensive. Objective We design a computational system to inform UMLS editors as to whether a specific combination of two, three, four, or five semantic types is permissible or prohibited or questionable. Methods We identify a set of inclusion and exclusion instructions in the UMLS Semantic Network documentation and derive corresponding rule-categories as well as rule-categories from the UMLS concept content. We then design an algorithm adviseEditor based on these rule-categories. The algorithm specifies rules for an editor how to proceed when considering a tuple (pair, triple, quadruple, quintuple) of semantic types to be assigned to a concept. Results Eight rule-categories were identified. A Web-based system was developed to implement the adviseEditor algorithm, which returns for an input combination of semantic types whether it is permitted, prohibited or (in a few cases) requires more research. The numbers of semantic type pairs assigned to each rule-category are reported. Interesting examples for each rule-category are illustrated. Cases of semantic type assignments that contradict rules are listed, including recently introduced ones. Conclusion The adviseEditor system implements explicit and implicit knowledge available in the UMLS in a system that informs UMLS editors about the permissibility of a desired combination of semantic types. Using adviseEditor might help accelerate the work of the UMLS editors and prevent erroneous semantic type assignments. PMID:23041716
Outsourcing critical financial system operations.
Cox, Nora; Pilbauer, Jan
2018-01-01
Payments Canada provides Canada's national payments systems and is responsible for the clearing and settlement infrastructure, processes and rules that underpin the exchange of billions of dollars each day through the Canadian economy. Strategic sourcing is a reality for this small organisation with a broad scope of national regulations and global standards to comply with. This paper outlines Payments Canada's approach to outsourcing its critical financial system operations, which centres on four key principles: strong relationship management; continuous learning, recording and reporting; evaluating the business landscape; and a commitment to evolving the organisation to greater resilience. This last point is covered in detail with an exploration of the organisation's resilience and security strategy as well as its risk appetite. As Payments Canada progresses to its future state, which includes modernising its core payment systems, underlying rules and standards, risk management for the industry as a whole will remain at the forefront of its collective mind. The expectation is that outsourcing will remain a fundamental element of its operating model in future, a strategy that will ensure the organisation can focus on its core business competencies and eliminate the need to develop and support in-house expertise in commodity areas.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pin, F.G.
Sensor-based operation of autonomous robots in unstructured and/or outdoor environments has revealed to be an extremely challenging problem, mainly because of the difficulties encountered when attempting to represent the many uncertainties which are always present in the real world. These uncertainties are primarily due to sensor imprecisions and unpredictability of the environment, i.e., lack of full knowledge of the environment characteristics and dynamics. An approach. which we have named the {open_quotes}Fuzzy Behaviorist Approach{close_quotes} (FBA) is proposed in an attempt to remedy some of these difficulties. This approach is based on the representation of the system`s uncertainties using Fuzzy Set Theory-basedmore » approximations and on the representation of the reasoning and control schemes as sets of elemental behaviors. Using the FBA, a formalism for rule base development and an automated generator of fuzzy rules have been developed. This automated system can automatically construct the set of membership functions corresponding to fuzzy behaviors. Once these have been expressed in qualitative terms by the user. The system also checks for completeness of the rule base and for non-redundancy of the rules (which has traditionally been a major hurdle in rule base development). Two major conceptual features, the suppression and inhibition mechanisms which allow to express a dominance between behaviors are discussed in detail. Some experimental results obtained with the automated fuzzy, rule generator applied to the domain of sensor-based navigation in aprion unknown environments. using one of our autonomous test-bed robots as well as a real car in outdoor environments, are then reviewed and discussed to illustrate the feasibility of large-scale automatic fuzzy rule generation using the {open_quotes}Fuzzy Behaviorist{close_quotes} concepts.« less
Sargis, Eric J.; Millien, Virginie; Woodman, Neal; Olson, Link E.
2018-01-01
There are a number of ecogeographical “rules” that describe patterns of geographical variation among organisms. The island rule predicts that populations of larger mammals on islands evolve smaller mean body size than their mainland counterparts, whereas smaller‐bodied mammals evolve larger size. Bergmann's rule predicts that populations of a species in colder climates (generally at higher latitudes) have larger mean body sizes than conspecifics in warmer climates (at lower latitudes). These two rules are rarely tested together and neither has been rigorously tested in treeshrews, a clade of small‐bodied mammals in their own order (Scandentia) broadly distributed in mainland Southeast Asia and on islands throughout much of the Sunda Shelf. The common treeshrew, Tupaia glis, is an excellent candidate for study and was used to test these two rules simultaneously for the first time in treeshrews. This species is distributed on the Malay Peninsula and several offshore islands east, west, and south of the mainland. Using craniodental dimensions as a proxy for body size, we investigated how island size, distance from the mainland, and maximum sea depth between the mainland and the islands relate to body size of 13 insular T. glis populations while also controlling for latitude and correlation among variables. We found a strong negative effect of latitude on body size in the common treeshrew, indicating the inverse of Bergmann's rule. We did not detect any overall difference in body size between the island and mainland populations. However, there was an effect of island area and maximum sea depth on body size among island populations. Although there is a strong latitudinal effect on body size, neither Bergmann's rule nor the island rule applies to the common treeshrew. The results of our analyses demonstrate the necessity of assessing multiple variables simultaneously in studies of ecogeographical rules.
A CLIPS-based expert system for the evaluation and selection of robots
NASA Technical Reports Server (NTRS)
Nour, Mohamed A.; Offodile, Felix O.; Madey, Gregory R.
1994-01-01
This paper describes the development of a prototype expert system for intelligent selection of robots for manufacturing operations. The paper first develops a comprehensive, three-stage process to model the robot selection problem. The decisions involved in this model easily lend themselves to an expert system application. A rule-based system, based on the selection model, is developed using the CLIPS expert system shell. Data about actual robots is used to test the performance of the prototype system. Further extensions to the rule-based system for data handling and interfacing capabilities are suggested.
The PDS4 Information Model and its Role in Agile Science Data Curation
NASA Astrophysics Data System (ADS)
Hughes, J. S.; Crichton, D.
2017-12-01
PDS4 is an information model-driven service architecture supporting the capture, management, distribution and integration of massive planetary science data captured in distributed data archives world-wide. The PDS4 Information Model (IM), the core element of the architecture, was developed using lessons learned from 20 years of archiving Planetary Science Data and best practices for information model development. The foundational principles were adopted from the Open Archival Information System (OAIS) Reference Model (ISO 14721), the Metadata Registry Specification (ISO/IEC 11179), and W3C XML (Extensible Markup Language) specifications. These provided respectively an object oriented model for archive information systems, a comprehensive schema for data dictionaries and hierarchical governance, and rules for rules for encoding documents electronically. The PDS4 Information model is unique in that it drives the PDS4 infrastructure by providing the representation of concepts and their relationships, constraints, rules, and operations; a sharable, stable, and organized set of information requirements; and machine parsable definitions that are suitable for configuring and generating code. This presentation will provide an over of the PDS4 Information Model and how it is being leveraged to develop and evolve the PDS4 infrastructure and enable agile curation of over 30 years of science data collected by the international Planetary Science community.
NASA Astrophysics Data System (ADS)
Lombardi, Ilaria; Console, Luca
In the paper we show how rule-based inference can be made more flexible by exploiting semantic information associated with the concepts involved in the rules. We introduce flexible forms of common sense reasoning in which whenever no rule applies to a given situation, the inference engine can fire rules that apply to more general or to similar situations. This can be obtained by defining new forms of match between rules and the facts in the working memory and new forms of conflict resolution. We claim that in this way we can overcome some of the brittleness problems that are common in rule-based systems.
Rule-Based Event Processing and Reaction Rules
NASA Astrophysics Data System (ADS)
Paschke, Adrian; Kozlenkov, Alexander
Reaction rules and event processing technologies play a key role in making business and IT / Internet infrastructures more agile and active. While event processing is concerned with detecting events from large event clouds or streams in almost real-time, reaction rules are concerned with the invocation of actions in response to events and actionable situations. They state the conditions under which actions must be taken. In the last decades various reaction rule and event processing approaches have been developed, which for the most part have been advanced separately. In this paper we survey reaction rule approaches and rule-based event processing systems and languages.
Spatio-Temporal Data Model for Integrating Evolving Nation-Level Datasets
NASA Astrophysics Data System (ADS)
Sorokine, A.; Stewart, R. N.
2017-10-01
Ability to easily combine the data from diverse sources in a single analytical workflow is one of the greatest promises of the Big Data technologies. However, such integration is often challenging as datasets originate from different vendors, governments, and research communities that results in multiple incompatibilities including data representations, formats, and semantics. Semantics differences are hardest to handle: different communities often use different attribute definitions and associate the records with different sets of evolving geographic entities. Analysis of global socioeconomic variables across multiple datasets over prolonged time is often complicated by the difference in how boundaries and histories of countries or other geographic entities are represented. Here we propose an event-based data model for depicting and tracking histories of evolving geographic units (countries, provinces, etc.) and their representations in disparate data. The model addresses the semantic challenge of preserving identity of geographic entities over time by defining criteria for the entity existence, a set of events that may affect its existence, and rules for mapping between different representations (datasets). Proposed model is used for maintaining an evolving compound database of global socioeconomic and environmental data harvested from multiple sources. Practical implementation of our model is demonstrated using PostgreSQL object-relational database with the use of temporal, geospatial, and NoSQL database extensions.
NASA Technical Reports Server (NTRS)
Thalman, Nancy E.; Sparn, Thomas P.
1990-01-01
SURE (Science User Resource Expert) is one of three components that compose the SURPASS (Science User Resource Planning and Scheduling System). This system is a planning and scheduling tool which supports distributed planning and scheduling, based on resource allocation and optimization. Currently SURE is being used within the SURPASS by the UARS (Upper Atmospheric Research Satellite) SOLSTICE instrument to build a daily science plan and activity schedule and in a prototyping effort with NASA GSFC to demonstrate distributed planning and scheduling for the SOLSTICE II instrument on the EOS platform. For the SOLSTICE application the SURE utilizes a rule-based system. Development of a rule-based program using Ada CLIPS as opposed to using conventional programming, allows for capture of the science planning and scheduling heuristics in rules and provides flexibility in inserting or removing rules as the scientific objectives and mission constraints change. The SURE system's role as a component in the SURPASS, the purpose of the SURE planning and scheduling tool, the SURE knowledge base, and the software architecture of the SURE component are described.
Dynamic Algorithms for Transition Matrix Generation
NASA Astrophysics Data System (ADS)
Yevick, David; Lee, Yong Hwan
The methods of [D. Yevick, Int. J. Mod. Phys. C, 1650041] for constructing transition matrices are applied to the two dimensional Ising model. Decreasing the system temperature during the acquisition of the matrix elements yields a reasonably precise specific heat curve for a 32x32 spin system for a limited number (50-100M) of realizations. If the system is instead evolved to first higher and then lower energies within a restricted interval that is steadily displaced in energy as the computation proceeds, a modification which permits backward displacements up to a certain lower bound for each forward step ensures acceptable accuracy. Additional constraints on the transition rule are also investigated. The Natural Sciences and Engineering Research Council of Canada (NSERC) and CIENA are acknowledged for financial support.
A diagnostic prototype of the potable water subsystem of the Space Station Freedom ECLSS
NASA Technical Reports Server (NTRS)
Lukefahr, Brenda D.; Rochowiak, Daniel M.; Benson, Brian L.; Rogers, John S.; Mckee, James W.
1989-01-01
In analyzing the baseline Environmental Control and Life Support System (ECLSS) command and control architecture, various processes are found which would be enhanced by the use of knowledge based system methods of implementation. The most suitable process for prototyping using rule based methods are documented, while domain knowledge resources and other practical considerations are examined. Requirements for a prototype rule based software system are documented. These requirements reflect Space Station Freedom ECLSS software and hardware development efforts, and knowledge based system requirements. A quick prototype knowledge based system environment is researched and developed.
Intelligent reservoir operation system based on evolving artificial neural networks
NASA Astrophysics Data System (ADS)
Chaves, Paulo; Chang, Fi-John
2008-06-01
We propose a novel intelligent reservoir operation system based on an evolving artificial neural network (ANN). Evolving means the parameters of the ANN model are identified by the GA evolutionary optimization technique. Accordingly, the ANN model should represent the operational strategies of reservoir operation. The main advantages of the Evolving ANN Intelligent System (ENNIS) are as follows: (i) only a small number of parameters to be optimized even for long optimization horizons, (ii) easy to handle multiple decision variables, and (iii) the straightforward combination of the operation model with other prediction models. The developed intelligent system was applied to the operation of the Shihmen Reservoir in North Taiwan, to investigate its applicability and practicability. The proposed method is first built to a simple formulation for the operation of the Shihmen Reservoir, with single objective and single decision. Its results were compared to those obtained by dynamic programming. The constructed network proved to be a good operational strategy. The method was then built and applied to the reservoir with multiple (five) decision variables. The results demonstrated that the developed evolving neural networks improved the operation performance of the reservoir when compared to its current operational strategy. The system was capable of successfully simultaneously handling various decision variables and provided reasonable and suitable decisions.
[Economics of health system transformation].
González Pier, Eduardo
2012-01-01
Health conditions in Mexico have evolved along with socioeconomic conditions. As a result, today's health system faces several problems characterized by four overlapping transitions: demand, expectations, funding and health resources. These transitions engender significant pressures on the system itself. Additionally, fragmentation of the health system creates disparities in access to services and generates problems in terms of efficiency and use of available resources. To address these complications and to improve equity in access and efficiency, thorough analysis is required in how the right to access health care should be established at a constitutional level without differentiating across population groups. This should be followed by careful discussion about what rules of health care financing should exist, which set of interventions ought to be covered and how services must be organized to meet the health needs of the population.
2016-11-14
This final rule with comment period revises the Medicare hospital outpatient prospective payment system (OPPS) and the Medicare ambulatory surgical center (ASC) payment system for CY 2017 to implement applicable statutory requirements and changes arising from our continuing experience with these systems. In this final rule with comment period, we describe the changes to the amounts and factors used to determine the payment rates for Medicare services paid under the OPPS and those paid under the ASC payment system. In addition, this final rule with comment period updates and refines the requirements for the Hospital Outpatient Quality Reporting (OQR) Program and the ASC Quality Reporting (ASCQR) Program. Further, in this final rule with comment period, we are making changes to tolerance thresholds for clinical outcomes for solid organ transplant programs; to Organ Procurement Organizations (OPOs) definitions, outcome measures, and organ transport documentation; and to the Medicare and Medicaid Electronic Health Record Incentive Programs. We also are removing the HCAHPS Pain Management dimension from the Hospital Value-Based Purchasing (VBP) Program. In addition, we are implementing section 603 of the Bipartisan Budget Act of 2015 relating to payment for certain items and services furnished by certain off-campus provider-based departments of a provider. In this document, we also are issuing an interim final rule with comment period to establish the Medicare Physician Fee Schedule payment rates for the nonexcepted items and services billed by a nonexcepted off-campus provider-based department of a hospital in accordance with the provisions of section 603.
An architecture for rule based system explanation
NASA Technical Reports Server (NTRS)
Fennel, T. R.; Johannes, James D.
1990-01-01
A system architecture is presented which incorporate both graphics and text into explanations provided by rule based expert systems. This architecture facilitates explanation of the knowledge base content, the control strategies employed by the system, and the conclusions made by the system. The suggested approach combines hypermedia and inference engine capabilities. Advantages include: closer integration of user interface, explanation system, and knowledge base; the ability to embed links to deeper knowledge underlying the compiled knowledge used in the knowledge base; and allowing for more direct control of explanation depth and duration by the user. User models are suggested to control the type, amount, and order of information presented.
Knowledge base rule partitioning design for CLIPS
NASA Technical Reports Server (NTRS)
Mainardi, Joseph D.; Szatkowski, G. P.
1990-01-01
This describes a knowledge base (KB) partitioning approach to solve the problem of real-time performance using the CLIPS AI shell when containing large numbers of rules and facts. This work is funded under the joint USAF/NASA Advanced Launch System (ALS) Program as applied research in expert systems to perform vehicle checkout for real-time controller and diagnostic monitoring tasks. The Expert System advanced development project (ADP-2302) main objective is to provide robust systems responding to new data frames of 0.1 to 1.0 second intervals. The intelligent system control must be performed within the specified real-time window, in order to meet the demands of the given application. Partitioning the KB reduces the complexity of the inferencing Rete net at any given time. This reduced complexity improves performance but without undo impacts during load and unload cycles. The second objective is to produce highly reliable intelligent systems. This requires simple and automated approaches to the KB verification & validation task. Partitioning the KB reduces rule interaction complexity overall. Reduced interaction simplifies the V&V testing necessary by focusing attention only on individual areas of interest. Many systems require a robustness that involves a large number of rules, most of which are mutually exclusive under different phases or conditions. The ideal solution is to control the knowledge base by loading rules that directly apply for that condition, while stripping out all rules and facts that are not used during that cycle. The practical approach is to cluster rules and facts into associated 'blocks'. A simple approach has been designed to control the addition and deletion of 'blocks' of rules and facts, while allowing real-time operations to run freely. Timing tests for real-time performance for specific machines under R/T operating systems have not been completed but are planned as part of the analysis process to validate the design.
Alternative Architectures for Distributed Work in the National Airspace System
NASA Technical Reports Server (NTRS)
Smith, Philip J.; Billings, Charles E.; Chapman, Roger; Obradovich, Heintz; McCoy, C. Elaine; Orasanu, Judith
2000-01-01
The architecture for the National Airspace System (NAS) in the United States has evolved over time to rely heavily on the distribution of tasks and control authority in order to keep cognitive complexity manageable for any one individual. This paper characterizes a number of different subsystems that have been recently incorporated in the NAS. The goal of this discussion is to begin to identify the critical parameters defining the differences among alternative architectures in terms of the locus of control and in terms of access to relevant data and knowledge. At an abstract level, this analysis can be described as an effort to describe alternative "rules of the game" for the NAS.
Hassanpour, Saeed; O'Connor, Martin J; Das, Amar K
2013-08-12
A variety of informatics approaches have been developed that use information retrieval, NLP and text-mining techniques to identify biomedical concepts and relations within scientific publications or their sentences. These approaches have not typically addressed the challenge of extracting more complex knowledge such as biomedical definitions. In our efforts to facilitate knowledge acquisition of rule-based definitions of autism phenotypes, we have developed a novel semantic-based text-mining approach that can automatically identify such definitions within text. Using an existing knowledge base of 156 autism phenotype definitions and an annotated corpus of 26 source articles containing such definitions, we evaluated and compared the average rank of correctly identified rule definition or corresponding rule template using both our semantic-based approach and a standard term-based approach. We examined three separate scenarios: (1) the snippet of text contained a definition already in the knowledge base; (2) the snippet contained an alternative definition for a concept in the knowledge base; and (3) the snippet contained a definition not in the knowledge base. Our semantic-based approach had a higher average rank than the term-based approach for each of the three scenarios (scenario 1: 3.8 vs. 5.0; scenario 2: 2.8 vs. 4.9; and scenario 3: 4.5 vs. 6.2), with each comparison significant at the p-value of 0.05 using the Wilcoxon signed-rank test. Our work shows that leveraging existing domain knowledge in the information extraction of biomedical definitions significantly improves the correct identification of such knowledge within sentences. Our method can thus help researchers rapidly acquire knowledge about biomedical definitions that are specified and evolving within an ever-growing corpus of scientific publications.
Program for Experimentation With Expert Systems
NASA Technical Reports Server (NTRS)
Engle, S. W.
1986-01-01
CERBERUS is forward-chaining, knowledge-based system program useful for experimentation with expert systems. Inference-engine mechanism performs deductions according to user-supplied rule set. Information stored in intermediate area, and user interrogated only when no applicable data found in storage. Each assertion posed by CERBERUS answered with certainty ranging from 0 to 100 percent. Rule processor stops investigating applicable rules when goal reaches certainty of 95 percent or higher. Capable of operating for wide variety of domains. Sample rule files included for animal identification, pixel classification in image processing, and rudimentary car repair for novice mechanic. User supplies set of end goals or actions. System complexity decided by user's rule file. CERBERUS written in FORTRAN 77.
Karayiannis, Nicolaos B; Mukherjee, Amit; Glover, John R; Ktonas, Periklis Y; Frost, James D; Hrachovy, Richard A; Mizrahi, Eli M
2006-04-01
This paper presents an approach to detect epileptic seizure segments in the neonatal electroencephalogram (EEG) by characterizing the spectral features of the EEG waveform using a rule-based algorithm cascaded with a neural network. A rule-based algorithm screens out short segments of pseudosinusoidal EEG patterns as epileptic based on features in the power spectrum. The output of the rule-based algorithm is used to train and compare the performance of conventional feedforward neural networks and quantum neural networks. The results indicate that the trained neural networks, cascaded with the rule-based algorithm, improved the performance of the rule-based algorithm acting by itself. The evaluation of the proposed cascaded scheme for the detection of pseudosinusoidal seizure segments reveals its potential as a building block of the automated seizure detection system under development.
Misirli, Goksel; Cavaliere, Matteo; Waites, William; Pocock, Matthew; Madsen, Curtis; Gilfellon, Owen; Honorato-Zimmer, Ricardo; Zuliani, Paolo; Danos, Vincent; Wipat, Anil
2016-03-15
Biological systems are complex and challenging to model and therefore model reuse is highly desirable. To promote model reuse, models should include both information about the specifics of simulations and the underlying biology in the form of metadata. The availability of computationally tractable metadata is especially important for the effective automated interpretation and processing of models. Metadata are typically represented as machine-readable annotations which enhance programmatic access to information about models. Rule-based languages have emerged as a modelling framework to represent the complexity of biological systems. Annotation approaches have been widely used for reaction-based formalisms such as SBML. However, rule-based languages still lack a rich annotation framework to add semantic information, such as machine-readable descriptions, to the components of a model. We present an annotation framework and guidelines for annotating rule-based models, encoded in the commonly used Kappa and BioNetGen languages. We adapt widely adopted annotation approaches to rule-based models. We initially propose a syntax to store machine-readable annotations and describe a mapping between rule-based modelling entities, such as agents and rules, and their annotations. We then describe an ontology to both annotate these models and capture the information contained therein, and demonstrate annotating these models using examples. Finally, we present a proof of concept tool for extracting annotations from a model that can be queried and analyzed in a uniform way. The uniform representation of the annotations can be used to facilitate the creation, analysis, reuse and visualization of rule-based models. Although examples are given, using specific implementations the proposed techniques can be applied to rule-based models in general. The annotation ontology for rule-based models can be found at http://purl.org/rbm/rbmo The krdf tool and associated executable examples are available at http://purl.org/rbm/rbmo/krdf anil.wipat@newcastle.ac.uk or vdanos@inf.ed.ac.uk. © The Author 2015. Published by Oxford University Press.
Neuronal avalanches and learning
NASA Astrophysics Data System (ADS)
de Arcangelis, Lucilla
2011-05-01
Networks of living neurons represent one of the most fascinating systems of biology. If the physical and chemical mechanisms at the basis of the functioning of a single neuron are quite well understood, the collective behaviour of a system of many neurons is an extremely intriguing subject. Crucial ingredient of this complex behaviour is the plasticity property of the network, namely the capacity to adapt and evolve depending on the level of activity. This plastic ability is believed, nowadays, to be at the basis of learning and memory in real brains. Spontaneous neuronal activity has recently shown features in common to other complex systems. Experimental data have, in fact, shown that electrical information propagates in a cortex slice via an avalanche mode. These avalanches are characterized by a power law distribution for the size and duration, features found in other problems in the context of the physics of complex systems and successful models have been developed to describe their behaviour. In this contribution we discuss a statistical mechanical model for the complex activity in a neuronal network. The model implements the main physiological properties of living neurons and is able to reproduce recent experimental results. Then, we discuss the learning abilities of this neuronal network. Learning occurs via plastic adaptation of synaptic strengths by a non-uniform negative feedback mechanism. The system is able to learn all the tested rules, in particular the exclusive OR (XOR) and a random rule with three inputs. The learning dynamics exhibits universal features as function of the strength of plastic adaptation. Any rule could be learned provided that the plastic adaptation is sufficiently slow.
One Giant Leap for Categorizers: One Small Step for Categorization Theory
Smith, J. David; Ell, Shawn W.
2015-01-01
We explore humans’ rule-based category learning using analytic approaches that highlight their psychological transitions during learning. These approaches confirm that humans show qualitatively sudden psychological transitions during rule learning. These transitions contribute to the theoretical literature contrasting single vs. multiple category-learning systems, because they seem to reveal a distinctive learning process of explicit rule discovery. A complete psychology of categorization must describe this learning process, too. Yet extensive formal-modeling analyses confirm that a wide range of current (gradient-descent) models cannot reproduce these transitions, including influential rule-based models (e.g., COVIS) and exemplar models (e.g., ALCOVE). It is an important theoretical conclusion that existing models cannot explain humans’ rule-based category learning. The problem these models have is the incremental algorithm by which learning is simulated. Humans descend no gradient in rule-based tasks. Very different formal-modeling systems will be required to explain humans’ psychology in these tasks. An important next step will be to build a new generation of models that can do so. PMID:26332587
NASA Astrophysics Data System (ADS)
Anugrah, Wirdah; Suryono; Suseno, Jatmiko Endro
2018-02-01
Management of water resources based on Geographic Information System can provide substantial benefits to water availability settings. Monitoring the potential water level is needed in the development sector, agriculture, energy and others. In this research is developed water resource information system using real-time Geographic Information System concept for monitoring the potential water level of web based area by applying rule based system method. GIS consists of hardware, software, and database. Based on the web-based GIS architecture, this study uses a set of computer that are connected to the network, run on the Apache web server and PHP programming language using MySQL database. The Ultrasound Wireless Sensor System is used as a water level data input. It also includes time and geographic location information. This GIS maps the five sensor locations. GIS is processed through a rule based system to determine the level of potential water level of the area. Water level monitoring information result can be displayed on thematic maps by overlaying more than one layer, and also generating information in the form of tables from the database, as well as graphs are based on the timing of events and the water level values.
Automatic Learning of Fine Operating Rules for Online Power System Security Control.
Sun, Hongbin; Zhao, Feng; Wang, Hao; Wang, Kang; Jiang, Weiyong; Guo, Qinglai; Zhang, Boming; Wehenkel, Louis
2016-08-01
Fine operating rules for security control and an automatic system for their online discovery were developed to adapt to the development of smart grids. The automatic system uses the real-time system state to determine critical flowgates, and then a continuation power flow-based security analysis is used to compute the initial transfer capability of critical flowgates. Next, the system applies the Monte Carlo simulations to expected short-term operating condition changes, feature selection, and a linear least squares fitting of the fine operating rules. The proposed system was validated both on an academic test system and on a provincial power system in China. The results indicated that the derived rules provide accuracy and good interpretability and are suitable for real-time power system security control. The use of high-performance computing systems enables these fine operating rules to be refreshed online every 15 min.
Object-oriented fault tree models applied to system diagnosis
NASA Technical Reports Server (NTRS)
Iverson, David L.; Patterson-Hine, F. A.
1990-01-01
When a diagnosis system is used in a dynamic environment, such as the distributed computer system planned for use on Space Station Freedom, it must execute quickly and its knowledge base must be easily updated. Representing system knowledge as object-oriented augmented fault trees provides both features. The diagnosis system described here is based on the failure cause identification process of the diagnostic system described by Narayanan and Viswanadham. Their system has been enhanced in this implementation by replacing the knowledge base of if-then rules with an object-oriented fault tree representation. This allows the system to perform its task much faster and facilitates dynamic updating of the knowledge base in a changing diagnosis environment. Accessing the information contained in the objects is more efficient than performing a lookup operation on an indexed rule base. Additionally, the object-oriented fault trees can be easily updated to represent current system status. This paper describes the fault tree representation, the diagnosis algorithm extensions, and an example application of this system. Comparisons are made between the object-oriented fault tree knowledge structure solution and one implementation of a rule-based solution. Plans for future work on this system are also discussed.
Aktipis, C. Athena
2011-01-01
The evolution of cooperation through partner choice mechanisms is often thought to involve relatively complex cognitive abilities. Using agent-based simulations I model a simple partner choice rule, the ‘Walk Away’ rule, where individuals stay in groups that provide higher returns (by virtue of having more cooperators), and ‘Walk Away’ from groups providing low returns. Implementing this conditional movement rule in a public goods game leads to a number of interesting findings: 1) cooperators have a selective advantage when thresholds are high, corresponding to low tolerance for defectors, 2) high thresholds lead to high initial rates of movement and low final rates of movement (after selection), and 3) as cooperation is selected, the population undergoes a spatial transition from high migration (and a many small and ephemeral groups) to low migration (and large and stable groups). These results suggest that the very simple ‘Walk Away’ rule of leaving uncooperative groups can favor the evolution of cooperation, and that cooperation can evolve in populations in which individuals are able to move in response to local social conditions. A diverse array of organisms are able to leave degraded physical or social environments. The ubiquitous nature of conditional movement suggests that ‘Walk Away’ dynamics may play an important role in the evolution of social behavior in both cognitively complex and cognitively simple organisms. PMID:21666771
Adaptive inferential sensors based on evolving fuzzy models.
Angelov, Plamen; Kordon, Arthur
2010-04-01
A new technique to the design and use of inferential sensors in the process industry is proposed in this paper, which is based on the recently introduced concept of evolving fuzzy models (EFMs). They address the challenge that the modern process industry faces today, namely, to develop such adaptive and self-calibrating online inferential sensors that reduce the maintenance costs while keeping the high precision and interpretability/transparency. The proposed new methodology makes possible inferential sensors to recalibrate automatically, which reduces significantly the life-cycle efforts for their maintenance. This is achieved by the adaptive and flexible open-structure EFM used. The novelty of this paper lies in the following: (1) the overall concept of inferential sensors with evolving and self-developing structure from the data streams; (2) the new methodology for online automatic selection of input variables that are most relevant for the prediction; (3) the technique to detect automatically a shift in the data pattern using the age of the clusters (and fuzzy rules); (4) the online standardization technique used by the learning procedure of the evolving model; and (5) the application of this innovative approach to several real-life industrial processes from the chemical industry (evolving inferential sensors, namely, eSensors, were used for predicting the chemical properties of different products in The Dow Chemical Company, Freeport, TX). It should be noted, however, that the methodology and conclusions of this paper are valid for the broader area of chemical and process industries in general. The results demonstrate that well-interpretable and with-simple-structure inferential sensors can automatically be designed from the data stream in real time, which predict various process variables of interest. The proposed approach can be used as a basis for the development of a new generation of adaptive and evolving inferential sensors that can address the challenges of the modern advanced process industry.
Predicting mining activity with parallel genetic algorithms
Talaie, S.; Leigh, R.; Louis, S.J.; Raines, G.L.; Beyer, H.G.; O'Reilly, U.M.; Banzhaf, Arnold D.; Blum, W.; Bonabeau, C.; Cantu-Paz, E.W.; ,; ,
2005-01-01
We explore several different techniques in our quest to improve the overall model performance of a genetic algorithm calibrated probabilistic cellular automata. We use the Kappa statistic to measure correlation between ground truth data and data predicted by the model. Within the genetic algorithm, we introduce a new evaluation function sensitive to spatial correctness and we explore the idea of evolving different rule parameters for different subregions of the land. We reduce the time required to run a simulation from 6 hours to 10 minutes by parallelizing the code and employing a 10-node cluster. Our empirical results suggest that using the spatially sensitive evaluation function does indeed improve the performance of the model and our preliminary results also show that evolving different rule parameters for different regions tends to improve overall model performance. Copyright 2005 ACM.
Automatic information extraction from unstructured mammography reports using distributed semantics.
Gupta, Anupama; Banerjee, Imon; Rubin, Daniel L
2018-02-01
To date, the methods developed for automated extraction of information from radiology reports are mainly rule-based or dictionary-based, and, therefore, require substantial manual effort to build these systems. Recent efforts to develop automated systems for entity detection have been undertaken, but little work has been done to automatically extract relations and their associated named entities in narrative radiology reports that have comparable accuracy to rule-based methods. Our goal is to extract relations in a unsupervised way from radiology reports without specifying prior domain knowledge. We propose a hybrid approach for information extraction that combines dependency-based parse tree with distributed semantics for generating structured information frames about particular findings/abnormalities from the free-text mammography reports. The proposed IE system obtains a F 1 -score of 0.94 in terms of completeness of the content in the information frames, which outperforms a state-of-the-art rule-based system in this domain by a significant margin. The proposed system can be leveraged in a variety of applications, such as decision support and information retrieval, and may also easily scale to other radiology domains, since there is no need to tune the system with hand-crafted information extraction rules. Copyright © 2018 Elsevier Inc. All rights reserved.
Modeling reliability measurement of interface on information system: Towards the forensic of rules
NASA Astrophysics Data System (ADS)
Nasution, M. K. M.; Sitompul, Darwin; Harahap, Marwan
2018-02-01
Today almost all machines depend on the software. As a software and hardware system depends also on the rules that are the procedures for its use. If the procedure or program can be reliably characterized by involving the concept of graph, logic, and probability, then regulatory strength can also be measured accordingly. Therefore, this paper initiates an enumeration model to measure the reliability of interfaces based on the case of information systems supported by the rules of use by the relevant agencies. An enumeration model is obtained based on software reliability calculation.
For Spacious Skies: Self-Separation with "Autonomous Flight Rules" in US Domestic Airspace
NASA Technical Reports Server (NTRS)
Wing, David J.; Cotton, William B.
2011-01-01
Autonomous Flight Rules (AFR) are proposed as a new set of operating regulations in which aircraft navigate on tracks of their choice while self-separating from traffic and weather. AFR would exist alongside Instrument and Visual Flight Rules (IFR and VFR) as one of three available flight options for any appropriately trained and qualified operator with the necessary certified equipment. Historically, ground-based separation services evolved by necessity as aircraft began operating in the clouds and were unable to see each other. Today, technologies for global precision navigation, emerging airborne surveillance, and onboard computing enable traffic conflict management to be fully integrated with navigation procedures onboard the aircraft. By self-separating, aircraft can operate with more flexibility and fewer flight restrictions than are required when using ground-based separation. The AFR concept proposes a practical means in which self-separating aircraft could share the same airspace as IFR and VFR aircraft without disrupting the ongoing processes of Air Traffic Control. The paper discusses the context and motivation for implementing self-separation in US domestic airspace. It presents a historical perspective on separation, the proposed way forward in AFR, the rationale behind mixed operations, and the expected benefits of AFR for the airspace user community.
Evolution of cellular automata with memory: The Density Classification Task.
Stone, Christopher; Bull, Larry
2009-08-01
The Density Classification Task is a well known test problem for two-state discrete dynamical systems. For many years researchers have used a variety of evolutionary computation approaches to evolve solutions to this problem. In this paper, we investigate the evolvability of solutions when the underlying Cellular Automaton is augmented with a type of memory based on the Least Mean Square algorithm. To obtain high performance solutions using a simple non-hybrid genetic algorithm, we design a novel representation based on the ternary representation used for Learning Classifier Systems. The new representation is found able to produce superior performance to the bit string traditionally used for representing Cellular automata. Moreover, memory is shown to improve evolvability of solutions and appropriate memory settings are able to be evolved as a component part of these solutions.
NASA Technical Reports Server (NTRS)
Denning, Peter J.
1990-01-01
Although powerful computers have allowed complex physical and manmade hardware systems to be modeled successfully, we have encountered persistent problems with the reliability of computer models for systems involving human learning, human action, and human organizations. This is not a misfortune; unlike physical and manmade systems, human systems do not operate under a fixed set of laws. The rules governing the actions allowable in the system can be changed without warning at any moment, and can evolve over time. That the governing laws are inherently unpredictable raises serious questions about the reliability of models when applied to human situations. In these domains, computers are better used, not for prediction and planning, but for aiding humans. Examples are systems that help humans speculate about possible futures, offer advice about possible actions in a domain, systems that gather information from the networks, and systems that track and support work flows in organizations.
Object-Driven and Temporal Action Rules Mining
ERIC Educational Resources Information Center
Hajja, Ayman
2013-01-01
In this thesis, I present my complete research work in the field of action rules, more precisely object-driven and temporal action rules. The drive behind the introduction of object-driven and temporally based action rules is to bring forth an adapted approach to extract action rules from a subclass of systems that have a specific nature, in which…
Policy-Based Middleware for QoS Management and Signaling in the Evolved Packet System
NASA Astrophysics Data System (ADS)
Good, Richard; Gouveia, Fabricio; Magedanz, Thomas; Ventura, Neco
The 3GPP are currently finalizing their Evolved Packet System (EPS) with the Evolved Packet Core (EPC) central to this framework. The EPC is a simplified, flat, all IP-based architecture that supports mobility between heterogeneous access networks and incorporates an evolved QoS concept based on the 3GPP Policy Control and Charging (PCC) framework. The IP Multimedia Subsystem (IMS) is an IP service element within the EPS, introduced for the rapid provisioning of innovative multimedia services. The evolved PCC framework extends the scope of operation and defines new interactions - in particular the S9 reference point is introduced to facilitate inter-domain PCC communication. This paper proposes an enhancement to the IMS/PCC framework that uses SIP routing information to discover signaling and media paths. This mechanism uses standardized IMS/PCC operations and allows applications to effectively issue resource requests from their home domain enabling QoS-connectivity across multiple domains. Because the mechanism operates at the service control layer it does not require any significant transport layer modifications or the sharing of potentially sensitive internal topology information. The evolved PCC architecture and inter-domain route discovery mechanisms were implemented in an evaluation testbed and performed favorably without adversely effecting end user experience.
Discovering Fine-grained Sentiment in Suicide Notes
Wang, Wenbo; Chen, Lu; Tan, Ming; Wang, Shaojun; Sheth, Amit P.
2012-01-01
This paper presents our solution for the i2b2 sentiment classification challenge. Our hybrid system consists of machine learning and rule-based classifiers. For the machine learning classifier, we investigate a variety of lexical, syntactic and knowledge-based features, and show how much these features contribute to the performance of the classifier through experiments. For the rule-based classifier, we propose an algorithm to automatically extract effective syntactic and lexical patterns from training examples. The experimental results show that the rule-based classifier outperforms the baseline machine learning classifier using unigram features. By combining the machine learning classifier and the rule-based classifier, the hybrid system gains a better trade-off between precision and recall, and yields the highest micro-averaged F-measure (0.5038), which is better than the mean (0.4875) and median (0.5027) micro-average F-measures among all participating teams. PMID:22879770
NASA Astrophysics Data System (ADS)
Inkoom, J. N.; Nyarko, B. K.
2014-12-01
The integration of geographic information systems (GIS) and agent-based modelling (ABM) can be an efficient tool to improve spatial planning practices. This paper utilizes GIS and ABM approaches to simulate spatial growth patterns of settlement structures in Shama. A preliminary household survey on residential location decision-making choice served as the behavioural rule for household agents in the model. Physical environment properties of the model were extracted from a 2005 image implemented in NetLogo. The resulting growth pattern model was compared with empirical growth patterns to ascertain the model's accuracy. The paper establishes that the development of unplanned structures and its evolving structural pattern are a function of land price, proximity to economic centres, household economic status and location decision-making patterns. The application of the proposed model underlines its potential for integration into urban planning policies and practices, and for understanding residential decision-making processes in emerging cities in developing countries. Key Words: GIS; Agent-based modelling; Growth patterns; NetLogo; Location decision making; Computational Intelligence.
77 FR 55371 - System Safety Program
Federal Register 2010, 2011, 2012, 2013, 2014
2012-09-07
...-based rule and FRA seeks comments on all aspects of the proposed rule. An SSP would be implemented by a... SSP would be the risk-based hazard management program and risk-based hazard analysis. A properly implemented risk-based hazard management program and risk-based hazard analysis would identify the hazards and...
Revisiting Robustness and Evolvability: Evolution in Weighted Genotype Spaces
Partha, Raghavendran; Raman, Karthik
2014-01-01
Robustness and evolvability are highly intertwined properties of biological systems. The relationship between these properties determines how biological systems are able to withstand mutations and show variation in response to them. Computational studies have explored the relationship between these two properties using neutral networks of RNA sequences (genotype) and their secondary structures (phenotype) as a model system. However, these studies have assumed every mutation to a sequence to be equally likely; the differences in the likelihood of the occurrence of various mutations, and the consequence of probabilistic nature of the mutations in such a system have previously been ignored. Associating probabilities to mutations essentially results in the weighting of genotype space. We here perform a comparative analysis of weighted and unweighted neutral networks of RNA sequences, and subsequently explore the relationship between robustness and evolvability. We show that assuming an equal likelihood for all mutations (as in an unweighted network), underestimates robustness and overestimates evolvability of a system. In spite of discarding this assumption, we observe that a negative correlation between sequence (genotype) robustness and sequence evolvability persists, and also that structure (phenotype) robustness promotes structure evolvability, as observed in earlier studies using unweighted networks. We also study the effects of base composition bias on robustness and evolvability. Particularly, we explore the association between robustness and evolvability in a sequence space that is AU-rich – sequences with an AU content of 80% or higher, compared to a normal (unbiased) sequence space. We find that evolvability of both sequences and structures in an AU-rich space is lesser compared to the normal space, and robustness higher. We also observe that AU-rich populations evolving on neutral networks of phenotypes, can access less phenotypic variation compared to normal populations evolving on neutral networks. PMID:25390641
Rule-based navigation control design for autonomous flight
NASA Astrophysics Data System (ADS)
Contreras, Hugo; Bassi, Danilo
2008-04-01
This article depicts a navigation control system design that is based on a set of rules in order to follow a desired trajectory. The full control of the aircraft considered here comprises: a low level stability control loop, based on classic PID controller and the higher level navigation whose main job is to exercise lateral control (course) and altitude control, trying to follow a desired trajectory. The rules and PID gains were adjusted systematically according to the result of flight simulation. In spite of its simplicity, the rule-based navigation control proved to be robust, even with big perturbation, like crossing winds.
Evolving RBF neural networks for adaptive soft-sensor design.
Alexandridis, Alex
2013-12-01
This work presents an adaptive framework for building soft-sensors based on radial basis function (RBF) neural network models. The adaptive fuzzy means algorithm is utilized in order to evolve an RBF network, which approximates the unknown system based on input-output data from it. The methodology gradually builds the RBF network model, based on two separate levels of adaptation: On the first level, the structure of the hidden layer is modified by adding or deleting RBF centers, while on the second level, the synaptic weights are adjusted with the recursive least squares with exponential forgetting algorithm. The proposed approach is tested on two different systems, namely a simulated nonlinear DC Motor and a real industrial reactor. The results show that the produced soft-sensors can be successfully applied to model the two nonlinear systems. A comparison with two different adaptive modeling techniques, namely a dynamic evolving neural-fuzzy inference system (DENFIS) and neural networks trained with online backpropagation, highlights the advantages of the proposed methodology.
Schroeder, Julie; Guin, Cecile C; Pogue, Rene; Bordelon, Danna
2006-10-01
Providing an effective defense for individuals charged with capital crimes requires a diligent, thorough investigation by a mitigation specialist. However, research suggests that mitigation often plays a small role in the decision for life. Jurors often make sentencing decisions prematurely, basing those decisions on their personal reactions to the defendant (for example, fear, anger), their confusion about the rules of law, and their lack of understanding regarding their role and responsibilities. This article proposes an evidence-based conceptual model of the complicating problems surrounding mitigation practice and a focused discussion about how traditional social work mitigation strategies might be evolved to a set of best practices that more effectively ensure jurors' careful consideration of mitigation evidence.
Advances in the mechanism and understanding of site-selective noncanonical amino acid incorporation.
Antonczak, Alicja K; Morris, Josephine; Tippmann, Eric M
2011-08-01
There are many approaches to introduce non-native functionality into proteins either translationally or post-translationally. When a noncanonical amino acid (NAA) is incorporated translationally, the host organism's existing translational machinery is relied upon to insert the amino acid by the same well-established mechanisms used by the host to achieve high fidelity insertion of its canonical amino acids. Research into the in vivo incorporation of NAAs has typically concentrated on evolving or engineering aminoacyl tRNA synthetases (aaRSs); however, new studies have increasingly focused on other members of the translational apparatus, for example entire ribosomes, in attempts to increase the fidelity and efficiency of incorporation of ever more structurally diverse NAAs. As the biochemical methods of NAA systems increase in complexity, it is informative to ask whether the 'rules' for canonical translation (i.e. aaRSs, tRNA, ribosomes, elongation factors, amino acid uptake, and metabolism) hold for NAA systems, or whether new rules are warranted. Here, recent advances in introducing novel chemical functionality into proteins are highlighted. Copyright © 2011 Elsevier Ltd. All rights reserved.
Incremental Learning of Context Free Grammars by Parsing-Based Rule Generation and Rule Set Search
NASA Astrophysics Data System (ADS)
Nakamura, Katsuhiko; Hoshina, Akemi
This paper discusses recent improvements and extensions in Synapse system for inductive inference of context free grammars (CFGs) from sample strings. Synapse uses incremental learning, rule generation based on bottom-up parsing, and the search for rule sets. The form of production rules in the previous system is extended from Revised Chomsky Normal Form A→βγ to Extended Chomsky Normal Form, which also includes A→B, where each of β and γ is either a terminal or nonterminal symbol. From the result of bottom-up parsing, a rule generation mechanism synthesizes minimum production rules required for parsing positive samples. Instead of inductive CYK algorithm in the previous version of Synapse, the improved version uses a novel rule generation method, called ``bridging,'' which bridges the lacked part of the derivation tree for the positive string. The improved version also employs a novel search strategy, called serial search in addition to minimum rule set search. The synthesis of grammars by the serial search is faster than the minimum set search in most cases. On the other hand, the size of the generated CFGs is generally larger than that by the minimum set search, and the system can find no appropriate grammar for some CFL by the serial search. The paper shows experimental results of incremental learning of several fundamental CFGs and compares the methods of rule generation and search strategies.
Discontinuous categories affect information-integration but not rule-based category learning.
Maddox, W Todd; Filoteo, J Vincent; Lauritzen, J Scott; Connally, Emily; Hejl, Kelli D
2005-07-01
Three experiments were conducted that provide a direct examination of within-category discontinuity manipulations on the implicit, procedural-based learning and the explicit, hypothesis-testing systems proposed in F. G. Ashby, L. A. Alfonso-Reese, A. U. Turken, and E. M. Waldron's (1998) competition between verbal and implicit systems model. Discontinuous categories adversely affected information-integration but not rule-based category learning. Increasing the magnitude of the discontinuity did not lead to a significant decline in performance. The distance to the bound provides a reasonable description of the generalization profile associated with the hypothesis-testing system, whereas the distance to the bound plus the distance to the trained response region provides a reasonable description of the generalization profile associated with the procedural-based learning system. These results suggest that within-category discontinuity differentially impacts information-integration but not rule-based category learning and provides information regarding the detailed processing characteristics of each category learning system. ((c) 2005 APA, all rights reserved).
Automated process control for plasma etching
NASA Astrophysics Data System (ADS)
McGeown, Margaret; Arshak, Khalil I.; Murphy, Eamonn
1992-06-01
This paper discusses the development and implementation of a rule-based system which assists in providing automated process control for plasma etching. The heart of the system is to establish a correspondence between a particular data pattern -- sensor or data signals -- and one or more modes of failure, i.e., a data-driven monitoring approach. The objective of this rule based system, PLETCHSY, is to create a program combining statistical process control (SPC) and fault diagnosis to help control a manufacturing process which varies over time. This can be achieved by building a process control system (PCS) with the following characteristics. A facility to monitor the performance of the process by obtaining and analyzing the data relating to the appropriate process variables. Process sensor/status signals are input into an SPC module. If trends are present, the SPC module outputs the last seven control points, a pattern which is represented by either regression or scoring. The pattern is passed to the rule-based module. When the rule-based system recognizes a pattern, it starts the diagnostic process using the pattern. If the process is considered to be going out of control, advice is provided about actions which should be taken to bring the process back into control.
Evolutionary potential games on lattices
NASA Astrophysics Data System (ADS)
Szabó, György; Borsos, István
2016-04-01
Game theory provides a general mathematical background to study the effect of pair interactions and evolutionary rules on the macroscopic behavior of multi-player games where players with a finite number of strategies may represent a wide scale of biological objects, human individuals, or even their associations. In these systems the interactions are characterized by matrices that can be decomposed into elementary matrices (games) and classified into four types. The concept of decomposition helps the identification of potential games and also the evaluation of the potential that plays a crucial role in the determination of the preferred Nash equilibrium, and defines the Boltzmann distribution towards which these systems evolve for suitable types of dynamical rules. This survey draws parallel between the potential games and the kinetic Ising type models which are investigated for a wide scale of connectivity structures. We discuss briefly the applicability of the tools and concepts of statistical physics and thermodynamics. Additionally the general features of ordering phenomena, phase transitions and slow relaxations are outlined and applied to evolutionary games. The discussion extends to games with three or more strategies. Finally we discuss what happens when the system is weakly driven out of the "equilibrium state" by adding non-potential components representing games of cyclic dominance.
A novel procedure for the identification of chaos in complex biological systems
NASA Astrophysics Data System (ADS)
Bazeia, D.; Pereira, M. B. P. N.; Brito, A. V.; Oliveira, B. F. De; Ramos, J. G. G. S.
2017-03-01
We demonstrate the presence of chaos in stochastic simulations that are widely used to study biodiversity in nature. The investigation deals with a set of three distinct species that evolve according to the standard rules of mobility, reproduction and predation, with predation following the cyclic rules of the popular rock, paper and scissors game. The study uncovers the possibility to distinguish between time evolutions that start from slightly different initial states, guided by the Hamming distance which heuristically unveils the chaotic behavior. The finding opens up a quantitative approach that relates the correlation length to the average density of maxima of a typical species, and an ensemble of stochastic simulations is implemented to support the procedure. The main result of the work shows how a single and simple experimental realization that counts the density of maxima associated with the chaotic evolution of the species serves to infer its correlation length. We use the result to investigate others distinct complex systems, one dealing with a set of differential equations that can be used to model a diversity of natural and artificial chaotic systems, and another one, focusing on the ocean water level.
Resource Allocation Planning Helper (RALPH): Lessons learned
NASA Technical Reports Server (NTRS)
Durham, Ralph; Reilly, Norman B.; Springer, Joe B.
1990-01-01
The current task of Resource Allocation Process includes the planning and apportionment of JPL's Ground Data System composed of the Deep Space Network and Mission Control and Computing Center facilities. The addition of the data driven, rule based planning system, RALPH, has expanded the planning horizon from 8 weeks to 10 years and has resulted in large labor savings. Use of the system has also resulted in important improvements in science return through enhanced resource utilization. In addition, RALPH has been instrumental in supporting rapid turn around for an increased volume of special what if studies. The status of RALPH is briefly reviewed and important lessons learned from the creation of an highly functional design team are focused on through an evolutionary design and implementation period in which an AI shell was selected, prototyped, and ultimately abandoned, and through the fundamental changes to the very process that spawned the tool kit. Principal topics include proper integration of software tools within the planning environment, transition from prototype to delivered to delivered software, changes in the planning methodology as a result of evolving software capabilities and creation of the ability to develop and process generic requirements to allow planning flexibility.
Ermer, Elsa; Guerin, Scott A; Cosmides, Leda; Tooby, John; Miller, Michael B
2006-01-01
Baron-Cohen (1995) proposed that the theory of mind (ToM) inference system evolved to promote strategic social interaction. Social exchange--a form of co-operation for mutual benefit--involves strategic social interaction and requires ToM inferences about the contents of other individuals' mental states, especially their desires, goals, and intentions. There are behavioral and neuropsychological dissociations between reasoning about social exchange and reasoning about equivalent problems tapping other, more general content domains. It has therefore been proposed that social exchange behavior is regulated by social contract algorithms: a domain-specific inference system that is functionally specialized for reasoning about social exchange. We report an fMRI study using the Wason selection task that provides further support for this hypothesis. Precautionary rules share so many properties with social exchange rules--they are conditional, deontic, and involve subjective utilities--that most reasoning theories claim they are processed by the same neurocomputational machinery. Nevertheless, neuroimaging shows that reasoning about social exchange activates brain areas not activated by reasoning about precautionary rules, and vice versa. As predicted, neural correlates of ToM (anterior and posterior temporal cortex) were activated when subjects interpreted social exchange rules, but not precautionary rules (where ToM inferences are unnecessary). We argue that the interaction between ToM and social contract algorithms can be reciprocal: social contract algorithms requires ToM inferences, but their functional logic also allows ToM inferences to be made. By considering interactions between ToM in the narrower sense (belief-desire reasoning) and all the social inference systems that create the logic of human social interaction--ones that enable as well as use inferences about the content of mental states--a broader conception of ToM may emerge: a computational model embodying a Theory of Human Nature (ToHN).
Prediction of drug synergy in cancer using ensemble-based machine learning techniques
NASA Astrophysics Data System (ADS)
Singh, Harpreet; Rana, Prashant Singh; Singh, Urvinder
2018-04-01
Drug synergy prediction plays a significant role in the medical field for inhibiting specific cancer agents. It can be developed as a pre-processing tool for therapeutic successes. Examination of different drug-drug interaction can be done by drug synergy score. It needs efficient regression-based machine learning approaches to minimize the prediction errors. Numerous machine learning techniques such as neural networks, support vector machines, random forests, LASSO, Elastic Nets, etc., have been used in the past to realize requirement as mentioned above. However, these techniques individually do not provide significant accuracy in drug synergy score. Therefore, the primary objective of this paper is to design a neuro-fuzzy-based ensembling approach. To achieve this, nine well-known machine learning techniques have been implemented by considering the drug synergy data. Based on the accuracy of each model, four techniques with high accuracy are selected to develop ensemble-based machine learning model. These models are Random forest, Fuzzy Rules Using Genetic Cooperative-Competitive Learning method (GFS.GCCL), Adaptive-Network-Based Fuzzy Inference System (ANFIS) and Dynamic Evolving Neural-Fuzzy Inference System method (DENFIS). Ensembling is achieved by evaluating the biased weighted aggregation (i.e. adding more weights to the model with a higher prediction score) of predicted data by selected models. The proposed and existing machine learning techniques have been evaluated on drug synergy score data. The comparative analysis reveals that the proposed method outperforms others in terms of accuracy, root mean square error and coefficient of correlation.
NASA Astrophysics Data System (ADS)
Ferrando, N.; Gosálvez, M. A.; Cerdá, J.; Gadea, R.; Sato, K.
2011-03-01
Presently, dynamic surface-based models are required to contain increasingly larger numbers of points and to propagate them over longer time periods. For large numbers of surface points, the octree data structure can be used as a balance between low memory occupation and relatively rapid access to the stored data. For evolution rules that depend on neighborhood states, extended simulation periods can be obtained by using simplified atomistic propagation models, such as the Cellular Automata (CA). This method, however, has an intrinsic parallel updating nature and the corresponding simulations are highly inefficient when performed on classical Central Processing Units (CPUs), which are designed for the sequential execution of tasks. In this paper, a series of guidelines is presented for the efficient adaptation of octree-based, CA simulations of complex, evolving surfaces into massively parallel computing hardware. A Graphics Processing Unit (GPU) is used as a cost-efficient example of the parallel architectures. For the actual simulations, we consider the surface propagation during anisotropic wet chemical etching of silicon as a computationally challenging process with a wide-spread use in microengineering applications. A continuous CA model that is intrinsically parallel in nature is used for the time evolution. Our study strongly indicates that parallel computations of dynamically evolving surfaces simulated using CA methods are significantly benefited by the incorporation of octrees as support data structures, substantially decreasing the overall computational time and memory usage.
A Bankruptcy Problem Approach to Load-shedding in Multiagent-based Microgrid Operation
Kim, Hak-Man; Kinoshita, Tetsuo; Lim, Yujin; Kim, Tai-Hoon
2010-01-01
A microgrid is composed of distributed power generation systems (DGs), distributed energy storage devices (DSs), and loads. To maintain a specific frequency in the islanded mode as an important requirement, the control of DGs’ output and charge action of DSs are used in supply surplus conditions and load-shedding and discharge action of DSs are used in supply shortage conditions. Recently, multiagent systems for autonomous microgrid operation have been studied. Especially, load-shedding, which is intentional reduction of electricity use, is a critical problem in islanded microgrid operation based on the multiagent system. Therefore, effective schemes for load-shedding are required. Meanwhile, the bankruptcy problem deals with dividing short resources among multiple agents. In order to solve the bankruptcy problem, division rules, such as the constrained equal awards rule (CEA), the constrained equal losses rule (CEL), and the random arrival rule (RA), have been used. In this paper, we approach load-shedding as a bankruptcy problem. We compare load-shedding results by above-mentioned rules in islanded microgrid operation based on wireless sensor network (WSN) as the communication link for an agent’s interactions. PMID:22163386
A bankruptcy problem approach to load-shedding in multiagent-based microgrid operation.
Kim, Hak-Man; Kinoshita, Tetsuo; Lim, Yujin; Kim, Tai-Hoon
2010-01-01
A microgrid is composed of distributed power generation systems (DGs), distributed energy storage devices (DSs), and loads. To maintain a specific frequency in the islanded mode as an important requirement, the control of DGs' output and charge action of DSs are used in supply surplus conditions and load-shedding and discharge action of DSs are used in supply shortage conditions. Recently, multiagent systems for autonomous microgrid operation have been studied. Especially, load-shedding, which is intentional reduction of electricity use, is a critical problem in islanded microgrid operation based on the multiagent system. Therefore, effective schemes for load-shedding are required. Meanwhile, the bankruptcy problem deals with dividing short resources among multiple agents. In order to solve the bankruptcy problem, division rules, such as the constrained equal awards rule (CEA), the constrained equal losses rule (CEL), and the random arrival rule (RA), have been used. In this paper, we approach load-shedding as a bankruptcy problem. We compare load-shedding results by above-mentioned rules in islanded microgrid operation based on wireless sensor network (WSN) as the communication link for an agent's interactions.
Kashyap, Vipul; Morales, Alfredo; Hongsermeier, Tonya
2006-01-01
We present an approach and architecture for implementing scalable and maintainable clinical decision support at the Partners HealthCare System. The architecture integrates a business rules engine that executes declarative if-then rules stored in a rule-base referencing objects and methods in a business object model. The rules engine executes object methods by invoking services implemented on the clinical data repository. Specialized inferences that support classification of data and instances into classes are identified and an approach to implement these inferences using an OWL based ontology engine is presented. Alternative representations of these specialized inferences as if-then rules or OWL axioms are explored and their impact on the scalability and maintenance of the system is presented. Architectural alternatives for integration of clinical decision support functionality with the invoking application and the underlying clinical data repository; and their associated trade-offs are discussed and presented.
Yang, Jin; Hlavacek, William S.
2011-01-01
Rule-based models, which are typically formulated to represent cell signaling systems, can now be simulated via various network-free simulation methods. In a network-free method, reaction rates are calculated for rules that characterize molecular interactions, and these rule rates, which each correspond to the cumulative rate of all reactions implied by a rule, are used to perform a stochastic simulation of reaction kinetics. Network-free methods, which can be viewed as generalizations of Gillespie’s method, are so named because these methods do not require that a list of individual reactions implied by a set of rules be explicitly generated, which is a requirement of other methods for simulating rule-based models. This requirement is impractical for rule sets that imply large reaction networks (i.e., long lists of individual reactions), as reaction network generation is expensive. Here, we compare the network-free simulation methods implemented in RuleMonkey and NFsim, general-purpose software tools for simulating rule-based models encoded in the BioNetGen language. The method implemented in NFsim uses rejection sampling to correct overestimates of rule rates, which introduces null events (i.e., time steps that do not change the state of the system being simulated). The method implemented in RuleMonkey uses iterative updates to track rule rates exactly, which avoids null events. To ensure a fair comparison of the two methods, we developed implementations of the rejection and rejection-free methods specific to a particular class of kinetic models for multivalent ligand-receptor interactions. These implementations were written with the intention of making them as much alike as possible, minimizing the contribution of irrelevant coding differences to efficiency differences. Simulation results show that performance of the rejection method is equal to or better than that of the rejection-free method over wide parameter ranges. However, when parameter values are such that ligand-induced aggregation of receptors yields a large connected receptor cluster, the rejection-free method is more efficient. PMID:21832806
Budgeting in health care systems.
Maynard, A
1984-01-01
During the last decade there has been a recognition that all health care systems, public and private, are characterised by perverse incentives (especially moral hazard and third party pays) which generate inefficiency in the use of scarce economic resources. Inefficiency is unethical: doctors who use resources inefficiently deprive potential patients of care from which they could benefit. To eradicate unethical and inefficient practices two economic rules have to be followed: (i) no service should be provided if its total costs exceed its total benefits; (ii) if total benefits exceed total costs, the level of provision should be at that level at which the additional input cost (marginal cost) is equal to the additional benefits (marginal benefit). This efficiency test can be applied to health care systems, their component parts and the individuals (especially doctors) who control resource allocation within them. Unfortunately, all health care systems neither generate this relevant decision making data nor are they flexible enough to use it to affect health care decisions. There are two basic varieties of budgeting system: resource based and production targeted. The former generates obsession with cash limits and too little regard of the benefits, particularly at the margins, of alternative patterns of resource allocation. The latter generates undue attention to the production of processes of care and scant regard for costs, especially at the margins. Consequently, one set of budget rules may lead to cost containment regardless of benefits and the other set of budget rules may lead to output maximization regardless of costs. To close this circle of inefficiency it is necessary to evolve market-like structures. To do this a system of client group (defined broadly across all existing activities public and private) budgets is advocated with an identification of the budget holder who has the capacity to shift resources and seek out cost effective policies. Negotiated output targets with defined budgets and incentives for decision makers to economise in their use of resources are being incorporated into experiments in the health care systems of Western Europe and the United States. Undue optimism about the success of these experiments must be avoided because these problems have existed in the West and in the Soviet bloc for decades and efficient solutions are noticeable by their absence.
Expert system for web based collaborative CAE
NASA Astrophysics Data System (ADS)
Hou, Liang; Lin, Zusheng
2006-11-01
An expert system for web based collaborative CAE was developed based on knowledge engineering, relational database and commercial FEA (Finite element analysis) software. The architecture of the system was illustrated. In this system, the experts' experiences, theories and typical examples and other related knowledge, which will be used in the stage of pre-process in FEA, were categorized into analysis process and object knowledge. Then, the integrated knowledge model based on object-oriented method and rule based method was described. The integrated reasoning process based on CBR (case based reasoning) and rule based reasoning was presented. Finally, the analysis process of this expert system in web based CAE application was illustrated, and an analysis example of a machine tool's column was illustrated to prove the validity of the system.
NASA Technical Reports Server (NTRS)
Lafuse, Sharon A.
1991-01-01
The paper describes the Shuttle Leak Management Expert System (SLMES), a preprototype expert system developed to enable the ECLSS subsystem manager to analyze subsystem anomalies and to formulate flight procedures based on flight data. The SLMES combines the rule-based expert system technology with the traditional FORTRAN-based software into an integrated system. SLMES analyzes the data using rules, and, when it detects a problem that requires simulation, it sets up the input for the FORTRAN-based simulation program ARPCS2AT2, which predicts the cabin total pressure and composition as a function of time. The program simulates the pressure control system, the crew oxygen masks, the airlock repress/depress valves, and the leakage. When the simulation has completed, other SLMES rules are triggered to examine the results of simulation contrary to flight data and to suggest methods for correcting the problem. Results are then presented in form of graphs and tables.
2016-08-05
This final rule updates the payment rates used under the prospective payment system (PPS) for skilled nursing facilities (SNFs) for fiscal year (FY) 2017. In addition, it specifies a potentially preventable readmission measure for the Skilled Nursing Facility Value-Based Purchasing Program (SNF VBP), and implements requirements for that program, including performance standards, a scoring methodology, and a review and correction process for performance information to be made public, aimed at implementing value-based purchasing for SNFs. Additionally, this final rule includes additional polices and measures in the Skilled Nursing Facility Quality Reporting Program (SNF QRP). This final rule also responds to comments on the SNF Payment Models Research (PMR) project.
Model-based Systems Engineering: Creation and Implementation of Model Validation Rules for MOS 2.0
NASA Technical Reports Server (NTRS)
Schmidt, Conrad K.
2013-01-01
Model-based Systems Engineering (MBSE) is an emerging modeling application that is used to enhance the system development process. MBSE allows for the centralization of project and system information that would otherwise be stored in extraneous locations, yielding better communication, expedited document generation and increased knowledge capture. Based on MBSE concepts and the employment of the Systems Modeling Language (SysML), extremely large and complex systems can be modeled from conceptual design through all system lifecycles. The Operations Revitalization Initiative (OpsRev) seeks to leverage MBSE to modernize the aging Advanced Multi-Mission Operations Systems (AMMOS) into the Mission Operations System 2.0 (MOS 2.0). The MOS 2.0 will be delivered in a series of conceptual and design models and documents built using the modeling tool MagicDraw. To ensure model completeness and cohesiveness, it is imperative that the MOS 2.0 models adhere to the specifications, patterns and profiles of the Mission Service Architecture Framework, thus leading to the use of validation rules. This paper outlines the process by which validation rules are identified, designed, implemented and tested. Ultimately, these rules provide the ability to maintain model correctness and synchronization in a simple, quick and effective manner, thus allowing the continuation of project and system progress.
Automatic rule generation for high-level vision
NASA Technical Reports Server (NTRS)
Rhee, Frank Chung-Hoon; Krishnapuram, Raghu
1992-01-01
Many high-level vision systems use rule-based approaches to solving problems such as autonomous navigation and image understanding. The rules are usually elaborated by experts. However, this procedure may be rather tedious. In this paper, we propose a method to generate such rules automatically from training data. The proposed method is also capable of filtering out irrelevant features and criteria from the rules.
Spiking Neural P Systems with Communication on Request.
Pan, Linqiang; Păun, Gheorghe; Zhang, Gexiang; Neri, Ferrante
2017-12-01
Spiking Neural [Formula: see text] Systems are Neural System models characterized by the fact that each neuron mimics a biological cell and the communication between neurons is based on spikes. In the Spiking Neural [Formula: see text] systems investigated so far, the application of evolution rules depends on the contents of a neuron (checked by means of a regular expression). In these [Formula: see text] systems, a specified number of spikes are consumed and a specified number of spikes are produced, and then sent to each of the neurons linked by a synapse to the evolving neuron. [Formula: see text]In the present work, a novel communication strategy among neurons of Spiking Neural [Formula: see text] Systems is proposed. In the resulting models, called Spiking Neural [Formula: see text] Systems with Communication on Request, the spikes are requested from neighboring neurons, depending on the contents of the neuron (still checked by means of a regular expression). Unlike the traditional Spiking Neural [Formula: see text] systems, no spikes are consumed or created: the spikes are only moved along synapses and replicated (when two or more neurons request the contents of the same neuron). [Formula: see text]The Spiking Neural [Formula: see text] Systems with Communication on Request are proved to be computationally universal, that is, equivalent with Turing machines as long as two types of spikes are used. Following this work, further research questions are listed to be open problems.
Parallel inferencing method and apparatus for rule-based expert systems
NASA Technical Reports Server (NTRS)
Schwuttke, Ursula M. (Inventor); Moldovan, Dan (Inventor); Kuo, Steve (Inventor)
1993-01-01
The invention analyzes areas of conditions with an expert knowledge base of rules using plural separate nodes which fire respective rules of said knowledge base, each of said rules upon being fired altering certain of said conditions predicated upon the existence of other said conditions. The invention operates by constructing a P representation of all pairs of said rules which are input dependent or output dependent; constructing a C representation of all pairs of said rules which are communication dependent or input dependent; determining which of the rules are ready to fire by matching the predicate conditions of each rule with the conditions of said set; enabling said node means to simultaneously fire those of the rules ready to fire which are defined by said P representation as being free of input and output dependencies; and communicating from each node enabled by said enabling step the alteration of conditions by the corresponding rule to other nodes whose rules are defined by said C matrix means as being input or communication dependent upon the rule of said enabled node.
Engineering posttranslational proofreading to discriminate nonstandard amino acids
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kunjapur, Aditya M.; Stork, Devon A.; Kuru, Erkin
Accurate incorporation of nonstandard amino acids (nsAAs) is central for genetic code expansion to increase the chemical diversity of proteins. However, aminoacyl-tRNA synthetases are polyspecific and facilitate incorporation of multiple nsAAs. We investigated and repurposed a natural protein degradation pathway, the N-end rule pathway, to devise an innovative system for rapid assessment of the accuracy of nsAA incorporation. Using this tool to monitor incorporation of the nsAA biphenylalanine allowed the identification of tyrosyl-tRNA synthetase (TyrRS) variants with improved amino acid specificity. The evolved TyrRS variants enhanced our ability to contain unwanted proliferation of genetically modified organisms. In conclusion, this posttranslationalmore » proofreading system will aid the evolution of orthogonal translation systems for specific incorporation of diverse nsAAs.« less
Engineering posttranslational proofreading to discriminate nonstandard amino acids
Kunjapur, Aditya M.; Stork, Devon A.; Kuru, Erkin; ...
2018-01-04
Accurate incorporation of nonstandard amino acids (nsAAs) is central for genetic code expansion to increase the chemical diversity of proteins. However, aminoacyl-tRNA synthetases are polyspecific and facilitate incorporation of multiple nsAAs. We investigated and repurposed a natural protein degradation pathway, the N-end rule pathway, to devise an innovative system for rapid assessment of the accuracy of nsAA incorporation. Using this tool to monitor incorporation of the nsAA biphenylalanine allowed the identification of tyrosyl-tRNA synthetase (TyrRS) variants with improved amino acid specificity. The evolved TyrRS variants enhanced our ability to contain unwanted proliferation of genetically modified organisms. In conclusion, this posttranslationalmore » proofreading system will aid the evolution of orthogonal translation systems for specific incorporation of diverse nsAAs.« less
An XML-Based Manipulation and Query Language for Rule-Based Information
NASA Astrophysics Data System (ADS)
Mansour, Essam; Höpfner, Hagen
Rules are utilized to assist in the monitoring process that is required in activities, such as disease management and customer relationship management. These rules are specified according to the application best practices. Most of research efforts emphasize on the specification and execution of these rules. Few research efforts focus on managing these rules as one object that has a management life-cycle. This paper presents our manipulation and query language that is developed to facilitate the maintenance of this object during its life-cycle and to query the information contained in this object. This language is based on an XML-based model. Furthermore, we evaluate the model and language using a prototype system applied to a clinical case study.
Influence of dispatching rules on average production lead time for multi-stage production systems.
Hübl, Alexander; Jodlbauer, Herbert; Altendorfer, Klaus
2013-08-01
In this paper the influence of different dispatching rules on the average production lead time is investigated. Two theorems based on covariance between processing time and production lead time are formulated and proved theoretically. Theorem 1 links the average production lead time to the "processing time weighted production lead time" for the multi-stage production systems analytically. The influence of different dispatching rules on average lead time, which is well known from simulation and empirical studies, can be proved theoretically in Theorem 2 for a single stage production system. A simulation study is conducted to gain more insight into the influence of dispatching rules on average production lead time in a multi-stage production system. We find that the "processing time weighted average production lead time" for a multi-stage production system is not invariant of the applied dispatching rule and can be used as a dispatching rule independent indicator for single-stage production systems.
Stem cell transplantation as a dynamical system: are clinical outcomes deterministic?
Toor, Amir A; Kobulnicky, Jared D; Salman, Salman; Roberts, Catherine H; Jameson-Lee, Max; Meier, Jeremy; Scalora, Allison; Sheth, Nihar; Koparde, Vishal; Serrano, Myrna; Buck, Gregory A; Clark, William B; McCarty, John M; Chung, Harold M; Manjili, Masoud H; Sabo, Roy T; Neale, Michael C
2014-01-01
Outcomes in stem cell transplantation (SCT) are modeled using probability theory. However, the clinical course following SCT appears to demonstrate many characteristics of dynamical systems, especially when outcomes are considered in the context of immune reconstitution. Dynamical systems tend to evolve over time according to mathematically determined rules. Characteristically, the future states of the system are predicated on the states preceding them, and there is sensitivity to initial conditions. In SCT, the interaction between donor T cells and the recipient may be considered as such a system in which, graft source, conditioning, and early immunosuppression profoundly influence immune reconstitution over time. This eventually determines clinical outcomes, either the emergence of tolerance or the development of graft versus host disease. In this paper, parallels between SCT and dynamical systems are explored and a conceptual framework for developing mathematical models to understand disparate transplant outcomes is proposed.
Stem Cell Transplantation as a Dynamical System: Are Clinical Outcomes Deterministic?
Toor, Amir A.; Kobulnicky, Jared D.; Salman, Salman; Roberts, Catherine H.; Jameson-Lee, Max; Meier, Jeremy; Scalora, Allison; Sheth, Nihar; Koparde, Vishal; Serrano, Myrna; Buck, Gregory A.; Clark, William B.; McCarty, John M.; Chung, Harold M.; Manjili, Masoud H.; Sabo, Roy T.; Neale, Michael C.
2014-01-01
Outcomes in stem cell transplantation (SCT) are modeled using probability theory. However, the clinical course following SCT appears to demonstrate many characteristics of dynamical systems, especially when outcomes are considered in the context of immune reconstitution. Dynamical systems tend to evolve over time according to mathematically determined rules. Characteristically, the future states of the system are predicated on the states preceding them, and there is sensitivity to initial conditions. In SCT, the interaction between donor T cells and the recipient may be considered as such a system in which, graft source, conditioning, and early immunosuppression profoundly influence immune reconstitution over time. This eventually determines clinical outcomes, either the emergence of tolerance or the development of graft versus host disease. In this paper, parallels between SCT and dynamical systems are explored and a conceptual framework for developing mathematical models to understand disparate transplant outcomes is proposed. PMID:25520720
DOE Office of Scientific and Technical Information (OSTI.GOV)
Korsah, Kofi; Muhlheim, Michael David; Wood, Richard
The US Nuclear Regulatory Commission (NRC) is initiating a new rulemaking project to develop a digital system common-cause failure (CCF) rule. This rulemaking will review and modify or affirm the NRC's current digital system CCF policy as discussed in the Staff Requirements Memorandum to the Secretary of the Commission, Office of the NRC (SECY) 93-087, Policy, Technical, and Licensing Issues Pertaining to Evolutionary and Advanced Light Water Reactor (ALWR) Designs, and Branch Technical Position (BTP) 7-19, Guidance on Evaluation of Defense-in-Depth and Diversity in Digital Computer-Based Instrumentation and Control Systems, as well as Chapter 7, Instrumentation and Controls, in NRCmore » Regulatory Guide (NUREG)-0800, Standard Review Plan for Review of Safety Analysis Reports for Nuclear Power Plants (ML033580677). The Oak Ridge National Laboratory (ORNL) is providing technical support to the NRC staff on the CCF rulemaking, and this report is one of several providing the technical basis to inform NRC staff members. For the task described in this report, ORNL examined instrumentation and controls (I&C) technology implementations in nuclear power plants in the light of current CCF guidance. The intent was to assess whether the current position on CCF is adequate given the evolutions in digital safety system implementations and, if gaps in the guidance were found, to provide recommendations as to how these gaps could be closed.« less
An expert system design to diagnose cancer by using a new method reduced rule base.
Başçiftçi, Fatih; Avuçlu, Emre
2018-04-01
A Medical Expert System (MES) was developed which uses Reduced Rule Base to diagnose cancer risk according to the symptoms in an individual. A total of 13 symptoms were used. With the new MES, the reduced rules are controlled instead of all possibilities (2 13 = 8192 different possibilities occur). By controlling reduced rules, results are found more quickly. The method of two-level simplification of Boolean functions was used to obtain Reduced Rule Base. Thanks to the developed application with the number of dynamic inputs and outputs on different platforms, anyone can easily test their own cancer easily. More accurate results were obtained considering all the possibilities related to cancer. Thirteen different risk factors were determined to determine the type of cancer. The truth table produced in our study has 13 inputs and 4 outputs. The Boolean Function Minimization method is used to obtain less situations by simplifying logical functions. Diagnosis of cancer quickly thanks to control of the simplified 4 output functions. Diagnosis made with the 4 output values obtained using Reduced Rule Base was found to be quicker than diagnosis made by screening all 2 13 = 8192 possibilities. With the improved MES, more probabilities were added to the process and more accurate diagnostic results were obtained. As a result of the simplification process in breast and renal cancer diagnosis 100% diagnosis speed gain, in cervical cancer and lung cancer diagnosis rate gain of 99% was obtained. With Boolean function minimization, less number of rules is evaluated instead of evaluating a large number of rules. Reducing the number of rules allows the designed system to work more efficiently and to save time, and facilitates to transfer the rules to the designed Expert systems. Interfaces were developed in different software platforms to enable users to test the accuracy of the application. Any one is able to diagnose the cancer itself using determinative risk factors. Thereby likely to beat the cancer with early diagnosis. Copyright © 2018 Elsevier B.V. All rights reserved.
Integrating the ECG power-line interference removal methods with rule-based system.
Kumaravel, N; Senthil, A; Sridhar, K S; Nithiyanandam, N
1995-01-01
The power-line frequency interference in electrocardiographic signals is eliminated to enhance the signal characteristics for diagnosis. The power-line frequency normally varies +/- 1.5 Hz from its standard value of 50 Hz. In the present work, the performances of the linear FIR filter, Wave digital filter (WDF) and adaptive filter for the power-line frequency variations from 48.5 to 51.5 Hz in steps of 0.5 Hz are studied. The advantage of the LMS adaptive filter in the removal of power-line frequency interference even if the frequency of interference varies by +/- 1.5 Hz from its normal value of 50 Hz over other fixed frequency filters is very well justified. A novel method of integrating rule-based system approach with linear FIR filter and also with Wave digital filter are proposed. The performances of Rule-based FIR filter and Rule-based Wave digital filter are compared with the LMS adaptive filter.
Contingent movement and cooperation evolve under generalized reciprocity
Hamilton, Ian M; Taborsky, Michael
2005-01-01
How cooperation and altruism among non-relatives can persist in the face of cheating remains a key puzzle in evolutionary biology. Although mechanisms such as direct and indirect reciprocity and limited movement have been put forward to explain such cooperation, they cannot explain cooperation among unfamiliar, highly mobile individuals. Here we show that cooperation may be evolutionarily stable if decisions taken to cooperate and to change group membership are both dependent on anonymous social experience (generalized reciprocity). We find that a win–stay, lose–shift rule (where shifting is either moving away from the group or changing tactics within the group after receiving defection) evolves in evolutionary simulations when group leaving is moderately costly (i.e. the current payoff to being alone is low, but still higher than that in a mutually defecting group, and new groups are rarely encountered). This leads to the establishment of widespread cooperation in the population. If the costs of group leaving are reduced, a similar group-leaving rule evolves in association with cooperation in pairs and exploitation of larger anonymous groups. We emphasize that mechanisms of assortment within populations are often behavioural decisions and should not be considered independently of the evolution of cooperation. PMID:16191638
Evolution of Functional Diversification within Quasispecies
Colizzi, Enrico Sandro; Hogeweg, Paulien
2014-01-01
According to quasispecies theory, high mutation rates limit the amount of information genomes can store (Eigen’s Paradox), whereas genomes with higher degrees of neutrality may be selected even at the expenses of higher replication rates (the “survival of the flattest” effect). Introducing a complex genotype to phenotype map, such as RNA folding, epitomizes such effect because of the existence of neutral networks and their exploitation by evolution, affecting both population structure and genome composition. We reexamine these classical results in the light of an RNA-based system that can evolve its own ecology. Contrary to expectations, we find that quasispecies evolving at high mutation rates are steep and characterized by one master sequence. Importantly, the analysis of the system and the characterization of the evolved quasispecies reveal the emergence of functionalities as phenotypes of nonreplicating genotypes, whose presence is crucial for the overall viability and stability of the system. In other words, the master sequence codes for the information of the entire ecosystem, whereas the decoding happens, stochastically, through mutations. We show that this solution quickly outcompetes strategies based on genomes with a high degree of neutrality. In conclusion, individually coded but ecosystem-based diversity evolves and persists indefinitely close to the Information Threshold. PMID:25056399
Barratt, Martin D
2004-11-01
Relationships between the structure and properties of chemicals can be programmed into knowledge-based systems such as DEREK for Windows (DEREK is an acronym for "Deductive Estimation of Risk from Existing Knowledge"). The DEREK for Windows computer system contains a subset of over 60 rules describing chemical substructures (toxophores) responsible for skin sensitisation. As part of the European Phototox Project, the rule base was supplemented by a number of rules for the prospective identification of photoallergens, either by extension of the scope of existing rules or by the generation of new rules where a sound mechanistic rationale for the biological activity could be established. The scope of the rules for photoallergenicity was then further refined by assessment against a list of chemicals identified as photosensitisers by the Centro de Farmacovigilancia de la Comunidad Valenciana, Valencia, Spain. This paper contains an analysis of the mechanistic bases of activity for eight important groups of photoallergens and phototoxins, together with rules for the prospective identification of the photobiological activity of new or untested chemicals belonging to those classes. The mechanism of action of one additional chemical, nitrofurantoin, is well established; however, it was deemed inappropriate to write a rule on the basis of a single chemical structure.
NASA Astrophysics Data System (ADS)
Kulkarni, Malhar; Kulkarni, Irawati; Dangarikar, Chaitali; Bhattacharyya, Pushpak
Glosses and examples are the essential components of the computational lexical databases like, Wordnet. These two components of the lexical database can be used in building domain ontologies, semantic relations, phrase structure rules etc., and can help automatic or manual word sense disambiguation tasks. The present paper aims to highlight the importance of gloss in the process of WSD based on the experiences from building Sanskrit Wordnet. This paper presents a survey of Sanskrit Synonymy lexica, use of Navya-Nyāya terminology in developing a gloss and the kind of patterns evolved that are useful for the computational purpose of WSD with special reference to Sanskrit.
Butt, Muhammad Arif; Akram, Muhammad
2016-01-01
We present a new intuitionistic fuzzy rule-based decision-making system based on intuitionistic fuzzy sets for a process scheduler of a batch operating system. Our proposed intuitionistic fuzzy scheduling algorithm, inputs the nice value and burst time of all available processes in the ready queue, intuitionistically fuzzify the input values, triggers appropriate rules of our intuitionistic fuzzy inference engine and finally calculates the dynamic priority (dp) of all the processes in the ready queue. Once the dp of every process is calculated the ready queue is sorted in decreasing order of dp of every process. The process with maximum dp value is sent to the central processing unit for execution. Finally, we show complete working of our algorithm on two different data sets and give comparisons with some standard non-preemptive process schedulers.
A new evolutionary system for evolving artificial neural networks.
Yao, X; Liu, Y
1997-01-01
This paper presents a new evolutionary system, i.e., EPNet, for evolving artificial neural networks (ANNs). The evolutionary algorithm used in EPNet is based on Fogel's evolutionary programming (EP). Unlike most previous studies on evolving ANN's, this paper puts its emphasis on evolving ANN's behaviors. Five mutation operators proposed in EPNet reflect such an emphasis on evolving behaviors. Close behavioral links between parents and their offspring are maintained by various mutations, such as partial training and node splitting. EPNet evolves ANN's architectures and connection weights (including biases) simultaneously in order to reduce the noise in fitness evaluation. The parsimony of evolved ANN's is encouraged by preferring node/connection deletion to addition. EPNet has been tested on a number of benchmark problems in machine learning and ANNs, such as the parity problem, the medical diagnosis problems, the Australian credit card assessment problem, and the Mackey-Glass time series prediction problem. The experimental results show that EPNet can produce very compact ANNs with good generalization ability in comparison with other algorithms.
NASA Astrophysics Data System (ADS)
Brandon, R.; Page, S.; Varndell, J.
2012-06-01
This paper presents a novel application of Evidential Reasoning to Threat Assessment for critical infrastructure protection. A fusion algorithm based on the PCR5 Dezert-Smarandache fusion rule is proposed which fuses alerts generated by a vision-based behaviour analysis algorithm and a-priori watch-list intelligence data. The fusion algorithm produces a prioritised event list according to a user-defined set of event-type severity or priority weightings. Results generated from application of the algorithm to real data and Behaviour Analysis alerts captured at London's Heathrow Airport under the EU FP7 SAMURAI programme are presented. A web-based demonstrator system is also described which implements the fusion process in real-time. It is shown that this system significantly reduces the data deluge problem, and directs the user's attention to the most pertinent alerts, enhancing their Situational Awareness (SA). The end-user is also able to alter the perceived importance of different event types in real-time, allowing the system to adapt rapidly to changes in priorities as the situation evolves. One of the key challenges associated with fusing information deriving from intelligence data is the issue of Data Incest. Techniques for handling Data Incest within Evidential Reasoning frameworks are proposed, and comparisons are drawn with respect to Data Incest management techniques that are commonly employed within Bayesian fusion frameworks (e.g. Covariance Intersection). The challenges associated with simultaneously dealing with conflicting information and Data Incest in Evidential Reasoning frameworks are also discussed.
Systematic reconstruction of TRANSPATH data into Cell System Markup Language
Nagasaki, Masao; Saito, Ayumu; Li, Chen; Jeong, Euna; Miyano, Satoru
2008-01-01
Background Many biological repositories store information based on experimental study of the biological processes within a cell, such as protein-protein interactions, metabolic pathways, signal transduction pathways, or regulations of transcription factors and miRNA. Unfortunately, it is difficult to directly use such information when generating simulation-based models. Thus, modeling rules for encoding biological knowledge into system-dynamics-oriented standardized formats would be very useful for fully understanding cellular dynamics at the system level. Results We selected the TRANSPATH database, a manually curated high-quality pathway database, which provides a plentiful source of cellular events in humans, mice, and rats, collected from over 31,500 publications. In this work, we have developed 16 modeling rules based on hybrid functional Petri net with extension (HFPNe), which is suitable for graphical representing and simulating biological processes. In the modeling rules, each Petri net element is incorporated with Cell System Ontology to enable semantic interoperability of models. As a formal ontology for biological pathway modeling with dynamics, CSO also defines biological terminology and corresponding icons. By combining HFPNe with the CSO features, it is possible to make TRANSPATH data to simulation-based and semantically valid models. The results are encoded into a biological pathway format, Cell System Markup Language (CSML), which eases the exchange and integration of biological data and models. Conclusion By using the 16 modeling rules, 97% of the reactions in TRANSPATH are converted into simulation-based models represented in CSML. This reconstruction demonstrates that it is possible to use our rules to generate quantitative models from static pathway descriptions. PMID:18570683
Systematic reconstruction of TRANSPATH data into cell system markup language.
Nagasaki, Masao; Saito, Ayumu; Li, Chen; Jeong, Euna; Miyano, Satoru
2008-06-23
Many biological repositories store information based on experimental study of the biological processes within a cell, such as protein-protein interactions, metabolic pathways, signal transduction pathways, or regulations of transcription factors and miRNA. Unfortunately, it is difficult to directly use such information when generating simulation-based models. Thus, modeling rules for encoding biological knowledge into system-dynamics-oriented standardized formats would be very useful for fully understanding cellular dynamics at the system level. We selected the TRANSPATH database, a manually curated high-quality pathway database, which provides a plentiful source of cellular events in humans, mice, and rats, collected from over 31,500 publications. In this work, we have developed 16 modeling rules based on hybrid functional Petri net with extension (HFPNe), which is suitable for graphical representing and simulating biological processes. In the modeling rules, each Petri net element is incorporated with Cell System Ontology to enable semantic interoperability of models. As a formal ontology for biological pathway modeling with dynamics, CSO also defines biological terminology and corresponding icons. By combining HFPNe with the CSO features, it is possible to make TRANSPATH data to simulation-based and semantically valid models. The results are encoded into a biological pathway format, Cell System Markup Language (CSML), which eases the exchange and integration of biological data and models. By using the 16 modeling rules, 97% of the reactions in TRANSPATH are converted into simulation-based models represented in CSML. This reconstruction demonstrates that it is possible to use our rules to generate quantitative models from static pathway descriptions.
Lin, Chin-Teng; Wu, Rui-Cheng; Chang, Jyh-Yeong; Liang, Sheng-Fu
2004-02-01
In this paper, a new technique for the Chinese text-to-speech (TTS) system is proposed. Our major effort focuses on the prosodic information generation. New methodologies for constructing fuzzy rules in a prosodic model simulating human's pronouncing rules are developed. The proposed Recurrent Fuzzy Neural Network (RFNN) is a multilayer recurrent neural network (RNN) which integrates a Self-cOnstructing Neural Fuzzy Inference Network (SONFIN) into a recurrent connectionist structure. The RFNN can be functionally divided into two parts. The first part adopts the SONFIN as a prosodic model to explore the relationship between high-level linguistic features and prosodic information based on fuzzy inference rules. As compared to conventional neural networks, the SONFIN can always construct itself with an economic network size in high learning speed. The second part employs a five-layer network to generate all prosodic parameters by directly using the prosodic fuzzy rules inferred from the first part as well as other important features of syllables. The TTS system combined with the proposed method can behave not only sandhi rules but also the other prosodic phenomena existing in the traditional TTS systems. Moreover, the proposed scheme can even find out some new rules about prosodic phrase structure. The performance of the proposed RFNN-based prosodic model is verified by imbedding it into a Chinese TTS system with a Chinese monosyllable database based on the time-domain pitch synchronous overlap add (TD-PSOLA) method. Our experimental results show that the proposed RFNN can generate proper prosodic parameters including pitch means, pitch shapes, maximum energy levels, syllable duration, and pause duration. Some synthetic sounds are online available for demonstration.
Solutions to time variant problems of real-time expert systems
NASA Technical Reports Server (NTRS)
Yeh, Show-Way; Wu, Chuan-Lin; Hung, Chaw-Kwei
1988-01-01
Real-time expert systems for monitoring and control are driven by input data which changes with time. One of the subtle problems of this field is the propagation of time variant problems from rule to rule. This propagation problem is even complicated under a multiprogramming environment where the expert system may issue test commands to the system to get data and to access time consuming devices to retrieve data for concurrent reasoning. Two approaches are used to handle the flood of input data. Snapshots can be taken to freeze the system from time to time. The expert system treats the system as a stationary one and traces changes by comparing consecutive snapshots. In the other approach, when an input is available, the rules associated with it are evaluated. For both approaches, if the premise condition of a fired rule is changed to being false, the downstream rules should be deactivated. If the status change is due to disappearance of a transient problem, actions taken by the fired downstream rules which are no longer true may need to be undone. If a downstream rule is being evaluated, it should not be fired. Three mechanisms for solving this problem are discussed: tracing, backward checking, and censor setting. In the forward tracing mechanism, when the premise conditions of a fired rule become false, the premise conditions of downstream rules which have been fired or are being evaluated due to the firing of that rule are reevaluated. A tree with its root at the rule being deactivated is traversed. In the backward checking mechanism, when a rule is being fired, the expert system checks back on the premise conditions of the upstream rules that result in evaluation of the rule to see whether it should be fired. The root of the tree being traversed is the rule being fired. In the censor setting mechanism, when a rule is to be evaluated, a censor is constructed based on the premise conditions of the upstream rules and the censor is evaluated just before the rule is fired. Unlike the backward checking mechanism, this one does not search the upstream rules. This paper explores the details of implementation of the three mechanisms.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marchisio, Mario Andrea, E-mail: marchisio@hit.edu.cn
Published in 2008, Parts & Pools represents one of the first attempts to conceptualize the modular design of bacterial synthetic gene circuits with Standard Biological Parts (DNA segments) and Pools of molecules referred to as common signal carriers (e.g., RNA polymerases and ribosomes). The original framework for modeling bacterial components and designing prokaryotic circuits evolved over the last years and brought, first, to the development of an algorithm for the automatic design of Boolean gene circuits. This is a remarkable achievement since gene digital circuits have a broad range of applications that goes from biosensors for health and environment caremore » to computational devices. More recently, Parts & Pools was enabled to give a proper formal description of eukaryotic biological circuit components. This was possible by employing a rule-based modeling approach, a technique that permits a faithful calculation of all the species and reactions involved in complex systems such as eukaryotic cells and compartments. In this way, Parts & Pools is currently suitable for the visual and modular design of synthetic gene circuits in yeast and mammalian cells too.« less
McCoyd, Judith L M
2009-10-01
The sociology of emotion is rapidly evolving and has implications for medical settings. Advancing medical technologies create new contexts for decision-making and emotional reaction that are framed by "feeling rules." Feeling rules guide not only behavior, but also how one believes one should feel, thereby causing one to attempt to bring one's authentic feelings into line with perceived feeling rules. Using qualitative data, the theoretical existence of feeling rules in pregnancy and prenatal testing is confirmed. Further examination extends this analysis: at times of technological development feeling rules are often discrepant, leaving patients with unscripted emotion work. Data from a study of women who interrupted anomalous pregnancies indicate that feeling rules are unclear when competing feeling rules are operating during times of societal and technological change. Because much of this occurs below the level of consciousness, medical and psychological services providers need to be aware of potential discrepancies in feeling rules and assist patients in identifying the salient feeling rules. Patients' struggles ease when they can recognize the discrepancies and assess their implications for decision-making and emotional response. (c) 2009 APA, all rights reserved.
A ROSE-based OpenMP 3.0 Research Compiler Supporting Multiple Runtime Libraries
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liao, C; Quinlan, D; Panas, T
2010-01-25
OpenMP is a popular and evolving programming model for shared-memory platforms. It relies on compilers for optimal performance and to target modern hardware architectures. A variety of extensible and robust research compilers are key to OpenMP's sustainable success in the future. In this paper, we present our efforts to build an OpenMP 3.0 research compiler for C, C++, and Fortran; using the ROSE source-to-source compiler framework. Our goal is to support OpenMP research for ourselves and others. We have extended ROSE's internal representation to handle all of the OpenMP 3.0 constructs and facilitate their manipulation. Since OpenMP research is oftenmore » complicated by the tight coupling of the compiler translations and the runtime system, we present a set of rules to define a common OpenMP runtime library (XOMP) on top of multiple runtime libraries. These rules additionally define how to build a set of translations targeting XOMP. Our work demonstrates how to reuse OpenMP translations across different runtime libraries. This work simplifies OpenMP research by decoupling the problematic dependence between the compiler translations and the runtime libraries. We present an evaluation of our work by demonstrating an analysis tool for OpenMP correctness. We also show how XOMP can be defined using both GOMP and Omni and present comparative performance results against other OpenMP compilers.« less
A Logical Framework for Service Migration Based Survivability
2016-06-24
platforms; Service Migration Strategy Fuzzy Inference System Knowledge Base Fuzzy rules representing domain expert knowledge about implications of...service migration strategy. Our approach uses expert knowledge as linguistic reasoning rules and takes service programs damage assessment, service...programs complexity, and available network capability as input. The fuzzy inference system includes four components as shown in Figure 5: (1) a knowledge
Bau, Cho-Tsan; Huang, Chung-Yi
2014-01-01
Abstract Objective: To construct a clinical decision support system (CDSS) for undergoing surgery based on domain ontology and rules reasoning in the setting of hospitalized diabetic patients. Materials and Methods: The ontology was created with a modified ontology development method, including specification and conceptualization, formalization, implementation, and evaluation and maintenance. The Protégé–Web Ontology Language editor was used to implement the ontology. Embedded clinical knowledge was elicited to complement the domain ontology with formal concept analysis. The decision rules were translated into JENA format, which JENA can use to infer recommendations based on patient clinical situations. Results: The ontology includes 31 classes and 13 properties, plus 38 JENA rules that were built to generate recommendations. The evaluation studies confirmed the correctness of the ontology, acceptance of recommendations, satisfaction with the system, and usefulness of the ontology for glycemic management of diabetic patients undergoing surgery, especially for domain experts. Conclusions: The contribution of this research is to set up an evidence-based hybrid ontology and an evaluation method for CDSS. The system can help clinicians to achieve inpatient glycemic control in diabetic patients undergoing surgery while avoiding hypoglycemia. PMID:24730353
Bau, Cho-Tsan; Chen, Rung-Ching; Huang, Chung-Yi
2014-05-01
To construct a clinical decision support system (CDSS) for undergoing surgery based on domain ontology and rules reasoning in the setting of hospitalized diabetic patients. The ontology was created with a modified ontology development method, including specification and conceptualization, formalization, implementation, and evaluation and maintenance. The Protégé-Web Ontology Language editor was used to implement the ontology. Embedded clinical knowledge was elicited to complement the domain ontology with formal concept analysis. The decision rules were translated into JENA format, which JENA can use to infer recommendations based on patient clinical situations. The ontology includes 31 classes and 13 properties, plus 38 JENA rules that were built to generate recommendations. The evaluation studies confirmed the correctness of the ontology, acceptance of recommendations, satisfaction with the system, and usefulness of the ontology for glycemic management of diabetic patients undergoing surgery, especially for domain experts. The contribution of this research is to set up an evidence-based hybrid ontology and an evaluation method for CDSS. The system can help clinicians to achieve inpatient glycemic control in diabetic patients undergoing surgery while avoiding hypoglycemia.
Use of an expert system data analysis manager for space shuttle main engine test evaluation
NASA Technical Reports Server (NTRS)
Abernethy, Ken
1988-01-01
The ability to articulate, collect, and automate the application of the expertise needed for the analysis of space shuttle main engine (SSME) test data would be of great benefit to NASA liquid rocket engine experts. This paper describes a project whose goal is to build a rule-based expert system which incorporates such expertise. Experiential expertise, collected directly from the experts currently involved in SSME data analysis, is used to build a rule base to identify engine anomalies similar to those analyzed previously. Additionally, an alternate method of expertise capture is being explored. This method would generate rules inductively based on calculations made using a theoretical model of the SSME's operation. The latter rules would be capable of diagnosing anomalies which may not have appeared before, but whose effects can be predicted by the theoretical model.
Bennett, Kochise; Kowalewski, Markus; Mukamel, Shaul
2016-02-09
We present a hierarchy of Fermi golden rules (FGRs) that incorporate strongly coupled electronic/nuclear dynamics in time-resolved photoelectron spectroscopy (TRPES) signals at different levels of theory. Expansion in the joint electronic and nuclear eigenbasis yields the numerically most challenging exact FGR (eFGR). The quasistatic Fermi Golden Rule (qsFGR) neglects nuclear motion during the photoionization process but takes into account electronic coherences as well as populations initially present in the pumped matter as well as those generated internally by coupling between electronic surfaces. The standard semiclassical Fermi Golden Rule (scFGR) neglects the electronic coherences and the nuclear kinetic energy during the ionizing pulse altogether, yielding the classical Condon approximation. The coherence contributions depend on the phase-profile of the ionizing field, allowing coherent control of TRPES signals. The photoelectron spectrum from model systems is simulated using these three levels of theory. The eFGR and the qsFGR show temporal oscillations originating from the electronic or vibrational coherences generated as the nuclear wave packet traverses a conical intersection. These oscillations, which are missed by the scFGR, directly reveal the time-evolving splitting between electronic states of the neutral molecule in the curve-crossing regime.
López Chavira, Magali Alexander; Marcelín-Jiménez, Ricardo
2017-01-01
The study of complex networks has become an important subject over the last decades. It has been shown that these structures have special features, such as their diameter, or their average path length, which in turn are the explanation of some functional properties in a system such as its fault tolerance, its fragility before attacks, or the ability to support routing procedures. In the present work, we study some of the forces that help a network to evolve to the point where structural properties are settled. Although our work is mainly focused on the possibility of applying our ideas to Information and Communication Technologies systems, we consider that our results may contribute to understanding different scenarios where complex networks have become an important modeling tool. Using a discrete event simulator, we get each node to discover the shortcuts that may connect it with regions away from its local environment. Based on this partial knowledge, each node can rewire some of its links, which allows modifying the topology of the entire underlying graph to achieve new structural properties. We proposed a distributed rewiring model that creates networks with features similar to those found in complex networks. Although each node acts in a distributed way and seeking to reduce only the trajectories of its packets, we observed a decrease of diameter and an increase in clustering coefficient in the global structure compared to the initial graph. Furthermore, we can find different final structures depending on slight changes in the local rewiring rules.
Grimm, Lisa R; Maddox, W Todd
2013-11-01
Research has identified multiple category-learning systems with each being "tuned" for learning categories with different task demands and each governed by different neurobiological systems. Rule-based (RB) classification involves testing verbalizable rules for category membership while information-integration (II) classification requires the implicit learning of stimulus-response mappings. In the first study to directly test rule priming with RB and II category learning, we investigated the influence of the availability of information presented at the beginning of the task. Participants viewed lines that varied in length, orientation, and position on the screen, and were primed to focus on stimulus dimensions that were relevant or irrelevant to the correct classification rule. In Experiment 1, we used an RB category structure, and in Experiment 2, we used an II category structure. Accuracy and model-based analyses suggested that a focus on relevant dimensions improves RB task performance later in learning while a focus on an irrelevant dimension improves II task performance early in learning. © 2013.
CLIPS: A tool for the development and delivery of expert systems
NASA Technical Reports Server (NTRS)
Riley, Gary
1991-01-01
The C Language Integrated Production System (CLIPS) is a forward chaining rule-based language developed by the Software Technology Branch at the Johnson Space Center. CLIPS provides a complete environment for the construction of rule-based expert systems. CLIPS was designed specifically to provide high probability, low cost, and easy integration with external systems. Other key features of CLIPS include a powerful rule syntax, an interactive development environment, high performance, extensibility, a verification/validation tool, extensive documentation, and source code availability. The current release of CLIPS, version 4.3, is being used by over 2,500 users throughout the public and private community including: all NASA sites and branches of the military, numerous Federal bureaus, government contractors, 140 universities, and many companies.
NASA Astrophysics Data System (ADS)
Chang, Ya-Ting; Chang, Li-Chiu; Chang, Fi-John
2005-04-01
To bridge the gap between academic research and actual operation, we propose an intelligent control system for reservoir operation. The methodology includes two major processes, the knowledge acquired and implemented, and the inference system. In this study, a genetic algorithm (GA) and a fuzzy rule base (FRB) are used to extract knowledge based on the historical inflow data with a design objective function and on the operating rule curves respectively. The adaptive network-based fuzzy inference system (ANFIS) is then used to implement the knowledge, to create the fuzzy inference system, and then to estimate the optimal reservoir operation. To investigate its applicability and practicability, the Shihmen reservoir, Taiwan, is used as a case study. For the purpose of comparison, a simulation of the currently used M-5 operating rule curve is also performed. The results demonstrate that (1) the GA is an efficient way to search the optimal input-output patterns, (2) the FRB can extract the knowledge from the operating rule curves, and (3) the ANFIS models built on different types of knowledge can produce much better performance than the traditional M-5 curves in real-time reservoir operation. Moreover, we show that the model can be more intelligent for reservoir operation if more information (or knowledge) is involved.
Scale-free effect of substitution networks
NASA Astrophysics Data System (ADS)
Li, Ziyu; Yu, Zhouyu; Xi, Lifeng
2018-02-01
In this paper, we construct the growing networks in terms of substitution rule. Roughly speaking, we replace edges of different colors with different initial graphs. Then the evolving networks are constructed. We obtained the free-scale effect of our substitution networks.
NASA Astrophysics Data System (ADS)
Lewe, Jung-Ho
The National Transportation System (NTS) is undoubtedly a complex system-of-systems---a collection of diverse 'things' that evolve over time, organized at multiple levels, to achieve a range of possibly conflicting objectives, and never quite behaving as planned. The purpose of this research is to develop a virtual transportation architecture for the ultimate goal of formulating an integrated decision-making framework. The foundational endeavor begins with creating an abstraction of the NTS with the belief that a holistic frame of reference is required to properly study such a multi-disciplinary, trans-domain system. The culmination of the effort produces the Transportation Architecture Field (TAF) as a mental model of the NTS, in which the relationships between four basic entity groups are identified and articulated. This entity-centric abstraction framework underpins the construction of a virtual NTS couched in the form of an agent-based model. The transportation consumers and the service providers are identified as adaptive agents that apply a set of preprogrammed behavioral rules to achieve their respective goals. The transportation infrastructure and multitude of exogenous entities (disruptors and drivers) in the whole system can also be represented without resorting to an extremely complicated structure. The outcome is a flexible, scalable, computational model that allows for examination of numerous scenarios which involve the cascade of interrelated effects of aviation technology, infrastructure, and socioeconomic changes throughout the entire system.
NASA Technical Reports Server (NTRS)
Myers, Thomas T.; Mcruer, Duane T.
1988-01-01
The development of a comprehensive and electric methodology for conceptual and preliminary design of flight control systems is presented and illustrated. The methodology is focused on the design states starting with the layout of system requirements and ending when some viable competing system architectures (feedback control structures) are defined. The approach is centered on the human pilot and the aircraft as both the sources of, and the keys to the solution of, many flight control problems. The methodology relies heavily on computational procedures which are highly interactive with the design engineer. To maximize effectiveness, these techniques, as selected and modified to be used together in the methodology, form a cadre of computational tools specifically tailored for integrated flight control system preliminary design purposes. The FCX expert system as presently developed is only a limited prototype capable of supporting basic lateral-directional FCS design activities related to the design example used. FCX presently supports design of only one FCS architecture (yaw damper plus roll damper) and the rules are largely focused on Class IV (highly maneuverable) aircraft. Despite this limited scope, the major elements which appear necessary for application of knowledge-based software concepts to flight control design were assembled and thus FCX represents a prototype which can be tested, critiqued and evolved in an ongoing process of development.
Can 100Gb/s wavelengths be deployed using 10Gb/s engineering rules?
NASA Astrophysics Data System (ADS)
Saunders, Ross; Nicholl, Gary; Wollenweber, Kevin; Schmidt, Ted
2007-09-01
A key challenge set by carriers for 40Gb/s deployments was that the 40Gb/s wavelengths should be deployable over existing 10Gb/s DWDM systems, using 10Gb/s link engineering design rules. Typical 10Gb/s link engineering rules are: 1. Polarization Mode Dispersion (PMD) tolerance of 10ps (mean); 2. Chromatic Dispersion (CD) tolerance of +/-700ps/nm 3. Operation at 50GHz channel spacing, including transit through multiple cascaded [R]OADMs; 4. Optical reach up to 2,000km. By using a combination of advanced modulation formats and adaptive dispersion compensation (technologies rarely seen at 10Gb/s outside of the submarine systems space), vendors did respond to the challenge and broadly met this requirement. As we now start to explore feasible technologies for 100Gb/s optical transport, driven by 100GE port availability on core IP routers, the carrier challenge remains the same. 100Gb/s links should be deployable over existing 10Gb/s DWDM systems using 10Gb/s link engineering rules (as listed above). To meet this challenge, optical transport technology must evolve to yet another level of complexity/maturity in both modulation formats and adaptive compensation techniques. Many clues as to how this might be achieved can be gained by first studying sister telecommunications industries, e.g. satellite (QPSK, QAM, LDCP FEC codes), wireless (advanced DSP, MSK), HDTV (TCM), etc. The optical industry is not a pioneer of new ideas in modulation schemes and coding theory, we will always be followers. However, we do have the responsibility of developing the highest capacity "modems" on the planet to carry the core backbone traffic of the Internet. As such, the key to our success will be to analyze the pros and cons of advanced modulation/coding techniques and balance this with the practical limitations of high speed electronics processing speed and the challenges of real world optical layer impairments. This invited paper will present a view on what advanced technologies are likely candidates to support 100GE optical IP transport over existing 10Gb/s DWDM systems, using 10Gb/s link engineering rules.
An Evolvable Multi-Agent Approach to Space Operations Engineering
NASA Technical Reports Server (NTRS)
Mandutianu, Sanda; Stoica, Adrian
1999-01-01
A complex system of spacecraft and ground tracking stations, as well as a constellation of satellites or spacecraft, has to be able to reliably withstand sudden environment changes, resource fluctuations, dynamic resource configuration, limited communication bandwidth, etc., while maintaining the consistency of the system as a whole. It is not known in advance when a change in the environment might occur or when a particular exchange will happen. A higher degree of sophistication for the communication mechanisms between different parts of the system is required. The actual behavior has to be determined while the system is performing and the course of action can be decided at the individual level. Under such circumstances, the solution will highly benefit from increased on-board and on the ground adaptability and autonomy. An evolvable architecture based on intelligent agents that communicate and cooperate with each other can offer advantages in this direction. This paper presents an architecture of an evolvable agent-based system (software and software/hardware hybrids) as well as some plans for further implementation.
The Role of Age and Executive Function in Auditory Category Learning
Reetzke, Rachel; Maddox, W. Todd; Chandrasekaran, Bharath
2015-01-01
Auditory categorization is a natural and adaptive process that allows for the organization of high-dimensional, continuous acoustic information into discrete representations. Studies in the visual domain have identified a rule-based learning system that learns and reasons via a hypothesis-testing process that requires working memory and executive attention. The rule-based learning system in vision shows a protracted development, reflecting the influence of maturing prefrontal function on visual categorization. The aim of the current study is two-fold: (a) to examine the developmental trajectory of rule-based auditory category learning from childhood through adolescence, into early adulthood; and (b) to examine the extent to which individual differences in rule-based category learning relate to individual differences in executive function. Sixty participants with normal hearing, 20 children (age range, 7–12), 21 adolescents (age range, 13–19), and 19 young adults (age range, 20–23), learned to categorize novel dynamic ripple sounds using trial-by-trial feedback. The spectrotemporally modulated ripple sounds are considered the auditory equivalent of the well-studied Gabor patches in the visual domain. Results revealed that auditory categorization accuracy improved with age, with young adults outperforming children and adolescents. Computational modeling analyses indicated that the use of the task-optimal strategy (i.e. a conjunctive rule-based learning strategy) improved with age. Notably, individual differences in executive flexibility significantly predicted auditory category learning success. The current findings demonstrate a protracted development of rule-based auditory categorization. The results further suggest that executive flexibility coupled with perceptual processes play important roles in successful rule-based auditory category learning. PMID:26491987
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-10
...The Federal Deposit Insurance Corporation (FDIC) is adopting an interim final rule that revises its risk-based and leverage capital requirements for FDIC-supervised institutions. This interim final rule is substantially identical to a joint final rule issued by the Office of the Comptroller of the Currency (OCC) and the Board of Governors of the Federal Reserve System (Federal Reserve) (together, with the FDIC, the agencies). The interim final rule consolidates three separate notices of proposed rulemaking that the agencies jointly published in the Federal Register on August 30, 2012, with selected changes. The interim final rule implements a revised definition of regulatory capital, a new common equity tier 1 minimum capital requirement, a higher minimum tier 1 capital requirement, and, for FDIC-supervised institutions subject to the advanced approaches risk-based capital rules, a supplementary leverage ratio that incorporates a broader set of exposures in the denominator. The interim final rule incorporates these new requirements into the FDIC's prompt corrective action (PCA) framework. In addition, the interim final rule establishes limits on FDIC-supervised institutions' capital distributions and certain discretionary bonus payments if the FDIC-supervised institution does not hold a specified amount of common equity tier 1 capital in addition to the amount necessary to meet its minimum risk-based capital requirements. The interim final rule amends the methodologies for determining risk-weighted assets for all FDIC-supervised institutions. The interim final rule also adopts changes to the FDIC's regulatory capital requirements that meet the requirements of section 171 and section 939A of the Dodd-Frank Wall Street Reform and Consumer Protection Act. The interim final rule also codifies the FDIC's regulatory capital rules, which have previously resided in various appendices to their respective regulations, into a harmonized integrated regulatory framework. In addition, the FDIC is amending the market risk capital rule (market risk rule) to apply to state savings associations. The FDIC is issuing these revisions to its capital regulations as an interim final rule. The FDIC invites comments on the interaction of this rule with other proposed leverage ratio requirements applicable to large, systemically important banking organizations. This interim final rule otherwise contains regulatory text that is identical to the common rule text adopted as a final rule by the Federal Reserve and the OCC. This interim final rule enables the FDIC to proceed on a unified, expedited basis with the other federal banking agencies pending consideration of other issues. Specifically, the FDIC intends to evaluate this interim final rule in the context of the proposed well- capitalized and buffer levels of the supplementary leverage ratio applicable to large, systemically important banking organizations, as described in a separate Notice of Proposed Rulemaking (NPR) published in the Federal Register August 20, 2013. The FDIC is seeking commenters' views on the interaction of this interim final rule with the proposed rule regarding the supplementary leverage ratio for large, systemically important banking organizations.
The Epistemology of a Rule-Based Expert System: A Framework for Explanation.
1982-01-01
Hypothesis e.coli cryptococcus "concluded by" 3 Rule Rule543 Rule535 predicates" 4 Hypothesis meningitis bacterial steroids a3coholic "more general" 5...the hypothesis "e.coll Is causing meningitis" before " cryptococcus is causing meningitis" Is strategic. And recalling an earlier example
SCADA-based Operator Support System for Power Plant Equipment Fault Forecasting
NASA Astrophysics Data System (ADS)
Mayadevi, N.; Ushakumari, S. S.; Vinodchandra, S. S.
2014-12-01
Power plant equipment must be monitored closely to prevent failures from disrupting plant availability. Online monitoring technology integrated with hybrid forecasting techniques can be used to prevent plant equipment faults. A self learning rule-based expert system is proposed in this paper for fault forecasting in power plants controlled by supervisory control and data acquisition (SCADA) system. Self-learning utilizes associative data mining algorithms on the SCADA history database to form new rules that can dynamically update the knowledge base of the rule-based expert system. In this study, a number of popular associative learning algorithms are considered for rule formation. Data mining results show that the Tertius algorithm is best suited for developing a learning engine for power plants. For real-time monitoring of the plant condition, graphical models are constructed by K-means clustering. To build a time-series forecasting model, a multi layer preceptron (MLP) is used. Once created, the models are updated in the model library to provide an adaptive environment for the proposed system. Graphical user interface (GUI) illustrates the variation of all sensor values affecting a particular alarm/fault, as well as the step-by-step procedure for avoiding critical situations and consequent plant shutdown. The forecasting performance is evaluated by computing the mean absolute error and root mean square error of the predictions.
Genetic programming for evolving due-date assignment models in job shop environments.
Nguyen, Su; Zhang, Mengjie; Johnston, Mark; Tan, Kay Chen
2014-01-01
Due-date assignment plays an important role in scheduling systems and strongly influences the delivery performance of job shops. Because of the stochastic and dynamic nature of job shops, the development of general due-date assignment models (DDAMs) is complicated. In this study, two genetic programming (GP) methods are proposed to evolve DDAMs for job shop environments. The experimental results show that the evolved DDAMs can make more accurate estimates than other existing dynamic DDAMs with promising reusability. In addition, the evolved operation-based DDAMs show better performance than the evolved DDAMs employing aggregate information of jobs and machines.
Combined Economic and Hydrologic Modeling to Support Collaborative Decision Making Processes
NASA Astrophysics Data System (ADS)
Sheer, D. P.
2008-12-01
For more than a decade, the core concept of the author's efforts in support of collaborative decision making has been a combination of hydrologic simulation and multi-objective optimization. The modeling has generally been used to support collaborative decision making processes. The OASIS model developed by HydroLogics Inc. solves a multi-objective optimization at each time step using a mixed integer linear program (MILP). The MILP can be configured to include any user defined objective, including but not limited too economic objectives. For example, an estimated marginal value for water for crops and M&I use were included in the objective function to drive trades in a model of the lower Rio Grande. The formulation of the MILP, constraints and objectives, in any time step is conditional: it changes based on the value of state variables and dynamic external forcing functions, such as rainfall, hydrology, market prices, arrival of migratory fish, water temperature, etc. It therefore acts as a dynamic short term multi-objective economic optimization for each time step. MILP is capable of solving a general problem that includes a very realistic representation of the physical system characteristics in addition to the normal multi-objective optimization objectives and constraints included in economic models. In all of these models, the short term objective function is a surrogate for achieving long term multi-objective results. The long term performance for any alternative (especially including operating strategies) is evaluated by simulation. An operating rule is the combination of conditions, parameters, constraints and objectives used to determine the formulation of the short term optimization in each time step. Heuristic wrappers for the simulation program have been developed improve the parameters of an operating rule, and are initiating research on a wrapper that will allow us to employ a genetic algorithm to improve the form of the rule (conditions, constraints, and short term objectives) as well. In the models operating rules represent different models of human behavior, and the objective of the modeling is to find rules for human behavior that perform well in terms of long term human objectives. The conceptual model used to represent human behavior incorporates economic multi-objective optimization for surrogate objectives, and rules that set those objectives based on current conditions and accounting for uncertainty, at least implicitly. The author asserts that real world operating rules follow this form and have evolved because they have been perceived as successful in the past. Thus, the modeling efforts focus on human behavior in much the same way that economic models focus on human behavior. This paper illustrates the above concepts with real world examples.
Swarm autonomic agents with self-destruct capability
NASA Technical Reports Server (NTRS)
Hinchey, Michael G. (Inventor); Sterritt, Roy (Inventor)
2009-01-01
Systems, methods and apparatus are provided through which in some embodiments an autonomic entity manages a system by generating one or more stay alive signals based on the functioning status and operating state of the system. In some embodiments, an evolvable synthetic neural system is operably coupled to one or more evolvable synthetic neural systems in a hierarchy. The evolvable neural interface receives and generates heartbeat monitor signals and pulse monitor signals that are used to generate a stay alive signal that is used to manage the operations of the synthetic neural system. In another embodiment an asynchronous Alice signal (Autonomic license) requiring valid credentials of an anonymous autonomous agent is initiated. An unsatisfactory Alice exchange may lead to self-destruction of the anonymous autonomous agent for self-protection.
Swarm autonomic agents with self-destruct capability
NASA Technical Reports Server (NTRS)
Hinchey, Michael G. (Inventor); Sterritt, Roy (Inventor)
2011-01-01
Systems, methods and apparatus are provided through which in some embodiments an autonomic entity manages a system by generating one or more stay alive signals based on the functioning status and operating state of the system. In some embodiments, an evolvable synthetic neural system is operably coupled to one or more evolvable synthetic neural systems in a hierarchy. The evolvable neural interface receives and generates heartbeat monitor signals and pulse monitor signals that are used to generate a stay alive signal that is used to manage the operations of the synthetic neural system. In another embodiment an asynchronous Alice signal (Autonomic license) requiring valid credentials of an anonymous autonomous agent is initiated. An unsatisfactory Alice exchange may lead to self-destruction of the anonymous autonomous agent for self-protection.
Henry, Laurence; Craig, Adrian J. F. K.; Lemasson, Alban; Hausberger, Martine
2015-01-01
Turn-taking in conversation appears to be a common feature in various human cultures and this universality raises questions about its biological basis and evolutionary trajectory. Functional convergence is a widespread phenomenon in evolution, revealing sometimes striking functional similarities between very distant species even though the mechanisms involved may be different. Studies on mammals (including non-human primates) and bird species with different levels of social coordination reveal that temporal and structural regularities in vocal interactions may depend on the species' social structure. Here we test the hypothesis that turn-taking and associated rules of conversations may be an adaptive response to the requirements of social life, by testing the applicability of turn-taking rules to an animal model, the European starling. Birdsong has for many decades been considered as one of the best models of human language and starling songs have been well described in terms of vocal production and perception. Starlings do have vocal interactions where alternating patterns predominate. Observational and experimental data on vocal interactions reveal that (1) there are indeed clear temporal and structural regularities, (2) the temporal and structural patterning is influenced by the immediate social context, the general social situation, the individual history, and the internal state of the emitter. Comparison of phylogenetically close species of Sturnids reveals that the alternating pattern of vocal interactions varies greatly according to the species' social structure, suggesting that interactional regularities may have evolved together with social systems. These findings lead to solid bases of discussion on the evolution of communication rules in relation to social evolution. They will be discussed also in terms of processes, at the light of recent neurobiological findings. PMID:26441787
Systematic methods for knowledge acquisition and expert system development
NASA Technical Reports Server (NTRS)
Belkin, Brenda L.; Stengel, Robert F.
1991-01-01
Nine cooperating rule-based systems, collectively called AUTOCREW which were designed to automate functions and decisions associated with a combat aircraft's subsystems, are discussed. The organization of tasks within each system is described; performance metrics were developed to evaluate the workload of each rule base and to assess the cooperation between the rule bases. Simulation and comparative workload results for two mission scenarios are given. The scenarios are inbound surface-to-air-missile attack on the aircraft and pilot incapacitation. The methodology used to develop the AUTOCREW knowledge bases is summarized. Issues involved in designing the navigation sensor selection expert in AUTOCREW's NAVIGATOR knowledge base are discussed in detail. The performance of seven navigation systems aiding a medium-accuracy INS was investigated using Kalman filter covariance analyses. A navigation sensor management (NSM) expert system was formulated from covariance simulation data using the analysis of variance (ANOVA) method and the ID3 algorithm. ANOVA results show that statistically different position accuracies are obtained when different navaids are used, the number of navaids aiding the INS is varied, the aircraft's trajectory is varied, and the performance history is varied. The ID3 algorithm determines the NSM expert's classification rules in the form of decision trees. The performance of these decision trees was assessed on two arbitrary trajectories, and the results demonstrate that the NSM expert adapts to new situations and provides reasonable estimates of the expected hybrid performance.
Simulation Of Combat With An Expert System
NASA Technical Reports Server (NTRS)
Provenzano, J. P.
1989-01-01
Proposed expert system predicts outcomes of combat situations. Called "COBRA", combat outcome based on rules for attrition, system selects rules for mathematical modeling of losses and discrete events in combat according to previous experiences. Used with another software module known as the "Game". Game/COBRA software system, consisting of Game and COBRA modules, provides for both quantitative aspects and qualitative aspects in simulations of battles. COBRA intended for simulation of large-scale military exercises, concepts embodied in it have much broader applicability. In industrial research, knowledge-based system enables qualitative as well as quantitative simulations.
Evidence flow graph methods for validation and verification of expert systems
NASA Technical Reports Server (NTRS)
Becker, Lee A.; Green, Peter G.; Bhatnagar, Jayant
1989-01-01
The results of an investigation into the use of evidence flow graph techniques for performing validation and verification of expert systems are given. A translator to convert horn-clause rule bases into evidence flow graphs, a simulation program, and methods of analysis were developed. These tools were then applied to a simple rule base which contained errors. It was found that the method was capable of identifying a variety of problems, for example that the order of presentation of input data or small changes in critical parameters could affect the output from a set of rules.
Adaptive WTA with an analog VLSI neuromorphic learning chip.
Häfliger, Philipp
2007-03-01
In this paper, we demonstrate how a particular spike-based learning rule (where exact temporal relations between input and output spikes of a spiking model neuron determine the changes of the synaptic weights) can be tuned to express rate-based classical Hebbian learning behavior (where the average input and output spike rates are sufficient to describe the synaptic changes). This shift in behavior is controlled by the input statistic and by a single time constant. The learning rule has been implemented in a neuromorphic very large scale integration (VLSI) chip as part of a neurally inspired spike signal image processing system. The latter is the result of the European Union research project Convolution AER Vision Architecture for Real-Time (CAVIAR). Since it is implemented as a spike-based learning rule (which is most convenient in the overall spike-based system), even if it is tuned to show rate behavior, no explicit long-term average signals are computed on the chip. We show the rule's rate-based Hebbian learning ability in a classification task in both simulation and chip experiment, first with artificial stimuli and then with sensor input from the CAVIAR system.
Combination Rules for Morse-Based van der Waals Force Fields.
Yang, Li; Sun, Lei; Deng, Wei-Qiao
2018-02-15
In traditional force fields (FFs), van der Waals interactions have been usually described by the Lennard-Jones potentials. Conventional combination rules for the parameters of van der Waals (VDW) cross-termed interactions were developed for the Lennard-Jones based FFs. Here, we report that the Morse potentials were a better function to describe VDW interactions calculated by highly precise quantum mechanics methods. A new set of combination rules was developed for Morse-based FFs, in which VDW interactions were described by Morse potentials. The new set of combination rules has been verified by comparing the second virial coefficients of 11 noble gas mixtures. For all of the mixed binaries considered in this work, the combination rules work very well and are superior to all three other existing sets of combination rules reported in the literature. We further used the Morse-based FF by using the combination rules to simulate the adsorption isotherms of CH 4 at 298 K in four covalent-organic frameworks (COFs). The overall agreement is great, which supports the further applications of this new set of combination rules in more realistic simulation systems.
Representation and the Rules of the Game: An Electoral Simulation
ERIC Educational Resources Information Center
Hoffman, Donna R.
2009-01-01
It is often a difficult proposition for introductory American government students to comprehend different electoral systems and how the rules of the game affect the representation that results. I have developed a simulation in which different proportional-based electoral systems are compared with a single-member plurality electoral system. In…
Knowledge-based computer systems for radiotherapy planning.
Kalet, I J; Paluszynski, W
1990-08-01
Radiation therapy is one of the first areas of clinical medicine to utilize computers in support of routine clinical decision making. The role of the computer has evolved from simple dose calculations to elaborate interactive graphic three-dimensional simulations. These simulations can combine external irradiation from megavoltage photons, electrons, and particle beams with interstitial and intracavitary sources. With the flexibility and power of modern radiotherapy equipment and the ability of computer programs that simulate anything the machinery can do, we now face a challenge to utilize this capability to design more effective radiation treatments. How can we manage the increased complexity of sophisticated treatment planning? A promising approach will be to use artificial intelligence techniques to systematize our present knowledge about design of treatment plans, and to provide a framework for developing new treatment strategies. Far from replacing the physician, physicist, or dosimetrist, artificial intelligence-based software tools can assist the treatment planning team in producing more powerful and effective treatment plans. Research in progress using knowledge-based (AI) programming in treatment planning already has indicated the usefulness of such concepts as rule-based reasoning, hierarchical organization of knowledge, and reasoning from prototypes. Problems to be solved include how to handle continuously varying parameters and how to evaluate plans in order to direct improvements.
Heat exchanger expert system logic
NASA Technical Reports Server (NTRS)
Cormier, R.
1988-01-01
The reduction is described of the operation and fault diagnostics of a Deep Space Network heat exchanger to a rule base by the application of propositional calculus to a set of logic statements. The value of this approach lies in the ease of converting the logic and subsequently implementing it on a computer as an expert system. The rule base was written in Process Intelligent Control software.
Applications of Machine Learning and Rule Induction,
1995-02-15
An important area of application for machine learning is in automating the acquisition of knowledge bases required for expert systems. In this paper...we review the major paradigms for machine learning , including neural networks, instance-based methods, genetic learning, rule induction, and analytic
An evolving model for the lodging-service network in a tourism destination
NASA Astrophysics Data System (ADS)
Hernández, Juan M.; González-Martel, Christian
2017-09-01
Tourism is a complex dynamic system including multiple actors which are related each other composing an evolving social network. This paper presents a growing model that explains how part of the supply components in a tourism system forms a social network. Specifically, the lodgings and services in a destination are the network nodes and a link between them appears if a representative tourist hosted in the lodging visits/consumes the service during his/her stay. The specific link between both categories are determined by a random and preferential attachment rule. The analytic results show that the long-term degree distribution of services follows a shifted power-law distribution. The numerical simulations show slight disagreements with the theoretical results in the case of the one-mode degree distribution of services, due to the low order of convergence to zero of X-motifs. The model predictions are compared with real data coming from a popular tourist destination in Gran Canaria, Spain, showing a good agreement between analytical and empirical data for the degree distribution of services. The theoretical model was validated assuming four type of perturbations in the real data.
NASA Astrophysics Data System (ADS)
Xu, Kuangyi; Li, Kun; Cong, Rui; Wang, Long
2017-02-01
In the framework of the evolutionary game theory, two fundamentally different mechanisms, the imitation process and the aspiration-driven dynamics, can be adopted by players to update their strategies. In the former case, individuals imitate the strategy of a more successful peer, while in the latter case individuals change their strategies based on a comparison of payoffs they collect in the game to their own aspiration levels. Here we explore how cooperation evolves for the coexistence of these two dynamics. Intriguingly, cooperation reaches its lowest level when a certain moderate fraction of individuals pick aspiration-level-driven rule while the others choose pairwise comparison rule. Furthermore, when individuals can adjust their update rules besides their strategies, either imitation dynamics or aspiration-driven dynamics will finally take over the entire population, and the stationary cooperation level is determined by the outcome of competition between these two dynamics. We find that appropriate synergetic effects and moderate aspiration level boost the fixation probability of aspiration-driven dynamics most effectively. Our work may be helpful in understanding the cooperative behavior induced by the coexistence of imitation dynamics and aspiration dynamics in the society.
Robust Strategy for Rocket Engine Health Monitoring
NASA Technical Reports Server (NTRS)
Santi, L. Michael
2001-01-01
Monitoring the health of rocket engine systems is essentially a two-phase process. The acquisition phase involves sensing physical conditions at selected locations, converting physical inputs to electrical signals, conditioning the signals as appropriate to establish scale or filter interference, and recording results in a form that is easy to interpret. The inference phase involves analysis of results from the acquisition phase, comparison of analysis results to established health measures, and assessment of health indications. A variety of analytical tools may be employed in the inference phase of health monitoring. These tools can be separated into three broad categories: statistical, rule based, and model based. Statistical methods can provide excellent comparative measures of engine operating health. They require well-characterized data from an ensemble of "typical" engines, or "golden" data from a specific test assumed to define the operating norm in order to establish reliable comparative measures. Statistical methods are generally suitable for real-time health monitoring because they do not deal with the physical complexities of engine operation. The utility of statistical methods in rocket engine health monitoring is hindered by practical limits on the quantity and quality of available data. This is due to the difficulty and high cost of data acquisition, the limited number of available test engines, and the problem of simulating flight conditions in ground test facilities. In addition, statistical methods incur a penalty for disregarding flow complexity and are therefore limited in their ability to define performance shift causality. Rule based methods infer the health state of the engine system based on comparison of individual measurements or combinations of measurements with defined health norms or rules. This does not mean that rule based methods are necessarily simple. Although binary yes-no health assessment can sometimes be established by relatively simple rules, the causality assignment needed for refined health monitoring often requires an exceptionally complex rule base involving complicated logical maps. Structuring the rule system to be clear and unambiguous can be difficult, and the expert input required to maintain a large logic network and associated rule base can be prohibitive.
Checking Flight Rules with TraceContract: Application of a Scala DSL for Trace Analysis
NASA Technical Reports Server (NTRS)
Barringer, Howard; Havelund, Klaus; Morris, Robert A.
2011-01-01
Typically during the design and development of a NASA space mission, rules and constraints are identified to help reduce reasons for failure during operations. These flight rules are usually captured in a set of indexed tables, containing rule descriptions, rationales for the rules, and other information. Flight rules can be part of manual operations procedures carried out by humans. However, they can also be automated, and either implemented as on-board monitors, or as ground based monitors that are part of a ground data system. In the case of automated flight rules, one considerable expense to be addressed for any mission is the extensive process by which system engineers express flight rules in prose, software developers translate these requirements into code, and then both experts verify that the resulting application is correct. This paper explores the potential benefits of using an internal Scala DSL for general trace analysis, named TRACECONTRACT, to write executable specifications of flight rules. TRACECONTRACT can generally be applied to analysis of for example log files or for monitoring executing systems online.
Origins of multicellular evolvability in snowflake yeast
Ratcliff, William C.; Fankhauser, Johnathon D.; Rogers, David W.; Greig, Duncan; Travisano, Michael
2015-01-01
Complex life has arisen through a series of ‘major transitions’ in which collectives of formerly autonomous individuals evolve into a single, integrated organism. A key step in this process is the origin of higher-level evolvability, but little is known about how higher-level entities originate and gain the capacity to evolve as an individual. Here we report a single mutation that not only creates a new level of biological organization, but also potentiates higher-level evolvability. Disrupting the transcription factor ACE2 in Saccharomyces cerevisiae prevents mother–daughter cell separation, generating multicellular ‘snowflake’ yeast. Snowflake yeast develop through deterministic rules that produce geometrically defined clusters that preclude genetic conflict and display a high broad-sense heritability for multicellular traits; as a result they are preadapted to multicellular adaptation. This work demonstrates that simple microevolutionary changes can have profound macroevolutionary consequences, and suggests that the formation of clonally developing clusters may often be the first step to multicellularity. PMID:25600558
2006-09-22
This final rule adopts the substance of the April 15, 2004 tentative interim amendment (TIA) 00-1 (101), Alcohol Based Hand Rub Solutions, an amendment to the 2000 edition of the Life Safety Code, published by the National Fire Protection Association (NFPA). This amendment allows certain health care facilities to place alcohol-based hand rub dispensers in egress corridors under specified conditions. This final rule also requires that nursing facilities at least install battery-operated single station smoke alarms in resident rooms and common areas if they are not fully sprinklered or they do not have system-based smoke detectors in those areas. Finally, this final rule confirms as final the provisions of the March 25, 2005 interim final rule with changes and responds to public comments on that rule.
An evolving systems-based methodology for healthcare planning.
Warwick, Jon; Bell, Gary
2007-01-01
Healthcare planning seems beset with problems at all hierarchical levels. These are caused by the 'soft' nature of many of the issues present in healthcare planning and the high levels of complexity inherent in healthcare services. There has, in recent years, been a move to utilize systems thinking ideas in an effort to gain a better understanding of the forces at work within the healthcare environment and these have had some success. This paper argues that systems-based methodologies can be further enhanced by metrication and modeling which assist in exploring the changed emergent behavior of a system resulting from management intervention. The paper describes the Holon Framework as an evolving systems-based approach that has been used to help clients understand complex systems (in the education domain) that would have application in the analysis of healthcare problems.
Symbolic rule-based classification of lung cancer stages from free-text pathology reports.
Nguyen, Anthony N; Lawley, Michael J; Hansen, David P; Bowman, Rayleen V; Clarke, Belinda E; Duhig, Edwina E; Colquist, Shoni
2010-01-01
To classify automatically lung tumor-node-metastases (TNM) cancer stages from free-text pathology reports using symbolic rule-based classification. By exploiting report substructure and the symbolic manipulation of systematized nomenclature of medicine-clinical terms (SNOMED CT) concepts in reports, statements in free text can be evaluated for relevance against factors relating to the staging guidelines. Post-coordinated SNOMED CT expressions based on templates were defined and populated by concepts in reports, and tested for subsumption by staging factors. The subsumption results were used to build logic according to the staging guidelines to calculate the TNM stage. The accuracy measure and confusion matrices were used to evaluate the TNM stages classified by the symbolic rule-based system. The system was evaluated against a database of multidisciplinary team staging decisions and a machine learning-based text classification system using support vector machines. Overall accuracy on a corpus of pathology reports for 718 lung cancer patients against a database of pathological TNM staging decisions were 72%, 78%, and 94% for T, N, and M staging, respectively. The system's performance was also comparable to support vector machine classification approaches. A system to classify lung TNM stages from free-text pathology reports was developed, and it was verified that the symbolic rule-based approach using SNOMED CT can be used for the extraction of key lung cancer characteristics from free-text reports. Future work will investigate the applicability of using the proposed methodology for extracting other cancer characteristics and types.
On the effects of adaptive reservoir operating rules in hydrological physically-based models
NASA Astrophysics Data System (ADS)
Giudici, Federico; Anghileri, Daniela; Castelletti, Andrea; Burlando, Paolo
2017-04-01
Recent years have seen a significant increase of the human influence on the natural systems both at the global and local scale. Accurately modeling the human component and its interaction with the natural environment is key to characterize the real system dynamics and anticipate future potential changes to the hydrological regimes. Modern distributed, physically-based hydrological models are able to describe hydrological processes with high level of detail and high spatiotemporal resolution. Yet, they lack in sophistication for the behavior component and human decisions are usually described by very simplistic rules, which might underperform in reproducing the catchment dynamics. In the case of water reservoir operators, these simplistic rules usually consist of target-level rule curves, which represent the average historical level trajectory. Whilst these rules can reasonably reproduce the average seasonal water volume shifts due to the reservoirs' operation, they cannot properly represent peculiar conditions, which influence the actual reservoirs' operation, e.g., variations in energy price or water demand, dry or wet meteorological conditions. Moreover, target-level rule curves are not suitable to explore the water system response to climate and socio economic changing contexts, because they assume a business-as-usual operation. In this work, we quantitatively assess how the inclusion of adaptive reservoirs' operating rules into physically-based hydrological models contribute to the proper representation of the hydrological regime at the catchment scale. In particular, we contrast target-level rule curves and detailed optimization-based behavioral models. We, first, perform the comparison on past observational records, showing that target-level rule curves underperform in representing the hydrological regime over multiple time scales (e.g., weekly, seasonal, inter-annual). Then, we compare how future hydrological changes are affected by the two modeling approaches by considering different future scenarios comprising climate change projections of precipitation and temperature and projections of electricity prices. We perform this comparative assessment on the real-world water system of Lake Como catchment in the Italian Alps, which is characterized by the massive presence of artificial hydropower reservoirs heavily altering the natural hydrological regime. The results show how different behavioral model approaches affect the system representation in terms of hydropower performance, reservoirs dynamics and hydrological regime under different future scenarios.
Integration of object-oriented knowledge representation with the CLIPS rule based system
NASA Technical Reports Server (NTRS)
Logie, David S.; Kamil, Hasan
1990-01-01
The paper describes a portion of the work aimed at developing an integrated, knowledge based environment for the development of engineering-oriented applications. An Object Representation Language (ORL) was implemented in C++ which is used to build and modify an object-oriented knowledge base. The ORL was designed in such a way so as to be easily integrated with other representation schemes that could effectively reason with the object base. Specifically, the integration of the ORL with the rule based system C Language Production Systems (CLIPS), developed at the NASA Johnson Space Center, will be discussed. The object-oriented knowledge representation provides a natural means of representing problem data as a collection of related objects. Objects are comprised of descriptive properties and interrelationships. The object-oriented model promotes efficient handling of the problem data by allowing knowledge to be encapsulated in objects. Data is inherited through an object network via the relationship links. Together, the two schemes complement each other in that the object-oriented approach efficiently handles problem data while the rule based knowledge is used to simulate the reasoning process. Alone, the object based knowledge is little more than an object-oriented data storage scheme; however, the CLIPS inference engine adds the mechanism to directly and automatically reason with that knowledge. In this hybrid scheme, the expert system dynamically queries for data and can modify the object base with complete access to all the functionality of the ORL from rules.
Absence of splash singularities for surface quasi-geostrophic sharp fronts and the Muskat problem.
Gancedo, Francisco; Strain, Robert M
2014-01-14
In this paper, for both the sharp front surface quasi-geostrophic equation and the Muskat problem, we rule out the "splash singularity" blow-up scenario; in other words, we prove that the contours evolving from either of these systems cannot intersect at a single point while the free boundary remains smooth. Splash singularities have been shown to hold for the free boundary incompressible Euler equation in the form of the water waves contour evolution problem. Our result confirms the numerical simulations in earlier work, in which it was shown that the curvature blows up because the contours collapse at a point. Here, we prove that maintaining control of the curvature will remove the possibility of pointwise interphase collapse. Another conclusion that we provide is a better understanding of earlier work in which squirt singularities are ruled out; in this case, a positive volume of fluid between the contours cannot be ejected in finite time.
Absence of splash singularities for surface quasi-geostrophic sharp fronts and the Muskat problem
Gancedo, Francisco; Strain, Robert M.
2014-01-01
In this paper, for both the sharp front surface quasi-geostrophic equation and the Muskat problem, we rule out the “splash singularity” blow-up scenario; in other words, we prove that the contours evolving from either of these systems cannot intersect at a single point while the free boundary remains smooth. Splash singularities have been shown to hold for the free boundary incompressible Euler equation in the form of the water waves contour evolution problem. Our result confirms the numerical simulations in earlier work, in which it was shown that the curvature blows up because the contours collapse at a point. Here, we prove that maintaining control of the curvature will remove the possibility of pointwise interphase collapse. Another conclusion that we provide is a better understanding of earlier work in which squirt singularities are ruled out; in this case, a positive volume of fluid between the contours cannot be ejected in finite time. PMID:24347645
Challenges and Insights in Using HIPAA Privacy Rule for Clinical Text Annotation.
Kayaalp, Mehmet; Browne, Allen C; Sagan, Pamela; McGee, Tyne; McDonald, Clement J
2015-01-01
The Privacy Rule of Health Insurance Portability and Accountability Act (HIPAA) requires that clinical documents be stripped of personally identifying information before they can be released to researchers and others. We have been manually annotating clinical text since 2008 in order to test and evaluate an algorithmic clinical text de-identification tool, NLM Scrubber, which we have been developing in parallel. Although HIPAA provides some guidance about what must be de-identified, translating those guidelines into practice is not as straightforward, especially when one deals with free text. As a result we have changed our manual annotation labels and methods six times. This paper explains why we have made those annotation choices, which have been evolved throughout seven years of practice on this field. The aim of this paper is to start a community discussion towards developing standards for clinical text annotation with the end goal of studying and comparing clinical text de-identification systems more accurately.
Knowledge-Based Motion Control of AN Intelligent Mobile Autonomous System
NASA Astrophysics Data System (ADS)
Isik, Can
An Intelligent Mobile Autonomous System (IMAS), which is equipped with vision and low level sensors to cope with unknown obstacles, is modeled as a hierarchy of path planning and motion control. This dissertation concentrates on the lower level of this hierarchy (Pilot) with a knowledge-based controller. The basis of a theory of knowledge-based controllers is established, using the example of the Pilot level motion control of IMAS. In this context, the knowledge-based controller with a linguistic world concept is shown to be adequate for the minimum time control of an autonomous mobile robot motion. The Pilot level motion control of IMAS is approached in the framework of production systems. The three major components of the knowledge-based control that are included here are the hierarchies of the database, the rule base and the rule evaluator. The database, which is the representation of the state of the world, is organized as a semantic network, using a concept of minimal admissible vocabulary. The hierarchy of rule base is derived from the analytical formulation of minimum-time control of IMAS motion. The procedure introduced for rule derivation, which is called analytical model verbalization, utilizes the concept of causalities to describe the system behavior. A realistic analytical system model is developed and the minimum-time motion control in an obstacle strewn environment is decomposed to a hierarchy of motion planning and control. The conditions for the validity of the hierarchical problem decomposition are established, and the consistency of operation is maintained by detecting the long term conflicting decisions of the levels of the hierarchy. The imprecision in the world description is modeled using the theory of fuzzy sets. The method developed for the choice of the rule that prescribes the minimum-time motion control among the redundant set of applicable rules is explained and the usage of fuzzy set operators is justified. Also included in the dissertation are the description of the computer simulation of Pilot within the hierarchy of IMAS control and the simulated experiments that demonstrate the theoretical work.
Evolving Systems and Adaptive Key Component Control
NASA Technical Reports Server (NTRS)
Frost, Susan A.; Balas, Mark J.
2009-01-01
We propose a new framework called Evolving Systems to describe the self-assembly, or autonomous assembly, of actively controlled dynamical subsystems into an Evolved System with a higher purpose. An introduction to Evolving Systems and exploration of the essential topics of the control and stability properties of Evolving Systems is provided. This chapter defines a framework for Evolving Systems, develops theory and control solutions for fundamental characteristics of Evolving Systems, and provides illustrative examples of Evolving Systems and their control with adaptive key component controllers.
Online Sensor Fault Detection Based on an Improved Strong Tracking Filter
Wang, Lijuan; Wu, Lifeng; Guan, Yong; Wang, Guohui
2015-01-01
We propose a method for online sensor fault detection that is based on the evolving Strong Tracking Filter (STCKF). The cubature rule is used to estimate states to improve the accuracy of making estimates in a nonlinear case. A residual is the difference in value between an estimated value and the true value. A residual will be regarded as a signal that includes fault information. The threshold is set at a reasonable level, and will be compared with residuals to determine whether or not the sensor is faulty. The proposed method requires only a nominal plant model and uses STCKF to estimate the original state vector. The effectiveness of the algorithm is verified by simulation on a drum-boiler model. PMID:25690553
NASA Technical Reports Server (NTRS)
Wu, Cathy; Taylor, Pam; Whitson, George; Smith, Cathy
1990-01-01
This paper describes the building of a corn disease diagnostic expert system using CLIPS, and the development of a neural expert system using the fact representation method of CLIPS for automated knowledge acquisition. The CLIPS corn expert system diagnoses 21 diseases from 52 symptoms and signs with certainty factors. CLIPS has several unique features. It allows the facts in rules to be broken down to object-attribute-value (OAV) triples, allows rule-grouping, and fires rules based on pattern-matching. These features combined with the chained inference engine result to a natural user query system and speedy execution. In order to develop a method for automated knowledge acquisition, an Artificial Neural Expert System (ANES) is developed by a direct mapping from the CLIPS system. The ANES corn expert system uses the same OAV triples in the CLIPS system for its facts. The LHS and RHS facts of the CLIPS rules are mapped into the input and output layers of the ANES, respectively; and the inference engine of the rules is imbedded in the hidden layer. The fact representation by OAC triples gives a natural grouping of the rules. These features allow the ANES system to automate rule-generation, and make it efficient to execute and easy to expand for a large and complex domain.
Implementation of a spike-based perceptron learning rule using TiO2-x memristors.
Mostafa, Hesham; Khiat, Ali; Serb, Alexander; Mayr, Christian G; Indiveri, Giacomo; Prodromakis, Themis
2015-01-01
Synaptic plasticity plays a crucial role in allowing neural networks to learn and adapt to various input environments. Neuromorphic systems need to implement plastic synapses to obtain basic "cognitive" capabilities such as learning. One promising and scalable approach for implementing neuromorphic synapses is to use nano-scale memristors as synaptic elements. In this paper we propose a hybrid CMOS-memristor system comprising CMOS neurons interconnected through TiO2-x memristors, and spike-based learning circuits that modulate the conductance of the memristive synapse elements according to a spike-based Perceptron plasticity rule. We highlight a number of advantages for using this spike-based plasticity rule as compared to other forms of spike timing dependent plasticity (STDP) rules. We provide experimental proof-of-concept results with two silicon neurons connected through a memristive synapse that show how the CMOS plasticity circuits can induce stable changes in memristor conductances, giving rise to increased synaptic strength after a potentiation episode and to decreased strength after a depression episode.
Simple methods of exploiting the underlying structure of rule-based systems
NASA Technical Reports Server (NTRS)
Hendler, James
1986-01-01
Much recent work in the field of expert systems research has aimed at exploiting the underlying structures of the rule base for reasons of analysis. Such techniques as Petri-nets and GAGs have been proposed as representational structures that will allow complete analysis. Much has been made of proving isomorphisms between the rule bases and the mechanisms, and in examining the theoretical power of this analysis. In this paper we describe some early work in a new system which has much simpler (and thus, one hopes, more easily achieved) aims and less formality. The technique being examined is a very simple one: OPS5 programs are analyzed in a purely syntactic way and a FSA description is generated. In this paper we describe the technique and some user interface tools which exploit this structure.
A hybrid intelligence approach to artifact recognition in digital publishing
NASA Astrophysics Data System (ADS)
Vega-Riveros, J. Fernando; Santos Villalobos, Hector J.
2006-02-01
The system presented integrates rule-based and case-based reasoning for artifact recognition in Digital Publishing. In Variable Data Printing (VDP) human proofing could result prohibitive since a job could contain millions of different instances that may contain two types of artifacts: 1) evident defects, like a text overflow or overlapping 2) style-dependent artifacts, subtle defects that show as inconsistencies with regard to the original job design. We designed a Knowledge-Based Artifact Recognition tool for document segmentation, layout understanding, artifact detection, and document design quality assessment. Document evaluation is constrained by reference to one instance of the VDP job proofed by a human expert against the remaining instances. Fundamental rules of document design are used in the rule-based component for document segmentation and layout understanding. Ambiguities in the design principles not covered by the rule-based system are analyzed by case-based reasoning, using the Nearest Neighbor Algorithm, where features from previous jobs are used to detect artifacts and inconsistencies within the document layout. We used a subset of XSL-FO and assembled a set of 44 document samples. The system detected all the job layout changes, while obtaining an overall average accuracy of 84.56%, with the highest accuracy of 92.82%, for overlapping and the lowest, 66.7%, for the lack-of-white-space.
Equations for Scoring Rules When Data Are Missing
NASA Technical Reports Server (NTRS)
James, Mark
2006-01-01
A document presents equations for scoring rules in a diagnostic and/or prognostic artificial-intelligence software system of the rule-based inference-engine type. The equations define a set of metrics that characterize the evaluation of a rule when data required for the antecedence clause(s) of the rule are missing. The metrics include a primary measure denoted the rule completeness metric (RCM) plus a number of subsidiary measures that contribute to the RCM. The RCM is derived from an analysis of a rule with respect to its truth and a measure of the completeness of its input data. The derivation is such that the truth value of an antecedent is independent of the measure of its completeness. The RCM can be used to compare the degree of completeness of two or more rules with respect to a given set of data. Hence, the RCM can be used as a guide to choosing among rules during the rule-selection phase of operation of the artificial-intelligence system..
A rule-based system for real-time analysis of control systems
NASA Astrophysics Data System (ADS)
Larson, Richard R.; Millard, D. Edward
1992-10-01
An approach to automate the real-time analysis of flight critical health monitoring and system status is being developed and evaluated at the NASA Dryden Flight Research Facility. A software package was developed in-house and installed as part of the extended aircraft interrogation and display system. This design features a knowledge-base structure in the form of rules to formulate interpretation and decision logic of real-time data. This technique has been applied for ground verification and validation testing and flight testing monitoring where quick, real-time, safety-of-flight decisions can be very critical. In many cases post processing and manual analysis of flight system data are not required. The processing is described of real-time data for analysis along with the output format which features a message stack display. The development, construction, and testing of the rule-driven knowledge base, along with an application using the X-31A flight test program, are presented.
Model-Based Anomaly Detection for a Transparent Optical Transmission System
NASA Astrophysics Data System (ADS)
Bengtsson, Thomas; Salamon, Todd; Ho, Tin Kam; White, Christopher A.
In this chapter, we present an approach for anomaly detection at the physical layer of networks where detailed knowledge about the devices and their operations is available. The approach combines physics-based process models with observational data models to characterize the uncertainties and derive the alarm decision rules. We formulate and apply three different methods based on this approach for a well-defined problem in optical network monitoring that features many typical challenges for this methodology. Specifically, we address the problem of monitoring optically transparent transmission systems that use dynamically controlled Raman amplification systems. We use models of amplifier physics together with statistical estimation to derive alarm decision rules and use these rules to automatically discriminate between measurement errors, anomalous losses, and pump failures. Our approach has led to an efficient tool for systematically detecting anomalies in the system behavior of a deployed network, where pro-active measures to address such anomalies are key to preventing unnecessary disturbances to the system's continuous operation.
A rule-based system for real-time analysis of control systems
NASA Technical Reports Server (NTRS)
Larson, Richard R.; Millard, D. Edward
1992-01-01
An approach to automate the real-time analysis of flight critical health monitoring and system status is being developed and evaluated at the NASA Dryden Flight Research Facility. A software package was developed in-house and installed as part of the extended aircraft interrogation and display system. This design features a knowledge-base structure in the form of rules to formulate interpretation and decision logic of real-time data. This technique has been applied for ground verification and validation testing and flight testing monitoring where quick, real-time, safety-of-flight decisions can be very critical. In many cases post processing and manual analysis of flight system data are not required. The processing is described of real-time data for analysis along with the output format which features a message stack display. The development, construction, and testing of the rule-driven knowledge base, along with an application using the X-31A flight test program, are presented.
META II Complex Systems Design and Analysis (CODA)
2011-08-01
37 3.8.7 Variables, Parameters and Constraints ............................................................. 37 3.8.8 Objective...18 Figure 7: Inputs, States, Outputs and Parameters of System Requirements Specifications ......... 19...Design Rule Based on Device Parameter ....................................................... 57 Figure 35: AEE Device Design Rules (excerpt
NASA Astrophysics Data System (ADS)
Imada, Keita; Nakamura, Katsuhiko
This paper describes recent improvements to Synapse system for incremental learning of general context-free grammars (CFGs) and definite clause grammars (DCGs) from positive and negative sample strings. An important feature of our approach is incremental learning, which is realized by a rule generation mechanism called “bridging” based on bottom-up parsing for positive samples and the search for rule sets. The sizes of rule sets and the computation time depend on the search strategies. In addition to the global search for synthesizing minimal rule sets and serial search, another method for synthesizing semi-optimum rule sets, we incorporate beam search to the system for synthesizing semi-minimal rule sets. The paper shows several experimental results on learning CFGs and DCGs, and we analyze the sizes of rule sets and the computation time.
Jabez Christopher, J; Khanna Nehemiah, H; Kannan, A
2015-10-01
Allergic Rhinitis is a universal common disease, especially in populated cities and urban areas. Diagnosis and treatment of Allergic Rhinitis will improve the quality of life of allergic patients. Though skin tests remain the gold standard test for diagnosis of allergic disorders, clinical experts are required for accurate interpretation of test outcomes. This work presents a clinical decision support system (CDSS) to assist junior clinicians in the diagnosis of Allergic Rhinitis. Intradermal Skin tests were performed on patients who had plausible allergic symptoms. Based on patient׳s history, 40 clinically relevant allergens were tested. 872 patients who had allergic symptoms were considered for this study. The rule based classification approach and the clinical test results were used to develop and validate the CDSS. Clinical relevance of the CDSS was compared with the Score for Allergic Rhinitis (SFAR). Tests were conducted for junior clinicians to assess their diagnostic capability in the absence of an expert. The class based Association rule generation approach provides a concise set of rules that is further validated by clinical experts. The interpretations of the experts are considered as the gold standard. The CDSS diagnoses the presence or absence of rhinitis with an accuracy of 88.31%. The allergy specialist and the junior clinicians prefer the rule based approach for its comprehendible knowledge model. The Clinical Decision Support Systems with rule based classification approach assists junior doctors and clinicians in the diagnosis of Allergic Rhinitis to make reliable decisions based on the reports of intradermal skin tests. Copyright © 2015 Elsevier Ltd. All rights reserved.
ERIC Educational Resources Information Center
Floryan, Mark
2013-01-01
This dissertation presents a novel effort to develop ITS technologies that adapt by observing student behavior. In particular, we define an evolving expert knowledge base (EEKB) that structures a domain's information as a set of nodes and the relationships that exist between those nodes. The structure of this model is not the particularly novel…
Heddam, Salim
2014-01-01
In this study, we present application of an artificial intelligence (AI) technique model called dynamic evolving neural-fuzzy inference system (DENFIS) based on an evolving clustering method (ECM), for modelling dissolved oxygen concentration in a river. To demonstrate the forecasting capability of DENFIS, a one year period from 1 January 2009 to 30 December 2009, of hourly experimental water quality data collected by the United States Geological Survey (USGS Station No: 420853121505500) station at Klamath River at Miller Island Boat Ramp, OR, USA, were used for model development. Two DENFIS-based models are presented and compared. The two DENFIS systems are: (1) offline-based system named DENFIS-OF, and (2) online-based system, named DENFIS-ON. The input variables used for the two models are water pH, temperature, specific conductance, and sensor depth. The performances of the models are evaluated using root mean square errors (RMSE), mean absolute error (MAE), Willmott index of agreement (d) and correlation coefficient (CC) statistics. The lowest root mean square error and highest correlation coefficient values were obtained with the DENFIS-ON method. The results obtained with DENFIS models are compared with linear (multiple linear regression, MLR) and nonlinear (multi-layer perceptron neural networks, MLPNN) methods. This study demonstrates that DENFIS-ON investigated herein outperforms all the proposed techniques for DO modelling.
Style-independent document labeling: design and performance evaluation
NASA Astrophysics Data System (ADS)
Mao, Song; Kim, Jong Woo; Thoma, George R.
2003-12-01
The Medical Article Records System or MARS has been developed at the U.S. National Library of Medicine (NLM) for automated data entry of bibliographical information from medical journals into MEDLINE, the premier bibliographic citation database at NLM. Currently, a rule-based algorithm (called ZoneCzar) is used for labeling important bibliographical fields (title, author, affiliation, and abstract) on medical journal article page images. While rules have been created for medical journals with regular layout types, new rules have to be manually created for any input journals with arbitrary or new layout types. Therefore, it is of interest to label any journal articles independent of their layout styles. In this paper, we first describe a system (called ZoneMatch) for automated generation of crucial geometric and non-geometric features of important bibliographical fields based on string-matching and clustering techniques. The rule based algorithm is then modified to use these features to perform style-independent labeling. We then describe a performance evaluation method for quantitatively evaluating our algorithm and characterizing its error distributions. Experimental results show that the labeling performance of the rule-based algorithm is significantly improved when the generated features are used.
Investigating Mesoscale Convective Systems and their Predictability Using Machine Learning
NASA Astrophysics Data System (ADS)
Daher, H.; Duffy, D.; Bowen, M. K.
2016-12-01
A mesoscale convective system (MCS) is a thunderstorm region that lasts several hours long and forms near weather fronts and can often develop into tornadoes. Here we seek to answer the question of whether these tornadoes are "predictable" by looking for a defining characteristic(s) separating MCSs that evolve into tornadoes versus those that do not. Using NASA's Modern Era Retrospective-analysis for Research and Applications 2 reanalysis data (M2R12K), we apply several state of the art machine learning techniques to investigate this question. The spatial region examined in this experiment is Tornado Alley in the United States over the peak tornado months. A database containing select variables from M2R12K is created using PostgreSQL. This database is then analyzed using machine learning methods such as Symbolic Aggregate approXimation (SAX) and DBSCAN (an unsupervised density-based data clustering algorithm). The incentive behind using these methods is to mathematically define a MCS so that association rule mining techniques can be used to uncover some sort of signal or teleconnection that will help us forecast which MCSs will result in tornadoes and therefore give society more time to prepare and in turn reduce casualties and destruction.
The load shedding advisor: An example of a crisis-response expert system
NASA Technical Reports Server (NTRS)
Bollinger, Terry B.; Lightner, Eric; Laverty, John; Ambrose, Edward
1987-01-01
A Prolog-based prototype expert system is described that was implemented by the Network Operations Branch of the NASA Goddard Space Flight Center. The purpose of the prototype was to test whether a small, inexpensive computer system could be used to host a load shedding advisor, a system which would monitor major physical environment parameters in a computer facility, then recommend appropriate operator reponses whenever a serious condition was detected. The resulting prototype performed significantly to efficiency gains achieved by replacing a purely rule-based design methodology with a hybrid approach that combined procedural, entity-relationship, and rule-based methods.
An on-line expert system for diagnosing environmentally induced spacecraft anomalies using CLIPS
NASA Technical Reports Server (NTRS)
Lauriente, Michael; Rolincik, Mark; Koons, Harry C; Gorney, David
1993-01-01
A new rule-based, expert system for diagnosing spacecraft anomalies is under development. The knowledge base consists of over two-hundred rules and provide links to historical and environmental databases. Environmental causes considered are bulk charging, single event upsets (SEU), surface charging, and total radiation dose. The system's driver translates forward chaining rules into a backward chaining sequence, prompting the user for information pertinent to the causes considered. The use of heuristics frees the user from searching through large amounts of irrelevant information (varying degrees of confidence in an answer) or 'unknown' to any question. The expert system not only provides scientists with needed risk analysis and confidence estimates not available in standard numerical models or databases, but it is also an effective learning tool. In addition, the architecture of the expert system allows easy additions to the knowledge base and the database. For example, new frames concerning orbital debris and ionospheric scintillation are being considered. The system currently runs on a MicroVAX and uses the C Language Integrated Production System (CLIPS).
A fuzzy hill-climbing algorithm for the development of a compact associative classifier
NASA Astrophysics Data System (ADS)
Mitra, Soumyaroop; Lam, Sarah S.
2012-02-01
Classification, a data mining technique, has widespread applications including medical diagnosis, targeted marketing, and others. Knowledge discovery from databases in the form of association rules is one of the important data mining tasks. An integrated approach, classification based on association rules, has drawn the attention of the data mining community over the last decade. While attention has been mainly focused on increasing classifier accuracies, not much efforts have been devoted towards building interpretable and less complex models. This paper discusses the development of a compact associative classification model using a hill-climbing approach and fuzzy sets. The proposed methodology builds the rule-base by selecting rules which contribute towards increasing training accuracy, thus balancing classification accuracy with the number of classification association rules. The results indicated that the proposed associative classification model can achieve competitive accuracies on benchmark datasets with continuous attributes and lend better interpretability, when compared with other rule-based systems.
Lack of cross-scale linkages reduces robustness of community-based fisheries management.
Cudney-Bueno, Richard; Basurto, Xavier
2009-07-16
Community-based management and the establishment of marine reserves have been advocated worldwide as means to overcome overexploitation of fisheries. Yet, researchers and managers are divided regarding the effectiveness of these measures. The "tragedy of the commons" model is often accepted as a universal paradigm, which assumes that unless managed by the State or privatized, common-pool resources are inevitably overexploited due to conflicts between the self-interest of individuals and the goals of a group as a whole. Under this paradigm, the emergence and maintenance of effective community-based efforts that include cooperative risky decisions as the establishment of marine reserves could not occur. In this paper, we question these assumptions and show that outcomes of commons dilemmas can be complex and scale-dependent. We studied the evolution and effectiveness of a community-based management effort to establish, monitor, and enforce a marine reserve network in the Gulf of California, Mexico. Our findings build on social and ecological research before (1997-2001), during (2002) and after (2003-2004) the establishment of marine reserves, which included participant observation in >100 fishing trips and meetings, interviews, as well as fishery dependent and independent monitoring. We found that locally crafted and enforced harvesting rules led to a rapid increase in resource abundance. Nevertheless, news about this increase spread quickly at a regional scale, resulting in poaching from outsiders and a subsequent rapid cascading effect on fishing resources and locally-designed rule compliance. We show that cooperation for management of common-pool fisheries, in which marine reserves form a core component of the system, can emerge, evolve rapidly, and be effective at a local scale even in recently organized fisheries. Stakeholder participation in monitoring, where there is a rapid feedback of the systems response, can play a key role in reinforcing cooperation. However, without cross-scale linkages with higher levels of governance, increase of local fishery stocks may attract outsiders who, if not restricted, will overharvest and threaten local governance. Fishers and fishing communities require incentives to maintain their management efforts. Rewarding local effective management with formal cross-scale governance recognition and support can generate these incentives.
Lack of Cross-Scale Linkages Reduces Robustness of Community-Based Fisheries Management
Cudney-Bueno, Richard; Basurto, Xavier
2009-01-01
Community-based management and the establishment of marine reserves have been advocated worldwide as means to overcome overexploitation of fisheries. Yet, researchers and managers are divided regarding the effectiveness of these measures. The “tragedy of the commons” model is often accepted as a universal paradigm, which assumes that unless managed by the State or privatized, common-pool resources are inevitably overexploited due to conflicts between the self-interest of individuals and the goals of a group as a whole. Under this paradigm, the emergence and maintenance of effective community-based efforts that include cooperative risky decisions as the establishment of marine reserves could not occur. In this paper, we question these assumptions and show that outcomes of commons dilemmas can be complex and scale-dependent. We studied the evolution and effectiveness of a community-based management effort to establish, monitor, and enforce a marine reserve network in the Gulf of California, Mexico. Our findings build on social and ecological research before (1997–2001), during (2002) and after (2003–2004) the establishment of marine reserves, which included participant observation in >100 fishing trips and meetings, interviews, as well as fishery dependent and independent monitoring. We found that locally crafted and enforced harvesting rules led to a rapid increase in resource abundance. Nevertheless, news about this increase spread quickly at a regional scale, resulting in poaching from outsiders and a subsequent rapid cascading effect on fishing resources and locally-designed rule compliance. We show that cooperation for management of common-pool fisheries, in which marine reserves form a core component of the system, can emerge, evolve rapidly, and be effective at a local scale even in recently organized fisheries. Stakeholder participation in monitoring, where there is a rapid feedback of the systems response, can play a key role in reinforcing cooperation. However, without cross-scale linkages with higher levels of governance, increase of local fishery stocks may attract outsiders who, if not restricted, will overharvest and threaten local governance. Fishers and fishing communities require incentives to maintain their management efforts. Rewarding local effective management with formal cross-scale governance recognition and support can generate these incentives. PMID:19606210
Evolvable social agents for bacterial systems modeling.
Paton, Ray; Gregory, Richard; Vlachos, Costas; Saunders, Jon; Wu, Henry
2004-09-01
We present two approaches to the individual-based modeling (IbM) of bacterial ecologies and evolution using computational tools. The IbM approach is introduced, and its important complementary role to biosystems modeling is discussed. A fine-grained model of bacterial evolution is then presented that is based on networks of interactivity between computational objects representing genes and proteins. This is followed by a coarser grained agent-based model, which is designed to explore the evolvability of adaptive behavioral strategies in artificial bacteria represented by learning classifier systems. The structure and implementation of the two proposed individual-based bacterial models are discussed, and some results from simulation experiments are presented, illustrating their adaptive properties.
Evidence flow graph methods for validation and verification of expert systems
NASA Technical Reports Server (NTRS)
Becker, Lee A.; Green, Peter G.; Bhatnagar, Jayant
1988-01-01
This final report describes the results of an investigation into the use of evidence flow graph techniques for performing validation and verification of expert systems. This was approached by developing a translator to convert horn-clause rule bases into evidence flow graphs, a simulation program, and methods of analysis. These tools were then applied to a simple rule base which contained errors. It was found that the method was capable of identifying a variety of problems, for example that the order of presentation of input data or small changes in critical parameters could effect the output from a set of rules.
NASA Technical Reports Server (NTRS)
Starks, Scott; Abdel-Hafeez, Saleh; Usevitch, Bryan
1997-01-01
This paper discusses the implementation of a fuzzy logic system using an ASICs design approach. The approach is based upon combining the inherent advantages of symmetric triangular membership functions and fuzzy singleton sets to obtain a novel structure for fuzzy logic system application development. The resulting structure utilizes a fuzzy static RAM to store the rule-base and the end-points of the triangular membership functions. This provides advantages over other approaches in which all sampled values of membership functions for all universes must be stored. The fuzzy coprocessor structure implements the fuzzification and defuzzification processes through a two-stage parallel pipeline architecture which is capable of executing complex fuzzy computations in less than 0.55us with an accuracy of more than 95%, thus making it suitable for a wide range of applications. Using the approach presented in this paper, a fuzzy logic rule-base can be directly downloaded via a host processor to an onchip rule-base memory with a size of 64 words. The fuzzy coprocessor's design supports up to 49 rules for seven fuzzy membership functions associated with each of the chip's two input variables. This feature allows designers to create fuzzy logic systems without the need for additional on-board memory. Finally, the paper reports on simulation studies that were conducted for several adaptive filter applications using the least mean squared adaptive algorithm for adjusting the knowledge rule-base.
Three CLIPS-based expert systems for solving engineering problems
NASA Technical Reports Server (NTRS)
Parkinson, W. J.; Luger, G. F.; Bretz, R. E.
1990-01-01
We have written three expert systems, using the CLIPS PC-based expert system shell. These three expert systems are rule based and are relatively small, with the largest containing slightly less than 200 rules. The first expert system is an expert assistant that was written to help users of the ASPEN computer code choose the proper thermodynamic package to use with their particular vapor-liquid equilibrium problem. The second expert system was designed to help petroleum engineers choose the proper enhanced oil recovery method to be used with a given reservoir. The effectiveness of each technique is highly dependent upon the reservoir conditions. The third expert system is a combination consultant and control system. This system was designed specifically for silicon carbide whisker growth. Silicon carbide whiskers are an extremely strong product used to make ceramic and metal composites. The manufacture of whiskers is a very complicated process. which to date. has defied a good mathematical model. The process was run by experts who had gained their expertise by trial and error. A system of rules was devised by these experts both for procedure setup and for the process control. In this paper we discuss the three problem areas of the design, development and evaluation of the CLIPS-based programs.
On Inference Rules of Logic-Based Information Retrieval Systems.
ERIC Educational Resources Information Center
Chen, Patrick Shicheng
1994-01-01
Discussion of relevance and the needs of the users in information retrieval focuses on a deductive object-oriented approach and suggests eight inference rules for the deduction. Highlights include characteristics of a deductive object-oriented system, database and data modeling language, implementation, and user interface. (Contains 24…
Knowledge-based approach to video content classification
NASA Astrophysics Data System (ADS)
Chen, Yu; Wong, Edward K.
2001-01-01
A framework for video content classification using a knowledge-based approach is herein proposed. This approach is motivated by the fact that videos are rich in semantic contents, which can best be interpreted and analyzed by human experts. We demonstrate the concept by implementing a prototype video classification system using the rule-based programming language CLIPS 6.05. Knowledge for video classification is encoded as a set of rules in the rule base. The left-hand-sides of rules contain high level and low level features, while the right-hand-sides of rules contain intermediate results or conclusions. Our current implementation includes features computed from motion, color, and text extracted from video frames. Our current rule set allows us to classify input video into one of five classes: news, weather, reporting, commercial, basketball and football. We use MYCIN's inexact reasoning method for combining evidences, and to handle the uncertainties in the features and in the classification results. We obtained good results in a preliminary experiment, and it demonstrated the validity of the proposed approach.
Knowledge-based approach to video content classification
NASA Astrophysics Data System (ADS)
Chen, Yu; Wong, Edward K.
2000-12-01
A framework for video content classification using a knowledge-based approach is herein proposed. This approach is motivated by the fact that videos are rich in semantic contents, which can best be interpreted and analyzed by human experts. We demonstrate the concept by implementing a prototype video classification system using the rule-based programming language CLIPS 6.05. Knowledge for video classification is encoded as a set of rules in the rule base. The left-hand-sides of rules contain high level and low level features, while the right-hand-sides of rules contain intermediate results or conclusions. Our current implementation includes features computed from motion, color, and text extracted from video frames. Our current rule set allows us to classify input video into one of five classes: news, weather, reporting, commercial, basketball and football. We use MYCIN's inexact reasoning method for combining evidences, and to handle the uncertainties in the features and in the classification results. We obtained good results in a preliminary experiment, and it demonstrated the validity of the proposed approach.
Federal Register 2010, 2011, 2012, 2013, 2014
2011-04-04
...) of that Rule. This tie-breaker resolves price disputes based on minimizing order imbalances. In other... Opening Cross, the system will choose that price which minimizes the order imbalance remaining if the... opening cross. NASDAQ initially adopted the imbalance-based tie-breaker based upon its successful use in...
Modelling of bio-morphodynamics in braided rivers: applications to the Waitaki river (New Zealand)
NASA Astrophysics Data System (ADS)
Stecca, G.; Zolezzi, G.; Hicks, M.; Measures, R.; Bertoldi, W.
2016-12-01
The planform shape of rivers results from the complex interaction between flow, sediment transport and vegetation processes, and can evolve in time following a change in these controls. The braided planform of the lower Waitaki (New Zealand), for instance, is endangered by the action of artificially-introduced alien vegetation, which spread after the reduction in magnitude of floods following hydropower dam construction. These processes, by favouring the flow concentration into the main channel, would likely promote a shift towards single thread morphology if vegetation was not artificially removed within a central fairway. The purpose of this work is to address the future evolution of these river systems under different management scenarios through two-dimensional numerical modelling. The construction of a suitable model represents a task in itself, since a modelling framework coupling all the relevant processes is not straightforwardly available at present. Our starting point is the GIAMT2D numerical model, solving two-dimensional flow and bedload transport in wet/dry domains, and recently modified by the inclusion of a rule-based bank erosion model. We further develop this model by adding a vegetation module, which accounts in a simplified manner for time-evolving biomass density, and tweaks the local flow roughness, critical shear stress for sediment transport and bank erodibility accordingly. We plan to apply the model to address the decadal-scale evolution of one reach in the Waitaki river, comparing different management scenarios for vegetation control.
Integration of a knowledge-based system and a clinical documentation system via a data dictionary.
Eich, H P; Ohmann, C; Keim, E; Lang, K
1997-01-01
This paper describes the design and realisation of a knowledge-based system and a clinical documentation system linked via a data dictionary. The software was developed as a shell with object oriented methods and C++ for IBM-compatible PC's and WINDOWS 3.1/95. The data dictionary covers terminology and document objects with relations to external classifications. It controls the terminology in the documentation program with form-based entry of clinical documents and in the knowledge-based system with scores and rules. The software was applied to the clinical field of acute abdominal pain by implementing a data dictionary with 580 terminology objects, 501 document objects, and 2136 links; a documentation module with 8 clinical documents and a knowledge-based system with 10 scores and 7 sets of rules.
Hierarchical graphs for better annotations of rule-based models of biochemical systems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hu, Bin; Hlavacek, William
2009-01-01
In the graph-based formalism of the BioNetGen language (BNGL), graphs are used to represent molecules, with a colored vertex representing a component of a molecule, a vertex label representing the internal state of a component, and an edge representing a bond between components. Components of a molecule share the same color. Furthermore, graph-rewriting rules are used to represent molecular interactions, with a rule that specifies addition (removal) of an edge representing a class of association (dissociation) reactions and with a rule that specifies a change of vertex label representing a class of reactions that affect the internal state of amore » molecular component. A set of rules comprises a mathematical/computational model that can be used to determine, through various means, the system-level dynamics of molecular interactions in a biochemical system. Here, for purposes of model annotation, we propose an extension of BNGL that involves the use of hierarchical graphs to represent (1) relationships among components and subcomponents of molecules and (2) relationships among classes of reactions defined by rules. We illustrate how hierarchical graphs can be used to naturally document the structural organization of the functional components and subcomponents of two proteins: the protein tyrosine kinase Lck and the T cell receptor (TCR)/CD3 complex. Likewise, we illustrate how hierarchical graphs can be used to document the similarity of two related rules for kinase-catalyzed phosphorylation of a protein substrate. We also demonstrate how a hierarchical graph representing a protein can be encoded in an XML-based format.« less
Hierarchical graphs for rule-based modeling of biochemical systems
2011-01-01
Background In rule-based modeling, graphs are used to represent molecules: a colored vertex represents a component of a molecule, a vertex attribute represents the internal state of a component, and an edge represents a bond between components. Components of a molecule share the same color. Furthermore, graph-rewriting rules are used to represent molecular interactions. A rule that specifies addition (removal) of an edge represents a class of association (dissociation) reactions, and a rule that specifies a change of a vertex attribute represents a class of reactions that affect the internal state of a molecular component. A set of rules comprises an executable model that can be used to determine, through various means, the system-level dynamics of molecular interactions in a biochemical system. Results For purposes of model annotation, we propose the use of hierarchical graphs to represent structural relationships among components and subcomponents of molecules. We illustrate how hierarchical graphs can be used to naturally document the structural organization of the functional components and subcomponents of two proteins: the protein tyrosine kinase Lck and the T cell receptor (TCR) complex. We also show that computational methods developed for regular graphs can be applied to hierarchical graphs. In particular, we describe a generalization of Nauty, a graph isomorphism and canonical labeling algorithm. The generalized version of the Nauty procedure, which we call HNauty, can be used to assign canonical labels to hierarchical graphs or more generally to graphs with multiple edge types. The difference between the Nauty and HNauty procedures is minor, but for completeness, we provide an explanation of the entire HNauty algorithm. Conclusions Hierarchical graphs provide more intuitive formal representations of proteins and other structured molecules with multiple functional components than do the regular graphs of current languages for specifying rule-based models, such as the BioNetGen language (BNGL). Thus, the proposed use of hierarchical graphs should promote clarity and better understanding of rule-based models. PMID:21288338
Driven fragmentation of granular gases.
Cruz Hidalgo, Raúl; Pagonabarraga, Ignacio
2008-06-01
The dynamics of homogeneously heated granular gases which fragment due to particle collisions is analyzed. We introduce a kinetic model which accounts for correlations induced at the grain collisions and analyze both the kinetics and relevant distribution functions these systems develop. The work combines analytical and numerical studies based on direct simulation Monte Carlo calculations. A broad family of fragmentation probabilities is considered, and its implications for the system kinetics are discussed. We show that generically these driven materials evolve asymptotically into a dynamical scaling regime. If the fragmentation probability tends to a constant, the grain number diverges at a finite time, leading to a shattering singularity. If the fragmentation probability vanishes, then the number of grains grows monotonously as a power law. We consider different homogeneous thermostats and show that the kinetics of these systems depends weakly on both the grain inelasticity and driving. We observe that fragmentation plays a relevant role in the shape of the velocity distribution of the particles. When the fragmentation is driven by local stochastic events, the long velocity tail is essentially exponential independently of the heating frequency and the breaking rule. However, for a Lowe-Andersen thermostat, numerical evidence strongly supports the conjecture that the scaled velocity distribution follows a generalized exponential behavior f(c) approximately exp(-cn) , with n approximately 1.2 , regarding less the fragmentation mechanisms.
The Comet Cometh: Evolving Developmental Systems.
Jaeger, Johannes; Laubichler, Manfred; Callebaut, Werner
In a recent opinion piece, Denis Duboule has claimed that the increasing shift towards systems biology is driving evolutionary and developmental biology apart, and that a true reunification of these two disciplines within the framework of evolutionary developmental biology (EvoDevo) may easily take another 100 years. He identifies methodological, epistemological, and social differences as causes for this supposed separation. Our article provides a contrasting view. We argue that Duboule's prediction is based on a one-sided understanding of systems biology as a science that is only interested in functional, not evolutionary, aspects of biological processes. Instead, we propose a research program for an evolutionary systems biology, which is based on local exploration of the configuration space in evolving developmental systems. We call this approach-which is based on reverse engineering, simulation, and mathematical analysis-the natural history of configuration space. We discuss a number of illustrative examples that demonstrate the past success of local exploration, as opposed to global mapping, in different biological contexts. We argue that this pragmatic mode of inquiry can be extended and applied to the mathematical analysis of the developmental repertoire and evolutionary potential of evolving developmental mechanisms and that evolutionary systems biology so conceived provides a pragmatic epistemological framework for the EvoDevo synthesis.
NASA Astrophysics Data System (ADS)
Villaver, E.; Niedzielski, A.; Wolszczan, A.; Nowak, G.; Kowalik, K.; Adamów, M.; Maciejewski, G.; Deka-Szymankiewicz, B.; Maldonado, J.
2017-10-01
Context. Evolved stars with planets are crucial to understanding the dependency of the planet formation mechanism on the mass and metallicity of the parent star and to studying star-planet interactions. Aims: We present two evolved stars (HD 103485 and BD+03 2562) from the Tracking Advanced PlAnetary Systems (TAPAS) with HARPS-N project devoted to RV precision measurements of identified candidates within the PennState - Toruń Centre for Astronomy Planet Search. Methods: The paper is based on precise radial velocity (RV) measurements. For HD 103485 we collected 57 epochs over 3317 days with the Hobby-Eberly Telescope (HET) and its high-resolution spectrograph and 18 ultra-precise HARPS-N data over 919 days. For BD+03 2562 we collected 46 epochs of HET data over 3380 days and 19 epochs of HARPS-N data over 919 days. Results: We present the analysis of the data and the search for correlations between the RV signal and stellar activity, stellar rotation, and photometric variability. Based on the available data, we interpret the RV variations measured in both stars as Keplerian motion. Both stars have masses close to Solar (1.11 M⊙ HD 103485 and 1.14 M⊙ BD+03 2562), very low metallicities ([Fe/H] = - 0.50 and - 0.71 for HD 103485 and BD+03 2562), and both have Jupiter planetary mass companions (m2sini = 7 and 6.4 MJ for HD 103485 and BD+03 2562 resp.) in close to terrestrial orbits (1.4 au HD 103485 and 1.3 au BD+03 2562) with moderate eccentricities (e = 0.34 and 0.2 for HD 103485 and BD+03 2562). However, we cannot totally rule-out the possibility that the signal in the case of HD 103485 is due to rotational modulation of active regions. Conclusions: Based on the current data, we conclude that BD+03 2562 has a bona fide planetary companion while for HD 103485 we cannot totally exclude the possibility that the best explanation for the RV signal modulations is not the existence of a planet but stellar activity. If the interpretation remains that both stars have planetary companions, they represent systems orbiting very evolved stars with very low metallicities, a challenge to the conditions required for the formation of massive giant gas planets. Based on observations obtained with the Hobby-Eberly Telescope, which is a joint project of the University of Texas at Austin, the Pennsylvania State University, Stanford University, Ludwig-Maximilians-Universität München, and Georg-August-Universität Göttingen.Based on observations made with the Italian Telescopio Nazionale Galileo (TNG) operated on the island of La Palma by the Fundación Galileo Galilei of the INAF (Istituto Nazionale di Astrofisica) at the Spanish Observatorio del Roque de los Muchachos of the Instituto de Astrofísica de Canarias.
Engineering monitoring expert system's developer
NASA Technical Reports Server (NTRS)
Lo, Ching F.
1991-01-01
This research project is designed to apply artificial intelligence technology including expert systems, dynamic interface of neural networks, and hypertext to construct an expert system developer. The developer environment is specifically suited to building expert systems which monitor the performance of ground support equipment for propulsion systems and testing facilities. The expert system developer, through the use of a graphics interface and a rule network, will be transparent to the user during rule constructing and data scanning of the knowledge base. The project will result in a software system that allows its user to build specific monitoring type expert systems which monitor various equipments used for propulsion systems or ground testing facilities and accrues system performance information in a dynamic knowledge base.
Evolvable Smartphone-Based Platforms for Point-of-Care In-Vitro Diagnostics Applications.
Patou, François; AlZahra'a Alatraktchi, Fatima; Kjægaard, Claus; Dimaki, Maria; Madsen, Jan; Svendsen, Winnie E
2016-09-03
The association of smart mobile devices and lab-on-chip technologies offers unprecedented opportunities for the emergence of direct-to-consumer in vitro medical diagnostics applications. Despite their clear transformative potential, obstacles remain to the large-scale disruption and long-lasting success of these systems in the consumer market. For instance, the increasing level of complexity of instrumented lab-on-chip devices, coupled to the sporadic nature of point-of-care testing, threatens the viability of a business model mainly relying on disposable/consumable lab-on-chips. We argued recently that system evolvability, defined as the design characteristic that facilitates more manageable transitions between system generations via the modification of an inherited design, can help remedy these limitations. In this paper, we discuss how platform-based design can constitute a formal entry point to the design and implementation of evolvable smart device/lab-on-chip systems. We present both a hardware/software design framework and the implementation details of a platform prototype enabling at this stage the interfacing of several lab-on-chip variants relying on current- or impedance-based biosensors. Our findings suggest that several change-enabling mechanisms implemented in the higher abstraction software layers of the system can promote evolvability, together with the design of change-absorbing hardware/software interfaces. Our platform architecture is based on a mobile software application programming interface coupled to a modular hardware accessory. It allows the specification of lab-on-chip operation and post-analytic functions at the mobile software layer. We demonstrate its potential by operating a simple lab-on-chip to carry out the detection of dopamine using various electroanalytical methods.
Evolvable Smartphone-Based Platforms for Point-of-Care In-Vitro Diagnostics Applications
Patou, François; AlZahra’a Alatraktchi, Fatima; Kjægaard, Claus; Dimaki, Maria; Madsen, Jan; Svendsen, Winnie E.
2016-01-01
The association of smart mobile devices and lab-on-chip technologies offers unprecedented opportunities for the emergence of direct-to-consumer in vitro medical diagnostics applications. Despite their clear transformative potential, obstacles remain to the large-scale disruption and long-lasting success of these systems in the consumer market. For instance, the increasing level of complexity of instrumented lab-on-chip devices, coupled to the sporadic nature of point-of-care testing, threatens the viability of a business model mainly relying on disposable/consumable lab-on-chips. We argued recently that system evolvability, defined as the design characteristic that facilitates more manageable transitions between system generations via the modification of an inherited design, can help remedy these limitations. In this paper, we discuss how platform-based design can constitute a formal entry point to the design and implementation of evolvable smart device/lab-on-chip systems. We present both a hardware/software design framework and the implementation details of a platform prototype enabling at this stage the interfacing of several lab-on-chip variants relying on current- or impedance-based biosensors. Our findings suggest that several change-enabling mechanisms implemented in the higher abstraction software layers of the system can promote evolvability, together with the design of change-absorbing hardware/software interfaces. Our platform architecture is based on a mobile software application programming interface coupled to a modular hardware accessory. It allows the specification of lab-on-chip operation and post-analytic functions at the mobile software layer. We demonstrate its potential by operating a simple lab-on-chip to carry out the detection of dopamine using various electroanalytical methods. PMID:27598208
Warren, Carol
This paper concerns resource governance in a remote Balinese coastal community, which faces severe environmental challenges due to overexploitation and habitat destruction. It explores some of the issues raised in 'social capital' debates regarding leadership and public participation toward sustainable natural resource governance. Given the strength of Balinese customary law and the high degree of participation required in the ritual-social domain, Bali represents a model context for examining these issues. Through a case study of destructive resource exploitation and evolving rules-in-use, this paper analyses the ambiguous role of 'bonding' social capital and the complexities of negotiating collective action on environmental problems where conflicting interests and dense social ties make local action difficult. The paper finds that a more complex appreciation of vertical (authority) and horizontal (solidarity) relationships between leaders and ordinary villagers is required, and that a more nuanced institutional bricolage and exploratory scenario approach to analysis of evolving rules in use would enhance associated policy interventions.
Guillon, Myrtille; Mace, Ruth
2016-01-01
The classification of kin into structured groups is a diverse phenomenon which is ubiquitous in human culture. For populations which are organized into large agropastoral groupings of sedentary residence but not governed within the context of a centralised state, such as our study sample of 83 historical Bantu-speaking groups of sub-Saharan Africa, cultural kinship norms guide all aspects of everyday life and social organization. Such rules operate in part through the use of differing terminological referential systems of familial organization. Although the cross-cultural study of kinship terminology was foundational in Anthropology, few modern studies have made use of statistical advances to further our sparse understanding of the structuring and diversification of terminological systems of kinship over time. In this study we use Bayesian Markov Chain Monte Carlo methods of phylogenetic comparison to investigate the evolution of Bantu kinship terminology and reconstruct the ancestral state and diversification of cousin terminology in this family of sub-Saharan ethnolinguistic groups. Using a phylogenetic tree of Bantu languages, we then test the prominent hypothesis that structured variation in systems of cousin terminology has co-evolved alongside adaptive change in patterns of descent organization, as well as rules of residence. We find limited support for this hypothesis, and argue that the shaping of systems of kinship terminology is a multifactorial process, concluding with possible avenues of future research. PMID:27008364
Thermal Control Technologies for Complex Spacecraft
NASA Technical Reports Server (NTRS)
Swanson, Theodore D.
2004-01-01
Thermal control is a generic need for all spacecraft. In response to ever more demanding science and exploration requirements, spacecraft are becoming ever more complex, and hence their thermal control systems must evolve. This paper briefly discusses the process of technology development, the state-of-the-art in thermal control, recent experiences with on-orbit two-phase systems, and the emerging thermal control technologies to meet these evolving needs. Some "lessons learned" based on experience with on-orbit systems are also presented.
Retrosynthetic Reaction Prediction Using Neural Sequence-to-Sequence Models
2017-01-01
We describe a fully data driven model that learns to perform a retrosynthetic reaction prediction task, which is treated as a sequence-to-sequence mapping problem. The end-to-end trained model has an encoder–decoder architecture that consists of two recurrent neural networks, which has previously shown great success in solving other sequence-to-sequence prediction tasks such as machine translation. The model is trained on 50,000 experimental reaction examples from the United States patent literature, which span 10 broad reaction types that are commonly used by medicinal chemists. We find that our model performs comparably with a rule-based expert system baseline model, and also overcomes certain limitations associated with rule-based expert systems and with any machine learning approach that contains a rule-based expert system component. Our model provides an important first step toward solving the challenging problem of computational retrosynthetic analysis. PMID:29104927
Rule Based System for Medicine Inventory Control Using Radio Frequency Identification (RFID)
NASA Astrophysics Data System (ADS)
Nugraha, Joanna Ardhyanti Mita; Suryono; Suseno, dan Jatmiko Endro
2018-02-01
Rule based system is very efficient to ensure stock of drug to remain available by utilizing Radio Frequency Identification (RFID) as input means automatically. This method can ensure the stock of drugs to remain available by analyzing the needs of drug users. The research data was the amount of drug usage in hospital for 1 year. The data was processed by using ABC classification to determine the drug with fast, medium and slow movement. In each classification result, rule based algorithm was given for determination of safety stock and Reorder Point (ROP). This research yielded safety stock and ROP values that vary depending on the class of each drug. Validation is done by comparing the calculation of safety stock and reorder point both manually and by system, then, it was found that the mean deviation value at safety stock was 0,03 and and ROP was 0,08.
Evolving from Course-Centric to Learning-Centric: Portfolios, Wikis, and Social Learning
ERIC Educational Resources Information Center
Everhart, Deborah
2006-01-01
Teaching and learning strategies for using course management systems have evolved from basic "fill in the blank" models to interactive designs that encourage multi-formatted individual contributions and collaborative forms of learning. In keeping with the participatory development of online resources, web-based courses are shifting from…
Impact of Operating Rules on Planning Capacity Expansion of Urban Water Supply Systems
NASA Astrophysics Data System (ADS)
de Neufville, R.; Galelli, S.; Tian, X.
2017-12-01
This study addresses the impact of operating rules on capacity planning of urban water supply systems. The continuous growth of metropolitan areas represents a major challenge for water utilities, which often rely on industrial water supply (e.g., desalination, reclaimed water) to complement natural resources (e.g., reservoirs). These additional sources increase the reliability of supply, equipping operators with additional means to hedge against droughts. How do their rules for using industrial water supply impact the performance of water supply system? How might it affect long-term plans for capacity expansion? Possibly significantly, as demonstrated by the analysis of the operations and planning of a water supply system inspired by Singapore. Our analysis explores the system dynamics under multiple inflow and management scenarios to understand the extent to which alternative operating rules for the use of industrial water supply affect system performance. Results first show that these operating rules can have significant impact on the variability in system performance (e.g., reliability, energy use) comparable to that of hydro-climatological conditions. Further analyses of several capacity expansion exercises—based on our original hydrological and management scenarios—show that operating rules significantly affect the timing and magnitude of critical decisions, such as the construction of new desalination plants. These results have two implications: Capacity expansion analysis should consider the effect of a priori uncertainty about operating rules; and operators should consider how their flexibility in operating rules can affect their perceived need for capacity.
A personalized health-monitoring system for elderly by combining rules and case-based reasoning.
Ahmed, Mobyen Uddin
2015-01-01
Health-monitoring system for elderly in home environment is a promising solution to provide efficient medical services that increasingly interest by the researchers within this area. It is often more challenging when the system is self-served and functioning as personalized provision. This paper proposed a personalized self-served health-monitoring system for elderly in home environment by combining general rules with a case-based reasoning approach. Here, the system generates feedback, recommendation and alarm in a personalized manner based on elderly's medical information and health parameters such as blood pressure, blood glucose, weight, activity, pulse, etc. A set of general rules has used to classify individual health parameters. The case-based reasoning approach is used to combine all different health parameters, which generates an overall classification of health condition. According to the evaluation result considering 323 cases and k=2 i.e., top 2 most similar retrieved cases, the sensitivity, specificity and overall accuracy are achieved as 90%, 97% and 96% respectively. The preliminary result of the system is acceptable since the feedback; recommendation and alarm messages are personalized and differ from the general messages. Thus, this approach could be possibly adapted for other situations in personalized elderly monitoring.
The study on dynamic cadastral coding rules based on kinship relationship
NASA Astrophysics Data System (ADS)
Xu, Huan; Liu, Nan; Liu, Renyi; Lu, Jingfeng
2007-06-01
Cadastral coding rules are an important supplement to the existing national and local standard specifications for building cadastral database. After analyzing the course of cadastral change, especially the parcel change with the method of object-oriented analysis, a set of dynamic cadastral coding rules based on kinship relationship corresponding to the cadastral change is put forward and a coding format composed of street code, block code, father parcel code, child parcel code and grandchild parcel code is worked out within the county administrative area. The coding rule has been applied to the development of an urban cadastral information system called "ReGIS", which is not only able to figure out the cadastral code automatically according to both the type of parcel change and the coding rules, but also capable of checking out whether the code is spatiotemporally unique before the parcel is stored in the database. The system has been used in several cities of Zhejiang Province and got a favorable response. This verifies the feasibility and effectiveness of the coding rules to some extent.
Correcting groove error in gratings ruled on a 500-mm ruling engine using interferometric control.
Mi, Xiaotao; Yu, Haili; Yu, Hongzhu; Zhang, Shanwen; Li, Xiaotian; Yao, Xuefeng; Qi, Xiangdong; Bayinhedhig; Wan, Qiuhua
2017-07-20
Groove error is one of the most important factors affecting grating quality and spectral performance. To reduce groove error, we propose a new ruling-tool carriage system based on aerostatic guideways. We design a new blank carriage system with double piezoelectric actuators. We also propose a completely closed-loop servo-control system with a new optical measurement system that can control the position of the diamond relative to the blank. To evaluate our proposed methods, we produced several gratings, including an echelle grating with 79 grooves/mm, a grating with 768 grooves/mm, and a high-density grating with 6000 grooves/mm. The results show that our methods effectively reduce groove error in ruled gratings.
Optimal pattern distributions in Rete-based production systems
NASA Technical Reports Server (NTRS)
Scott, Stephen L.
1994-01-01
Since its introduction into the AI community in the early 1980's, the Rete algorithm has been widely used. This algorithm has formed the basis for many AI tools, including NASA's CLIPS. One drawback of Rete-based implementation, however, is that the network structures used internally by the Rete algorithm make it sensitive to the arrangement of individual patterns within rules. Thus while rules may be more or less arbitrarily placed within source files, the distribution of individual patterns within these rules can significantly affect the overall system performance. Some heuristics have been proposed to optimize pattern placement, however, these suggestions can be conflicting. This paper describes a systematic effort to measure the effect of pattern distribution on production system performance. An overview of the Rete algorithm is presented to provide context. A description of the methods used to explore the pattern ordering problem area are presented, using internal production system metrics such as the number of partial matches, and coarse-grained operating system data such as memory usage and time. The results of this study should be of interest to those developing and optimizing software for Rete-based production systems.
Patterson, Olga V; Forbush, Tyler B; Saini, Sameer D; Moser, Stephanie E; DuVall, Scott L
2015-01-01
In order to measure the level of utilization of colonoscopy procedures, identifying the primary indication for the procedure is required. Colonoscopies may be utilized not only for screening, but also for diagnostic or therapeutic purposes. To determine whether a colonoscopy was performed for screening, we created a natural language processing system to identify colonoscopy reports in the electronic medical record system and extract indications for the procedure. A rule-based model and three machine-learning models were created using 2,000 manually annotated clinical notes of patients cared for in the Department of Veterans Affairs. Performance of the models was measured and compared. Analysis of the models on a test set of 1,000 documents indicates that the rule-based system performance stays fairly constant as evaluated on training and testing sets. However, the machine learning model without feature selection showed significant decrease in performance. Therefore, rule-based classification system appears to be more robust than a machine-learning system in cases when no feature selection is performed.
Greek, Ray; Hansen, Lawrence A
2013-11-01
We surveyed the scientific literature regarding amyotrophic lateral sclerosis, the SOD1 mouse model, complex adaptive systems, evolution, drug development, animal models, and philosophy of science in an attempt to analyze the SOD1 mouse model of amyotrophic lateral sclerosis in the context of evolved complex adaptive systems. Humans and animals are examples of evolved complex adaptive systems. It is difficult to predict the outcome from perturbations to such systems because of the characteristics of complex systems. Modeling even one complex adaptive system in order to predict outcomes from perturbations is difficult. Predicting outcomes to one evolved complex adaptive system based on outcomes from a second, especially when the perturbation occurs at higher levels of organization, is even more problematic. Using animal models to predict human outcomes to perturbations such as disease and drugs should have a very low predictive value. We present empirical evidence confirming this and suggest a theory to explain this phenomenon. We analyze the SOD1 mouse model of amyotrophic lateral sclerosis in order to illustrate this position. Copyright © 2013 The Authors. Published by Elsevier Ltd.. All rights reserved.
Tan, W Katherine; Hassanpour, Saeed; Heagerty, Patrick J; Rundell, Sean D; Suri, Pradeep; Huhdanpaa, Hannu T; James, Kathryn; Carrell, David S; Langlotz, Curtis P; Organ, Nancy L; Meier, Eric N; Sherman, Karen J; Kallmes, David F; Luetmer, Patrick H; Griffith, Brent; Nerenz, David R; Jarvik, Jeffrey G
2018-03-28
To evaluate a natural language processing (NLP) system built with open-source tools for identification of lumbar spine imaging findings related to low back pain on magnetic resonance and x-ray radiology reports from four health systems. We used a limited data set (de-identified except for dates) sampled from lumbar spine imaging reports of a prospectively assembled cohort of adults. From N = 178,333 reports, we randomly selected N = 871 to form a reference-standard dataset, consisting of N = 413 x-ray reports and N = 458 MR reports. Using standardized criteria, four spine experts annotated the presence of 26 findings, where 71 reports were annotated by all four experts and 800 were each annotated by two experts. We calculated inter-rater agreement and finding prevalence from annotated data. We randomly split the annotated data into development (80%) and testing (20%) sets. We developed an NLP system from both rule-based and machine-learned models. We validated the system using accuracy metrics such as sensitivity, specificity, and area under the receiver operating characteristic curve (AUC). The multirater annotated dataset achieved inter-rater agreement of Cohen's kappa > 0.60 (substantial agreement) for 25 of 26 findings, with finding prevalence ranging from 3% to 89%. In the testing sample, rule-based and machine-learned predictions both had comparable average specificity (0.97 and 0.95, respectively). The machine-learned approach had a higher average sensitivity (0.94, compared to 0.83 for rules-based), and a higher overall AUC (0.98, compared to 0.90 for rules-based). Our NLP system performed well in identifying the 26 lumbar spine findings, as benchmarked by reference-standard annotation by medical experts. Machine-learned models provided substantial gains in model sensitivity with slight loss of specificity, and overall higher AUC. Copyright © 2018 The Association of University Radiologists. All rights reserved.
CNN-based ranking for biomedical entity normalization.
Li, Haodi; Chen, Qingcai; Tang, Buzhou; Wang, Xiaolong; Xu, Hua; Wang, Baohua; Huang, Dong
2017-10-03
Most state-of-the-art biomedical entity normalization systems, such as rule-based systems, merely rely on morphological information of entity mentions, but rarely consider their semantic information. In this paper, we introduce a novel convolutional neural network (CNN) architecture that regards biomedical entity normalization as a ranking problem and benefits from semantic information of biomedical entities. The CNN-based ranking method first generates candidates using handcrafted rules, and then ranks the candidates according to their semantic information modeled by CNN as well as their morphological information. Experiments on two benchmark datasets for biomedical entity normalization show that our proposed CNN-based ranking method outperforms traditional rule-based method with state-of-the-art performance. We propose a CNN architecture that regards biomedical entity normalization as a ranking problem. Comparison results show that semantic information is beneficial to biomedical entity normalization and can be well combined with morphological information in our CNN architecture for further improvement.
Building a common pipeline for rule-based document classification.
Patterson, Olga V; Ginter, Thomas; DuVall, Scott L
2013-01-01
Instance-based classification of clinical text is a widely used natural language processing task employed as a step for patient classification, document retrieval, or information extraction. Rule-based approaches rely on concept identification and context analysis in order to determine the appropriate class. We propose a five-step process that enables even small research teams to develop simple but powerful rule-based NLP systems by taking advantage of a common UIMA AS based pipeline for classification. Our proposed methodology coupled with the general-purpose solution provides researchers with access to the data locked in clinical text in cases of limited human resources and compact timelines.
2010-01-01
Background Clinical practice guidelines give recommendations about what to do in various medical situations, including therapeutical recommendations for drug prescription. An effective way to computerize these recommendations is to design critiquing decision support systems, i.e. systems that criticize the physician's prescription when it does not conform to the guidelines. These systems are commonly based on a list of "if conditions then criticism" rules. However, writing these rules from the guidelines is not a trivial task. The objective of this article is to propose methods that (1) simplify the implementation of guidelines' therapeutical recommendations in critiquing systems by automatically translating structured therapeutical recommendations into a list of "if conditions then criticize" rules, and (2) can generate an appropriate textual label to explain to the physician why his/her prescription is not recommended. Methods We worked on the therapeutic recommendations in five clinical practice guidelines concerning chronic diseases related to the management of cardiovascular risk. We evaluated the system using a test base of more than 2000 cases. Results Algorithms for automatically translating therapeutical recommendations into "if conditions then criticize" rules are presented. Eight generic recommendations are also proposed; they are guideline-independent, and can be used as default behaviour for handling various situations that are usually implicit in the guidelines, such as decreasing the dose of a poorly tolerated drug. Finally, we provide models and methods for generating a human-readable textual critique. The system was successfully evaluated on the test base. Conclusion We show that it is possible to criticize physicians' prescriptions starting from a structured clinical guideline, and to provide clear explanations. We are now planning a randomized clinical trial to evaluate the impact of the system on practices. PMID:20509903
Granular Flow Graph, Adaptive Rule Generation and Tracking.
Pal, Sankar Kumar; Chakraborty, Debarati Bhunia
2017-12-01
A new method of adaptive rule generation in granular computing framework is described based on rough rule base and granular flow graph, and applied for video tracking. In the process, several new concepts and operations are introduced, and methodologies formulated with superior performance. The flow graph enables in defining an intelligent technique for rule base adaptation where its characteristics in mapping the relevance of attributes and rules in decision-making system are exploited. Two new features, namely, expected flow graph and mutual dependency between flow graphs are defined to make the flow graph applicable in the tasks of both training and validation. All these techniques are performed in neighborhood granular level. A way of forming spatio-temporal 3-D granules of arbitrary shape and size is introduced. The rough flow graph-based adaptive granular rule-based system, thus produced for unsupervised video tracking, is capable of handling the uncertainties and incompleteness in frames, able to overcome the incompleteness in information that arises without initial manual interactions and in providing superior performance and gaining in computation time. The cases of partial overlapping and detecting the unpredictable changes are handled efficiently. It is shown that the neighborhood granulation provides a balanced tradeoff between speed and accuracy as compared to pixel level computation. The quantitative indices used for evaluating the performance of tracking do not require any information on ground truth as in the other methods. Superiority of the algorithm to nonadaptive and other recent ones is demonstrated extensively.
NASA Astrophysics Data System (ADS)
Aljuboori, Ahmed S.; Coenen, Frans; Nsaif, Mohammed; Parsons, David J.
2018-05-01
Case-Based Reasoning (CBR) plays a major role in expert system research. However, a critical problem can be met when a CBR system retrieves incorrect cases. Class Association Rules (CARs) have been utilized to offer a potential solution in a previous work. The aim of this paper was to perform further validation of Case-Based Reasoning using a Classification based on Association Rules (CBRAR) to enhance the performance of Similarity Based Retrieval (SBR). The CBRAR strategy uses a classed frequent pattern tree algorithm (FP-CAR) in order to disambiguate wrongly retrieved cases in CBR. The research reported in this paper makes contributions to both fields of CBR and Association Rules Mining (ARM) in that full target cases can be extracted from the FP-CAR algorithm without invoking P-trees and union operations. The dataset used in this paper provided more efficient results when the SBR retrieves unrelated answers. The accuracy of the proposed CBRAR system outperforms the results obtained by existing CBR tools such as Jcolibri and FreeCBR.
Systematic methods for knowledge acquisition and expert system development
NASA Technical Reports Server (NTRS)
Belkin, Brenda L.; Stengel, Robert F.
1991-01-01
Nine cooperating rule-based systems, collectively called AUTOCREW, were designed to automate functions and decisions associated with a combat aircraft's subsystem. The organization of tasks within each system is described; performance metrics were developed to evaluate the workload of each rule base, and to assess the cooperation between the rule-bases. Each AUTOCREW subsystem is composed of several expert systems that perform specific tasks. AUTOCREW's NAVIGATOR was analyzed in detail to understand the difficulties involved in designing the system and to identify tools and methodologies that ease development. The NAVIGATOR determines optimal navigation strategies from a set of available sensors. A Navigation Sensor Management (NSM) expert system was systematically designed from Kalman filter covariance data; four ground-based, a satellite-based, and two on-board INS-aiding sensors were modeled and simulated to aid an INS. The NSM Expert was developed using the Analysis of Variance (ANOVA) and the ID3 algorithm. Navigation strategy selection is based on an RSS position error decision metric, which is computed from the covariance data. Results show that the NSM Expert predicts position error correctly between 45 and 100 percent of the time for a specified navaid configuration and aircraft trajectory. The NSM Expert adapts to new situations, and provides reasonable estimates of hybrid performance. The systematic nature of the ANOVA/ID3 method makes it broadly applicable to expert system design when experimental or simulation data is available.
A Novel BA Complex Network Model on Color Template Matching
Han, Risheng; Yue, Guangxue; Ding, Hui
2014-01-01
A novel BA complex network model of color space is proposed based on two fundamental rules of BA scale-free network model: growth and preferential attachment. The scale-free characteristic of color space is discovered by analyzing evolving process of template's color distribution. And then the template's BA complex network model can be used to select important color pixels which have much larger effects than other color pixels in matching process. The proposed BA complex network model of color space can be easily integrated into many traditional template matching algorithms, such as SSD based matching and SAD based matching. Experiments show the performance of color template matching results can be improved based on the proposed algorithm. To the best of our knowledge, this is the first study about how to model the color space of images using a proper complex network model and apply the complex network model to template matching. PMID:25243235
A novel BA complex network model on color template matching.
Han, Risheng; Shen, Shigen; Yue, Guangxue; Ding, Hui
2014-01-01
A novel BA complex network model of color space is proposed based on two fundamental rules of BA scale-free network model: growth and preferential attachment. The scale-free characteristic of color space is discovered by analyzing evolving process of template's color distribution. And then the template's BA complex network model can be used to select important color pixels which have much larger effects than other color pixels in matching process. The proposed BA complex network model of color space can be easily integrated into many traditional template matching algorithms, such as SSD based matching and SAD based matching. Experiments show the performance of color template matching results can be improved based on the proposed algorithm. To the best of our knowledge, this is the first study about how to model the color space of images using a proper complex network model and apply the complex network model to template matching.
A Methodology for Multiple Rule System Integration and Resolution Within a Singular Knowledge Base
NASA Technical Reports Server (NTRS)
Kautzmann, Frank N., III
1988-01-01
Expert Systems which support knowledge representation by qualitative modeling techniques experience problems, when called upon to support integrated views embodying description and explanation, especially when other factors such as multiple causality, competing rule model resolution, and multiple uses of knowledge representation are included. A series of prototypes are being developed to demonstrate the feasibility of automating the process of systems engineering, design and configuration, and diagnosis and fault management. A study involves not only a generic knowledge representation; it must also support multiple views at varying levels of description and interaction between physical elements, systems, and subsystems. Moreover, it will involve models of description and explanation for each level. This multiple model feature requires the development of control methods between rule systems and heuristics on a meta-level for each expert system involved in an integrated and larger class of expert system. The broadest possible category of interacting expert systems is described along with a general methodology for the knowledge representation and control of mutually exclusive rule systems.
75 FR 76393 - Notice of Request for a New Information Collection (Public Health Information System)
Federal Register 2010, 2011, 2012, 2013, 2014
2010-12-08
... intention to request a new information collection concerning its Web-based Public Health Information System... Building, Washington, DC 20250; (202) 720-0345. SUPPLEMENTARY INFORMATION: Title: Public Health Information... documents other than rules #0;or proposed rules that are applicable to the public. Notices of hearings #0...
The Virtual Factory Teaching System (VFTS): Project Review and Results.
ERIC Educational Resources Information Center
Kazlauskas, E. J.; Boyd, E. F., III; Dessouky, M. M.
This paper presents a review of the Virtual Factory Teaching (VFTS) project, a Web-based, multimedia collaborative learning network. The system allows students, working alone or in teams, to build factories, forecast demand for products, plan production, establish release rules for new work into the factory, and set scheduling rules for…
Derivation of optimal joint operating rules for multi-purpose multi-reservoir water-supply system
NASA Astrophysics Data System (ADS)
Tan, Qiao-feng; Wang, Xu; Wang, Hao; Wang, Chao; Lei, Xiao-hui; Xiong, Yi-song; Zhang, Wei
2017-08-01
The derivation of joint operating policy is a challenging task for a multi-purpose multi-reservoir system. This study proposed an aggregation-decomposition model to guide the joint operation of multi-purpose multi-reservoir system, including: (1) an aggregated model based on the improved hedging rule to ensure the long-term water-supply operating benefit; (2) a decomposed model to allocate the limited release to individual reservoirs for the purpose of maximizing the total profit of the facing period; and (3) a double-layer simulation-based optimization model to obtain the optimal time-varying hedging rules using the non-dominated sorting genetic algorithm II, whose objectives were to minimize maximum water deficit and maximize water supply reliability. The water-supply system of Li River in Guangxi Province, China, was selected for the case study. The results show that the operating policy proposed in this study is better than conventional operating rules and aggregated standard operating policy for both water supply and hydropower generation due to the use of hedging mechanism and effective coordination among multiple objectives.
From data mining rules to medical logical modules and medical advices.
Gomoi, Valentin; Vida, Mihaela; Robu, Raul; Stoicu-Tivadar, Vasile; Bernad, Elena; Lupşe, Oana
2013-01-01
Using data mining in collaboration with Clinical Decision Support Systems adds new knowledge as support for medical diagnosis. The current work presents a tool which translates data mining rules supporting generation of medical advices to Arden Syntax formalism. The developed system was tested with data related to 2326 births that took place in 2010 at the Bega Obstetrics - Gynaecology Hospital, Timişoara. Based on processing these data, 14 medical rules regarding the Apgar score were generated and then translated in Arden Syntax language.
Automation Improves Schedule Quality and Increases Scheduling Efficiency for Residents.
Perelstein, Elizabeth; Rose, Ariella; Hong, Young-Chae; Cohn, Amy; Long, Micah T
2016-02-01
Medical resident scheduling is difficult due to multiple rules, competing educational goals, and ever-evolving graduate medical education requirements. Despite this, schedules are typically created manually, consuming hours of work, producing schedules of varying quality, and yielding negative consequences for resident morale and learning. To determine whether computerized decision support can improve the construction of residency schedules, saving time and improving schedule quality. The Optimized Residency Scheduling Assistant was designed by a team from the University of Michigan Department of Industrial and Operations Engineering. It was implemented in the C.S. Mott Children's Hospital Pediatric Emergency Department in the 2012-2013 academic year. The 4 metrics of schedule quality that were compared between the 2010-2011 and 2012-2013 academic years were the incidence of challenging shift transitions, the incidence of shifts following continuity clinics, the total shift inequity, and the night shift inequity. All scheduling rules were successfully incorporated. Average schedule creation time fell from 22 to 28 hours to 4 to 6 hours per month, and 3 of 4 metrics of schedule quality significantly improved. For the implementation year, the incidence of challenging shift transitions decreased from 83 to 14 (P < .01); the incidence of postclinic shifts decreased from 72 to 32 (P < .01); and the SD of night shifts dropped by 55.6% (P < .01). This automated shift scheduling system improves the current manual scheduling process, reducing time spent and improving schedule quality. Embracing such automated tools can benefit residency programs with shift-based scheduling needs.
Integration of perception and reasoning in fast neural modules
NASA Technical Reports Server (NTRS)
Fritz, David G.
1989-01-01
Artificial neural systems promise to integrate symbolic and sub-symbolic processing to achieve real time control of physical systems. Two potential alternatives exist. In one, neural nets can be used to front-end expert systems. The expert systems, in turn, are developed with varying degrees of parallelism, including their implementation in neural nets. In the other, rule-based reasoning and sensor data can be integrated within a single hybrid neural system. The hybrid system reacts as a unit to provide decisions (problem solutions) based on the simultaneous evaluation of data and rules. Discussed here is a model hybrid system based on the fuzzy cognitive map (FCM). The operation of the model is illustrated with the control of a hypothetical satellite that intelligently alters its attitude in space in response to an intersecting micrometeorite shower.
Empirical and theoretical analysis of complex systems
NASA Astrophysics Data System (ADS)
Zhao, Guannan
This thesis is an interdisciplinary work under the heading of complexity science which focuses on an arguably common "hard" problem across physics, finance and biology [1], to quantify and mimic the macroscopic "emergent phenomenon" in large-scale systems consisting of many interacting "particles" governed by microscopic rules. In contrast to traditional statistical physics, we are interested in systems whose dynamics are subject to feedback, evolution, adaption, openness, etc. Global financial markets, like the stock market and currency market, are ideal candidate systems for such a complexity study: there exists a vast amount of accurate data, which is the aggregate output of many autonomous agents continuously competing with each other. We started by examining the ultrafast "mini flash crash (MFC)" events in the US stock market. An abrupt system-wide composition transition from a mixed human machine phase to a new all-machine phase is uncovered, and a novel theory developed to explain this observation. Then in the study of FX market, we found an unexpected variation in the synchronicity of price changes in different market subsections as a function of the overall trading activity. Several survival models have been tested in analyzing the distribution of waiting times to the next price change. In the region of long waiting-times, the distribution for each currency pair exhibits a power law with exponent in the vicinity of 3.5. By contrast, for short waiting times only, the market activity can be mimicked by the fluctuations emerging from a finite resource competition model containing multiple agents with limited rationality (so called El Farol Model). Switching to the biomedical domain, we present a minimal mathematical model built around a co-evolving resource network and cell population, yielding good agreement with primary tumors in mice experiment and with clinical metastasis data. In the quest to understand contagion phenomena in systems where social group structures evolve on a similar timescale to individual level transmission, we investigated the process of transmission through a model population comprising of social groups which follow simple dynamical rules for growth and break-up, and the profiles produced bear a striking resemblance to empirical data obtained from social, financial and biological systems. Finally, for better implementation of a widely accepted power law test algorithm, we have developed a fast testing procedure using parallel computation.
Giant number fluctuations in self-propelled particles without alignment
NASA Astrophysics Data System (ADS)
Fily, Yaouen; Henkes, Silke; Marchetti, M. Cristina
2012-02-01
Giant number fluctuations are a ubiquitous property of active systems. They were predicted using a generic continuum description of active nematics, and have been observed in simulations of Vicsek-type models and in experiments on vibrated granular layers and swimming bacteria. In all of these systems, there is an alignment interaction among the self-propelled units, either imposed as a rule, or arising from hydrodynamic or other medium-mediated couplings. Here we report numerical evidence of giant number fluctuations in a minimal model of self-propelled disks in two dimensions in the absence of any alignment mechanism. The direction of self-propulsion evolves via rotational diffusion and the particles interact solely via a finite range repulsive soft potential. It can be shown that in this system self propulsion is equivalent to a non Markovian noise whose correlation time is controlled by the amplitude of the orientational noise.
Emergence of structural patterns out of synchronization in networks with competitive interactions
NASA Astrophysics Data System (ADS)
Assenza, Salvatore; Gutiérrez, Ricardo; Gómez-Gardeñes, Jesús; Latora, Vito; Boccaletti, Stefano
2011-09-01
Synchronization is a collective phenomenon occurring in systems of interacting units, and is ubiquitous in nature, society and technology. Recent studies have enlightened the important role played by the interaction topology on the emergence of synchronized states. However, most of these studies neglect that real world systems change their interaction patterns in time. Here, we analyze synchronization features in networks in which structural and dynamical features co-evolve. The feedback of the node dynamics on the interaction pattern is ruled by the competition of two mechanisms: homophily (reinforcing those interactions with other correlated units in the graph) and homeostasis (preserving the value of the input strength received by each unit). The competition between these two adaptive principles leads to the emergence of key structural properties observed in real world networks, such as modular and scale-free structures, together with a striking enhancement of local synchronization in systems with no global order.
A proposed computer diagnostic system for malignant melanoma (CDSMM).
Shao, S; Grams, R R
1994-04-01
This paper describes a computer diagnostic system for malignant melanoma. The diagnostic system is a rule base system based on image analyses and works under the PC windows environment. It consists of seven modules: I/O module, Patient/Clinic database, image processing module, classification module, rule base module and system control module. In the system, the image analyses are automatically carried out, and database management is efficient and fast. Both final clinic results and immediate results from various modules such as measured features, feature pictures and history records of the disease lesion can be presented on screen or printed out from each corresponding module or from the I/O module. The system can also work as a doctor's office-based tool to aid dermatologists with details not perceivable by the human eye. Since the system operates on a general purpose PC, it can be made portable if the I/O module is disconnected.
Building Better Decision-Support by Using Knowledge Discovery.
ERIC Educational Resources Information Center
Jurisica, Igor
2000-01-01
Discusses knowledge-based decision-support systems that use artificial intelligence approaches. Addresses the issue of how to create an effective case-based reasoning system for complex and evolving domains, focusing on automated methods for system optimization and domain knowledge evolution that can supplement knowledge acquired from domain…
Genetic Programming for Automatic Hydrological Modelling
NASA Astrophysics Data System (ADS)
Chadalawada, Jayashree; Babovic, Vladan
2017-04-01
One of the recent challenges for the hydrologic research community is the need for the development of coupled systems that involves the integration of hydrologic, atmospheric and socio-economic relationships. This poses a requirement for novel modelling frameworks that can accurately represent complex systems, given, the limited understanding of underlying processes, increasing volume of data and high levels of uncertainity. Each of the existing hydrological models vary in terms of conceptualization and process representation and is the best suited to capture the environmental dynamics of a particular hydrological system. Data driven approaches can be used in the integration of alternative process hypotheses in order to achieve a unified theory at catchment scale. The key steps in the implementation of integrated modelling framework that is influenced by prior understanding and data, include, choice of the technique for the induction of knowledge from data, identification of alternative structural hypotheses, definition of rules, constraints for meaningful, intelligent combination of model component hypotheses and definition of evaluation metrics. This study aims at defining a Genetic Programming based modelling framework that test different conceptual model constructs based on wide range of objective functions and evolves accurate and parsimonious models that capture dominant hydrological processes at catchment scale. In this paper, GP initializes the evolutionary process using the modelling decisions inspired from the Superflex framework [Fenicia et al., 2011] and automatically combines them into model structures that are scrutinized against observed data using statistical, hydrological and flow duration curve based performance metrics. The collaboration between data driven and physical, conceptual modelling paradigms improves the ability to model and manage hydrologic systems. Fenicia, F., D. Kavetski, and H. H. Savenije (2011), Elements of a flexible approach for conceptual hydrological modeling: 1. Motivation and theoretical development, Water Resources Research, 47(11).
Systematics of strength function sum rules
DOE Office of Scientific and Technical Information (OSTI.GOV)
Johnson, Calvin W.
2015-08-28
Sum rules provide useful insights into transition strength functions and are often expressed as expectation values of an operator. In this letter I demonstrate that non-energy-weighted transition sum rules have strong secular dependences on the energy of the initial state. Such non-trivial systematics have consequences: the simplification suggested by the generalized Brink–Axel hypothesis, for example, does not hold for most cases, though it weakly holds in at least some cases for electric dipole transitions. Furthermore, I show the systematics can be understood through spectral distribution theory, calculated via traces of operators and of products of operators. Seen through this lens,more » violation of the generalized Brink–Axel hypothesis is unsurprising: one expectssum rules to evolve with excitation energy. Moreover, to lowest order the slope of the secular evolution can be traced to a component of the Hamiltonian being positive (repulsive) or negative (attractive).« less
Improved Personalized Recommendation Based on Causal Association Rule and Collaborative Filtering
ERIC Educational Resources Information Center
Lei, Wu; Qing, Fang; Zhou, Jin
2016-01-01
There are usually limited user evaluation of resources on a recommender system, which caused an extremely sparse user rating matrix, and this greatly reduce the accuracy of personalized recommendation, especially for new users or new items. This paper presents a recommendation method based on rating prediction using causal association rules.…
Federal Register 2010, 2011, 2012, 2013, 2014
2011-02-25
... Change Regarding Providing Participants With a New Optional Settlement Web Interface February 22, 2011... Rule Change The proposed rule change will establish a new browser-based interface, the ``Settlement Web... Browser System (``PBS'').\\4\\ Based on request from its Participants, DTC has created a more user-friendly...
NASA Astrophysics Data System (ADS)
Hu, Yao; Quinn, Christopher J.; Cai, Ximing; Garfinkle, Noah W.
2017-11-01
For agent-based modeling, the major challenges in deriving agents' behavioral rules arise from agents' bounded rationality and data scarcity. This study proposes a "gray box" approach to address the challenge by incorporating expert domain knowledge (i.e., human intelligence) with machine learning techniques (i.e., machine intelligence). Specifically, we propose using directed information graph (DIG), boosted regression trees (BRT), and domain knowledge to infer causal factors and identify behavioral rules from data. A case study is conducted to investigate farmers' pumping behavior in the Midwest, U.S.A. Results show that four factors identified by the DIG algorithm- corn price, underlying groundwater level, monthly mean temperature and precipitation- have main causal influences on agents' decisions on monthly groundwater irrigation depth. The agent-based model is then developed based on the behavioral rules represented by three DIGs and modeled by BRTs, and coupled with a physically-based groundwater model to investigate the impacts of agents' pumping behavior on the underlying groundwater system in the context of coupled human and environmental systems.
Haldane’s Rule Is Linked to Extraordinary Sex Ratios and Sperm Length in Stalk-Eyed Flies
Wilkinson, Gerald S.; Christianson, Sarah J.; Brand, Cara L.; Ru, George; Shell, Wyatt
2014-01-01
We use three allopatric populations of the stalk-eyed fly Teleopsis dalmanni from Southeast Asia to test two predictions made by the sex chromosome drive hypothesis for Haldane’s rule. The first is that modifiers that suppress or enhance drive should evolve rapidly and independently in isolated populations. The second is that drive loci or modifiers should also cause sterility in hybrid males. We tested these predictions by assaying the fertility of 2066 males derived from backcross experiments involving two pairs of populations and found that the proportion of mated males that fail to produce any offspring ranged from 38 to 60% among crosses with some males producing strongly female-biased or male-biased sex ratios. After genotyping each male at 25–28 genetic markers we found quantitative trait loci (QTL) that jointly influence male sterility, sperm length, and biased progeny sex ratios in each pair of populations, but almost no shared QTL between population crosses. We also discovered that the extant XSR chromosome has no effect on sex ratio or sterility in these backcross males. Whether shared QTL are caused by linkage or pleiotropy requires additional study. Nevertheless, these results indicate the presence of a “cryptic” drive system that is currently masked by suppressing elements that are associated with sterility and sperm length within but not between populations and, therefore, must have evolved since the populations became isolated, i.e., in <100,000 years. We discuss how genes that influence sperm length may contribute to hybrid sterility. PMID:25164880
Vereecken, Nicolas J; Wilson, Carol A; Hötling, Susann; Schulz, Stefan; Banketov, Sergey A; Mardulyn, Patrick
2012-12-07
Pollination by sexual deception is arguably one of the most unusual liaisons linking plants and insects, and perhaps the most illustrative example of extreme floral specialization in angiosperms. While considerable progress has been made in understanding the floral traits involved in sexual deception, less is known about how this remarkable mimicry system might have arisen, the role of pre-adaptations in promoting its evolution and its extent as a pollination mechanism outside the few groups of plants (primarily orchids) where it has been described to date. In the Euro-Mediterranean region, pollination by sexual deception is traditionally considered to be the hallmark of the orchid genus Ophrys. Here, we introduce two new cases outside of Ophrys, in plant groups dominated by generalized, shelter-mimicking species. On the basis of phylogenetic reconstructions of ancestral pollination strategies, we provide evidence for independent and bidirectional evolutionary transitions between generalized (shelter mimicry) and specialized (sexual deception) pollination strategies in three groups of flowering plants, and suggest that pseudocopulation has evolved from pre-adaptations (floral colours, shapes and odour bouquets) that selectively attract male pollinators through shelter mimicry. These findings, along with comparative analyses of floral traits (colours and scents), shed light on particular phenotypic changes that might have fuelled the parallel evolution of these extraordinary pollination strategies. Collectively, our results provide the first substantive insights into how pollination sexual deception might have evolved in the Euro-Mediterranean region, and demonstrate that even the most extreme cases of pollinator specialization can reverse to more generalized interactions, breaking 'Cope's rule of specialization'.
Vereecken, Nicolas J.; Wilson, Carol A.; Hötling, Susann; Schulz, Stefan; Banketov, Sergey A.; Mardulyn, Patrick
2012-01-01
Pollination by sexual deception is arguably one of the most unusual liaisons linking plants and insects, and perhaps the most illustrative example of extreme floral specialization in angiosperms. While considerable progress has been made in understanding the floral traits involved in sexual deception, less is known about how this remarkable mimicry system might have arisen, the role of pre-adaptations in promoting its evolution and its extent as a pollination mechanism outside the few groups of plants (primarily orchids) where it has been described to date. In the Euro-Mediterranean region, pollination by sexual deception is traditionally considered to be the hallmark of the orchid genus Ophrys. Here, we introduce two new cases outside of Ophrys, in plant groups dominated by generalized, shelter-mimicking species. On the basis of phylogenetic reconstructions of ancestral pollination strategies, we provide evidence for independent and bidirectional evolutionary transitions between generalized (shelter mimicry) and specialized (sexual deception) pollination strategies in three groups of flowering plants, and suggest that pseudocopulation has evolved from pre-adaptations (floral colours, shapes and odour bouquets) that selectively attract male pollinators through shelter mimicry. These findings, along with comparative analyses of floral traits (colours and scents), shed light on particular phenotypic changes that might have fuelled the parallel evolution of these extraordinary pollination strategies. Collectively, our results provide the first substantive insights into how pollination sexual deception might have evolved in the Euro-Mediterranean region, and demonstrate that even the most extreme cases of pollinator specialization can reverse to more generalized interactions, breaking ‘Cope's rule of specialization’. PMID:23055065
77 FR 38729 - Alternate Tonnage Threshold for Oil Spill Response Vessels
Federal Register 2010, 2011, 2012, 2013, 2014
2012-06-29
...The Coast Guard is establishing an alternate size threshold based on the measurement system established under the International Convention on Tonnage Measurement of Ships, 1969, for oil spill response vessels, which are properly certificated under 46 CFR chapter I, subchapter L. The present size threshold of 500 gross register tons is based on the U.S. regulatory measurement system. This final rule provides an alternative for owners and operators of offshore supply vessels that may result in an increase in oil spill response capacity and capability. This final rule adopts, without change, the interim rule amending 46 CFR part 126 published in the Federal Register on Monday, December 12, 2011.
NASA Technical Reports Server (NTRS)
Heymans, Bart C.; Onema, Joel P.; Kuti, Joseph O.
1991-01-01
A rule based knowledge system was developed in CLIPS (C Language Integrated Production System) for identifying Opuntia species in the family Cactaceae, which contains approx. 1500 different species. This botanist expert tool system is capable of identifying selected Opuntia plants from the family level down to the species level when given some basic characteristics of the plants. Many plants are becoming of increasing importance because of their nutrition and human health potential, especially in the treatment of diabetes mellitus. The expert tool system described can be extremely useful in an unequivocal identification of many useful Opuntia species.
Evolvable synthetic neural system
NASA Technical Reports Server (NTRS)
Curtis, Steven A. (Inventor)
2009-01-01
An evolvable synthetic neural system includes an evolvable neural interface operably coupled to at least one neural basis function. Each neural basis function includes an evolvable neural interface operably coupled to a heuristic neural system to perform high-level functions and an autonomic neural system to perform low-level functions. In some embodiments, the evolvable synthetic neural system is operably coupled to one or more evolvable synthetic neural systems in a hierarchy.
Alternative Theoretical Bases for the Study of Human Communication: The Rules Perspective.
ERIC Educational Resources Information Center
Cushman, Donald P.
Three potentially useful perspectives for the scientific development of human communication theory are the law model, the systems approach, and the rules paradigm. It is the purpose of this paper to indicate the utility of the rules perspective. For the purposes of this analysis, human communication is viewed as the successful transfer of symbolic…
Probing the young circumplanetary environment of Beta Pic b during transit egress
NASA Astrophysics Data System (ADS)
Wang, Jason
2017-08-01
Among the thousands of known exoplanets, Beta Pic b is the only directly imaged exoplanet with a nearly edge-on orbit. We show that the latest astrometric measurements rule out a transit by the planet at 10-sigma significance, but we are certain that the Hill sphere of the planet will transit. With a period of 22 years and no other system like it, this Hill sphere transit provides a rare opportunity to study the evolving circumplanetary environment of a young and well-characterized exoplanet. To compliment GO-14621, our Cycle 25 proposal to monitor the ingress of the Hill sphere, we propose a modest HST program to photometrically search for signatures of the planet's large scale circumplanetary material during the egress of the Hill sphere transit. The existence of such material is plausible given that Beta Pic's young age is similar to that of the ring-bearing J1407b system. Combined with GO-14621 and less-precise but dedicated ground-based monitoring, these observations will give us a comprehensive set of observations about this young circumplanetary environment. Given the sparse observational data of circumplanetary environments, non-detections will also be valuable for constraining the timescales relevant to circumplanetary material and moon formation. If photometric variations are detected with HST, these results would yield empirical information concerning the dynamics of the system and the evolution of planetary systems as a whole.
Fusion of classifiers for REIS-based detection of suspicious breast lesions
NASA Astrophysics Data System (ADS)
Lederman, Dror; Wang, Xingwei; Zheng, Bin; Sumkin, Jules H.; Tublin, Mitchell; Gur, David
2011-03-01
After developing a multi-probe resonance-frequency electrical impedance spectroscopy (REIS) system aimed at detecting women with breast abnormalities that may indicate a developing breast cancer, we have been conducting a prospective clinical study to explore the feasibility of applying this REIS system to classify younger women (< 50 years old) into two groups of "higher-than-average risk" and "average risk" of having or developing breast cancer. The system comprises one central probe placed in contact with the nipple, and six additional probes uniformly distributed along an outside circle to be placed in contact with six points on the outer breast skin surface. In this preliminary study, we selected an initial set of 174 examinations on participants that have completed REIS examinations and have clinical status verification. Among these, 66 examinations were recommended for biopsy due to findings of a highly suspicious breast lesion ("positives"), and 108 were determined as negative during imaging based procedures ("negatives"). A set of REIS-based features, extracted using a mirror-matched approach, was computed and fed into five machine learning classifiers. A genetic algorithm was used to select an optimal subset of features for each of the five classifiers. Three fusion rules, namely sum rule, weighted sum rule and weighted median rule, were used to combine the results of the classifiers. Performance evaluation was performed using a leave-one-case-out cross-validation method. The results indicated that REIS may provide a new technology to identify younger women with higher than average risk of having or developing breast cancer. Furthermore, it was shown that fusion rule, such as a weighted median fusion rule and a weighted sum fusion rule may improve performance as compared with the highest performing single classifier.
C-Language Integrated Production System, Version 6.0
NASA Technical Reports Server (NTRS)
Riley, Gary; Donnell, Brian; Ly, Huyen-Anh Bebe; Ortiz, Chris
1995-01-01
C Language Integrated Production System (CLIPS) computer programs are specifically intended to model human expertise or other knowledge. CLIPS is designed to enable research on, and development and delivery of, artificial intelligence on conventional computers. CLIPS 6.0 provides cohesive software tool for handling wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming: representation of knowledge as heuristics - essentially, rules of thumb that specify set of actions performed in given situation. Object-oriented programming: modeling of complex systems comprised of modular components easily reused to model other systems or create new components. Procedural-programming: representation of knowledge in ways similar to those of such languages as C, Pascal, Ada, and LISP. Version of CLIPS 6.0 for IBM PC-compatible computers requires DOS v3.3 or later and/or Windows 3.1 or later.
Multiple systems of category learning.
Smith, Edward E; Grossman, Murray
2008-01-01
We review neuropsychological and neuroimaging evidence for the existence of three qualitatively different categorization systems. These categorization systems are themselves based on three distinct memory systems: working memory (WM), explicit long-term memory (explicit LTM), and implicit long-term memory (implicit LTM). We first contrast categorization based on WM with that based on explicit LTM, where the former typically involves applying rules to a test item and the latter involves determining the similarity between stored exemplars or prototypes and a test item. Neuroimaging studies show differences between brain activity in normal participants as a function of whether they are instructed to categorize novel test items by rule or by similarity to known category members. Rule instructions typically lead to more activation in frontal or parietal areas, associated with WM and selective attention, whereas similarity instructions may activate parietal areas associated with the integration of perceptual features. Studies with neurological patients in the same paradigms provide converging evidence, e.g., patients with Alzheimer's disease, who have damage in prefrontal regions, are more impaired with rule than similarity instructions. Our second contrast is between categorization based on explicit LTM with that based on implicit LTM. Neuropsychological studies with patients with medial-temporal lobe damage show that patients are impaired on tasks requiring explicit LTM, but perform relatively normally on an implicit categorization task. Neuroimaging studies provide converging evidence: whereas explicit categorization is mediated by activation in numerous frontal and parietal areas, implicit categorization is mediated by a deactivation in posterior cortex.
NASA Technical Reports Server (NTRS)
Hadipriono, Fabian C.; Diaz, Carlos F.; Merritt, Earl S.
1989-01-01
The research project results in a powerful yet user friendly CROPCAST expert system for use by a client to determine the crop yield production of a certain crop field. The study is based on the facts that heuristic assessment and decision making in agriculture are significant and dominate much of agribusiness. Transfer of the expert knowledge concerning remote sensing based crop yield production into a specific expert system is the key program in this study. A knowledge base consisting of a root frame, CROP-YIELD-FORECAST, and four subframes, namely, SATELLITE, PLANT-PHYSIOLOGY, GROUND, and MODEL were developed to accommodate the production rules obtained from the domain expert. The expert system shell Personal Consultant Plus version 4.0. was used for this purpose. An external geographic program was integrated to the system. This project is the first part of a completely built expert system. The study reveals that much effort was given to the development of the rules. Such effort is inevitable if workable, efficient, and accurate rules are desired. Furthermore, abundant help statements and graphics were included. Internal and external display routines add to the visual capability of the system. The work results in a useful tool for the client for making decisions on crop yield production.
Simple heuristics and rules of thumb: where psychologists and behavioural biologists might meet.
Hutchinson, John M C; Gigerenzer, Gerd
2005-05-31
The Centre for Adaptive Behaviour and Cognition (ABC) has hypothesised that much human decision-making can be described by simple algorithmic process models (heuristics). This paper explains this approach and relates it to research in biology on rules of thumb, which we also review. As an example of a simple heuristic, consider the lexicographic strategy of Take The Best for choosing between two alternatives: cues are searched in turn until one discriminates, then search stops and all other cues are ignored. Heuristics consist of building blocks, and building blocks exploit evolved or learned abilities such as recognition memory; it is the complexity of these abilities that allows the heuristics to be simple. Simple heuristics have an advantage in making decisions fast and with little information, and in avoiding overfitting. Furthermore, humans are observed to use simple heuristics. Simulations show that the statistical structures of different environments affect which heuristics perform better, a relationship referred to as ecological rationality. We contrast ecological rationality with the stronger claim of adaptation. Rules of thumb from biology provide clearer examples of adaptation because animals can be studied in the environments in which they evolved. The range of examples is also much more diverse. To investigate them, biologists have sometimes used similar simulation techniques to ABC, but many examples depend on empirically driven approaches. ABC's theoretical framework can be useful in connecting some of these examples, particularly the scattered literature on how information from different cues is integrated. Optimality modelling is usually used to explain less detailed aspects of behaviour but might more often be redirected to investigate rules of thumb.
State Identification of Hoisting Motors Based on Association Rules for Quayside Container Crane
NASA Astrophysics Data System (ADS)
Li, Q. Z.; Gang, T.; Pan, H. Y.; Xiong, H.
2017-07-01
Quay container crane hoisting motor is a complex system, and the characteristics of long-term evolution and change of running status of there is a rule, and use it. Through association rules analysis, this paper introduced the similarity in association rules, and quay container crane hoisting motor status identification. Finally validated by an example, some rules change amplitude is small, regular monitoring, not easy to find, but it is precisely because of these small changes led to mechanical failure. Therefore, using the association rules change in monitoring the motor status has the very strong practical significance.
An Intelligent computer-aided tutoring system for diagnosing anomalies of spacecraft in operation
NASA Technical Reports Server (NTRS)
Rolincik, Mark; Lauriente, Michael; Koons, Harry C.; Gorney, David
1993-01-01
A new rule-based, expert system for diagnosing spacecraft anomalies is under development. The knowledge base consists of over two-hundred (200) rules and provides links to historical and environmental databases. Environmental causes considered are bulk charging, single event upsets (SEU), surface charging, and total radiation dose. The system's driver translates forward chaining rules into a backward chaining sequence, prompting the user for information pertinent to the causes considered. When the user selects the novice mode, the system automatically gives detailed explanations and descriptions of terms and reasoning as the session progresses, in a sense teaching the user. As such it is an effective tutoring tool. The use of heuristics frees the user from searching through large amounts of irrelevant information and allows the user to input partial information (varying degrees of confidence in an answer) or 'unknown' to any question. The system is available on-line and uses C Language Integrated Production System (CLIPS), an expert shell developed by the NASA Johnson Space Center AI Laboratory in Houston.
Liu, Zengjian; Tang, Buzhou; Wang, Xiaolong; Chen, Qingcai; Li, Haodi; Bu, Junzhao; Jiang, Jingzhi; Deng, Qiwen; Zhu, Suisong
2016-01-01
Time is an important aspect of information and is very useful for information utilization. The goal of this study was to analyze the challenges of temporal expression (TE) extraction and normalization in Chinese clinical notes by assessing the performance of a rule-based system developed by us on a manually annotated corpus (including 1,778 clinical notes of 281 hospitalized patients). In order to develop system conveniently, we divided TEs into three categories: direct, indirect and uncertain TEs, and designed different rules for each category of them. Evaluation on the independent test set shows that our system achieves an F-score of93.40% on TE extraction, and an accuracy of 92.58% on TE normalization under "exact-match" criterion. Compared with HeidelTime for Chinese newswire text, our system is much better, indicating that it is necessary to develop a specific TE extraction and normalization system for Chinese clinical notes because of domain difference.
NASA Astrophysics Data System (ADS)
Xu, Y.; Seshadri, P.; Amin, V.; Heim, N. A.; Payne, J.
2013-12-01
Over time, organisms have adapted to changing environments by evolving to be larger or smaller. Scientists have described body-size trends using two generalized theories. Bergmann's rule states that body size is inversely related to temperature, and Cope's rule establishes an increase over time. Cope's rule has been hypothesized as a temporal manifestation of Bergmann's rule, as the temperature of the Earth has consistently decreased over time and mean body size has increased. However, during times of constant temperature increase, Bergmann's rule and Cope's rule predict opposite effects on body size. Our goal was to clarify this relationship using both accessible proxies of historic temperature - atmospheric CO2 levels and paleo-latitude. We measured ostracod lengths throughout the Paleozoic and Mesozoic eras (using the Catalogue of Ostracoda) and utilized ostracod latitudinal information from the Paleobiology Database. By closely studying body-size trends during four time periods of constant CO2 increase across spectrums of time and latitude, we were able to compare the effects of Cope's and Bergmann's rule. The correlation, p-values, and slopes of each of our graphs showed that there is no clear relationship between body size and each of these rules in times of temperature increase, both latitudinally and temporally. Therefore, both Cope's and Bergmann's rule act on marine ostracods and no rule is dominant, though our results more strongly disprove the latitudinal variation in ostracod size.
Molnets: An Artificial Chemistry Based on Neural Networks
NASA Technical Reports Server (NTRS)
Colombano, Silvano; Luk, Johnny; Segovia-Juarez, Jose L.; Lohn, Jason; Clancy, Daniel (Technical Monitor)
2002-01-01
The fundamental problem in the evolution of matter is to understand how structure-function relationships are formed and increase in complexity from the molecular level all the way to a genetic system. We have created a system where structure-function relationships arise naturally and without the need of ad hoc function assignments to given structures. The idea was inspired by neural networks, where the structure of the net embodies specific computational properties. In this system networks interact with other networks to create connections between the inputs of one net and the outputs of another. The newly created net then recomputes its own synaptic weights, based on anti-hebbian rules. As a result some connections may be cut, and multiple nets can emerge as products of a 'reaction'. The idea is to study emergent reaction behaviors, based on simple rules that constitute a pseudophysics of the system. These simple rules are parameterized to produce behaviors that emulate chemical reactions. We find that these simple rules show a gradual increase in the size and complexity of molecules. We have been building a virtual artificial chemistry laboratory for discovering interesting reactions and for testing further ideas on the evolution of primitive molecules. Some of these ideas include the potential effect of membranes and selective diffusion according to molecular size.
Usefulness of Neuro-Fuzzy Models' Application for Tobacco Control
NASA Astrophysics Data System (ADS)
Petrovic-Lazarevic, Sonja; Zhang, Jian Ying
2007-12-01
The paper presents neuro-fuzzy models' application appropriate for tobacco control: the fuzzy control model, Adaptive Network Based Fuzzy Inference System, Evolving Fuzzy Neural Network models, and EVOlving POLicies. We propose further the use of Fuzzy Casual Networks to help tobacco control decision makers develop policies and measure their impact on social regulation.
NASA Astrophysics Data System (ADS)
Wang, Zhen; Yu, Chao; Cui, Guang-Hai; Li, Ya-Peng; Li, Ming-Chu
2016-02-01
The spatial Iterated Prisoner's Dilemma game has been widely studied in order to explain the evolution of cooperation. Considering the large strategy space size and infinite interaction times, it is unrealistic to adopt the common imitate-best updating rule, which assumes that the human players have much stronger abilities to recognize their neighbors' strategies than they do in the one-shot game. In this paper, a novel localized extremal dynamic system is proposed, in which each player only needs to recognize the payoff of his neighbors and changes his strategy randomly when he receives the lowest payoff in his neighborhood. The evolution of cooperation is here explored under this updating rule for neighborhoods of different sizes, which are characterized by their corresponding radiuses r. The results show that when r = 1, the system is trapped in a checkerboard-like state, where half of the players consistently use AllD-like strategies and the other half constantly change their strategies. When r = 2, the system first enters an AllD-like state, from which it escapes, and finally evolves to a TFT-like state. When r is larger, the system locks in a situation with similar low average fitness as r = 1. The number of active players and the ability to form clusters jointly distinguish the evolutionary processes for different values of r from each other. The current findings further provide some insight into the evolution of cooperation and collective behavior in biological and social systems.
Design a Fuzzy Rule-based Expert System to Aid Earlier Diagnosis of Gastric Cancer.
Safdari, Reza; Arpanahi, Hadi Kazemi; Langarizadeh, Mostafa; Ghazisaiedi, Marjan; Dargahi, Hossein; Zendehdel, Kazem
2018-01-01
Screening and health check-up programs are most important sanitary priorities, that should be undertaken to control dangerous diseases such as gastric cancer that affected by different factors. More than 50% of gastric cancer diagnoses are made during the advanced stage. Currently, there is no systematic approach for early diagnosis of gastric cancer. to develop a fuzzy expert system that can identify gastric cancer risk levels in individuals. This system was implemented in MATLAB software, Mamdani inference technique applied to simulate reasoning of experts in the field, a total of 67 fuzzy rules extracted as a rule-base based on medical expert's opinion. 50 case scenarios were used to evaluate the system, the information of case reports is given to the system to find risk level of each case report then obtained results were compared with expert's diagnosis. Results revealed that sensitivity was 92.1% and the specificity was 83.1%. The results show that is possible to develop a system that can identify High risk individuals for gastric cancer. The system can lead to earlier diagnosis, this may facilitate early treatment and reduce gastric cancer mortality rate.
Modeling for (physical) biologists: an introduction to the rule-based approach
Chylek, Lily A; Harris, Leonard A; Faeder, James R; Hlavacek, William S
2015-01-01
Models that capture the chemical kinetics of cellular regulatory networks can be specified in terms of rules for biomolecular interactions. A rule defines a generalized reaction, meaning a reaction that permits multiple reactants, each capable of participating in a characteristic transformation and each possessing certain, specified properties, which may be local, such as the state of a particular site or domain of a protein. In other words, a rule defines a transformation and the properties that reactants must possess to participate in the transformation. A rule also provides a rate law. A rule-based approach to modeling enables consideration of mechanistic details at the level of functional sites of biomolecules and provides a facile and visual means for constructing computational models, which can be analyzed to study how system-level behaviors emerge from component interactions. PMID:26178138
Reliability and performance evaluation of systems containing embedded rule-based expert systems
NASA Technical Reports Server (NTRS)
Beaton, Robert M.; Adams, Milton B.; Harrison, James V. A.
1989-01-01
A method for evaluating the reliability of real-time systems containing embedded rule-based expert systems is proposed and investigated. It is a three stage technique that addresses the impact of knowledge-base uncertainties on the performance of expert systems. In the first stage, a Markov reliability model of the system is developed which identifies the key performance parameters of the expert system. In the second stage, the evaluation method is used to determine the values of the expert system's key performance parameters. The performance parameters can be evaluated directly by using a probabilistic model of uncertainties in the knowledge-base or by using sensitivity analyses. In the third and final state, the performance parameters of the expert system are combined with performance parameters for other system components and subsystems to evaluate the reliability and performance of the complete system. The evaluation method is demonstrated in the context of a simple expert system used to supervise the performances of an FDI algorithm associated with an aircraft longitudinal flight-control system.
The evolving ecology of risk for hospitalized dialysis patients.
Sandroni, Stephen
2009-01-01
Despite an increased focus on patient safety, changes in resident work rules and contemporary hospital culture often combine to create an environment of potential hazard for the hospitalized dialysis patient. Clinical scenarios are presented to illustrate some of these risks, and suggestions are offered for the protection of patients.
Use of aerobic spores as a surrogate for cryptosporidium oocysts in drinking water and supplies
USDA-ARS?s Scientific Manuscript database
Waterborne illnesses are a growing concern among health agencies worldwide and regulatory efforts to prevent microbial contamination of water supplies are constantly evolving to stay ahead of the threat. The United States Environmental Protection Agency has established several rules to combat the co...
A Work Revolution in U.S. Industry.
ERIC Educational Resources Information Center
Business Week, 1983
1983-01-01
Changes in work rules are moving the workplace away from rigid labor practices created by labor/management. A more flexible structure is evolving that can adapt to new technology and provide new products at competitive cost. Discusses the movement and the impact of international competition/deregulation on the trend. (JN)
Emergent 1d Ising Behavior in AN Elementary Cellular Automaton Model
NASA Astrophysics Data System (ADS)
Kassebaum, Paul G.; Iannacchione, Germano S.
The fundamental nature of an evolving one-dimensional (1D) Ising model is investigated with an elementary cellular automaton (CA) simulation. The emergent CA simulation employs an ensemble of cells in one spatial dimension, each cell capable of two microstates interacting with simple nearest-neighbor rules and incorporating an external field. The behavior of the CA model provides insight into the dynamics of coupled two-state systems not expressible by exact analytical solutions. For instance, state progression graphs show the causal dynamics of a system through time in relation to the system's entropy. Unique graphical analysis techniques are introduced through difference patterns, diffusion patterns, and state progression graphs of the 1D ensemble visualizing the evolution. All analyses are consistent with the known behavior of the 1D Ising system. The CA simulation and new pattern recognition techniques are scalable (in both dimension, complexity, and size) and have many potential applications such as complex design of materials, control of agent systems, and evolutionary mechanism design.
Oishi, Takuya; Uraguchi, Kohji; Abramov, Alexei V; Masuda, Ryuichi
2010-12-01
In order to clarify the morphological differences between two subspecies of the red fox (Vulpes vulpes) on the Japanese Islands and test the validity of Bergmann's rule, we examined geographical variations in 25 cranial and 24 dental characters in V. v. schrencki from Hokkaido and V. v. japonica from the other main islands of Japan (Honshu, Shikoku, and Kyushu). Many skull measurements, including the male greatest length, condylobasal length, and the length of upper and lower tooth rows, were significantly larger for V. v. japonica than for V. v. schrencki, whereas most tooth measurements, especially the length of molars and premolars, in V. v. schrencki were larger than those in V. v. japonica. Although the two subspecies were morphologically well-differentiated from each other, the results did not support that they have evolved following Bergmann's rule of adaptation to cold climates. Based on consideration of the relatively large differences of their tooth sizes, which are not easily influenced by food abundance, and previous genetic research on the different migration histories of the two subspecies, the morphological differences detected in the present study may have resulted not only from the present ecological differences between the two subspecies, but also from the difference of migration history and evolutionary constraints.
Automated diagnosis of coronary artery disease based on data mining and fuzzy modeling.
Tsipouras, Markos G; Exarchos, Themis P; Fotiadis, Dimitrios I; Kotsia, Anna P; Vakalis, Konstantinos V; Naka, Katerina K; Michalis, Lampros K
2008-07-01
A fuzzy rule-based decision support system (DSS) is presented for the diagnosis of coronary artery disease (CAD). The system is automatically generated from an initial annotated dataset, using a four stage methodology: 1) induction of a decision tree from the data; 2) extraction of a set of rules from the decision tree, in disjunctive normal form and formulation of a crisp model; 3) transformation of the crisp set of rules into a fuzzy model; and 4) optimization of the parameters of the fuzzy model. The dataset used for the DSS generation and evaluation consists of 199 subjects, each one characterized by 19 features, including demographic and history data, as well as laboratory examinations. Tenfold cross validation is employed, and the average sensitivity and specificity obtained is 62% and 54%, respectively, using the set of rules extracted from the decision tree (first and second stages), while the average sensitivity and specificity increase to 80% and 65%, respectively, when the fuzzification and optimization stages are used. The system offers several advantages since it is automatically generated, it provides CAD diagnosis based on easily and noninvasively acquired features, and is able to provide interpretation for the decisions made.
Jonnalagadda, Siddhartha Reddy; Li, Dingcheng; Sohn, Sunghwan; Wu, Stephen Tze-Inn; Wagholikar, Kavishwar; Torii, Manabu; Liu, Hongfang
2012-01-01
This paper describes the coreference resolution system submitted by Mayo Clinic for the 2011 i2b2/VA/Cincinnati shared task Track 1C. The goal of the task was to construct a system that links the markables corresponding to the same entity. The task organizers provided progress notes and discharge summaries that were annotated with the markables of treatment, problem, test, person, and pronoun. We used a multi-pass sieve algorithm that applies deterministic rules in the order of preciseness and simultaneously gathers information about the entities in the documents. Our system, MedCoref, also uses a state-of-the-art machine learning framework as an alternative to the final, rule-based pronoun resolution sieve. The best system that uses a multi-pass sieve has an overall score of 0.836 (average of B(3), MUC, Blanc, and CEAF F score) for the training set and 0.843 for the test set. A supervised machine learning system that typically uses a single function to find coreferents cannot accommodate irregularities encountered in data especially given the insufficient number of examples. On the other hand, a completely deterministic system could lead to a decrease in recall (sensitivity) when the rules are not exhaustive. The sieve-based framework allows one to combine reliable machine learning components with rules designed by experts. Using relatively simple rules, part-of-speech information, and semantic type properties, an effective coreference resolution system could be designed. The source code of the system described is available at https://sourceforge.net/projects/ohnlp/files/MedCoref.
2015-11-05
This final rule will update Home Health Prospective Payment System (HH PPS) rates, including the national, standardized 60-day episode payment rates, the national per-visit rates, and the non-routine medical supply (NRS) conversion factor under the Medicare prospective payment system for home health agencies (HHAs), effective for episodes ending on or after January 1, 2016. As required by the Affordable Care Act, this rule implements the 3rd year of the 4-year phase-in of the rebasing adjustments to the HH PPS payment rates. This rule updates the HH PPS case-mix weights using the most current, complete data available at the time of rulemaking and provides a clarification regarding the use of the "initial encounter'' seventh character applicable to certain ICD-10-CM code categories. This final rule will also finalize reductions to the national, standardized 60-day episode payment rate in CY 2016, CY 2017, and CY 2018 of 0.97 percent in each year to account for estimated case-mix growth unrelated to increases in patient acuity (nominal case-mix growth) between CY 2012 and CY 2014. In addition, this rule implements a HH value-based purchasing (HHVBP) model, beginning January 1, 2016, in which all Medicare-certified HHAs in selected states will be required to participate. Finally, this rule finalizes minor changes to the home health quality reporting program and minor technical regulations text changes.
NASA Astrophysics Data System (ADS)
Macian-Sorribes, Hector; Pulido-Velazquez, Manuel
2016-04-01
This contribution presents a methodology for defining optimal seasonal operating rules in multireservoir systems coupling expert criteria and stochastic optimization. Both sources of information are combined using fuzzy logic. The structure of the operating rules is defined based on expert criteria, via a joint expert-technician framework consisting in a series of meetings, workshops and surveys carried out between reservoir managers and modelers. As a result, the decision-making process used by managers can be assessed and expressed using fuzzy logic: fuzzy rule-based systems are employed to represent the operating rules and fuzzy regression procedures are used for forecasting future inflows. Once done that, a stochastic optimization algorithm can be used to define optimal decisions and transform them into fuzzy rules. Finally, the optimal fuzzy rules and the inflow prediction scheme are combined into a Decision Support System for making seasonal forecasts and simulate the effect of different alternatives in response to the initial system state and the foreseen inflows. The approach presented has been applied to the Jucar River Basin (Spain). Reservoir managers explained how the system is operated, taking into account the reservoirs' states at the beginning of the irrigation season and the inflows previewed during that season. According to the information given by them, the Jucar River Basin operating policies were expressed via two fuzzy rule-based (FRB) systems that estimate the amount of water to be allocated to the users and how the reservoir storages should be balanced to guarantee those deliveries. A stochastic optimization model using Stochastic Dual Dynamic Programming (SDDP) was developed to define optimal decisions, which are transformed into optimal operating rules embedding them into the two FRBs previously created. As a benchmark, historical records are used to develop alternative operating rules. A fuzzy linear regression procedure was employed to foresee future inflows depending on present and past hydrological and meteorological variables actually used by the reservoir managers to define likely inflow scenarios. A Decision Support System (DSS) was created coupling the FRB systems and the inflow prediction scheme in order to give the user a set of possible optimal releases in response to the reservoir states at the beginning of the irrigation season and the fuzzy inflow projections made using hydrological and meteorological information. The results show that the optimal DSS created using the FRB operating policies are able to increase the amount of water allocated to the users in 20 to 50 Mm3 per irrigation season with respect to the current policies. Consequently, the mechanism used to define optimal operating rules and transform them into a DSS is able to increase the water deliveries in the Jucar River Basin, combining expert criteria and optimization algorithms in an efficient way. This study has been partially supported by the IMPADAPT project (CGL2013-48424-C2-1-R) with Spanish MINECO (Ministerio de Economía y Competitividad) and FEDER funds. It also has received funding from the European Union's Horizon 2020 research and innovation programme under the IMPREX project (grant agreement no: 641.811).
Autonomous Flight Safety System
NASA Technical Reports Server (NTRS)
Simpson, James
2010-01-01
The Autonomous Flight Safety System (AFSS) is an independent self-contained subsystem mounted onboard a launch vehicle. AFSS has been developed by and is owned by the US Government. Autonomously makes flight termination/destruct decisions using configurable software-based rules implemented on redundant flight processors using data from redundant GPS/IMU navigation sensors. AFSS implements rules determined by the appropriate Range Safety officials.
Secure steganographic communication algorithm based on self-organizing patterns.
Saunoriene, Loreta; Ragulskis, Minvydas
2011-11-01
A secure steganographic communication algorithm based on patterns evolving in a Beddington-de Angelis-type predator-prey model with self- and cross-diffusion is proposed in this paper. Small perturbations of initial states of the system around the state of equilibrium result in the evolution of self-organizing patterns. Small differences between initial perturbations result in slight differences also in the evolving patterns. It is shown that the generation of interpretable target patterns cannot be considered as a secure mean of communication because contours of the secret image can be retrieved from the cover image using statistical techniques if only it represents small perturbations of the initial states of the system. An alternative approach when the cover image represents the self-organizing pattern that has evolved from initial states perturbed using the dot-skeleton representation of the secret image can be considered as a safe visual communication technique protecting both the secret image and communicating parties.
Forward-Chaining Versus A Graph Approach As The Inference Engine In Expert Systems
NASA Astrophysics Data System (ADS)
Neapolitan, Richard E.
1986-03-01
Rule-based expert systems are those in which a certain number of IF-THEN rules are assumed to be true. Based on the verity of some assertions, the rules deduce as many new conclusions as possible. A standard technique used to make these deductions is forward-chaining. In forward-chaining, the program or 'inference engine' cycles through the rules. At each rule, the premises for the rule are checked against the current true assertions. If all the premises are found, the conclusion is added to the list of true assertions. At that point it is necessary to start over at the first rule, since the new conclusion may be a premise in a rule already checked. Therefore, each time a new conclusion is deduced it is necessary to start the rule checking procedure over. This process continues until no new conclusions are added and the end of the list of rules is reached. The above process, although quite costly in terms of CPU cycles due to the necessity of repeatedly starting the process over, is necessary if the rules contain 'pattern variables'. An example of such a rule is, 'IF X IS A BACTERIA, THEN X CAN BE TREATED WITH ANTIBIOTICS'. Since the rule can lead to conclusions for many values of X, it is necessary to check each premise in the rule against every true assertion producing an association list to be used in the checking of the next premise. However, if the rule does not contain variable data, as is the case in many current expert systems, then a rule can lead to only one conclusion. In this case, the rules can be stored in a graph, and the true assertions in an assertion list. The assertion list is traversed only once; at each assertion a premise is triggered in all the rules which have that assertion as a premise. When all premises for a rule trigger, the rule's conclusion is added to the END of the list of assertions. It must be added at the end so that it will eventually be used to make further deductions. In the current paper, the two methods are described in detail, the relative advantages of each is discussed, and a benchmark comparing the CPU cycles consumed by each is included. It is also shown that, in the case of reasoning under uncertainty, it is possible to properly combine the certainties derived from rules arguing for the same conclusion when the graph approach is used.
2011-11-30
This final rule with comment period revises the Medicare hospital outpatient prospective payment system (OPPS) for CY 2012 to implement applicable statutory requirements and changes arising from our continuing experience with this system. In this final rule with comment period, we describe the changes to the amounts and factors used to determine the payment rates for Medicare hospital outpatient services paid under the OPPS. In addition, this final rule with comment period updates the revised Medicare ambulatory surgical center (ASC) payment system to implement applicable statutory requirements and changes arising from our continuing experience with this system. In this final rule with comment period, we set forth the relative payment weights and payment amounts for services furnished in ASCs, specific HCPCS codes to which these changes apply, and other ratesetting information for the CY 2012 ASC payment system. We are revising the requirements for the Hospital Outpatient Quality Reporting (OQR) Program, adding new requirements for ASC Quality Reporting System, and making additional changes to provisions of the Hospital Inpatient Value-Based Purchasing (VBP) Program. We also are allowing eligible hospitals and CAHs participating in the Medicare Electronic Health Record (EHR) Incentive Program to meet the clinical quality measure reporting requirement of the EHR Incentive Program for payment year 2012 by participating in the 2012 Medicare EHR Incentive Program Electronic Reporting Pilot. Finally, we are making changes to the rules governing the whole hospital and rural provider exceptions to the physician self-referral prohibition for expansion of facility capacity and changes to provider agreement regulations on patient notification requirements.
Using cellular automata to generate image representation for biological sequences.
Xiao, X; Shao, S; Ding, Y; Huang, Z; Chen, X; Chou, K-C
2005-02-01
A novel approach to visualize biological sequences is developed based on cellular automata (Wolfram, S. Nature 1984, 311, 419-424), a set of discrete dynamical systems in which space and time are discrete. By transforming the symbolic sequence codes into the digital codes, and using some optimal space-time evolvement rules of cellular automata, a biological sequence can be represented by a unique image, the so-called cellular automata image. Many important features, which are originally hidden in a long and complicated biological sequence, can be clearly revealed thru its cellular automata image. With biological sequences entering into databanks rapidly increasing in the post-genomic era, it is anticipated that the cellular automata image will become a very useful vehicle for investigation into their key features, identification of their function, as well as revelation of their "fingerprint". It is anticipated that by using the concept of the pseudo amino acid composition (Chou, K.C. Proteins: Structure, Function, and Genetics, 2001, 43, 246-255), the cellular automata image approach can also be used to improve the quality of predicting protein attributes, such as structural class and subcellular location.