Science.gov

Sample records for adaptive neural-fuzzy inference

  1. Artificial frame filling using adaptive neural fuzzy inference system for particle image velocimetry dataset

    NASA Astrophysics Data System (ADS)

    Akdemir, Bayram; Doǧan, Sercan; Aksoy, Muharrem H.; Canli, Eyüp; Özgören, Muammer

    2015-03-01

    Liquid behaviors are very important for many areas especially for Mechanical Engineering. Fast camera is a way to observe and search the liquid behaviors. Camera traces the dust or colored markers travelling in the liquid and takes many pictures in a second as possible as. Every image has large data structure due to resolution. For fast liquid velocity, there is not easy to evaluate or make a fluent frame after the taken images. Artificial intelligence has much popularity in science to solve the nonlinear problems. Adaptive neural fuzzy inference system is a common artificial intelligence in literature. Any particle velocity in a liquid has two dimension speed and its derivatives. Adaptive Neural Fuzzy Inference System has been used to create an artificial frame between previous and post frames as offline. Adaptive neural fuzzy inference system uses velocities and vorticities to create a crossing point vector between previous and post points. In this study, Adaptive Neural Fuzzy Inference System has been used to fill virtual frames among the real frames in order to improve image continuity. So this evaluation makes the images much understandable at chaotic or vorticity points. After executed adaptive neural fuzzy inference system, the image dataset increase two times and has a sequence as virtual and real, respectively. The obtained success is evaluated using R2 testing and mean squared error. R2 testing has a statistical importance about similarity and 0.82, 0.81, 0.85 and 0.8 were obtained for velocities and derivatives, respectively.

  2. An adaptive neural fuzzy filter and its applications.

    PubMed

    Lin, C T; Juang, C F

    1997-01-01

    A new kind of nonlinear adaptive filter, the adaptive neural fuzzy filter (ANFF), based upon a neural network's learning ability and fuzzy if-then rule structure, is proposed in this paper. The ANFF is inherently a feedforward multilayered connectionist network which can learn by itself according to numerical training data or expert knowledge represented by fuzzy if-then rules. The adaptation here includes the construction of fuzzy if-then rules (structure learning), and the tuning of the free parameters of membership functions (parameter learning). In the structure learning phase, fuzzy rules are found based on the matching of input-output clusters. In the parameter learning phase, a backpropagation-like adaptation algorithm is developed to minimize the output error. There are no hidden nodes (i.e., no membership functions and fuzzy rules) initially, and both the structure learning and parameter learning are performed concurrently as the adaptation proceeds. However, if some linguistic information about the design of the filter is available, such knowledge can be put into the ANFF to form an initial structure with hidden nodes. Two major advantages of the ANFF can thus be seen: 1) a priori knowledge can be incorporated into the ANFF which makes the fusion of numerical data and linguistic information in the filter possible; and 2) no predetermination, like the number of hidden nodes, must be given, since the ANFF can find its optimal structure and parameters automatically.

  3. Urban land use and land cover classification using the neural-fuzzy inference approach with Formosat-2 data

    NASA Astrophysics Data System (ADS)

    Chen, Ho-Wen; Chang, Ni-Bin; Yu, Ruey-Fang; Huang, Yi-Wen

    2009-10-01

    This paper presents a neural-fuzzy inference approach to identify the land use and land cover (LULC) patterns in large urban areas with the 8-meter resolution of multi-spectral images collected by Formosat-2 satellite. Texture and feature analyses support the retrieval of fuzzy rules in the context of data mining to discern the embedded LULC patterns via a neural-fuzzy inference approach. The case study for Taichung City in central Taiwan shows the application potential based on five LULC classes. With the aid of integrated fuzzy rules and a neural network model, the optimal weights associated with these achievable rules can be determined with phenomenological and theoretical implications. Through appropriate model training and validation stages with respect to a groundtruth data set, research findings clearly indicate that the proposed remote sensing technique can structure an improved screening and sequencing procedure when selecting rules for LULC classification. There is no limitation of using broad spectral bands for category separation by this method, such as the ability to reliably separate only a few (4-5) classes. This normalized difference vegetation index (NDVI)-based data mining technique has shown potential for LULC pattern recognition in different regions, and is not restricted to this sensor, location or date.

  4. Hybrid artificial intelligence approach based on neural fuzzy inference model and metaheuristic optimization for flood susceptibilitgy modeling in a high-frequency tropical cyclone area using GIS

    NASA Astrophysics Data System (ADS)

    Tien Bui, Dieu; Pradhan, Biswajeet; Nampak, Haleh; Bui, Quang-Thanh; Tran, Quynh-An; Nguyen, Quoc-Phi

    2016-09-01

    This paper proposes a new artificial intelligence approach based on neural fuzzy inference system and metaheuristic optimization for flood susceptibility modeling, namely MONF. In the new approach, the neural fuzzy inference system was used to create an initial flood susceptibility model and then the model was optimized using two metaheuristic algorithms, Evolutionary Genetic and Particle Swarm Optimization. A high-frequency tropical cyclone area of the Tuong Duong district in Central Vietnam was used as a case study. First, a GIS database for the study area was constructed. The database that includes 76 historical flood inundated areas and ten flood influencing factors was used to develop and validate the proposed model. Root Mean Square Error (RMSE), Mean Absolute Error (MAE), Receiver Operating Characteristic (ROC) curve, and area under the ROC curve (AUC) were used to assess the model performance and its prediction capability. Experimental results showed that the proposed model has high performance on both the training (RMSE = 0.306, MAE = 0.094, AUC = 0.962) and validation dataset (RMSE = 0.362, MAE = 0.130, AUC = 0.911). The usability of the proposed model was evaluated by comparing with those obtained from state-of-the art benchmark soft computing techniques such as J48 Decision Tree, Random Forest, Multi-layer Perceptron Neural Network, Support Vector Machine, and Adaptive Neuro Fuzzy Inference System. The results show that the proposed MONF model outperforms the above benchmark models; we conclude that the MONF model is a new alternative tool that should be used in flood susceptibility mapping. The result in this study is useful for planners and decision makers for sustainable management of flood-prone areas.

  5. HyFIS: adaptive neuro-fuzzy inference systems and their application to nonlinear dynamical systems.

    PubMed

    Kim, J; Kasabov, N

    1999-11-01

    This paper proposes an adaptive neuro-fuzzy system, HyFIS (Hybrid neural Fuzzy Inference System), for building and optimising fuzzy models. The proposed model introduces the learning power of neural networks to fuzzy logic systems and provides linguistic meaning to the connectionist architectures. Heuristic fuzzy logic rules and input-output fuzzy membership functions can be optimally tuned from training examples by a hybrid learning scheme comprised of two phases: rule generation phase from data; and rule tuning phase using error backpropagation learning scheme for a neural fuzzy system. To illustrate the performance and applicability of the proposed neuro-fuzzy hybrid model, extensive simulation studies of nonlinear complex dynamic systems are carried out. The proposed method can be applied to an on-line incremental adaptive learning for the prediction and control of nonlinear dynamical systems. Two benchmark case studies are used to demonstrate that the proposed HyFIS system is a superior neuro-fuzzy modelling technique.

  6. A modified dynamic evolving neural-fuzzy approach to modeling customer satisfaction for affective design.

    PubMed

    Kwong, C K; Fung, K Y; Jiang, Huimin; Chan, K Y; Siu, Kin Wai Michael

    2013-01-01

    Affective design is an important aspect of product development to achieve a competitive edge in the marketplace. A neural-fuzzy network approach has been attempted recently to model customer satisfaction for affective design and it has been proved to be an effective one to deal with the fuzziness and non-linearity of the modeling as well as generate explicit customer satisfaction models. However, such an approach to modeling customer satisfaction has two limitations. First, it is not suitable for the modeling problems which involve a large number of inputs. Second, it cannot adapt to new data sets, given that its structure is fixed once it has been developed. In this paper, a modified dynamic evolving neural-fuzzy approach is proposed to address the above mentioned limitations. A case study on the affective design of mobile phones was conducted to illustrate the effectiveness of the proposed methodology. Validation tests were conducted and the test results indicated that: (1) the conventional Adaptive Neuro-Fuzzy Inference System (ANFIS) failed to run due to a large number of inputs; (2) the proposed dynamic neural-fuzzy model outperforms the subtractive clustering-based ANFIS model and fuzzy c-means clustering-based ANFIS model in terms of their modeling accuracy and computational effort.

  7. A Modified Dynamic Evolving Neural-Fuzzy Approach to Modeling Customer Satisfaction for Affective Design

    PubMed Central

    Kwong, C. K.; Fung, K. Y.; Jiang, Huimin; Chan, K. Y.

    2013-01-01

    Affective design is an important aspect of product development to achieve a competitive edge in the marketplace. A neural-fuzzy network approach has been attempted recently to model customer satisfaction for affective design and it has been proved to be an effective one to deal with the fuzziness and non-linearity of the modeling as well as generate explicit customer satisfaction models. However, such an approach to modeling customer satisfaction has two limitations. First, it is not suitable for the modeling problems which involve a large number of inputs. Second, it cannot adapt to new data sets, given that its structure is fixed once it has been developed. In this paper, a modified dynamic evolving neural-fuzzy approach is proposed to address the above mentioned limitations. A case study on the affective design of mobile phones was conducted to illustrate the effectiveness of the proposed methodology. Validation tests were conducted and the test results indicated that: (1) the conventional Adaptive Neuro-Fuzzy Inference System (ANFIS) failed to run due to a large number of inputs; (2) the proposed dynamic neural-fuzzy model outperforms the subtractive clustering-based ANFIS model and fuzzy c-means clustering-based ANFIS model in terms of their modeling accuracy and computational effort. PMID:24385884

  8. Modelling hourly dissolved oxygen concentration (DO) using dynamic evolving neural-fuzzy inference system (DENFIS)-based approach: case study of Klamath River at Miller Island Boat Ramp, OR, USA.

    PubMed

    Heddam, Salim

    2014-01-01

    In this study, we present application of an artificial intelligence (AI) technique model called dynamic evolving neural-fuzzy inference system (DENFIS) based on an evolving clustering method (ECM), for modelling dissolved oxygen concentration in a river. To demonstrate the forecasting capability of DENFIS, a one year period from 1 January 2009 to 30 December 2009, of hourly experimental water quality data collected by the United States Geological Survey (USGS Station No: 420853121505500) station at Klamath River at Miller Island Boat Ramp, OR, USA, were used for model development. Two DENFIS-based models are presented and compared. The two DENFIS systems are: (1) offline-based system named DENFIS-OF, and (2) online-based system, named DENFIS-ON. The input variables used for the two models are water pH, temperature, specific conductance, and sensor depth. The performances of the models are evaluated using root mean square errors (RMSE), mean absolute error (MAE), Willmott index of agreement (d) and correlation coefficient (CC) statistics. The lowest root mean square error and highest correlation coefficient values were obtained with the DENFIS-ON method. The results obtained with DENFIS models are compared with linear (multiple linear regression, MLR) and nonlinear (multi-layer perceptron neural networks, MLPNN) methods. This study demonstrates that DENFIS-ON investigated herein outperforms all the proposed techniques for DO modelling.

  9. Multimodel inference and adaptive management

    USGS Publications Warehouse

    Rehme, S.E.; Powell, L.A.; Allen, C.R.

    2011-01-01

    Ecology is an inherently complex science coping with correlated variables, nonlinear interactions and multiple scales of pattern and process, making it difficult for experiments to result in clear, strong inference. Natural resource managers, policy makers, and stakeholders rely on science to provide timely and accurate management recommendations. However, the time necessary to untangle the complexities of interactions within ecosystems is often far greater than the time available to make management decisions. One method of coping with this problem is multimodel inference. Multimodel inference assesses uncertainty by calculating likelihoods among multiple competing hypotheses, but multimodel inference results are often equivocal. Despite this, there may be pressure for ecologists to provide management recommendations regardless of the strength of their study’s inference. We reviewed papers in the Journal of Wildlife Management (JWM) and the journal Conservation Biology (CB) to quantify the prevalence of multimodel inference approaches, the resulting inference (weak versus strong), and how authors dealt with the uncertainty. Thirty-eight percent and 14%, respectively, of articles in the JWM and CB used multimodel inference approaches. Strong inference was rarely observed, with only 7% of JWM and 20% of CB articles resulting in strong inference. We found the majority of weak inference papers in both journals (59%) gave specific management recommendations. Model selection uncertainty was ignored in most recommendations for management. We suggest that adaptive management is an ideal method to resolve uncertainty when research results in weak inference.

  10. ARPOP: an appetitive reward-based pseudo-outer-product neural fuzzy inference system inspired from the operant conditioning of feeding behavior in Aplysia.

    PubMed

    Cheu, Eng Yeow; Quek, Chai; Ng, See Kiong

    2012-02-01

    Appetitive operant conditioning in Aplysia for feeding behavior via the electrical stimulation of the esophageal nerve contingently reinforces each spontaneous bite during the feeding process. This results in the acquisition of operant memory by the contingently reinforced animals. Analysis of the cellular and molecular mechanisms of the feeding motor circuitry revealed that activity-dependent neuronal modulation occurs at the interneurons that mediate feeding behaviors. This provides evidence that interneurons are possible loci of plasticity and constitute another mechanism for memory storage in addition to memory storage attributed to activity-dependent synaptic plasticity. In this paper, an associative ambiguity correction-based neuro-fuzzy network, called appetitive reward-based pseudo-outer-product-compositional rule of inference [ARPOP-CRI(S)], is trained based on an appetitive reward-based learning algorithm which is biologically inspired by the appetitive operant conditioning of the feeding behavior in Aplysia. A variant of the Hebbian learning rule called Hebbian concomitant learning is proposed as the building block in the neuro-fuzzy network learning algorithm. The proposed algorithm possesses the distinguishing features of the sequential learning algorithm. In addition, the proposed ARPOP-CRI(S) neuro-fuzzy system encodes fuzzy knowledge in the form of linguistic rules that satisfies the semantic criteria for low-level fuzzy model interpretability. ARPOP-CRI(S) is evaluated and compared against other modeling techniques using benchmark time-series datasets. Experimental results are encouraging and show that ARPOP-CRI(S) is a viable modeling technique for time-variant problem domains.

  11. Automated adaptive inference of phenomenological dynamical models

    NASA Astrophysics Data System (ADS)

    Daniels, Bryan C.; Nemenman, Ilya

    2015-08-01

    Dynamics of complex systems is often driven by large and intricate networks of microscopic interactions, whose sheer size obfuscates understanding. With limited experimental data, many parameters of such dynamics are unknown, and thus detailed, mechanistic models risk overfitting and making faulty predictions. At the other extreme, simple ad hoc models often miss defining features of the underlying systems. Here we develop an approach that instead constructs phenomenological, coarse-grained models of network dynamics that automatically adapt their complexity to the available data. Such adaptive models produce accurate predictions even when microscopic details are unknown. The approach is computationally tractable, even for a relatively large number of dynamical variables. Using simulated data, it correctly infers the phase space structure for planetary motion, avoids overfitting in a biological signalling system and produces accurate predictions for yeast glycolysis with tens of data points and over half of the interacting species unobserved.

  12. Automated adaptive inference of phenomenological dynamical models

    PubMed Central

    Daniels, Bryan C.; Nemenman, Ilya

    2015-01-01

    Dynamics of complex systems is often driven by large and intricate networks of microscopic interactions, whose sheer size obfuscates understanding. With limited experimental data, many parameters of such dynamics are unknown, and thus detailed, mechanistic models risk overfitting and making faulty predictions. At the other extreme, simple ad hoc models often miss defining features of the underlying systems. Here we develop an approach that instead constructs phenomenological, coarse-grained models of network dynamics that automatically adapt their complexity to the available data. Such adaptive models produce accurate predictions even when microscopic details are unknown. The approach is computationally tractable, even for a relatively large number of dynamical variables. Using simulated data, it correctly infers the phase space structure for planetary motion, avoids overfitting in a biological signalling system and produces accurate predictions for yeast glycolysis with tens of data points and over half of the interacting species unobserved. PMID:26293508

  13. Statistical Inference for Data Adaptive Target Parameters.

    PubMed

    Hubbard, Alan E; Kherad-Pajouh, Sara; van der Laan, Mark J

    2016-05-01

    Consider one observes n i.i.d. copies of a random variable with a probability distribution that is known to be an element of a particular statistical model. In order to define our statistical target we partition the sample in V equal size sub-samples, and use this partitioning to define V splits in an estimation sample (one of the V subsamples) and corresponding complementary parameter-generating sample. For each of the V parameter-generating samples, we apply an algorithm that maps the sample to a statistical target parameter. We define our sample-split data adaptive statistical target parameter as the average of these V-sample specific target parameters. We present an estimator (and corresponding central limit theorem) of this type of data adaptive target parameter. This general methodology for generating data adaptive target parameters is demonstrated with a number of practical examples that highlight new opportunities for statistical learning from data. This new framework provides a rigorous statistical methodology for both exploratory and confirmatory analysis within the same data. Given that more research is becoming "data-driven", the theory developed within this paper provides a new impetus for a greater involvement of statistical inference into problems that are being increasingly addressed by clever, yet ad hoc pattern finding methods. To suggest such potential, and to verify the predictions of the theory, extensive simulation studies, along with a data analysis based on adaptively determined intervention rules are shown and give insight into how to structure such an approach. The results show that the data adaptive target parameter approach provides a general framework and resulting methodology for data-driven science.

  14. Inference in Adaptive Regression via the Kac-Rice Formula

    DTIC Science & Technology

    2014-05-15

    Inference in Adaptive Regression via the Kac- Rice Formula Jonathan Taylor∗, Joshua Loftus, Ryan J. Tibshirani Department of Statistics Stanford...general adaptive regression setting. Our approach uses the Kac- Rice formula (as described in Adler & Taylor 2007) applied to the problem of maximizing a...SUBTITLE Inference in Adaptive Regression via the Kac- Rice Formula 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) 5d

  15. Active Inference, homeostatic regulation and adaptive behavioural control.

    PubMed

    Pezzulo, Giovanni; Rigoli, Francesco; Friston, Karl

    2015-11-01

    We review a theory of homeostatic regulation and adaptive behavioural control within the Active Inference framework. Our aim is to connect two research streams that are usually considered independently; namely, Active Inference and associative learning theories of animal behaviour. The former uses a probabilistic (Bayesian) formulation of perception and action, while the latter calls on multiple (Pavlovian, habitual, goal-directed) processes for homeostatic and behavioural control. We offer a synthesis these classical processes and cast them as successive hierarchical contextualisations of sensorimotor constructs, using the generative models that underpin Active Inference. This dissolves any apparent mechanistic distinction between the optimization processes that mediate classical control or learning. Furthermore, we generalize the scope of Active Inference by emphasizing interoceptive inference and homeostatic regulation. The ensuing homeostatic (or allostatic) perspective provides an intuitive explanation for how priors act as drives or goals to enslave action, and emphasises the embodied nature of inference.

  16. Active Inference, homeostatic regulation and adaptive behavioural control

    PubMed Central

    Pezzulo, Giovanni; Rigoli, Francesco; Friston, Karl

    2015-01-01

    We review a theory of homeostatic regulation and adaptive behavioural control within the Active Inference framework. Our aim is to connect two research streams that are usually considered independently; namely, Active Inference and associative learning theories of animal behaviour. The former uses a probabilistic (Bayesian) formulation of perception and action, while the latter calls on multiple (Pavlovian, habitual, goal-directed) processes for homeostatic and behavioural control. We offer a synthesis these classical processes and cast them as successive hierarchical contextualisations of sensorimotor constructs, using the generative models that underpin Active Inference. This dissolves any apparent mechanistic distinction between the optimization processes that mediate classical control or learning. Furthermore, we generalize the scope of Active Inference by emphasizing interoceptive inference and homeostatic regulation. The ensuing homeostatic (or allostatic) perspective provides an intuitive explanation for how priors act as drives or goals to enslave action, and emphasises the embodied nature of inference. PMID:26365173

  17. Adaptive inference for distinguishing credible from incredible patterns in nature

    USGS Publications Warehouse

    Holling, Crawford S.; Allen, C.R.

    2002-01-01

    Strong inference is a powerful and rapid tool that can be used to identify and explain patterns in molecular biology, cell biology, and physiology. It is effective where causes are single and separable and where discrimination between pairwise alternative hypotheses can be determined experimentally by a simple yes or no answer. But causes in ecological systems are multiple and overlapping and are not entirely separable. Frequently, competing hypotheses cannot be distinguished by a single unambiguous test, but only by a suite of tests of different kinds, that produce a body of evidence to support one line of argument and not others. We call this process "adaptive inference". Instead of pitting each member of a pair of hypotheses against each other, adaptive inference relies on the exuberant invention of multiple, competing hypotheses, after which carefully structured comparative data are used to explore the logical consequences of each. Herein we present an example that demonstrates the attributes of adaptive inference that have developed out of a 30-year study of the resilience of ecosystems.

  18. GenSo-EWS: a novel neural-fuzzy based early warning system for predicting bank failures.

    PubMed

    Tung, W L; Quek, C; Cheng, P

    2004-05-01

    Bank failure prediction is an important issue for the regulators of the banking industries. The collapse and failure of a bank could trigger an adverse financial repercussion and generate negative impacts such as a massive bail out cost for the failing bank and loss of confidence from the investors and depositors. Very often, bank failures are due to financial distress. Hence, it is desirable to have an early warning system (EWS) that identifies potential bank failure or high-risk banks through the traits of financial distress. Various traditional statistical models have been employed to study bank failures [J Finance 1 (1975) 21; J Banking Finance 1 (1977) 249; J Banking Finance 10 (1986) 511; J Banking Finance 19 (1995) 1073]. However, these models do not have the capability to identify the characteristics of financial distress and thus function as black boxes. This paper proposes the use of a new neural fuzzy system [Foundations of neuro-fuzzy systems, 1997], namely the Generic Self-organising Fuzzy Neural Network (GenSoFNN) [IEEE Trans Neural Networks 13 (2002c) 1075] based on the compositional rule of inference (CRI) [Commun ACM 37 (1975) 77], as an alternative to predict banking failure. The CRI based GenSoFNN neural fuzzy network, henceforth denoted as GenSoFNN-CRI(S), functions as an EWS and is able to identify the inherent traits of financial distress based on financial covariates (features) derived from publicly available financial statements. The interaction between the selected features is captured in the form of highly intuitive IF-THEN fuzzy rules. Such easily comprehensible rules provide insights into the possible characteristics of financial distress and form the knowledge base for a highly desired EWS that aids bank regulation. The performance of the GenSoFNN-CRI(S) network is subsequently benchmarked against that of the Cox's proportional hazards model [J Banking Finance 10 (1986) 511; J Banking Finance 19 (1995) 1073], the multi

  19. A neural fuzzy controller learning by fuzzy error propagation

    NASA Technical Reports Server (NTRS)

    Nauck, Detlef; Kruse, Rudolf

    1992-01-01

    In this paper, we describe a procedure to integrate techniques for the adaptation of membership functions in a linguistic variable based fuzzy control environment by using neural network learning principles. This is an extension to our work. We solve this problem by defining a fuzzy error that is propagated back through the architecture of our fuzzy controller. According to this fuzzy error and the strength of its antecedent each fuzzy rule determines its amount of error. Depending on the current state of the controlled system and the control action derived from the conclusion, each rule tunes the membership functions of its antecedent and its conclusion. By this we get an unsupervised learning technique that enables a fuzzy controller to adapt to a control task by knowing just about the global state and the fuzzy error.

  20. Adaptive cluster expansion for inferring boltzmann machines with noisy data.

    PubMed

    Cocco, S; Monasson, R

    2011-03-04

    We introduce a procedure to infer the interactions among a set of binary variables, based on their sampled frequencies and pairwise correlations. The algorithm builds the clusters of variables contributing most to the entropy of the inferred Ising model and rejects the small contributions due to the sampling noise. Our procedure successfully recovers benchmark Ising models even at criticality and in the low temperature phase, and is applied to neurobiological data.

  1. Adaptive Cluster Expansion for Inferring Boltzmann Machines with Noisy Data

    NASA Astrophysics Data System (ADS)

    Cocco, S.; Monasson, R.

    2011-03-01

    We introduce a procedure to infer the interactions among a set of binary variables, based on their sampled frequencies and pairwise correlations. The algorithm builds the clusters of variables contributing most to the entropy of the inferred Ising model and rejects the small contributions due to the sampling noise. Our procedure successfully recovers benchmark Ising models even at criticality and in the low temperature phase, and is applied to neurobiological data.

  2. An Interval Type-2 Neural Fuzzy System for Online System Identification and Feature Elimination.

    PubMed

    Lin, Chin-Teng; Pal, Nikhil R; Wu, Shang-Lin; Liu, Yu-Ting; Lin, Yang-Yin

    2015-07-01

    We propose an integrated mechanism for discarding derogatory features and extraction of fuzzy rules based on an interval type-2 neural fuzzy system (NFS)-in fact, it is a more general scheme that can discard bad features, irrelevant antecedent clauses, and even irrelevant rules. High-dimensional input variable and a large number of rules not only enhance the computational complexity of NFSs but also reduce their interpretability. Therefore, a mechanism for simultaneous extraction of fuzzy rules and reducing the impact of (or eliminating) the inferior features is necessary. The proposed approach, namely an interval type-2 Neural Fuzzy System for online System Identification and Feature Elimination (IT2NFS-SIFE), uses type-2 fuzzy sets to model uncertainties associated with information and data in designing the knowledge base. The consequent part of the IT2NFS-SIFE is of Takagi-Sugeno-Kang type with interval weights. The IT2NFS-SIFE possesses a self-evolving property that can automatically generate fuzzy rules. The poor features can be discarded through the concept of a membership modulator. The antecedent and modulator weights are learned using a gradient descent algorithm. The consequent part weights are tuned via the rule-ordered Kalman filter algorithm to enhance learning effectiveness. Simulation results show that IT2NFS-SIFE not only simplifies the system architecture by eliminating derogatory/irrelevant antecedent clauses, rules, and features but also maintains excellent performance.

  3. Some challenges with statistical inference in adaptive designs.

    PubMed

    Hung, H M James; Wang, Sue-Jane; Yang, Peiling

    2014-01-01

    Adaptive designs have generated a great deal of attention to clinical trial communities. The literature contains many statistical methods to deal with added statistical uncertainties concerning the adaptations. Increasingly encountered in regulatory applications are adaptive statistical information designs that allow modification of sample size or related statistical information and adaptive selection designs that allow selection of doses or patient populations during the course of a clinical trial. For adaptive statistical information designs, a few statistical testing methods are mathematically equivalent, as a number of articles have stipulated, but arguably there are large differences in their practical ramifications. We pinpoint some undesirable features of these methods in this work. For adaptive selection designs, the selection based on biomarker data for testing the correlated clinical endpoints may increase statistical uncertainty in terms of type I error probability, and most importantly the increased statistical uncertainty may be impossible to assess.

  4. Action adaptation during natural unfolding social scenes influences action recognition and inferences made about actor beliefs.

    PubMed

    Keefe, Bruce D; Wincenciak, Joanna; Jellema, Tjeerd; Ward, James W; Barraclough, Nick E

    2016-07-01

    When observing another individual's actions, we can both recognize their actions and infer their beliefs concerning the physical and social environment. The extent to which visual adaptation influences action recognition and conceptually later stages of processing involved in deriving the belief state of the actor remains unknown. To explore this we used virtual reality (life-size photorealistic actors presented in stereoscopic three dimensions) to see how visual adaptation influences the perception of individuals in naturally unfolding social scenes at increasingly higher levels of action understanding. We presented scenes in which one actor picked up boxes (of varying number and weight), after which a second actor picked up a single box. Adaptation to the first actor's behavior systematically changed perception of the second actor. Aftereffects increased with the duration of the first actor's behavior, declined exponentially over time, and were independent of view direction. Inferences about the second actor's expectation of box weight were also distorted by adaptation to the first actor. Distortions in action recognition and actor expectations did not, however, extend across different actions, indicating that adaptation is not acting at an action-independent abstract level but rather at an action-dependent level. We conclude that although adaptation influences more complex inferences about belief states of individuals, this is likely to be a result of adaptation at an earlier action recognition stage rather than adaptation operating at a higher, more abstract level in mentalizing or simulation systems.

  5. Outcome-adaptive lasso: Variable selection for causal inference.

    PubMed

    Shortreed, Susan M; Ertefaie, Ashkan

    2017-03-08

    Methodological advancements, including propensity score methods, have resulted in improved unbiased estimation of treatment effects from observational data. Traditionally, a "throw in the kitchen sink" approach has been used to select covariates for inclusion into the propensity score, but recent work shows including unnecessary covariates can impact both the bias and statistical efficiency of propensity score estimators. In particular, the inclusion of covariates that impact exposure but not the outcome, can inflate standard errors without improving bias, while the inclusion of covariates associated with the outcome but unrelated to exposure can improve precision. We propose the outcome-adaptive lasso for selecting appropriate covariates for inclusion in propensity score models to account for confounding bias and maintaining statistical efficiency. This proposed approach can perform variable selection in the presence of a large number of spurious covariates, that is, covariates unrelated to outcome or exposure. We present theoretical and simulation results indicating that the outcome-adaptive lasso selects the propensity score model that includes all true confounders and predictors of outcome, while excluding other covariates. We illustrate covariate selection using the outcome-adaptive lasso, including comparison to alternative approaches, using simulated data and in a survey of patients using opioid therapy to manage chronic pain.

  6. INTEGRATING EVOLUTIONARY AND FUNCTIONAL APPROACHES TO INFER ADAPTATION AT SPECIFIC LOCI

    PubMed Central

    Storz, Jay F.; Wheat, Christopher W.

    2010-01-01

    Inferences about adaptation at specific loci are often exclusively based on the static analysis of DNA sequence variation. Ideally, population-genetic evidence for positive selection serves as a stepping-off point for experimental studies to elucidate the functional significance of the putatively adaptive variation. We argue that inferences about adaptation at specific loci are best achieved by integrating the indirect, retrospective insights provided by population-genetic analyses with the more direct, mechanistic insights provided by functional experiments. Integrative studies of adaptive genetic variation may sometimes be motivated by experimental insights into molecular function, which then provide the impetus to perform population genetic tests to evaluate whether the functional variation is of adaptive significance. In other cases, studies may be initiated by genome scans of DNA variation to identify candidate loci for recent adaptation. Results of such analyses can then motivate experimental efforts to test whether the identified candidate loci do in fact contribute to functional variation in some fitness-related phenotype. Functional studies can provide corroborative evidence for positive selection at particular loci, and can potentially reveal specific molecular mechanisms of adaptation. PMID:20500215

  7. Specificity and timescales of cortical adaptation as inferences about natural movie statistics

    PubMed Central

    Snow, Michoel; Coen-Cagli, Ruben; Schwartz, Odelia

    2016-01-01

    Adaptation is a phenomenological umbrella term under which a variety of temporal contextual effects are grouped. Previous models have shown that some aspects of visual adaptation reflect optimal processing of dynamic visual inputs, suggesting that adaptation should be tuned to the properties of natural visual inputs. However, the link between natural dynamic inputs and adaptation is poorly understood. Here, we extend a previously developed Bayesian modeling framework for spatial contextual effects to the temporal domain. The model learns temporal statistical regularities of natural movies and links these statistics to adaptation in primary visual cortex via divisive normalization, a ubiquitous neural computation. In particular, the model divisively normalizes the present visual input by the past visual inputs only to the degree that these are inferred to be statistically dependent. We show that this flexible form of normalization reproduces classical findings on how brief adaptation affects neuronal selectivity. Furthermore, prior knowledge acquired by the Bayesian model from natural movies can be modified by prolonged exposure to novel visual stimuli. We show that this updating can explain classical results on contrast adaptation. We also simulate the recent finding that adaptation maintains population homeostasis, namely, a balanced level of activity across a population of neurons with different orientation preferences. Consistent with previous disparate observations, our work further clarifies the influence of stimulus-specific and neuronal-specific normalization signals in adaptation. PMID:27699416

  8. Review of Medical Image Classification using the Adaptive Neuro-Fuzzy Inference System

    PubMed Central

    Hosseini, Monireh Sheikh; Zekri, Maryam

    2012-01-01

    Image classification is an issue that utilizes image processing, pattern recognition and classification methods. Automatic medical image classification is a progressive area in image classification, and it is expected to be more developed in the future. Because of this fact, automatic diagnosis can assist pathologists by providing second opinions and reducing their workload. This paper reviews the application of the adaptive neuro-fuzzy inference system (ANFIS) as a classifier in medical image classification during the past 16 years. ANFIS is a fuzzy inference system (FIS) implemented in the framework of an adaptive fuzzy neural network. It combines the explicit knowledge representation of an FIS with the learning power of artificial neural networks. The objective of ANFIS is to integrate the best features of fuzzy systems and neural networks. A brief comparison with other classifiers, main advantages and drawbacks of this classifier are investigated. PMID:23493054

  9. Review of Medical Image Classification using the Adaptive Neuro-Fuzzy Inference System.

    PubMed

    Hosseini, Monireh Sheikh; Zekri, Maryam

    2012-01-01

    Image classification is an issue that utilizes image processing, pattern recognition and classification methods. Automatic medical image classification is a progressive area in image classification, and it is expected to be more developed in the future. Because of this fact, automatic diagnosis can assist pathologists by providing second opinions and reducing their workload. This paper reviews the application of the adaptive neuro-fuzzy inference system (ANFIS) as a classifier in medical image classification during the past 16 years. ANFIS is a fuzzy inference system (FIS) implemented in the framework of an adaptive fuzzy neural network. It combines the explicit knowledge representation of an FIS with the learning power of artificial neural networks. The objective of ANFIS is to integrate the best features of fuzzy systems and neural networks. A brief comparison with other classifiers, main advantages and drawbacks of this classifier are investigated.

  10. Adaptability and phenotypic stability of common bean genotypes through Bayesian inference.

    PubMed

    Corrêa, A M; Teodoro, P E; Gonçalves, M C; Barroso, L M A; Nascimento, M; Santos, A; Torres, F E

    2016-04-27

    This study used Bayesian inference to investigate the genotype x environment interaction in common bean grown in Mato Grosso do Sul State, and it also evaluated the efficiency of using informative and minimally informative a priori distributions. Six trials were conducted in randomized blocks, and the grain yield of 13 common bean genotypes was assessed. To represent the minimally informative a priori distributions, a probability distribution with high variance was used, and a meta-analysis concept was adopted to represent the informative a priori distributions. Bayes factors were used to conduct comparisons between the a priori distributions. The Bayesian inference was effective for the selection of upright common bean genotypes with high adaptability and phenotypic stability using the Eberhart and Russell method. Bayes factors indicated that the use of informative a priori distributions provided more accurate results than minimally informative a priori distributions. According to Bayesian inference, the EMGOPA-201, BAMBUÍ, CNF 4999, CNF 4129 A 54, and CNFv 8025 genotypes had specific adaptability to favorable environments, while the IAPAR 14 and IAC CARIOCA ETE genotypes had specific adaptability to unfavorable environments.

  11. Perspectives of probabilistic inferences: Reinforcement learning and an adaptive network compared.

    PubMed

    Rieskamp, Jörg

    2006-11-01

    The assumption that people possess a strategy repertoire for inferences has been raised repeatedly. The strategy selection learning theory specifies how people select strategies from this repertoire. The theory assumes that individuals select strategies proportional to their subjective expectations of how well the strategies solve particular problems; such expectations are assumed to be updated by reinforcement learning. The theory is compared with an adaptive network model that assumes people make inferences by integrating information according to a connectionist network. The network's weights are modified by error correction learning. The theories were tested against each other in 2 experimental studies. Study 1 showed that people substantially improved their inferences through feedback, which was appropriately predicted by the strategy selection learning theory. Study 2 examined a dynamic environment in which the strategies' performances changed. In this situation a quick adaptation to the new situation was not observed; rather, individuals got stuck on the strategy they had successfully applied previously. This "inertia effect" was most strongly predicted by the strategy selection learning theory.

  12. Statistical inference for response adaptive randomization procedures with adjusted optimal allocation proportions.

    PubMed

    Zhu, Hongjian

    2016-12-12

    Seamless phase II/III clinical trials have attracted increasing attention recently. They mainly use Bayesian response adaptive randomization (RAR) designs. There has been little research into seamless clinical trials using frequentist RAR designs because of the difficulty in performing valid statistical inference following this procedure. The well-designed frequentist RAR designs can target theoretically optimal allocation proportions, and they have explicit asymptotic results. In this paper, we study the asymptotic properties of frequentist RAR designs with adjusted target allocation proportions, and investigate statistical inference for this procedure. The properties of the proposed design provide an important theoretical foundation for advanced seamless clinical trials. Our numerical studies demonstrate that the design is ethical and efficient.

  13. Classification of diabetes maculopathy images using data-adaptive neuro-fuzzy inference classifier.

    PubMed

    Ibrahim, Sulaimon; Chowriappa, Pradeep; Dua, Sumeet; Acharya, U Rajendra; Noronha, Kevin; Bhandary, Sulatha; Mugasa, Hatwib

    2015-12-01

    Prolonged diabetes retinopathy leads to diabetes maculopathy, which causes gradual and irreversible loss of vision. It is important for physicians to have a decision system that detects the early symptoms of the disease. This can be achieved by building a classification model using machine learning algorithms. Fuzzy logic classifiers group data elements with a degree of membership in multiple classes by defining membership functions for each attribute. Various methods have been proposed to determine the partitioning of membership functions in a fuzzy logic inference system. A clustering method partitions the membership functions by grouping data that have high similarity into clusters, while an equalized universe method partitions data into predefined equal clusters. The distribution of each attribute determines its partitioning as fine or coarse. A simple grid partitioning partitions each attribute equally and is therefore not effective in handling varying distribution amongst the attributes. A data-adaptive method uses a data frequency-driven approach to partition each attribute based on the distribution of data in that attribute. A data-adaptive neuro-fuzzy inference system creates corresponding rules for both finely distributed and coarsely distributed attributes. This method produced more useful rules and a more effective classification system. We obtained an overall accuracy of 98.55%.

  14. Design and Inference for the Intent to Treat Principle using Adaptive Treatment Strategies and Sequential Randomization

    PubMed Central

    Dawson, Ree; Lavori, Philip W.

    2015-01-01

    Nonadherence to assigned treatment jeopardizes the power and interpretability of intent-to-treat comparisons from clinical trial data and continues to be an issue for effectiveness studies, despite their pragmatic emphasis. We posit that new approaches to design need to complement developments in methods for causal inference to address nonadherence, in both experimental and practice settings. This paper considers the conventional study design for psychiatric research and other medical contexts, in which subjects are randomized to treatments that are fixed throughout the trial and presents an alternative that converts the fixed treatments into an adaptive intervention that reflects best practice. The key element is the introduction of an adaptive decision point midway into the study to address a patient's reluctance to remain on treatment before completing a full-length trial of medication. The clinical uncertainty about the appropriate adaptation prompts a second randomization at the new decision point to evaluate relevant options. Additionally, the standard ‘all-or-none’ principal stratification (PS) framework is applied to the first stage of the design to address treatment discontinuation that occurs too early for a mid-trial adaptation. Drawing upon the adaptive intervention features, we develop assumptions to identify the PS causal estimand and introduce restrictions on outcome distributions to simplify Expectation-Maximization calculations. We evaluate the performance of the PS setup, with particular attention to the role played by a binary covariate. The results emphasize the importance of collecting covariate data for use in design and analysis. We consider the generality of our approach beyond the setting of psychiatric research. PMID:25581413

  15. Inference for Optimal Dynamic Treatment Regimes using an Adaptive m-out-of-n Bootstrap Scheme

    PubMed Central

    Chakraborty, Bibhas; Laber, Eric B.; Zhao, Yingqi

    2013-01-01

    Summary A dynamic treatment regime consists of a set of decision rules that dictate how to individualize treatment to patients based on available treatment and covariate history. A common method for estimating an optimal dynamic treatment regime from data is Q-learning which involves nonsmooth operations of the data. This nonsmoothness causes standard asymptotic approaches for inference like the bootstrap or Taylor series arguments to breakdown if applied without correction. Here, we consider the m-out-of-n bootstrap for constructing confidence intervals for the parameters indexing the optimal dynamic regime. We propose an adaptive choice of m and show that it produces asymptotically correct confidence sets under fixed alternatives. Furthermore, the proposed method has the advantage of being conceptually and computationally much more simple than competing methods possessing this same theoretical property. We provide an extensive simulation study to compare the proposed method with currently available inference procedures. The results suggest that the proposed method delivers nominal coverage while being less conservative than alternatives. The proposed methods are implemented in the qLearn R-package and have been made available on the Comprehensive R-Archive Network (http://cran.r-project.org/). Analysis of the Sequenced Treatment Alternatives to Relieve Depression (STAR*D) study is used as an illustrative example. PMID:23845276

  16. Adaptive Path Selection for Link Loss Inference in Network Tomography Applications

    PubMed Central

    Qiao, Yan; Jiao, Jun; Rao, Yuan; Ma, Huimin

    2016-01-01

    In this study, we address the problem of selecting the optimal end-to-end paths for link loss inference in order to improve the performance of network tomography applications, which infer the link loss rates from the path loss rates. Measuring the path loss rates using end-to-end probing packets may incur additional traffic overheads for networks, so it is important to select the minimum path set carefully while maximizing their performance. The usual approach is to select the maximum independent paths from the candidates simultaneously, while the other paths can be replaced by linear combinations of them. However, this approach ignores the fact that many paths always exist that do not lose any packets, and thus it is easy to determine that all of the links of these paths also have 0 loss rates. Not considering these good paths will inevitably lead to inefficiency and high probing costs. Thus, we propose an adaptive path selection method that selects paths sequentially based on the loss rates of previously selected paths. We also propose a theorem as well as a graph construction and decomposition approach to efficiently find the most valuable path during each round of selection. Our new method significantly outperforms the classical path selection method based on simulations in terms of the probing cost, number of accurate links determined, and the running speed. PMID:27701447

  17. On an adaptive preconditioned Crank-Nicolson MCMC algorithm for infinite dimensional Bayesian inference

    NASA Astrophysics Data System (ADS)

    Hu, Zixi; Yao, Zhewei; Li, Jinglai

    2017-03-01

    Many scientific and engineering problems require to perform Bayesian inference for unknowns of infinite dimension. In such problems, many standard Markov Chain Monte Carlo (MCMC) algorithms become arbitrary slow under the mesh refinement, which is referred to as being dimension dependent. To this end, a family of dimensional independent MCMC algorithms, known as the preconditioned Crank-Nicolson (pCN) methods, were proposed to sample the infinite dimensional parameters. In this work we develop an adaptive version of the pCN algorithm, where the covariance operator of the proposal distribution is adjusted based on sampling history to improve the simulation efficiency. We show that the proposed algorithm satisfies an important ergodicity condition under some mild assumptions. Finally we provide numerical examples to demonstrate the performance of the proposed method.

  18. Evolving RBF neural networks for adaptive soft-sensor design.

    PubMed

    Alexandridis, Alex

    2013-12-01

    This work presents an adaptive framework for building soft-sensors based on radial basis function (RBF) neural network models. The adaptive fuzzy means algorithm is utilized in order to evolve an RBF network, which approximates the unknown system based on input-output data from it. The methodology gradually builds the RBF network model, based on two separate levels of adaptation: On the first level, the structure of the hidden layer is modified by adding or deleting RBF centers, while on the second level, the synaptic weights are adjusted with the recursive least squares with exponential forgetting algorithm. The proposed approach is tested on two different systems, namely a simulated nonlinear DC Motor and a real industrial reactor. The results show that the produced soft-sensors can be successfully applied to model the two nonlinear systems. A comparison with two different adaptive modeling techniques, namely a dynamic evolving neural-fuzzy inference system (DENFIS) and neural networks trained with online backpropagation, highlights the advantages of the proposed methodology.

  19. Hydrological time series modeling: A comparison between adaptive neuro-fuzzy, neural network and autoregressive techniques

    NASA Astrophysics Data System (ADS)

    Lohani, A. K.; Kumar, Rakesh; Singh, R. D.

    2012-06-01

    SummaryTime series modeling is necessary for the planning and management of reservoirs. More recently, the soft computing techniques have been used in hydrological modeling and forecasting. In this study, the potential of artificial neural networks and neuro-fuzzy system in monthly reservoir inflow forecasting are examined by developing and comparing monthly reservoir inflow prediction models, based on autoregressive (AR), artificial neural networks (ANNs) and adaptive neural-based fuzzy inference system (ANFIS). To take care the effect of monthly periodicity in the flow data, cyclic terms are also included in the ANN and ANFIS models. Working with time series flow data of the Sutlej River at Bhakra Dam, India, several ANN and adaptive neuro-fuzzy models are trained with different input vectors. To evaluate the performance of the selected ANN and adaptive neural fuzzy inference system (ANFIS) models, comparison is made with the autoregressive (AR) models. The ANFIS model trained with the input data vector including previous inflows and cyclic terms of monthly periodicity has shown a significant improvement in the forecast accuracy in comparison with the ANFIS models trained with the input vectors considering only previous inflows. In all cases ANFIS gives more accurate forecast than the AR and ANN models. The proposed ANFIS model coupled with the cyclic terms is shown to provide better representation of the monthly inflow forecasting for planning and operation of reservoir.

  20. Bayesian inference for an adaptive Ordered Probit model: an application to Brain Computer Interfacing.

    PubMed

    Yoon, Ji Won; Roberts, Stephen J; Dyson, Mathew; Gan, John Q

    2011-09-01

    This paper proposes an algorithm for adaptive, sequential classification in systems with unknown labeling errors, focusing on the biomedical application of Brain Computer Interfacing (BCI). The method is shown to be robust in the presence of label and sensor noise. We focus on the inference and prediction of target labels under a nonlinear and non-Gaussian model. In order to handle missing or erroneous labeling, we model observed labels as a noisy observation of a latent label set with multiple classes (≥ 2). Whilst this paper focuses on the method's application to BCI systems, the algorithm has the potential to be applied to many application domains in which sequential missing labels are to be imputed in the presence of uncertainty. This dynamic classification algorithm combines an Ordered Probit model and an Extended Kalman Filter (EKF). The EKF estimates the parameters of the Ordered Probit model sequentially with time. We test the performance of the classification approach by processing synthetic datasets and real experimental EEG signals with multiple classes (2, 3 and 4 labels) for a Brain Computer Interfacing (BCI) experiment.

  1. Racing to learn: statistical inference and learning in a single spiking neuron with adaptive kernels

    PubMed Central

    Afshar, Saeed; George, Libin; Tapson, Jonathan; van Schaik, André; Hamilton, Tara J.

    2014-01-01

    This paper describes the Synapto-dendritic Kernel Adapting Neuron (SKAN), a simple spiking neuron model that performs statistical inference and unsupervised learning of spatiotemporal spike patterns. SKAN is the first proposed neuron model to investigate the effects of dynamic synapto-dendritic kernels and demonstrate their computational power even at the single neuron scale. The rule-set defining the neuron is simple: there are no complex mathematical operations such as normalization, exponentiation or even multiplication. The functionalities of SKAN emerge from the real-time interaction of simple additive and binary processes. Like a biological neuron, SKAN is robust to signal and parameter noise, and can utilize both in its operations. At the network scale neurons are locked in a race with each other with the fastest neuron to spike effectively “hiding” its learnt pattern from its neighbors. The robustness to noise, high speed, and simple building blocks not only make SKAN an interesting neuron model in computational neuroscience, but also make it ideal for implementation in digital and analog neuromorphic systems which is demonstrated through an implementation in a Field Programmable Gate Array (FPGA). Matlab, Python, and Verilog implementations of SKAN are available at: http://www.uws.edu.au/bioelectronics_neuroscience/bens/reproducible_research. PMID:25505378

  2. Adaptive neuro-fuzzy inference system for real-time monitoring of integrated-constructed wetlands.

    PubMed

    Dzakpasu, Mawuli; Scholz, Miklas; McCarthy, Valerie; Jordan, Siobhán; Sani, Abdulkadir

    2015-01-01

    Monitoring large-scale treatment wetlands is costly and time-consuming, but required by regulators. Some analytical results are available only after 5 days or even longer. Thus, adaptive neuro-fuzzy inference system (ANFIS) models were developed to predict the effluent concentrations of 5-day biochemical oxygen demand (BOD5) and NH4-N from a full-scale integrated constructed wetland (ICW) treating domestic wastewater. The ANFIS models were developed and validated with a 4-year data set from the ICW system. Cost-effective, quicker and easier to measure variables were selected as the possible predictors based on their goodness of correlation with the outputs. A self-organizing neural network was applied to extract the most relevant input variables from all the possible input variables. Fuzzy subtractive clustering was used to identify the architecture of the ANFIS models and to optimize fuzzy rules, overall, improving the network performance. According to the findings, ANFIS could predict the effluent quality variation quite strongly. Effluent BOD5 and NH4-N concentrations were predicted relatively accurately by other effluent water quality parameters, which can be measured within a few hours. The simulated effluent BOD5 and NH4-N concentrations well fitted the measured concentrations, which was also supported by relatively low mean squared error. Thus, ANFIS can be useful for real-time monitoring and control of ICW systems.

  3. Multiple Adaptive Neuro-Fuzzy Inference System with Automatic Features Extraction Algorithm for Cervical Cancer Recognition

    PubMed Central

    Subhi Al-batah, Mohammad; Mat Isa, Nor Ashidi; Klaib, Mohammad Fadel; Al-Betar, Mohammed Azmi

    2014-01-01

    To date, cancer of uterine cervix is still a leading cause of cancer-related deaths in women worldwide. The current methods (i.e., Pap smear and liquid-based cytology (LBC)) to screen for cervical cancer are time-consuming and dependent on the skill of the cytopathologist and thus are rather subjective. Therefore, this paper presents an intelligent computer vision system to assist pathologists in overcoming these problems and, consequently, produce more accurate results. The developed system consists of two stages. In the first stage, the automatic features extraction (AFE) algorithm is performed. In the second stage, a neuro-fuzzy model called multiple adaptive neuro-fuzzy inference system (MANFIS) is proposed for recognition process. The MANFIS contains a set of ANFIS models which are arranged in parallel combination to produce a model with multi-input-multioutput structure. The system is capable of classifying cervical cell image into three groups, namely, normal, low-grade squamous intraepithelial lesion (LSIL) and high-grade squamous intraepithelial lesion (HSIL). The experimental results prove the capability of the AFE algorithm to be as effective as the manual extraction by human experts, while the proposed MANFIS produces a good classification performance with 94.2% accuracy. PMID:24707316

  4. Intelligent Modeling Combining Adaptive Neuro Fuzzy Inference System and Genetic Algorithm for Optimizing Welding Process Parameters

    NASA Astrophysics Data System (ADS)

    Gowtham, K. N.; Vasudevan, M.; Maduraimuthu, V.; Jayakumar, T.

    2011-04-01

    Modified 9Cr-1Mo ferritic steel is used as a structural material for steam generator components of power plants. Generally, tungsten inert gas (TIG) welding is preferred for welding of these steels in which the depth of penetration achievable during autogenous welding is limited. Therefore, activated flux TIG (A-TIG) welding, a novel welding technique, has been developed in-house to increase the depth of penetration. In modified 9Cr-1Mo steel joints produced by the A-TIG welding process, weld bead width, depth of penetration, and heat-affected zone (HAZ) width play an important role in determining the mechanical properties as well as the performance of the weld joints during service. To obtain the desired weld bead geometry and HAZ width, it becomes important to set the welding process parameters. In this work, adaptative neuro fuzzy inference system is used to develop independent models correlating the welding process parameters like current, voltage, and torch speed with weld bead shape parameters like depth of penetration, bead width, and HAZ width. Then a genetic algorithm is employed to determine the optimum A-TIG welding process parameters to obtain the desired weld bead shape parameters and HAZ width.

  5. Racing to learn: statistical inference and learning in a single spiking neuron with adaptive kernels.

    PubMed

    Afshar, Saeed; George, Libin; Tapson, Jonathan; van Schaik, André; Hamilton, Tara J

    2014-01-01

    This paper describes the Synapto-dendritic Kernel Adapting Neuron (SKAN), a simple spiking neuron model that performs statistical inference and unsupervised learning of spatiotemporal spike patterns. SKAN is the first proposed neuron model to investigate the effects of dynamic synapto-dendritic kernels and demonstrate their computational power even at the single neuron scale. The rule-set defining the neuron is simple: there are no complex mathematical operations such as normalization, exponentiation or even multiplication. The functionalities of SKAN emerge from the real-time interaction of simple additive and binary processes. Like a biological neuron, SKAN is robust to signal and parameter noise, and can utilize both in its operations. At the network scale neurons are locked in a race with each other with the fastest neuron to spike effectively "hiding" its learnt pattern from its neighbors. The robustness to noise, high speed, and simple building blocks not only make SKAN an interesting neuron model in computational neuroscience, but also make it ideal for implementation in digital and analog neuromorphic systems which is demonstrated through an implementation in a Field Programmable Gate Array (FPGA). Matlab, Python, and Verilog implementations of SKAN are available at: http://www.uws.edu.au/bioelectronics_neuroscience/bens/reproducible_research.

  6. Classifying work rate from heart rate measurements using an adaptive neuro-fuzzy inference system.

    PubMed

    Kolus, Ahmet; Imbeau, Daniel; Dubé, Philippe-Antoine; Dubeau, Denise

    2016-05-01

    In a new approach based on adaptive neuro-fuzzy inference systems (ANFIS), field heart rate (HR) measurements were used to classify work rate into four categories: very light, light, moderate, and heavy. Inter-participant variability (physiological and physical differences) was considered. Twenty-eight participants performed Meyer and Flenghi's step-test and a maximal treadmill test, during which heart rate and oxygen consumption (VO2) were measured. Results indicated that heart rate monitoring (HR, HRmax, and HRrest) and body weight are significant variables for classifying work rate. The ANFIS classifier showed superior sensitivity, specificity, and accuracy compared to current practice using established work rate categories based on percent heart rate reserve (%HRR). The ANFIS classifier showed an overall 29.6% difference in classification accuracy and a good balance between sensitivity (90.7%) and specificity (95.2%) on average. With its ease of implementation and variable measurement, the ANFIS classifier shows potential for widespread use by practitioners for work rate assessment.

  7. Proteomics-inferred genome typing (PIGT) demonstrates inter-populationrecombination as a strategy for environmental adaptation

    SciTech Connect

    Denef, Vincent; Verberkmoes, Nathan C; Shah, Manesh B; Abraham, Paul E; Lefsrud, Mark G; Hettich, Robert {Bob} L; Banfield, Jillian F.

    2009-01-01

    Analyses of ecological and evolutionary processes that shape microbial consortia are facilitated by comprehensive studies of ecosystems with low species richness. In the current study we evaluated the role of recombination in altering the fitness of chemoautotrophic bacteria in their natural environment. Proteomics-inferred genome typing (PIGT) was used to determine the genomic make-up of Leptospirillum group II populations in 27 biofilms sampled from six locations in the Richmond Mine acid mine drainage system (Iron Mountain, CA) over a four-year period. We observed six distinct genotypes that are recombinants comprised of segments from two parental genotypes. Community genomic analyses revealed additional low abundance recombinant variants. The dominance of some genotypes despite a larger available genome pool, and patterns of spatiotemporal distribution within the ecosystem, indicate selection for distinct recombinants. Genes involved in motility, signal transduction and transport were overrepresented in the tens to hundreds of kilobase recombinant blocks, whereas core metabolic functions were significantly underrepresented. Our findings demonstrate the power of PIGT and reveal that recombination is a mechanism for fine-scale adaptation in this system.

  8. Limitations of a morphological criterion of adaptive inference in the fossil record.

    PubMed

    Ravosa, Matthew J; Menegaz, Rachel A; Scott, Jeremiah E; Daegling, David J; McAbee, Kevin R

    2016-11-01

    Experimental analyses directly inform how an anatomical feature or complex functions during an organism's lifetime, which serves to increase the efficacy of comparative studies of living and fossil taxa. In the mammalian skull, food material properties and feeding behaviour have a pronounced influence on the development of the masticatory apparatus. Diet-related variation in loading magnitude and frequency induce a cascade of changes at the gross, tissue, cellular, protein and genetic levels, with such modelling and remodelling maintaining the integrity of oral structures vis-à-vis routine masticatory stresses. Ongoing integrative research using rabbit and rat models of long-term masticatory plasticity offers unique insight into the limitations of functional interpretations of fossilised remains. Given the general restriction of the palaeontological record to bony elements, we argue that failure to account for the disparity in the hierarchical network of responses of hard versus soft tissues may overestimate the magnitude of the adaptive divergence that is inferred from phenotypic differences. Second, we note that the developmental onset and duration of a loading stimulus associated with a given feeding behaviour can impart large effects on patterns of intraspecific variation that can mirror differences observed among taxa. Indeed, plasticity data are relevant to understanding evolutionary transformations because rabbits raised on different diets exhibit levels of morphological disparity comparable to those found between closely related primate species that vary in diet. Lastly, pronounced variation in joint form, and even joint function, can also characterise adult conspecifics that differ solely in age. In sum, our analyses emphasise the importance of a multi-site and hierarchical approach to understanding determinants of morphological variation, one which incorporates critical data on performance.

  9. Adaptive evolution of chloroplast genome structure inferred using a parametric bootstrap approach

    PubMed Central

    Cui, Liying; Leebens-Mack, Jim; Wang, Li-San; Tang, Jijun; Rymarquis, Linda; Stern, David B; dePamphilis, Claude W

    2006-01-01

    Background Genome rearrangements influence gene order and configuration of gene clusters in all genomes. Most land plant chloroplast DNAs (cpDNAs) share a highly conserved gene content and with notable exceptions, a largely co-linear gene order. Conserved gene orders may reflect a slow intrinsic rate of neutral chromosomal rearrangements, or selective constraint. It is unknown to what extent observed changes in gene order are random or adaptive. We investigate the influence of natural selection on gene order in association with increased rate of chromosomal rearrangement. We use a novel parametric bootstrap approach to test if directional selection is responsible for the clustering of functionally related genes observed in the highly rearranged chloroplast genome of the unicellular green alga Chlamydomonas reinhardtii, relative to ancestral chloroplast genomes. Results Ancestral gene orders were inferred and then subjected to simulated rearrangement events under the random breakage model with varying ratios of inversions and transpositions. We found that adjacent chloroplast genes in C. reinhardtii were located on the same strand much more frequently than in simulated genomes that were generated under a random rearrangement processes (increased sidedness; p < 0.0001). In addition, functionally related genes were found to be more clustered than those evolved under random rearrangements (p < 0.0001). We report evidence of co-transcription of neighboring genes, which may be responsible for the observed gene clusters in C. reinhardtii cpDNA. Conclusion Simulations and experimental evidence suggest that both selective maintenance and directional selection for gene clusters are determinants of chloroplast gene order. PMID:16469102

  10. qPR: An adaptive partial-report procedure based on Bayesian inference

    PubMed Central

    Baek, Jongsoo; Lesmes, Luis Andres; Lu, Zhong-Lin

    2016-01-01

    Iconic memory is best assessed with the partial report procedure in which an array of letters appears briefly on the screen and a poststimulus cue directs the observer to report the identity of the cued letter(s). Typically, 6–8 cue delays or 600–800 trials are tested to measure the iconic memory decay function. Here we develop a quick partial report, or qPR, procedure based on a Bayesian adaptive framework to estimate the iconic memory decay function with much reduced testing time. The iconic memory decay function is characterized by an exponential function and a joint probability distribution of its three parameters. Starting with a prior of the parameters, the method selects the stimulus to maximize the expected information gain in the next test trial. It then updates the posterior probability distribution of the parameters based on the observer's response using Bayesian inference. The procedure is reiterated until either the total number of trials or the precision of the parameter estimates reaches a certain criterion. Simulation studies showed that only 100 trials were necessary to reach an average absolute bias of 0.026 and a precision of 0.070 (both in terms of probability correct). A psychophysical validation experiment showed that estimates of the iconic memory decay function obtained with 100 qPR trials exhibited good precision (the half width of the 68.2% credible interval = 0.055) and excellent agreement with those obtained with 1,600 trials of the conventional method of constant stimuli procedure (RMSE = 0.063). Quick partial-report relieves the data collection burden in characterizing iconic memory and makes it possible to assess iconic memory in clinical populations. PMID:27580045

  11. Application of Non-Kolmogorovian Probability and Quantum Adaptive Dynamics to Unconscious Inference in Visual Perception Process

    NASA Astrophysics Data System (ADS)

    Accardi, Luigi; Khrennikov, Andrei; Ohya, Masanori; Tanaka, Yoshiharu; Yamato, Ichiro

    2016-07-01

    Recently a novel quantum information formalism — quantum adaptive dynamics — was developed and applied to modelling of information processing by bio-systems including cognitive phenomena: from molecular biology (glucose-lactose metabolism for E.coli bacteria, epigenetic evolution) to cognition, psychology. From the foundational point of view quantum adaptive dynamics describes mutual adapting of the information states of two interacting systems (physical or biological) as well as adapting of co-observations performed by the systems. In this paper we apply this formalism to model unconscious inference: the process of transition from sensation to perception. The paper combines theory and experiment. Statistical data collected in an experimental study on recognition of a particular ambiguous figure, the Schröder stairs, support the viability of the quantum(-like) model of unconscious inference including modelling of biases generated by rotation-contexts. From the probabilistic point of view, we study (for concrete experimental data) the problem of contextuality of probability, its dependence on experimental contexts. Mathematically contextuality leads to non-Komogorovness: probability distributions generated by various rotation contexts cannot be treated in the Kolmogorovian framework. At the same time they can be embedded in a “big Kolmogorov space” as conditional probabilities. However, such a Kolmogorov space has too complex structure and the operational quantum formalism in the form of quantum adaptive dynamics simplifies the modelling essentially.

  12. An application of adaptive neuro-fuzzy inference system to landslide susceptibility mapping (Klang valley, Malaysia)

    NASA Astrophysics Data System (ADS)

    Sezer, Ebru; Pradhan, Biswajeet; Gokceoglu, Candan

    2010-05-01

    Landslides are one of the recurrent natural hazard problems throughout most of Malaysia. Recently, the Klang Valley area of Selangor state has faced numerous landslide and mudflow events and much damage occurred in these areas. However, only little effort has been made to assess or predict these events which resulted in serious damages. Through scientific analyses of these landslides, one can assess and predict landslide-susceptible areas and even the events as such, and thus reduce landslide damages through proper preparation and/or mitigation. For this reason , the purpose of the present paper is to produce landslide susceptibility maps of a part of the Klang Valley areas in Malaysia by employing the results of the adaptive neuro-fuzzy inference system (ANFIS) analyses. Landslide locations in the study area were identified by interpreting aerial photographs and satellite images, supported by extensive field surveys. Landsat TM satellite imagery was used to map vegetation index. Maps of topography, lineaments and NDVI were constructed from the spatial datasets. Seven landslide conditioning factors such as altitude, slope angle, plan curvature, distance from drainage, soil type, distance from faults and NDVI were extracted from the spatial database. These factors were analyzed using an ANFIS to construct the landslide susceptibility maps. During the model development works, total 5 landslide susceptibility models were obtained by using ANFIS results. For verification, the results of the analyses were then compared with the field-verified landslide locations. Additionally, the ROC curves for all landslide susceptibility models were drawn and the area under curve values was calculated. Landslide locations were used to validate results of the landslide susceptibility map and the verification results showed 98% accuracy for the model 5 employing all parameters produced in the present study as the landslide conditioning factors. The validation results showed sufficient

  13. Prediction analysis and comparison between agriculture and mining stocks in Indonesia by using adaptive neuro-fuzzy inference system (ANFIS)

    NASA Astrophysics Data System (ADS)

    Mahandrio, Irsantyo; Budi, Andriantama; Liong, The Houw; Purqon, Acep

    2015-09-01

    The growing patterns in cultural and mining sectors are interesting particularly in developed country such as in Indonesia. Here, we investigate the local characteristics of stocks between the sectors of agriculture and mining which si representing two leading companies and two common companies in these sectors. We analyze the prediction by using Adaptive Neuro Fuzzy Inference System (ANFIS). The type of Fuzzy Inference System (FIS) is Sugeno type with Generalized Bell membership function (Gbell). Our results show that ANFIS is a proper method to predicting the stock market with the RMSE : 0.14% for AALI and 0.093% for SGRO representing the agriculture sectors, meanwhile, 0.073% for ANTM and 0.1107% for MDCO representing the mining sectors.

  14. The Development of Adaptive Decision Making: Recognition-Based Inference in Children and Adolescents

    ERIC Educational Resources Information Center

    Horn, Sebastian S.; Ruggeri, Azzurra; Pachur, Thorsten

    2016-01-01

    Judgments about objects in the world are often based on probabilistic information (or cues). A frugal judgment strategy that utilizes memory (i.e., the ability to discriminate between known and unknown objects) as a cue for inference is the recognition heuristic (RH). The usefulness of the RH depends on the structure of the environment,…

  15. Perspectives of Probabilistic Inferences: Reinforcement Learning and an Adaptive Network Compared

    ERIC Educational Resources Information Center

    Rieskamp, Jorg

    2006-01-01

    The assumption that people possess a strategy repertoire for inferences has been raised repeatedly. The strategy selection learning theory specifies how people select strategies from this repertoire. The theory assumes that individuals select strategies proportional to their subjective expectations of how well the strategies solve particular…

  16. Poisson-Based Inference for Perturbation Models in Adaptive Spelling Training

    ERIC Educational Resources Information Center

    Baschera, Gian-Marco; Gross, Markus

    2010-01-01

    We present an inference algorithm for perturbation models based on Poisson regression. The algorithm is designed to handle unclassified input with multiple errors described by independent mal-rules. This knowledge representation provides an intelligent tutoring system with local and global information about a student, such as error classification…

  17. Adaptive neuro-fuzzy inference system to improve the power quality of a split shaft microturbine power generation system

    NASA Astrophysics Data System (ADS)

    Oğuz, Yüksel; Üstün, Seydi Vakkas; Yabanova, İsmail; Yumurtaci, Mehmet; Güney, İrfan

    2012-01-01

    This article presents design of adaptive neuro-fuzzy inference system (ANFIS) for the turbine speed control for purpose of improving the power quality of the power production system of a split shaft microturbine. To improve the operation performance of the microturbine power generation system (MTPGS) and to obtain the electrical output magnitudes in desired quality and value (terminal voltage, operation frequency, power drawn by consumer and production power), a controller depended on adaptive neuro-fuzzy inference system was designed. The MTPGS consists of the microturbine speed controller, a split shaft microturbine, cylindrical pole synchronous generator, excitation circuit and voltage regulator. Modeling of dynamic behavior of synchronous generator driver with a turbine and split shaft turbine was realized by using the Matlab/Simulink and SimPowerSystems in it. It is observed from the simulation results that with the microturbine speed control made with ANFIS, when the MTPGS is operated under various loading situations, the terminal voltage and frequency values of the system can be settled in desired operation values in a very short time without significant oscillation and electrical production power in desired quality can be obtained.

  18. Modeling and Simulation of An Adaptive Neuro-Fuzzy Inference System (ANFIS) for Mobile Learning

    ERIC Educational Resources Information Center

    Al-Hmouz, A.; Shen, Jun; Al-Hmouz, R.; Yan, Jun

    2012-01-01

    With recent advances in mobile learning (m-learning), it is becoming possible for learning activities to occur everywhere. The learner model presented in our earlier work was partitioned into smaller elements in the form of learner profiles, which collectively represent the entire learning process. This paper presents an Adaptive Neuro-Fuzzy…

  19. Adaptive thresholding for reliable topological inference in single subject fMRI analysis

    PubMed Central

    Gorgolewski, Krzysztof J.; Storkey, Amos J.; Bastin, Mark E.; Pernet, Cyril R.

    2012-01-01

    Single subject fMRI has proved to be a useful tool for mapping functional areas in clinical procedures such as tumor resection. Using fMRI data, clinicians assess the risk, plan and execute such procedures based on thresholded statistical maps. However, because current thresholding methods were developed mainly in the context of cognitive neuroscience group studies, most single subject fMRI maps are thresholded manually to satisfy specific criteria related to single subject analyzes. Here, we propose a new adaptive thresholding method which combines Gamma-Gaussian mixture modeling with topological thresholding to improve cluster delineation. In a series of simulations we show that by adapting to the signal and noise properties, the new method performs well in terms of total number of errors but also in terms of the trade-off between false negative and positive cluster error rates. Similarly, simulations show that adaptive thresholding performs better than fixed thresholding in terms of over and underestimation of the true activation border (i.e., higher spatial accuracy). Finally, through simulations and a motor test–retest study on 10 volunteer subjects, we show that adaptive thresholding improves reliability, mainly by accounting for the global signal variance. This in turn increases the likelihood that the true activation pattern can be determined offering an automatic yet flexible way to threshold single subject fMRI maps. PMID:22936908

  20. Evidence for Adaptation to the Tibetan Plateau Inferred from Tibetan Loach Transcriptomes.

    PubMed

    Wang, Ying; Yang, Liandong; Zhou, Kun; Zhang, Yanping; Song, Zhaobin; He, Shunping

    2015-10-09

    Triplophysa fishes are the primary component of the fish fauna on the Tibetan Plateau and are well adapted to the high-altitude environment. Despite the importance of Triplophysa fishes on the plateau, the genetic mechanisms of the adaptations of these fishes to this high-altitude environment remain poorly understood. In this study, we generated the transcriptome sequences for three Triplophysa fishes, that is, Triplophysa siluroides, Triplophysa scleroptera, and Triplophysa dalaica, and used these and the previously available transcriptome and genome sequences from fishes living at low altitudes to identify potential genetic mechanisms for the high-altitude adaptations in Triplophysa fishes. An analysis of 2,269 orthologous genes among cave fish (Astyanax mexicanus), zebrafish (Danio rerio), large-scale loach (Paramisgurnus dabryanus), and Triplophysa fishes revealed that each of the terminal branches of the Triplophysa fishes had a significantly higher ratio of nonsynonymous to synonymous substitutions than that of the branches of the fishes from low altitudes, which provided consistent evidence for genome-wide rapid evolution in the Triplophysa genus. Many of the GO (Gene Ontology) categories associated with energy metabolism and hypoxia response exhibited accelerated evolution in the Triplophysa fishes compared with the large-scale loach. The genes that exhibited signs of positive selection and rapid evolution in the Triplophysa fishes were also significantly enriched in energy metabolism and hypoxia response categories. Our analysis identified widespread Triplophysa-specific nonsynonymous mutations in the fast evolving genes and positively selected genes. Moreover, we detected significant evidence of positive selection in the HIF (hypoxia-inducible factor)-1A and HIF-2B genes in Triplophysa fishes and found that the Triplophysa-specific nonsynonymous mutations in the HIF-1A and HIF-2B genes were associated with functional changes. Overall, our study provides

  1. Fuzzy logic and adaptive neuro-fuzzy inference system for characterization of contaminant exposure through selected biomarkers in African catfish.

    PubMed

    Karami, Ali; Keiter, Steffen; Hollert, Henner; Courtenay, Simon C

    2013-03-01

    This study represents a first attempt at applying a fuzzy inference system (FIS) and an adaptive neuro-fuzzy inference system (ANFIS) to the field of aquatic biomonitoring for classification of the dosage and time of benzo[a]pyrene (BaP) injection through selected biomarkers in African catfish (Clarias gariepinus). Fish were injected either intramuscularly (i.m.) or intraperitoneally (i.p.) with BaP. Hepatic glutathione S-transferase (GST) activities, relative visceral fat weights (LSI), and four biliary fluorescent aromatic compounds (FACs) concentrations were used as the inputs in the modeling study. Contradictory rules in FIS and ANFIS models appeared after conversion of bioassay results into human language (rule-based system). A "data trimming" approach was proposed to eliminate the conflicts prior to fuzzification. However, the model produced was relevant only to relatively low exposures to BaP, especially through the i.m. route of exposure. Furthermore, sensitivity analysis was unable to raise the classification rate to an acceptable level. In conclusion, FIS and ANFIS models have limited applications in the field of fish biomarker studies.

  2. Simulation of SiGe:C HBTs using neural network and adaptive neuro-fuzzy inference system for RF applications

    NASA Astrophysics Data System (ADS)

    Karimi, Gholamreza; Banitalebi, Roza; Babaei Sedaghat, Sedigheh

    2013-07-01

    In this article, the small-signal equivalent circuit model of SiGe:C heterojunction bipolar transistors (HBTs) has directly been extracted from S-parameter data. Moreover, in this article, we present a new modelling approach using ANFIS (adaptive neuro-fuzzy inference system), which in general has a high degree of accuracy, simplicity and novelty (independent approach). Then measured and model-calculated data show an excellent agreement with less than 1.68 × 10-5% discrepancy in the frequency range of higher than 300 GHz over a wide range of bias points in ANFIS. The results show ANFIS model is better than ANN (artificial neural network) for redeveloping the model and increasing the input parameters.

  3. Analysis prediction of Indonesian banks (BCA, BNI, MANDIRI) using adaptive neuro-fuzzy inference system (ANFIS) and investment strategies

    NASA Astrophysics Data System (ADS)

    Trianto, Andriantama Budi; Hadi, I. M.; Liong, The Houw; Purqon, Acep

    2015-09-01

    Indonesian economical development is growing well. It has effect for their invesment in Banks and the stock market. In this study, we perform prediction for the three blue chips of Indonesian bank i.e. BCA, BNI, and MANDIRI by using the method of Adaptive Neuro-Fuzzy Inference System (ANFIS) with Takagi-Sugeno rules and Generalized bell (Gbell) as the membership function. Our results show that ANFIS perform good prediction with RMSE for BCA of 27, BNI of 5.29, and MANDIRI of 13.41, respectively. Furthermore, we develop an active strategy to gain more benefit. We compare between passive strategy versus active strategy. Our results shows that for the passive strategy gains 13 million rupiah, while for the active strategy gains 47 million rupiah in one year. The active investment strategy significantly shows gaining multiple benefit than the passive one.

  4. Adaptive Neuro-Fuzzy Inference System for Classification of Background EEG Signals from ESES Patients and Controls

    PubMed Central

    Yang, Zhixian; Wang, Yinghua; Ouyang, Gaoxiang

    2014-01-01

    Background electroencephalography (EEG), recorded with scalp electrodes, in children with electrical status epilepticus during slow-wave sleep (ESES) syndrome and control subjects has been analyzed. We considered 10 ESES patients, all right-handed and aged 3–9 years. The 10 control individuals had the same characteristics of the ESES ones but presented a normal EEG. Recordings were undertaken in the awake and relaxed states with their eyes open. The complexity of background EEG was evaluated using the permutation entropy (PE) and sample entropy (SampEn) in combination with the ANOVA test. It can be seen that the entropy measures of EEG are significantly different between the ESES patients and normal control subjects. Then, a classification framework based on entropy measures and adaptive neuro-fuzzy inference system (ANFIS) classifier is proposed to distinguish ESES and normal EEG signals. The results are promising and a classification accuracy of about 89% is achieved. PMID:24790547

  5. An adaptive sparse-grid high-order stochastic collocation method for Bayesian inference in groundwater reactive transport modeling

    SciTech Connect

    Zhang, Guannan; Webster, Clayton G; Gunzburger, Max D

    2012-09-01

    Although Bayesian analysis has become vital to the quantification of prediction uncertainty in groundwater modeling, its application has been hindered due to the computational cost associated with numerous model executions needed for exploring the posterior probability density function (PPDF) of model parameters. This is particularly the case when the PPDF is estimated using Markov Chain Monte Carlo (MCMC) sampling. In this study, we develop a new approach that improves computational efficiency of Bayesian inference by constructing a surrogate system based on an adaptive sparse-grid high-order stochastic collocation (aSG-hSC) method. Unlike previous works using first-order hierarchical basis, we utilize a compactly supported higher-order hierar- chical basis to construct the surrogate system, resulting in a significant reduction in the number of computational simulations required. In addition, we use hierarchical surplus as an error indi- cator to determine adaptive sparse grids. This allows local refinement in the uncertain domain and/or anisotropic detection with respect to the random model parameters, which further improves computational efficiency. Finally, we incorporate a global optimization technique and propose an iterative algorithm for building the surrogate system for the PPDF with multiple significant modes. Once the surrogate system is determined, the PPDF can be evaluated by sampling the surrogate system directly with very little computational cost. The developed method is evaluated first using a simple analytical density function with multiple modes and then using two synthetic groundwater reactive transport models. The groundwater models represent different levels of complexity; the first example involves coupled linear reactions and the second example simulates nonlinear ura- nium surface complexation. The results show that the aSG-hSC is an effective and efficient tool for Bayesian inference in groundwater modeling in comparison with conventional

  6. An adaptive Bayesian inference algorithm to estimate the parameters of a hazardous atmospheric release

    NASA Astrophysics Data System (ADS)

    Rajaona, Harizo; Septier, François; Armand, Patrick; Delignon, Yves; Olry, Christophe; Albergel, Armand; Moussafir, Jacques

    2015-12-01

    In the eventuality of an accidental or intentional atmospheric release, the reconstruction of the source term using measurements from a set of sensors is an important and challenging inverse problem. A rapid and accurate estimation of the source allows faster and more efficient action for first-response teams, in addition to providing better damage assessment. This paper presents a Bayesian probabilistic approach to estimate the location and the temporal emission profile of a pointwise source. The release rate is evaluated analytically by using a Gaussian assumption on its prior distribution, and is enhanced with a positivity constraint to improve the estimation. The source location is obtained by the means of an advanced iterative Monte-Carlo technique called Adaptive Multiple Importance Sampling (AMIS), which uses a recycling process at each iteration to accelerate its convergence. The proposed methodology is tested using synthetic and real concentration data in the framework of the Fusion Field Trials 2007 (FFT-07) experiment. The quality of the obtained results is comparable to those coming from the Markov Chain Monte Carlo (MCMC) algorithm, a popular Bayesian method used for source estimation. Moreover, the adaptive processing of the AMIS provides a better sampling efficiency by reusing all the generated samples.

  7. Simultaneous learning and filtering without delusions: a Bayes-optimal combination of Predictive Inference and Adaptive Filtering.

    PubMed

    Kneissler, Jan; Drugowitsch, Jan; Friston, Karl; Butz, Martin V

    2015-01-01

    Predictive coding appears to be one of the fundamental working principles of brain processing. Amongst other aspects, brains often predict the sensory consequences of their own actions. Predictive coding resembles Kalman filtering, where incoming sensory information is filtered to produce prediction errors for subsequent adaptation and learning. However, to generate prediction errors given motor commands, a suitable temporal forward model is required to generate predictions. While in engineering applications, it is usually assumed that this forward model is known, the brain has to learn it. When filtering sensory input and learning from the residual signal in parallel, a fundamental problem arises: the system can enter a delusional loop when filtering the sensory information using an overly trusted forward model. In this case, learning stalls before accurate convergence because uncertainty about the forward model is not properly accommodated. We present a Bayes-optimal solution to this generic and pernicious problem for the case of linear forward models, which we call Predictive Inference and Adaptive Filtering (PIAF). PIAF filters incoming sensory information and learns the forward model simultaneously. We show that PIAF is formally related to Kalman filtering and to the Recursive Least Squares linear approximation method, but combines these procedures in a Bayes optimal fashion. Numerical evaluations confirm that the delusional loop is precluded and that the learning of the forward model is more than 10-times faster when compared to a naive combination of Kalman filtering and Recursive Least Squares.

  8. Motion adaptive vertical handoff in cellular/WLAN heterogeneous wireless network.

    PubMed

    Li, Limin; Ma, Lin; Xu, Yubin; Fu, Yunhai

    2014-01-01

    In heterogeneous wireless network, vertical handoff plays an important role for guaranteeing quality of service and overall performance of network. Conventional vertical handoff trigger schemes are mostly developed from horizontal handoff in homogeneous cellular network. Basically, they can be summarized as hysteresis-based and dwelling-timer-based algorithms, which are reliable on avoiding unnecessary handoff caused by the terminals dwelling at the edge of WLAN coverage. However, the coverage of WLAN is much smaller compared with cellular network, while the motion types of terminals can be various in a typical outdoor scenario. As a result, traditional algorithms are less effective in avoiding unnecessary handoff triggered by vehicle-borne terminals with various speeds. Besides that, hysteresis and dwelling-timer thresholds usually need to be modified to satisfy different channel environments. For solving this problem, a vertical handoff algorithm based on Q-learning is proposed in this paper. Q-learning can provide the decider with self-adaptive ability for handling the terminals' handoff requests with different motion types and channel conditions. Meanwhile, Neural Fuzzy Inference System (NFIS) is embedded to retain a continuous perception of the state space. Simulation results verify that the proposed algorithm can achieve lower unnecessary handoff probability compared with the other two conventional algorithms.

  9. Motion Adaptive Vertical Handoff in Cellular/WLAN Heterogeneous Wireless Network

    PubMed Central

    Ma, Lin; Xu, Yubin; Fu, Yunhai

    2014-01-01

    In heterogeneous wireless network, vertical handoff plays an important role for guaranteeing quality of service and overall performance of network. Conventional vertical handoff trigger schemes are mostly developed from horizontal handoff in homogeneous cellular network. Basically, they can be summarized as hysteresis-based and dwelling-timer-based algorithms, which are reliable on avoiding unnecessary handoff caused by the terminals dwelling at the edge of WLAN coverage. However, the coverage of WLAN is much smaller compared with cellular network, while the motion types of terminals can be various in a typical outdoor scenario. As a result, traditional algorithms are less effective in avoiding unnecessary handoff triggered by vehicle-borne terminals with various speeds. Besides that, hysteresis and dwelling-timer thresholds usually need to be modified to satisfy different channel environments. For solving this problem, a vertical handoff algorithm based on Q-learning is proposed in this paper. Q-learning can provide the decider with self-adaptive ability for handling the terminals' handoff requests with different motion types and channel conditions. Meanwhile, Neural Fuzzy Inference System (NFIS) is embedded to retain a continuous perception of the state space. Simulation results verify that the proposed algorithm can achieve lower unnecessary handoff probability compared with the other two conventional algorithms. PMID:24741347

  10. Adaptive evolution in the Arabidopsis MADS-box gene family inferred from its complete resolved phylogeny

    PubMed Central

    Martínez-Castilla, León Patricio; Alvarez-Buylla, Elena R.

    2003-01-01

    Gene duplication is a substrate of evolution. However, the relative importance of positive selection versus relaxation of constraints in the functional divergence of gene copies is still under debate. Plant MADS-box genes encode transcriptional regulators key in various aspects of development and have undergone extensive duplications to form a large family. We recovered 104 MADS sequences from the Arabidopsis genome. Bayesian phylogenetic trees recover type II lineage as a monophyletic group and resolve a branching sequence of monophyletic groups within this lineage. The type I lineage is comprised of several divergent groups. However, contrasting gene structure and patterns of chromosomal distribution between type I and II sequences suggest that they had different evolutionary histories and support the placement of the root of the gene family between these two groups. Site-specific and site-branch analyses of positive Darwinian selection (PDS) suggest that different selection regimes could have affected the evolution of these lineages. We found evidence for PDS along the branch leading to flowering time genes that have a direct impact on plant fitness. Sites with high probabilities of having been under PDS were found in the MADS and K domains, suggesting that these played important roles in the acquisition of novel functions during MADS-box diversification. Detected sites are targets for further experimental analyses. We argue that adaptive changes in MADS-domain protein sequences have been important for their functional divergence, suggesting that changes within coding regions of transcriptional regulators have influenced phenotypic evolution of plants. PMID:14597714

  11. Diverse Aquatic Adaptations in Nothosaurus spp. (Sauropterygia)—Inferences from Humeral Histology and Microanatomy

    PubMed Central

    Klein, Nicole; Sander, P. Martin; Krahl, Anna; Scheyer, Torsten M.; Houssaye, Alexandra

    2016-01-01

    , and possibly sexual dimorphism. Humeral microanatomy documents the diversification of nothosaur species into different environments to avoid intraclade competition as well as competition with other marine reptiles. Nothosaur microanatomy indicates that knowledge of processes involved in secondary aquatic adaptation and their interaction are more complex than previously believed. PMID:27391607

  12. Natural Selection, Adaptive Topographies and the Problem of Statistical Inference: The Moraba scurra Controversy Under the Microscope.

    PubMed

    Grodwohl, Jean-Baptiste

    2016-08-01

    This paper gives a detailed narrative of a controversial empirical research in postwar population genetics, the analysis of the cytological polymorphisms of an Australian grasshopper, Moraba scurra. This research intertwined key technical developments in three research areas during the 1950s and 1960s: it involved Dobzhansky's empirical research program on cytological polymorphisms, the mathematical theory of natural selection in two-locus systems, and the building of reliable estimates of natural selection in the wild. In the mid-1950s the cytologist Michael White discovered an interesting case of epistasis in populations of Moraba scurra. These observations received a wide diffusion when theoretical population geneticist Richard Lewontin represented White's data on adaptive topographies. These topographies connected the information on the genetic structure of these grasshopper populations with the formal framework of theoretical population genetics. As such, they appeared at the time as the most successful application of two-locus models of natural selection to an empirical study system. However, this connection generated paradoxical results: in the landscapes, all grasshopper populations were located on a ridge (an unstable equilibrium) while they were expected to reach a peak. This puzzling result fueled years of research and triggered a controversy attracting contributors from Australia, the United States and the United Kingdom. While the original problem seemed, at first, purely empirical, the subsequent controversy affected the main mathematical tools used in the study of two-gene systems under natural selection. Adaptive topographies and their underlying mathematical structure, Wright's mean fitness equations, were submitted to close scrutiny. Suspicion eventually shifted to the statistical machinery used in data analysis, reflecting the crucial role of statistical inference in applied population genetics. In the 1950s and 1960s, population geneticists were

  13. Parametric 3D Atmospheric Reconstruction in Highly Variable Terrain with Recycled Monte Carlo Paths and an Adapted Bayesian Inference Engine

    NASA Technical Reports Server (NTRS)

    Langmore, Ian; Davis, Anthony B.; Bal, Guillaume; Marzouk, Youssef M.

    2012-01-01

    We describe a method for accelerating a 3D Monte Carlo forward radiative transfer model to the point where it can be used in a new kind of Bayesian retrieval framework. The remote sensing challenge is to detect and quantify a chemical effluent of a known absorbing gas produced by an industrial facility in a deep valley. The available data is a single low resolution noisy image of the scene in the near IR at an absorbing wavelength for the gas of interest. The detected sunlight has been multiply reflected by the variable terrain and/or scattered by an aerosol that is assumed partially known and partially unknown. We thus introduce a new class of remote sensing algorithms best described as "multi-pixel" techniques that call necessarily for a 3D radaitive transfer model (but demonstrated here in 2D); they can be added to conventional ones that exploit typically multi- or hyper-spectral data, sometimes with multi-angle capability, with or without information about polarization. The novel Bayesian inference methodology uses adaptively, with efficiency in mind, the fact that a Monte Carlo forward model has a known and controllable uncertainty depending on the number of sun-to-detector paths used.

  14. Adaptive Neuro-Fuzzy Inference System Applied QSAR with Quantum Chemical Descriptors for Predicting Radical Scavenging Activities of Carotenoids.

    PubMed

    Jhin, Changho; Hwang, Keum Taek

    2015-01-01

    One of the physiological characteristics of carotenoids is their radical scavenging activity. In this study, the relationship between radical scavenging activities and quantum chemical descriptors of carotenoids was determined. Adaptive neuro-fuzzy inference system (ANFIS) applied quantitative structure-activity relationship models (QSAR) were also developed for predicting and comparing radical scavenging activities of carotenoids. Semi-empirical PM6 and PM7 quantum chemical calculations were done by MOPAC. Ionisation energies of neutral and monovalent cationic carotenoids and the product of chemical potentials of neutral and monovalent cationic carotenoids were significantly correlated with the radical scavenging activities, and consequently these descriptors were used as independent variables for the QSAR study. The ANFIS applied QSAR models were developed with two triangular-shaped input membership functions made for each of the independent variables and optimised by a backpropagation method. High prediction efficiencies were achieved by the ANFIS applied QSAR. The R-square values of the developed QSAR models with the variables calculated by PM6 and PM7 methods were 0.921 and 0.902, respectively. The results of this study demonstrated reliabilities of the selected quantum chemical descriptors and the significance of QSAR models.

  15. Modeling Pb (II) adsorption from aqueous solution by ostrich bone ash using adaptive neural-based fuzzy inference system.

    PubMed

    Amiri, Mohammad J; Abedi-Koupai, Jahangir; Eslamian, Sayed S; Mousavi, Sayed F; Hasheminejad, Hasti

    2013-01-01

    To evaluate the performance of Adaptive Neural-Based Fuzzy Inference System (ANFIS) model in estimating the efficiency of Pb (II) ions removal from aqueous solution by ostrich bone ash, a batch experiment was conducted. Five operational parameters including adsorbent dosage (C(s)), initial concentration of Pb (II) ions (C(o)), initial pH, temperature (T) and contact time (t) were taken as the input data and the adsorption efficiency (AE) of bone ash as the output. Based on the 31 different structures, 5 ANFIS models were tested against the measured adsorption efficiency to assess the accuracy of each model. The results showed that ANFIS5, which used all input parameters, was the most accurate (RMSE = 2.65 and R(2) = 0.95) and ANFIS1, which used only the contact time input, was the worst (RMSE = 14.56 and R(2) = 0.46). In ranking the models, ANFIS4, ANFIS3 and ANFIS2 ranked second, third and fourth, respectively. The sensitivity analysis revealed that the estimated AE is more sensitive to the contact time, followed by pH, initial concentration of Pb (II) ions, adsorbent dosage, and temperature. The results showed that all ANFIS models overestimated the AE. In general, this study confirmed the capabilities of ANFIS model as an effective tool for estimation of AE.

  16. Prediction of matching condition for a microstrip subsystem using artificial neural network and adaptive neuro-fuzzy inference system

    NASA Astrophysics Data System (ADS)

    Salehi, Mohammad Reza; Noori, Leila; Abiri, Ebrahim

    2016-11-01

    In this paper, a subsystem consisting of a microstrip bandpass filter and a microstrip low noise amplifier (LNA) is designed for WLAN applications. The proposed filter has a small implementation area (49 mm2), small insertion loss (0.08 dB) and wide fractional bandwidth (FBW) (61%). To design the proposed LNA, the compact microstrip cells, an field effect transistor, and only a lumped capacitor are used. It has a low supply voltage and a low return loss (-40 dB) at the operation frequency. The matching condition of the proposed subsystem is predicted using subsystem analysis, artificial neural network (ANN) and adaptive neuro-fuzzy inference system (ANFIS). To design the proposed filter, the transmission matrix of the proposed resonator is obtained and analysed. The performance of the proposed ANN and ANFIS models is tested using the numerical data by four performance measures, namely the correlation coefficient (CC), the mean absolute error (MAE), the average percentage error (APE) and the root mean square error (RMSE). The obtained results show that these models are in good agreement with the numerical data, and a small error between the predicted values and numerical solution is obtained.

  17. Application of adaptive neuro-fuzzy inference system and cuckoo optimization algorithm for analyzing electro chemical machining process

    NASA Astrophysics Data System (ADS)

    Teimouri, Reza; Sohrabpoor, Hamed

    2013-12-01

    Electrochemical machining process (ECM) is increasing its importance due to some of the specific advantages which can be exploited during machining operation. The process offers several special privileges such as higher machining rate, better accuracy and control, and wider range of materials that can be machined. Contribution of too many predominate parameters in the process, makes its prediction and selection of optimal values really complex, especially while the process is programmized for machining of hard materials. In the present work in order to investigate effects of electrolyte concentration, electrolyte flow rate, applied voltage and feed rate on material removal rate (MRR) and surface roughness (SR) the adaptive neuro-fuzzy inference systems (ANFIS) have been used for creation predictive models based on experimental observations. Then the ANFIS 3D surfaces have been plotted for analyzing effects of process parameters on MRR and SR. Finally, the cuckoo optimization algorithm (COA) was used for selection solutions in which the process reaches maximum material removal rate and minimum surface roughness simultaneously. Results indicated that the ANFIS technique has superiority in modeling of MRR and SR with high prediction accuracy. Also, results obtained while applying of COA have been compared with those derived from confirmatory experiments which validate the applicability and suitability of the proposed techniques in enhancing the performance of ECM process.

  18. Prediction of Tensile Strength of Friction Stir Weld Joints with Adaptive Neuro-Fuzzy Inference System (ANFIS) and Neural Network

    NASA Technical Reports Server (NTRS)

    Dewan, Mohammad W.; Huggett, Daniel J.; Liao, T. Warren; Wahab, Muhammad A.; Okeil, Ayman M.

    2015-01-01

    Friction-stir-welding (FSW) is a solid-state joining process where joint properties are dependent on welding process parameters. In the current study three critical process parameters including spindle speed (??), plunge force (????), and welding speed (??) are considered key factors in the determination of ultimate tensile strength (UTS) of welded aluminum alloy joints. A total of 73 weld schedules were welded and tensile properties were subsequently obtained experimentally. It is observed that all three process parameters have direct influence on UTS of the welded joints. Utilizing experimental data, an optimized adaptive neuro-fuzzy inference system (ANFIS) model has been developed to predict UTS of FSW joints. A total of 1200 models were developed by varying the number of membership functions (MFs), type of MFs, and combination of four input variables (??,??,????,??????) utilizing a MATLAB platform. Note EFI denotes an empirical force index derived from the three process parameters. For comparison, optimized artificial neural network (ANN) models were also developed to predict UTS from FSW process parameters. By comparing ANFIS and ANN predicted results, it was found that optimized ANFIS models provide better results than ANN. This newly developed best ANFIS model could be utilized for prediction of UTS of FSW joints.

  19. Adaptive Neuro-Fuzzy Inference System Applied QSAR with Quantum Chemical Descriptors for Predicting Radical Scavenging Activities of Carotenoids

    PubMed Central

    Jhin, Changho; Hwang, Keum Taek

    2015-01-01

    One of the physiological characteristics of carotenoids is their radical scavenging activity. In this study, the relationship between radical scavenging activities and quantum chemical descriptors of carotenoids was determined. Adaptive neuro-fuzzy inference system (ANFIS) applied quantitative structure-activity relationship models (QSAR) were also developed for predicting and comparing radical scavenging activities of carotenoids. Semi-empirical PM6 and PM7 quantum chemical calculations were done by MOPAC. Ionisation energies of neutral and monovalent cationic carotenoids and the product of chemical potentials of neutral and monovalent cationic carotenoids were significantly correlated with the radical scavenging activities, and consequently these descriptors were used as independent variables for the QSAR study. The ANFIS applied QSAR models were developed with two triangular-shaped input membership functions made for each of the independent variables and optimised by a backpropagation method. High prediction efficiencies were achieved by the ANFIS applied QSAR. The R-square values of the developed QSAR models with the variables calculated by PM6 and PM7 methods were 0.921 and 0.902, respectively. The results of this study demonstrated reliabilities of the selected quantum chemical descriptors and the significance of QSAR models. PMID:26474167

  20. Neuro-fuzzy controller of low head hydropower plants using adaptive-network based fuzzy inference system

    SciTech Connect

    Djukanovic, M.B.; Calovic, M.S.; Vesovic, B.V.; Sobajic, D.J.

    1997-12-01

    This paper presents an attempt of nonlinear, multivariable control of low-head hydropower plants, by using adaptive-network based fuzzy inference system (ANFIS). The new design technique enhances fuzzy controllers with self-learning capability for achieving prescribed control objectives in a near optimal manner. The controller has flexibility for accepting more sensory information, with the main goal to improve the generator unit transients, by adjusting the exciter input, the wicket gate and runner blade positions. The developed ANFIS controller whose control signals are adjusted by using incomplete on-line measurements, can offer better damping effects to generator oscillations over a wide range of operating conditions, than conventional controllers. Digital simulations of hydropower plant equipped with low-head Kaplan turbine are performed and the comparisons of conventional excitation-governor control, state-feedback optimal control and ANFIS based output feedback control are presented. To demonstrate the effectiveness of the proposed control scheme and the robustness of the acquired neuro-fuzzy controller, the controller has been implemented on a complex high-order non-linear hydrogenerator model.

  1. Prediction of Radical Scavenging Activities of Anthocyanins Applying Adaptive Neuro-Fuzzy Inference System (ANFIS) with Quantum Chemical Descriptors

    PubMed Central

    Jhin, Changho; Hwang, Keum Taek

    2014-01-01

    Radical scavenging activity of anthocyanins is well known, but only a few studies have been conducted by quantum chemical approach. The adaptive neuro-fuzzy inference system (ANFIS) is an effective technique for solving problems with uncertainty. The purpose of this study was to construct and evaluate quantitative structure-activity relationship (QSAR) models for predicting radical scavenging activities of anthocyanins with good prediction efficiency. ANFIS-applied QSAR models were developed by using quantum chemical descriptors of anthocyanins calculated by semi-empirical PM6 and PM7 methods. Electron affinity (A) and electronegativity (χ) of flavylium cation, and ionization potential (I) of quinoidal base were significantly correlated with radical scavenging activities of anthocyanins. These descriptors were used as independent variables for QSAR models. ANFIS models with two triangular-shaped input fuzzy functions for each independent variable were constructed and optimized by 100 learning epochs. The constructed models using descriptors calculated by both PM6 and PM7 had good prediction efficiency with Q-square of 0.82 and 0.86, respectively. PMID:25153627

  2. Seasonal rainfall forecasting by adaptive network-based fuzzy inference system (ANFIS) using large scale climate signals

    NASA Astrophysics Data System (ADS)

    Mekanik, F.; Imteaz, M. A.; Talei, A.

    2016-05-01

    Accurate seasonal rainfall forecasting is an important step in the development of reliable runoff forecast models. The large scale climate modes affecting rainfall in Australia have recently been proven useful in rainfall prediction problems. In this study, adaptive network-based fuzzy inference systems (ANFIS) models are developed for the first time for southeast Australia in order to forecast spring rainfall. The models are applied in east, center and west Victoria as case studies. Large scale climate signals comprising El Nino Southern Oscillation (ENSO), Indian Ocean Dipole (IOD) and Inter-decadal Pacific Ocean (IPO) are selected as rainfall predictors. Eight models are developed based on single climate modes (ENSO, IOD, and IPO) and combined climate modes (ENSO-IPO and ENSO-IOD). Root Mean Square Error (RMSE), Mean Absolute Error (MAE), Pearson correlation coefficient (r) and root mean square error in probability (RMSEP) skill score are used to evaluate the performance of the proposed models. The predictions demonstrate that ANFIS models based on individual IOD index perform superior in terms of RMSE, MAE and r to the models based on individual ENSO indices. It is further discovered that IPO is not an effective predictor for the region and the combined ENSO-IOD and ENSO-IPO predictors did not improve the predictions. In order to evaluate the effectiveness of the proposed models a comparison is conducted between ANFIS models and the conventional Artificial Neural Network (ANN), the Predictive Ocean Atmosphere Model for Australia (POAMA) and climatology forecasts. POAMA is the official dynamic model used by the Australian Bureau of Meteorology. The ANFIS predictions certify a superior performance for most of the region compared to ANN and climatology forecasts. POAMA performs better in regards to RMSE and MAE in east and part of central Victoria, however, compared to ANFIS it shows weaker results in west Victoria in terms of prediction errors and RMSEP skill

  3. The new physician as unwitting quantum mechanic: is adapting Dirac's inference system best practice for personalized medicine, genomics, and proteomics?

    PubMed

    Robson, Barry

    2007-08-01

    What is the Best Practice for automated inference in Medical Decision Support for personalized medicine? A known system already exists as Dirac's inference system from quantum mechanics (QM) using bra-kets and bras where A and B are states, events, or measurements representing, say, clinical and biomedical rules. Dirac's system should theoretically be the universal best practice for all inference, though QM is notorious as sometimes leading to bizarre conclusions that appear not to be applicable to the macroscopic world of everyday world human experience and medical practice. It is here argued that this apparent difficulty vanishes if QM is assigned one new multiplication function @, which conserves conditionality appropriately, making QM applicable to classical inference including a quantitative form of the predicate calculus. An alternative interpretation with the same consequences is if every i = radical-1 in Dirac's QM is replaced by h, an entity distinct from 1 and i and arguably a hidden root of 1 such that h2 = 1. With that exception, this paper is thus primarily a review of the application of Dirac's system, by application of linear algebra in the complex domain to help manipulate information about associations and ontology in complicated data. Any combined bra-ket can be shown to be composed only of the sum of QM-like bra and ket weights c(), times an exponential function of Fano's mutual information measure I(A; B) about the association between A and B, that is, an association rule from data mining. With the weights and Fano measure re-expressed as expectations on finite data using Riemann's Incomplete (i.e., Generalized) Zeta Functions, actual counts of observations for real world sparse data can be readily utilized. Finally, the paper compares identical character, distinguishability of states events or measurements, correlation, mutual information, and orthogonal character, important issues in data mining

  4. Realities of weather extremes on daily life in urban India - How quantified impacts infer sensible adaptation options

    NASA Astrophysics Data System (ADS)

    Reckien, D.

    2012-12-01

    Emerging and developing economies are currently undergoing one of the profoundest socio-spatial transitions in their history, with strong urbanization and weather extremes bringing about changes in the economy, forms of living and living conditions, but also increasing risks and altered social divides. The impacts of heat waves and strong rain events are therefore differently perceived among urban residents. Addressing the social differences of climate change impacts1 and expanding targeted adaptation options have emerged as urgent policy priorities, particularly for developing and emerging economies2. This paper discusses the perceived impacts of weather-related extreme events on different social groups in New Delhi and Hyderabad, India. Using network statistics and scenario analysis on Fuzzy Cognitive Maps (FCMs) as part of a vulnerability analysis, the investigation provides quantitative and qualitative measures to compare impacts and adaptation strategies for different social groups. Impacts of rain events are stronger than those of heat in both cities and affect the lower income classes particularly. Interestingly, the scenario analysis (comparing altered networks in which the alteration represents a possible adaptation measure) shows that investments in the water infrastructure would be most meaningful and more effective than investments in, e.g., the traffic infrastructure, despite the stronger burden from traffic disruptions and the resulting concentration of planning and policy on traffic ease and investments. The method of Fuzzy Cognitive Mapping offers a link between perception and modeling, and the possibility to aggregate and analyze the views of a large number of stakeholders. Our research has shown that planners and politicians often know about many of the problems, but are often overwhelmed by the problems in their respective cities and look for a prioritization of adaptation options. FCM provides this need and identifies priority adaptation options

  5. A Novel Technique for Maximum Power Point Tracking of a Photovoltaic Based on Sensing of Array Current Using Adaptive Neuro-Fuzzy Inference System (ANFIS)

    NASA Astrophysics Data System (ADS)

    El-Zoghby, Helmy M.; Bendary, Ahmed F.

    2016-10-01

    Maximum Power Point Tracking (MPPT) is now widely used method in increasing the photovoltaic (PV) efficiency. The conventional MPPT methods have many problems concerning the accuracy, flexibility and efficiency. The MPP depends on the PV temperature and solar irradiation that randomly varied. In this paper an artificial intelligence based controller is presented through implementing of an Adaptive Neuro-Fuzzy Inference System (ANFIS) to obtain maximum power from PV. The ANFIS inputs are the temperature and cell current, and the output is optimal voltage at maximum power. During operation the trained ANFIS senses the PV current using suitable sensor and also senses the temperature to determine the optimal operating voltage that corresponds to the current at MPP. This voltage is used to control the boost converter duty cycle. The MATLAB simulation results shows the effectiveness of the ANFIS with sensing the PV current in obtaining the MPPT from the PV.

  6. Reliable Detection of Loci Responsible for Local Adaptation: Inference of a Null Model through Trimming the Distribution of F(ST).

    PubMed

    Whitlock, Michael C; Lotterhos, Katie E

    2015-10-01

    Loci responsible for local adaptation are likely to have more genetic differentiation among populations than neutral loci. However, neutral loci can vary widely in their amount of genetic differentiation, even over the same geographic range. Unfortunately, the distribution of differentiation--as measured by an index such as F(ST)--depends on the details of the demographic history of the populations in question, even without spatially heterogeneous selection. Many methods designed to detect F(ST) outliers assume a specific model of demographic history, which can result in extremely high false positive rates for detecting loci under selection. We develop a new method that infers the distribution of F(ST) for loci unlikely to be strongly affected by spatially diversifying selection, using data on a large set of loci with unknown selective properties. Compared to previous methods, this approach, called OutFLANK, has much lower false positive rates and comparable power, as shown by simulation.

  7. Comparison of adaptive neuro-fuzzy inference system (ANFIS) and Gaussian processes for machine learning (GPML) algorithms for the prediction of skin temperature in lower limb prostheses.

    PubMed

    Mathur, Neha; Glesk, Ivan; Buis, Arjan

    2016-10-01

    Monitoring of the interface temperature at skin level in lower-limb prosthesis is notoriously complicated. This is due to the flexible nature of the interface liners used impeding the required consistent positioning of the temperature sensors during donning and doffing. Predicting the in-socket residual limb temperature by monitoring the temperature between socket and liner rather than skin and liner could be an important step in alleviating complaints on increased temperature and perspiration in prosthetic sockets. In this work, we propose to implement an adaptive neuro fuzzy inference strategy (ANFIS) to predict the in-socket residual limb temperature. ANFIS belongs to the family of fused neuro fuzzy system in which the fuzzy system is incorporated in a framework which is adaptive in nature. The proposed method is compared to our earlier work using Gaussian processes for machine learning. By comparing the predicted and actual data, results indicate that both the modeling techniques have comparable performance metrics and can be efficiently used for non-invasive temperature monitoring.

  8. Ecological Inference

    NASA Astrophysics Data System (ADS)

    King, Gary; Rosen, Ori; Tanner, Martin A.

    2004-09-01

    This collection of essays brings together a diverse group of scholars to survey the latest strategies for solving ecological inference problems in various fields. The last half-decade has witnessed an explosion of research in ecological inference--the process of trying to infer individual behavior from aggregate data. Although uncertainties and information lost in aggregation make ecological inference one of the most problematic types of research to rely on, these inferences are required in many academic fields, as well as by legislatures and the Courts in redistricting, by business in marketing research, and by governments in policy analysis.

  9. Application of adaptive neuro-fuzzy inference system techniques and artificial neural networks to predict solid oxide fuel cell performance in residential microgeneration installation

    NASA Astrophysics Data System (ADS)

    Entchev, Evgueniy; Yang, Libing

    This study applies adaptive neuro-fuzzy inference system (ANFIS) techniques and artificial neural network (ANN) to predict solid oxide fuel cell (SOFC) performance while supplying both heat and power to a residence. A microgeneration 5 kW el SOFC system was installed at the Canadian Centre for Housing Technology (CCHT), integrated with existing mechanical systems and connected in parallel to the grid. SOFC performance data were collected during the winter heating season and used for training of both ANN and ANFIS models. The ANN model was built on back propagation algorithm as for ANFIS model a combination of least squares method and back propagation gradient decent method were developed and applied. Both models were trained with experimental data and used to predict selective SOFC performance parameters such as fuel cell stack current, stack voltage, etc. The study revealed that both ANN and ANFIS models' predictions agreed well with variety of experimental data sets representing steady-state, start-up and shut-down operations of the SOFC system. The initial data set was subjected to detailed sensitivity analysis and statistically insignificant parameters were excluded from the training set. As a result, significant reduction of computational time was achieved without affecting models' accuracy. The study showed that adaptive models can be applied with confidence during the design process and for performance optimization of existing and newly developed solid oxide fuel cell systems. It demonstrated that by using ANN and ANFIS techniques SOFC microgeneration system's performance could be modelled with minimum time demand and with a high degree of accuracy.

  10. Use of an adaptive neuro-fuzzy inference system to obtain the correspondence among balance, gait, and depression for Parkinson's disease

    NASA Astrophysics Data System (ADS)

    Woo, Youngkeun; Lee, Juwon; Hwang, Sujin; Hong, Cheol Pyo

    2013-03-01

    The purpose of this study was to investigate the associations between gait performance, postural stability, and depression in patients with Parkinson's disease (PD) by using an adaptive neuro-fuzzy inference system (ANFIS). Twenty-two idiopathic PD patients were assessed during outpatient physical therapy by using three clinical tests: the Berg balance scale (BBS), Dynamic gait index (DGI), and Geriatric depression scale (GDS). Scores were determined from clinical observation and patient interviews, and associations among gait performance, postural stability, and depression in this PD population were evaluated. The DGI showed significant positive correlation with the BBS scores, and negative correlation with the GDS score. We assessed the relationship between the BBS score and the DGI results by using a multiple regression analysis. In this case, the GDS score was not significantly associated with the DGI, but the BBS and DGI results were. Strikingly, the ANFIS-estimated value of the DGI, based on the BBS and the GDS scores, significantly correlated with the walking ability determined by using the DGI in patients with Parkinson's disease. These findings suggest that the ANFIS techniques effectively reflect and explain the multidirectional phenomena or conditions of gait performance in patients with PD.

  11. An exploratory investigation of an adaptive neuro fuzzy inference system (ANFIS) for estimating hydrometeors from TRMM/TMI in synergy with TRMM/PR

    NASA Astrophysics Data System (ADS)

    Islam, Tanvir; Srivastava, Prashant K.; Rico-Ramirez, Miguel A.; Dai, Qiang; Han, Dawei; Gupta, Manika

    2014-08-01

    The authors have investigated an adaptive neuro fuzzy inference system (ANFIS) for the estimation of hydrometeors from the TRMM microwave imager (TMI). The proposed algorithm, named as Hydro-Rain algorithm, is developed in synergy with the TRMM precipitation radar (PR) observed hydrometeor information. The method retrieves rain rates by exploiting the synergistic relations between the TMI and PR observations in twofold steps. First, the fundamental hydrometeor parameters, liquid water path (LWP) and ice water path (IWP), are estimated from the TMI brightness temperatures. Next, the rain rates are estimated from the retrieved hydrometeor parameters (LWP and IWP). A comparison of the hydrometeor retrievals by the Hydro-Rain algorithm is done with the TRMM PR 2A25 and GPROF 2A12 algorithms. The results reveal that the Hydro-Rain algorithm has good skills in estimating hydrometeor paths LWP and IWP, as well as surface rain rate. An examination of the Hydro-Rain algorithm is also conducted on a super typhoon case, in which the Hydro-Rain has shown very good performance in reproducing the typhoon field. Nevertheless, the passive microwave based estimate of hydrometeors appears to suffer in high rain rate regimes, and as the rain rate increases, the discrepancies with hydrometeor estimates tend to increase as well.

  12. Using adaptive neuro-fuzzy inference system technique for crosstalk correction in simultaneous 99mTc/201Tl SPECT imaging: A Monte Carlo simulation study

    NASA Astrophysics Data System (ADS)

    Heidary, Saeed; Setayeshi, Saeed

    2015-01-01

    This work presents a simulation based study by Monte Carlo which uses two adaptive neuro-fuzzy inference systems (ANFIS) for cross talk compensation of simultaneous 99mTc/201Tl dual-radioisotope SPECT imaging. We have compared two neuro-fuzzy systems based on fuzzy c-means (FCM) and subtractive (SUB) clustering. Our approach incorporates eight energy-windows image acquisition from 28 keV to 156 keV and two main photo peaks of 201Tl (77±10% keV) and 99mTc (140±10% keV). The Geant4 application in emission tomography (GATE) is used as a Monte Carlo simulator for three cylindrical and a NURBS Based Cardiac Torso (NCAT) phantom study. Three separate acquisitions including two single-isotopes and one dual isotope were performed in this study. Cross talk and scatter corrected projections are reconstructed by an iterative ordered subsets expectation maximization (OSEM) algorithm which models the non-uniform attenuation in the projection/back-projection. ANFIS-FCM/SUB structures are tuned to create three to sixteen fuzzy rules for modeling the photon cross-talk of the two radioisotopes. Applying seven to nine fuzzy rules leads to a total improvement of the contrast and the bias comparatively. It is found that there is an out performance for the ANFIS-FCM due to its acceleration and accurate results.

  13. A prediction model of ammonia emission from a fattening pig room based on the indoor concentration using adaptive neuro fuzzy inference system.

    PubMed

    Xie, Qiuju; Ni, Ji-Qin; Su, Zhongbin

    2017-03-05

    Ammonia (NH3) is considered one of the significant pollutions contributor to indoor air quality and odor gas emission from swine house because of the negative impact on the health of pigs, the workers and local environment. Prediction models could provide a reasonable way for pig industries and environment regulatory to determine environment control strategies and give an effective method to evaluate the air quality. The adaptive neuro fuzzy inference system (ANFIS) simulates human's vague thinking manner to solve the ambiguity and nonlinear problems which are difficult to be processed by conventional mathematics. Five kinds of membership functions were used to build a well fitted ANFIS prediction model. It was shown that the prediction model with "Gbell" membership function had the best capabilities among those five kinds of membership functions, and it had the best performances compared with backpropagation (BP) neuro network model and multiple linear regression model (MLRM) both in wintertime and summertime, the smallest value of mean square error (MSE), mean absolute percentage error (MAPE) and standard deviation (SD) are 0.002 and 0.0047, 31.1599 and 23.6816, 0.0564 and 0.0802, respectively, and the largest coefficients of determination (R(2)) are 0.6351 and 0.6483, repectively. The ANFIS prediction model could be served as a beneficial strategy for the environment control system that has input parameters with highly fluctuating, complexity, and non-linear relationship.

  14. On-line self-learning time forward voltage prognosis for lithium-ion batteries using adaptive neuro-fuzzy inference system

    NASA Astrophysics Data System (ADS)

    Fleischer, Christian; Waag, Wladislaw; Bai, Ziou; Sauer, Dirk Uwe

    2013-12-01

    The battery management system (BMS) of a battery-electric road vehicle must ensure an optimal operation of the electrochemical storage system to guarantee for durability and reliability. In particular, the BMS must provide precise information about the battery's state-of-functionality, i.e. how much dis-/charging power can the battery accept at current state and condition while at the same time preventing it from operating outside its safe operating area. These critical limits have to be calculated in a predictive manner, which serve as a significant input factor for the supervising vehicle energy management (VEM). The VEM must provide enough power to the vehicle's drivetrain for certain tasks and especially in critical driving situations. Therefore, this paper describes a new approach which can be used for state-of-available-power estimation with respect to lowest/highest cell voltage prediction using an adaptive neuro-fuzzy inference system (ANFIS). The estimated voltage for a given time frame in the future is directly compared with the actual voltage, verifying the effectiveness and accuracy of a relative voltage prediction error of less than 1%. Moreover, the real-time operating capability of the proposed algorithm was verified on a battery test bench while running on a real-time system performing voltage prediction.

  15. An adaptive neuro fuzzy inference system controlled space cector pulse width modulation based HVDC light transmission system under AC fault conditions

    NASA Astrophysics Data System (ADS)

    Ajay Kumar, M.; Srikanth, N. V.

    2014-03-01

    In HVDC Light transmission systems, converter control is one of the major fields of present day research works. In this paper, fuzzy logic controller is utilized for controlling both the converters of the space vector pulse width modulation (SVPWM) based HVDC Light transmission systems. Due to its complexity in the rule base formation, an intelligent controller known as adaptive neuro fuzzy inference system (ANFIS) controller is also introduced in this paper. The proposed ANFIS controller changes the PI gains automatically for different operating conditions. A hybrid learning method which combines and exploits the best features of both the back propagation algorithm and least square estimation method is used to train the 5-layer ANFIS controller. The performance of the proposed ANFIS controller is compared and validated with the fuzzy logic controller and also with the fixed gain conventional PI controller. The simulations are carried out in the MATLAB/SIMULINK environment. The results reveal that the proposed ANFIS controller is reducing power fluctuations at both the converters. It also improves the dynamic performance of the test power system effectively when tested for various ac fault conditions.

  16. Genetic algorithm-artificial neural network and adaptive neuro-fuzzy inference system modeling of antibacterial activity of annatto dye on Salmonella enteritidis.

    PubMed

    Yolmeh, Mahmoud; Habibi Najafi, Mohammad B; Salehi, Fakhreddin

    2014-01-01

    Annatto is commonly used as a coloring agent in the food industry and has antimicrobial and antioxidant properties. In this study, genetic algorithm-artificial neural network (GA-ANN) and adaptive neuro-fuzzy inference system (ANFIS) models were used to predict the effect of annatto dye on Salmonella enteritidis in mayonnaise. The GA-ANN and ANFIS were fed with 3 inputs of annatto dye concentration (0, 0.1, 0.2 and 0.4%), storage temperature (4 and 25°C) and storage time (1-20 days) for prediction of S. enteritidis population. Both models were trained with experimental data. The results showed that the annatto dye was able to reduce of S. enteritidis and its effect was stronger at 25°C than 4°C. The developed GA-ANN, which included 8 hidden neurons, could predict S. enteritidis population with correlation coefficient of 0.999. The overall agreement between ANFIS predictions and experimental data was also very good (r=0.998). Sensitivity analysis results showed that storage temperature was the most sensitive factor for prediction of S. enteritidis population.

  17. Adaptive Neuro-Fuzzy Inference system analysis on adsorption studies of Reactive Red 198 from aqueous solution by SBA-15/CTAB composite

    NASA Astrophysics Data System (ADS)

    Aghajani, Khadijeh; Tayebi, Habib-Allah

    2017-01-01

    In this study, the Mesoporous material SBA-15 were synthesized and then, the surface was modified by the surfactant Cetyltrimethylammoniumbromide (CTAB). Finally, the obtained adsorbent was used in order to remove Reactive Red 198 (RR 198) from aqueous solution. Transmission electron microscope (TEM), Fourier transform infra-red spectroscopy (FTIR), Thermogravimetric analysis (TGA), X-ray diffraction (XRD), and BET were utilized for the purpose of examining the structural characteristics of obtained adsorbent. Parameters affecting the removal of RR 198 such as pH, the amount of adsorbent, and contact time were investigated at various temperatures and were also optimized. The obtained optimized condition is as follows: pH = 2, time = 60 min and adsorbent dose = 1 g/l. Moreover, a predictive model based on ANFIS for predicting the adsorption amount according to the input variables is presented. The presented model can be used for predicting the adsorption rate based on the input variables include temperature, pH, time, dosage, concentration. The error between actual and approximated output confirm the high accuracy of the proposed model in the prediction process. This fact results in cost reduction because prediction can be done without resorting to costly experimental efforts. SBA-15, CTAB, Reactive Red 198, adsorption study, Adaptive Neuro-Fuzzy Inference systems (ANFIS).

  18. Adaptation.

    PubMed

    Broom, Donald M

    2006-01-01

    The term adaptation is used in biology in three different ways. It may refer to changes which occur at the cell and organ level, or at the individual level, or at the level of gene action and evolutionary processes. Adaptation by cells, especially nerve cells helps in: communication within the body, the distinguishing of stimuli, the avoidance of overload and the conservation of energy. The time course and complexity of these mechanisms varies. Adaptive characters of organisms, including adaptive behaviours, increase fitness so this adaptation is evolutionary. The major part of this paper concerns adaptation by individuals and its relationships to welfare. In complex animals, feed forward control is widely used. Individuals predict problems and adapt by acting before the environmental effect is substantial. Much of adaptation involves brain control and animals have a set of needs, located in the brain and acting largely via motivational mechanisms, to regulate life. Needs may be for resources but are also for actions and stimuli which are part of the mechanism which has evolved to obtain the resources. Hence pigs do not just need food but need to be able to carry out actions like rooting in earth or manipulating materials which are part of foraging behaviour. The welfare of an individual is its state as regards its attempts to cope with its environment. This state includes various adaptive mechanisms including feelings and those which cope with disease. The part of welfare which is concerned with coping with pathology is health. Disease, which implies some significant effect of pathology, always results in poor welfare. Welfare varies over a range from very good, when adaptation is effective and there are feelings of pleasure or contentment, to very poor. A key point concerning the concept of individual adaptation in relation to welfare is that welfare may be good or poor while adaptation is occurring. Some adaptation is very easy and energetically cheap and

  19. Adaptive neuro-fuzzy inference system (ANFIS) to predict CI engine parameters fueled with nano-particles additive to diesel fuel

    NASA Astrophysics Data System (ADS)

    Ghanbari, M.; Najafi, G.; Ghobadian, B.; Mamat, R.; Noor, M. M.; Moosavian, A.

    2015-12-01

    This paper studies the use of adaptive neuro-fuzzy inference system (ANFIS) to predict the performance parameters and exhaust emissions of a diesel engine operating on nanodiesel blended fuels. In order to predict the engine parameters, the whole experimental data were randomly divided into training and testing data. For ANFIS modelling, Gaussian curve membership function (gaussmf) and 200 training epochs (iteration) were found to be optimum choices for training process. The results demonstrate that ANFIS is capable of predicting the diesel engine performance and emissions. In the experimental step, Carbon nano tubes (CNT) (40, 80 and 120 ppm) and nano silver particles (40, 80 and 120 ppm) with nanostructure were prepared and added as additive to the diesel fuel. Six cylinders, four-stroke diesel engine was fuelled with these new blended fuels and operated at different engine speeds. Experimental test results indicated the fact that adding nano particles to diesel fuel, increased diesel engine power and torque output. For nano-diesel it was found that the brake specific fuel consumption (bsfc) was decreased compared to the net diesel fuel. The results proved that with increase of nano particles concentrations (from 40 ppm to 120 ppm) in diesel fuel, CO2 emission increased. CO emission in diesel fuel with nano-particles was lower significantly compared to pure diesel fuel. UHC emission with silver nano-diesel blended fuel decreased while with fuels that contains CNT nano particles increased. The trend of NOx emission was inverse compared to the UHC emission. With adding nano particles to the blended fuels, NOx increased compared to the net diesel fuel. The tests revealed that silver & CNT nano particles can be used as additive in diesel fuel to improve combustion of the fuel and reduce the exhaust emissions significantly.

  20. Estimation of Flow Duration Curve for Ungauged Catchments using Adaptive Neuro-Fuzzy Inference System and Map Correlation Method: A Case Study from Turkey

    NASA Astrophysics Data System (ADS)

    Kentel, E.; Dogulu, N.

    2015-12-01

    In Turkey the experience and data required for a hydrological model setup is limited and very often not available. Moreover there are many ungauged catchments where there are also many planned projects aimed at utilization of water resources including development of existing hydropower potential. This situation makes runoff prediction at locations with lack of data and ungauged locations where small hydropower plants, reservoirs, etc. are planned an increasingly significant challenge and concern in the country. Flow duration curves have many practical applications in hydrology and integrated water resources management. Estimation of flood duration curve (FDC) at ungauged locations is essential, particularly for hydropower feasibility studies and selection of the installed capacities. In this study, we test and compare the performances of two methods for estimating FDCs in the Western Black Sea catchment, Turkey: (i) FDC based on Map Correlation Method (MCM) flow estimates. MCM is a recently proposed method (Archfield and Vogel, 2010) which uses geospatial information to estimate flow. Flow measurements of stream gauging stations nearby the ungauged location are the only data requirement for this method. This fact makes MCM very attractive for flow estimation in Turkey, (ii) Adaptive Neuro-Fuzzy Inference System (ANFIS) is a data-driven method which is used to relate FDC to a number of variables representing catchment and climate characteristics. However, it`s ease of implementation makes it very useful for practical purposes. Both methods use easily collectable data and are computationally efficient. Comparison of the results is realized based on two different measures: the root mean squared error (RMSE) and the Nash-Sutcliffe Efficiency (NSE) value. Ref: Archfield, S. A., and R. M. Vogel (2010), Map correlation method: Selection of a reference streamgage to estimate daily streamflow at ungaged catchments, Water Resour. Res., 46, W10513, doi:10.1029/2009WR008481.

  1. Adaptive neuro-fuzzy inference system model for adsorption of 1,3,4-thiadiazole-2,5-dithiol onto gold nanoparticales-activated carbon.

    PubMed

    Ghaedi, M; Hosaininia, R; Ghaedi, A M; Vafaei, A; Taghizadeh, F

    2014-10-15

    In this research, a novel adsorbent gold nanoparticle loaded on activated carbon (Au-NP-AC) was synthesized by ultrasound energy as a low cost routing protocol. Subsequently, this novel material characterization and identification followed by different techniques such as scanning electron microscope(SEM), Brunauer-Emmett-Teller(BET) and transmission electron microscopy (TEM) analysis. Unique properties such as high BET surface area (>1229.55m(2)/g) and low pore size (<22.46Å) and average particle size lower than 48.8Å in addition to high reactive atoms and the presence of various functional groups make it possible for efficient removal of 1,3,4-thiadiazole-2,5-dithiol (TDDT). Generally, the influence of variables, including the amount of adsorbent, initial pollutant concentration, contact time on pollutants removal percentage has great effect on the removal percentage that their influence was optimized. The optimum parameters for adsorption of 1,3,4-thiadiazole-2, 5-dithiol onto gold nanoparticales-activated carbon were 0.02g adsorbent mass, 10mgL(-1) initial 1,3,4-thiadiazole-2,5-dithiol concentration, 30min contact time and pH 7. The Adaptive neuro-fuzzy inference system (ANFIS), and multiple linear regression (MLR) models, have been applied for prediction of removal of 1,3,4-thiadiazole-2,5-dithiol using gold nanoparticales-activated carbon (Au-NP-AC) in a batch study. The input data are included adsorbent dosage (g), contact time (min) and pollutant concentration (mg/l). The coefficient of determination (R(2)) and mean squared error (MSE) for the training data set of optimal ANFIS model were achieved to be 0.9951 and 0.00017, respectively. These results show that ANFIS model is capable of predicting adsorption of 1,3,4-thiadiazole-2,5-dithiol using Au-NP-AC with high accuracy in an easy, rapid and cost effective way.

  2. Adaptive neuro-fuzzy inference system model for adsorption of 1,3,4-thiadiazole-2,5-dithiol onto gold nanoparticales-activated carbon

    NASA Astrophysics Data System (ADS)

    Ghaedi, M.; Hosaininia, R.; Ghaedi, A. M.; Vafaei, A.; Taghizadeh, F.

    2014-10-01

    In this research, a novel adsorbent gold nanoparticle loaded on activated carbon (Au-NP-AC) was synthesized by ultrasound energy as a low cost routing protocol. Subsequently, this novel material characterization and identification followed by different techniques such as scanning electron microscope (SEM), Brunauer-Emmett-Teller (BET) and transmission electron microscopy (TEM) analysis. Unique properties such as high BET surface area (>1229.55 m2/g) and low pore size (<22.46 Å) and average particle size lower than 48.8 Å in addition to high reactive atoms and the presence of various functional groups make it possible for efficient removal of 1,3,4-thiadiazole-2,5-dithiol (TDDT). Generally, the influence of variables, including the amount of adsorbent, initial pollutant concentration, contact time on pollutants removal percentage has great effect on the removal percentage that their influence was optimized. The optimum parameters for adsorption of 1,3,4-thiadiazole-2, 5-dithiol onto gold nanoparticales-activated carbon were 0.02 g adsorbent mass, 10 mg L-1 initial 1,3,4-thiadiazole-2,5-dithiol concentration, 30 min contact time and pH 7. The Adaptive neuro-fuzzy inference system (ANFIS), and multiple linear regression (MLR) models, have been applied for prediction of removal of 1,3,4-thiadiazole-2,5-dithiol using gold nanoparticales-activated carbon (Au-NP-AC) in a batch study. The input data are included adsorbent dosage (g), contact time (min) and pollutant concentration (mg/l). The coefficient of determination (R2) and mean squared error (MSE) for the training data set of optimal ANFIS model were achieved to be 0.9951 and 0.00017, respectively. These results show that ANFIS model is capable of predicting adsorption of 1,3,4-thiadiazole-2,5-dithiol using Au-NP-AC with high accuracy in an easy, rapid and cost effective way.

  3. Adapt

    NASA Astrophysics Data System (ADS)

    Bargatze, L. F.

    2015-12-01

    Active Data Archive Product Tracking (ADAPT) is a collection of software routines that permits one to generate XML metadata files to describe and register data products in support of the NASA Heliophysics Virtual Observatory VxO effort. ADAPT is also a philosophy. The ADAPT concept is to use any and all available metadata associated with scientific data to produce XML metadata descriptions in a consistent, uniform, and organized fashion to provide blanket access to the full complement of data stored on a targeted data server. In this poster, we present an application of ADAPT to describe all of the data products that are stored by using the Common Data File (CDF) format served out by the CDAWEB and SPDF data servers hosted at the NASA Goddard Space Flight Center. These data servers are the primary repositories for NASA Heliophysics data. For this purpose, the ADAPT routines have been used to generate data resource descriptions by using an XML schema named Space Physics Archive, Search, and Extract (SPASE). SPASE is the designated standard for documenting Heliophysics data products, as adopted by the Heliophysics Data and Model Consortium. The set of SPASE XML resource descriptions produced by ADAPT includes high-level descriptions of numerical data products, display data products, or catalogs and also includes low-level "Granule" descriptions. A SPASE Granule is effectively a universal access metadata resource; a Granule associates an individual data file (e.g. a CDF file) with a "parent" high-level data resource description, assigns a resource identifier to the file, and lists the corresponding assess URL(s). The CDAWEB and SPDF file systems were queried to provide the input required by the ADAPT software to create an initial set of SPASE metadata resource descriptions. Then, the CDAWEB and SPDF data repositories were queried subsequently on a nightly basis and the CDF file lists were checked for any changes such as the occurrence of new, modified, or deleted

  4. Perceptual inference.

    PubMed

    Aggelopoulos, Nikolaos C

    2015-08-01

    Perceptual inference refers to the ability to infer sensory stimuli from predictions that result from internal neural representations built through prior experience. Methods of Bayesian statistical inference and decision theory model cognition adequately by using error sensing either in guiding action or in "generative" models that predict the sensory information. In this framework, perception can be seen as a process qualitatively distinct from sensation, a process of information evaluation using previously acquired and stored representations (memories) that is guided by sensory feedback. The stored representations can be utilised as internal models of sensory stimuli enabling long term associations, for example in operant conditioning. Evidence for perceptual inference is contributed by such phenomena as the cortical co-localisation of object perception with object memory, the response invariance in the responses of some neurons to variations in the stimulus, as well as from situations in which perception can be dissociated from sensation. In the context of perceptual inference, sensory areas of the cerebral cortex that have been facilitated by a priming signal may be regarded as comparators in a closed feedback loop, similar to the better known motor reflexes in the sensorimotor system. The adult cerebral cortex can be regarded as similar to a servomechanism, in using sensory feedback to correct internal models, producing predictions of the outside world on the basis of past experience.

  5. Statistical Inference

    NASA Astrophysics Data System (ADS)

    Khan, Shahjahan

    Often scientific information on various data generating processes are presented in the from of numerical and categorical data. Except for some very rare occasions, generally such data represent a small part of the population, or selected outcomes of any data generating process. Although, valuable and useful information is lurking in the array of scientific data, generally, they are unavailable to the users. Appropriate statistical methods are essential to reveal the hidden "jewels" in the mess of the row data. Exploratory data analysis methods are used to uncover such valuable characteristics of the observed data. Statistical inference provides techniques to make valid conclusions about the unknown characteristics or parameters of the population from which scientifically drawn sample data are selected. Usually, statistical inference includes estimation of population parameters as well as performing test of hypotheses on the parameters. However, prediction of future responses and determining the prediction distributions are also part of statistical inference. Both Classical or Frequentists and Bayesian approaches are used in statistical inference. The commonly used Classical approach is based on the sample data alone. In contrast, increasingly popular Beyesian approach uses prior distribution on the parameters along with the sample data to make inferences. The non-parametric and robust methods are also being used in situations where commonly used model assumptions are unsupported. In this chapter,we cover the philosophical andmethodological aspects of both the Classical and Bayesian approaches.Moreover, some aspects of predictive inference are also included. In the absence of any evidence to support assumptions regarding the distribution of the underlying population, or if the variable is measured only in ordinal scale, non-parametric methods are used. Robust methods are employed to avoid any significant changes in the results due to deviations from the model

  6. Statistical Inference

    NASA Astrophysics Data System (ADS)

    Khan, Shahjahan

    Often scientific information on various data generating processes are presented in the from of numerical and categorical data. Except for some very rare occasions, generally such data represent a small part of the population, or selected outcomes of any data generating process. Although, valuable and useful information is lurking in the array of scientific data, generally, they are unavailable to the users. Appropriate statistical methods are essential to reveal the hidden “jewels” in the mess of the row data. Exploratory data analysis methods are used to uncover such valuable characteristics of the observed data. Statistical inference provides techniques to make valid conclusions about the unknown characteristics or parameters of the population from which scientifically drawn sample data are selected. Usually, statistical inference includes estimation of population parameters as well as performing test of hypotheses on the parameters. However, prediction of future responses and determining the prediction distributions are also part of statistical inference. Both Classical or Frequentists and Bayesian approaches are used in statistical inference. The commonly used Classical approach is based on the sample data alone. In contrast, increasingly popular Beyesian approach uses prior distribution on the parameters along with the sample data to make inferences. The non-parametric and robust methods are also being used in situations where commonly used model assumptions are unsupported. In this chapter,we cover the philosophical andmethodological aspects of both the Classical and Bayesian approaches.Moreover, some aspects of predictive inference are also included. In the absence of any evidence to support assumptions regarding the distribution of the underlying population, or if the variable is measured only in ordinal scale, non-parametric methods are used. Robust methods are employed to avoid any significant changes in the results due to deviations from the model

  7. Adaptive fuzzy system for 3-D vision

    NASA Technical Reports Server (NTRS)

    Mitra, Sunanda

    1993-01-01

    An adaptive fuzzy system using the concept of the Adaptive Resonance Theory (ART) type neural network architecture and incorporating fuzzy c-means (FCM) system equations for reclassification of cluster centers was developed. The Adaptive Fuzzy Leader Clustering (AFLC) architecture is a hybrid neural-fuzzy system which learns on-line in a stable and efficient manner. The system uses a control structure similar to that found in the Adaptive Resonance Theory (ART-1) network to identify the cluster centers initially. The initial classification of an input takes place in a two stage process; a simple competitive stage and a distance metric comparison stage. The cluster prototypes are then incrementally updated by relocating the centroid positions from Fuzzy c-Means (FCM) system equations for the centroids and the membership values. The operational characteristics of AFLC and the critical parameters involved in its operation are discussed. The performance of the AFLC algorithm is presented through application of the algorithm to the Anderson Iris data, and laser-luminescent fingerprint image data. The AFLC algorithm successfully classifies features extracted from real data, discrete or continuous, indicating the potential strength of this new clustering algorithm in analyzing complex data sets. The hybrid neuro-fuzzy AFLC algorithm will enhance analysis of a number of difficult recognition and control problems involved with Tethered Satellite Systems and on-orbit space shuttle attitude controller.

  8. Estimating patient specific uncertainty parameters for adaptive treatment re-planning in proton therapy using in vivo range measurements and Bayesian inference: application to setup and stopping power errors

    NASA Astrophysics Data System (ADS)

    Labarbe, Rudi; Janssens, Guillaume; Sterpin, Edmond

    2016-09-01

    In proton therapy, quantification of the proton range uncertainty is important to achieve dose distribution compliance. The promising accuracy of prompt gamma imaging (PGI) suggests the development of a mathematical framework using the range measurements to convert population based estimates of uncertainties into patient specific estimates with the purpose of plan adaptation. We present here such framework using Bayesian inference. The sources of uncertainty were modeled by three parameters: setup bias m, random setup precision r and water equivalent path length bias u. The evolution of the expectation values E(m), E(r) and E(u) during the treatment was simulated. The expectation values converged towards the true simulation parameters after 5 and 10 fractions, for E(m) and E(u), respectively. E(r) settle on a constant value slightly lower than the true value after 10 fractions. In conclusion, the simulation showed that there is enough information in the frequency distribution of the range errors measured by PGI to estimate the expectation values and the confidence interval of the model parameters by Bayesian inference. The updated model parameters were used to compute patient specific lateral and local distal margins for adaptive re-planning.

  9. The association forecasting of 13 variants within seven asthma susceptibility genes on 3 serum IgE groups in Taiwanese population by integrating of adaptive neuro-fuzzy inference system (ANFIS) and classification analysis methods.

    PubMed

    Wang, Cheng-Hang; Liu, Baw-Jhiune; Wu, Lawrence Shih-Hsin

    2012-02-01

    Asthma is one of the most common chronic diseases in children. It is caused by complicated coactions between various genetic factors and environmental allergens. The study aims to integrate the concept of implementing adaptive neuro-fuzzy inference system (ANFIS) and classification analysis methods for forecasting the association of asthma susceptibility genes on 3 serum IgE groups. The ANFIS model was trained and tested with data sets obtained from 425 asthmatic subjects and 483 non-asthma subjects from the Taiwanese population. We assessed 13 single-nucleotide polymorphisms (SNPs) in seven well-known asthma susceptibility genes; firstly, the proposed ANFIS model learned to reduce input features from the 13 SNPs. And secondly, the classification will be used to classify the serum IgE groups from the simulated SNPs results. The performance of the ANFIS model, classification accuracies and the results confirmed that the integration of ANFIS and classified analysis has potential in association discovery.

  10. Sites Inferred by Metabolic Background Assertion Labeling (SIMBAL): adapting the Partial Phylogenetic Profiling algorithm to scan sequences for signatures that predict protein function

    PubMed Central

    2010-01-01

    Background Comparative genomics methods such as phylogenetic profiling can mine powerful inferences from inherently noisy biological data sets. We introduce Sites Inferred by Metabolic Background Assertion Labeling (SIMBAL), a method that applies the Partial Phylogenetic Profiling (PPP) approach locally within a protein sequence to discover short sequence signatures associated with functional sites. The approach is based on the basic scoring mechanism employed by PPP, namely the use of binomial distribution statistics to optimize sequence similarity cutoffs during searches of partitioned training sets. Results Here we illustrate and validate the ability of the SIMBAL method to find functionally relevant short sequence signatures by application to two well-characterized protein families. In the first example, we partitioned a family of ABC permeases using a metabolic background property (urea utilization). Thus, the TRUE set for this family comprised members whose genome of origin encoded a urea utilization system. By moving a sliding window across the sequence of a permease, and searching each subsequence in turn against the full set of partitioned proteins, the method found which local sequence signatures best correlated with the urea utilization trait. Mapping of SIMBAL "hot spots" onto crystal structures of homologous permeases reveals that the significant sites are gating determinants on the cytosolic face rather than, say, docking sites for the substrate-binding protein on the extracellular face. In the second example, we partitioned a protein methyltransferase family using gene proximity as a criterion. In this case, the TRUE set comprised those methyltransferases encoded near the gene for the substrate RF-1. SIMBAL identifies sequence regions that map onto the substrate-binding interface while ignoring regions involved in the methyltransferase reaction mechanism in general. Neither method for training set construction requires any prior experimental

  11. Adaptive fuzzy leader clustering of complex data sets in pattern recognition

    NASA Technical Reports Server (NTRS)

    Newton, Scott C.; Pemmaraju, Surya; Mitra, Sunanda

    1992-01-01

    A modular, unsupervised neural network architecture for clustering and classification of complex data sets is presented. The adaptive fuzzy leader clustering (AFLC) architecture is a hybrid neural-fuzzy system that learns on-line in a stable and efficient manner. The initial classification is performed in two stages: a simple competitive stage and a distance metric comparison stage. The cluster prototypes are then incrementally updated by relocating the centroid positions from fuzzy C-means system equations for the centroids and the membership values. The AFLC algorithm is applied to the Anderson Iris data and laser-luminescent fingerprint image data. It is concluded that the AFLC algorithm successfully classifies features extracted from real data, discrete or continuous.

  12. Prediction of settled water turbidity and optimal coagulant dosage in drinking water treatment plant using a hybrid model of k-means clustering and adaptive neuro-fuzzy inference system

    NASA Astrophysics Data System (ADS)

    Kim, Chan Moon; Parnichkun, Manukid

    2017-02-01

    Coagulation is an important process in drinking water treatment to attain acceptable treated water quality. However, the determination of coagulant dosage is still a challenging task for operators, because coagulation is nonlinear and complicated process. Feedback control to achieve the desired treated water quality is difficult due to lengthy process time. In this research, a hybrid of k-means clustering and adaptive neuro-fuzzy inference system (k-means-ANFIS) is proposed for the settled water turbidity prediction and the optimal coagulant dosage determination using full-scale historical data. To build a well-adaptive model to different process states from influent water, raw water quality data are classified into four clusters according to its properties by a k-means clustering technique. The sub-models are developed individually on the basis of each clustered data set. Results reveal that the sub-models constructed by a hybrid k-means-ANFIS perform better than not only a single ANFIS model, but also seasonal models by artificial neural network (ANN). The finally completed model consisting of sub-models shows more accurate and consistent prediction ability than a single model of ANFIS and a single model of ANN based on all five evaluation indices. Therefore, the hybrid model of k-means-ANFIS can be employed as a robust tool for managing both treated water quality and production costs simultaneously.

  13. Constraints on the conformation of the cytoplasmic face of dark-adapted and light-excited rhodopsin inferred from antirhodopsin antibody imprints

    PubMed Central

    Bailey, Brian W.; Mumey, Brendan; Hargrave, Paul A.; Arendt, Anatol; Ernst, Oliver P.; Hofmann, Klaus Peter; Callis, Patrik R.; Burritt, James B.; Jesaitis, Algirdas J.; Dratz, Edward A.

    2003-01-01

    Rhodopsin is the best-understood member of the large G protein–coupled receptor (GPCR) superfamily. The G-protein amplification cascade is triggered by poorly understood light-induced conformational changes in rhodopsin that are homologous to changes caused by agonists in other GPCRs. We have applied the "antibody imprint" method to light-activated rhodopsin in native membranes by using nine monoclonal antibodies (mAbs) against aqueous faces of rhodopsin. Epitopes recognized by these mAbs were found by selection from random peptide libraries displayed on phage. A new computer algorithm, FINDMAP, was used to map the epitopes to discontinuous segments of rhodopsin that are distant in the primary sequence but are in close spatial proximity in the structure. The proximity of a segment of the N-terminal and the loop between helices VI and VIII found by FINDMAP is consistent with the X-ray structure of the dark-adapted rhodopsin. Epitopes to the cytoplasmic face segregated into two classes with different predicted spatial proximities of protein segments that correlate with different preferences of the antibodies for stabilizing the metarhodopsin I or metarhodopsin II conformations of light-excited rhodopsin. Epitopes of antibodies that stabilize metarhodopsin II indicate conformational changes from dark-adapted rhodopsin, including rearrangements of the C-terminal tail and altered exposure of the cytoplasmic end of helix VI, a portion of the C-3 loop, and helix VIII. As additional antibodies are subjected to antibody imprinting, this approach should provide increasingly detailed information on the conformation of light-excited rhodopsin and be applicable to structural studies of other challenging protein targets. PMID:14573859

  14. Divergent evolution and molecular adaptation in the Drosophila odorant-binding protein family: inferences from sequence variation at the OS-E and OS-F genes

    PubMed Central

    2008-01-01

    Background The Drosophila Odorant-Binding Protein (Obp) genes constitute a multigene family with moderate gene number variation across species. The OS-E and OS-F genes are the two phylogenetically closest members of this family in the D. melanogaster genome. In this species, these genes are arranged in the same genomic cluster and likely arose by tandem gene duplication, the major mechanism proposed for the origin of new members in this olfactory-system family. Results We have analyzed the genomic cluster encompassing OS-E and OS-F genes (Obp83 genomic region) to determine the role of the functional divergence and molecular adaptation on the Obp family size evolution. We compared nucleotide and amino acid variation across 18 Drosophila and 4 mosquito species applying a phylogenetic-based maximum likelihood approach complemented with information of the OBP three-dimensional structure and function. We show that, in spite the OS-E and OS-F genes are currently subject to similar and strong selective constraints, they likely underwent divergent evolution. Positive selection was likely involved in the functional diversification of new copies in the early stages after the gene duplication event; moreover, it might have shaped nucleotide variation of the OS-E gene concomitantly with the loss of functionally related members. Besides, molecular adaptation likely affecting the functional OBP conformational changes was supported by the analysis of the evolution of physicochemical properties of the OS-E protein and the location of the putative positive selected amino acids on the OBP three-dimensional structure. Conclusion Our results support that positive selection was likely involved in the functional differentiation of new copies of the OBP multigene family in the early stages after their birth by gene duplication; likewise, it might shape variation of some members of the family concomitantly with the loss of functionally related genes. Thus, the stochastic gene gain

  15. Inferring Horizontal Gene Transfer

    PubMed Central

    Lassalle, Florent; Dessimoz, Christophe

    2015-01-01

    Horizontal or Lateral Gene Transfer (HGT or LGT) is the transmission of portions of genomic DNA between organisms through a process decoupled from vertical inheritance. In the presence of HGT events, different fragments of the genome are the result of different evolutionary histories. This can therefore complicate the investigations of evolutionary relatedness of lineages and species. Also, as HGT can bring into genomes radically different genotypes from distant lineages, or even new genes bearing new functions, it is a major source of phenotypic innovation and a mechanism of niche adaptation. For example, of particular relevance to human health is the lateral transfer of antibiotic resistance and pathogenicity determinants, leading to the emergence of pathogenic lineages [1]. Computational identification of HGT events relies upon the investigation of sequence composition or evolutionary history of genes. Sequence composition-based ("parametric") methods search for deviations from the genomic average, whereas evolutionary history-based ("phylogenetic") approaches identify genes whose evolutionary history significantly differs from that of the host species. The evaluation and benchmarking of HGT inference methods typically rely upon simulated genomes, for which the true history is known. On real data, different methods tend to infer different HGT events, and as a result it can be difficult to ascertain all but simple and clear-cut HGT events. PMID:26020646

  16. Adaptive neuro-fuzzy inference system multi-objective optimization using the genetic algorithm/singular value decomposition method for modelling the discharge coefficient in rectangular sharp-crested side weirs

    NASA Astrophysics Data System (ADS)

    Khoshbin, Fatemeh; Bonakdari, Hossein; Hamed Ashraf Talesh, Seyed; Ebtehaj, Isa; Zaji, Amir Hossein; Azimi, Hamed

    2016-06-01

    In the present article, the adaptive neuro-fuzzy inference system (ANFIS) is employed to model the discharge coefficient in rectangular sharp-crested side weirs. The genetic algorithm (GA) is used for the optimum selection of membership functions, while the singular value decomposition (SVD) method helps in computing the linear parameters of the ANFIS results section (GA/SVD-ANFIS). The effect of each dimensionless parameter on discharge coefficient prediction is examined in five different models to conduct sensitivity analysis by applying the above-mentioned dimensionless parameters. Two different sets of experimental data are utilized to examine the models and obtain the best model. The study results indicate that the model designed through GA/SVD-ANFIS predicts the discharge coefficient with a good level of accuracy (mean absolute percentage error = 3.362 and root mean square error = 0.027). Moreover, comparing this method with existing equations and the multi-layer perceptron-artificial neural network (MLP-ANN) indicates that the GA/SVD-ANFIS method has superior performance in simulating the discharge coefficient of side weirs.

  17. Adaptive Importance Sampling for Control and Inference

    NASA Astrophysics Data System (ADS)

    Kappen, H. J.; Ruiz, H. C.

    2016-03-01

    Path integral (PI) control problems are a restricted class of non-linear control problems that can be solved formally as a Feynman-Kac PI and can be estimated using Monte Carlo sampling. In this contribution we review PI control theory in the finite horizon case. We subsequently focus on the problem how to compute and represent control solutions. We review the most commonly used methods in robotics and control. Within the PI theory, the question of how to compute becomes the question of importance sampling. Efficient importance samplers are state feedback controllers and the use of these requires an efficient representation. Learning and representing effective state-feedback controllers for non-linear stochastic control problems is a very challenging, and largely unsolved, problem. We show how to learn and represent such controllers using ideas from the cross entropy method. We derive a gradient descent method that allows to learn feed-back controllers using an arbitrary parametrisation. We refer to this method as the path integral cross entropy method or PICE. We illustrate this method for some simple examples. The PI control methods can be used to estimate the posterior distribution in latent state models. In neuroscience these problems arise when estimating connectivity from neural recording data using EM. We demonstrate the PI control method as an accurate alternative to particle filtering.

  18. An Adaptive Network-based Fuzzy Inference System for the detection of thermal and TEC anomalies around the time of the Varzeghan, Iran, (Mw = 6.4) earthquake of 11 August 2012

    NASA Astrophysics Data System (ADS)

    Akhoondzadeh, M.

    2013-09-01

    Anomaly detection is extremely important for forecasting the date, location and magnitude of an impending earthquake. In this paper, an Adaptive Network-based Fuzzy Inference System (ANFIS) has been proposed to detect the thermal and Total Electron Content (TEC) anomalies around the time of the Varzeghan, Iran, (Mw = 6.4) earthquake jolted in 11 August 2012 NW Iran. ANFIS is the famous hybrid neuro-fuzzy network for modeling the non-linear complex systems. In this study, also the detected thermal and TEC anomalies using the proposed method are compared to the results dealing with the observed anomalies by applying the classical and intelligent methods including Interquartile, Auto-Regressive Integrated Moving Average (ARIMA), Artificial Neural Network (ANN) and Support Vector Machine (SVM) methods. The duration of the dataset which is comprised from Aqua-MODIS Land Surface Temperature (LST) night-time snapshot images and also Global Ionospheric Maps (GIM), is 62 days. It can be shown that, if the difference between the predicted value using the ANFIS method and the observed value, exceeds the pre-defined threshold value, then the observed precursor value in the absence of non seismic effective parameters could be regarded as precursory anomaly. For two precursors of LST and TEC, the ANFIS method shows very good agreement with the other implemented classical and intelligent methods and this indicates that ANFIS is capable of detecting earthquake anomalies. The applied methods detected anomalous occurrences 1 and 2 days before the earthquake. This paper indicates that the detection of the thermal and TEC anomalies derive their credibility from the overall efficiencies and potentialities of the five integrated methods.

  19. Long-range forecast of all India summer monsoon rainfall using adaptive neuro-fuzzy inference system: skill comparison with CFSv2 model simulation and real-time forecast for the year 2015

    NASA Astrophysics Data System (ADS)

    Chaudhuri, S.; Das, D.; Goswami, S.; Das, S. K.

    2016-11-01

    All India summer monsoon rainfall (AISMR) characteristics play a vital role for the policy planning and national economy of the country. In view of the significant impact of monsoon system on regional as well as global climate systems, accurate prediction of summer monsoon rainfall has become a challenge. The objective of this study is to develop an adaptive neuro-fuzzy inference system (ANFIS) for long range forecast of AISMR. The NCEP/NCAR reanalysis data of temperature, zonal and meridional wind at different pressure levels have been taken to construct the input matrix of ANFIS. The membership of the input parameters for AISMR as high, medium or low is estimated with trapezoidal membership function. The fuzzified standardized input parameters and the de-fuzzified target output are trained with artificial neural network models. The forecast of AISMR with ANFIS is compared with non-hybrid multi-layer perceptron model (MLP), radial basis functions network (RBFN) and multiple linear regression (MLR) models. The forecast error analyses of the models reveal that ANFIS provides the best forecast of AISMR with minimum prediction error of 0.076, whereas the errors with MLP, RBFN and MLR models are 0.22, 0.18 and 0.73 respectively. During validation with observations, ANFIS shows its potency over the said comparative models. Performance of the ANFIS model is verified through different statistical skill scores, which also confirms the aptitude of ANFIS in forecasting AISMR. The forecast skill of ANFIS is also observed to be better than Climate Forecast System version 2. The real-time forecast with ANFIS shows possibility of deficit (65-75 cm) AISMR in the year 2015.

  20. Multiple Instance Fuzzy Inference

    DTIC Science & Technology

    2015-12-02

    INFERENCE A novel fuzzy learning framework that employs fuzzy inference to solve the problem of multiple instance learning (MIL) is presented. The...fuzzy learning framework that employs fuzzy inference to solve the problem of multiple instance learning (MIL) is presented. The framework introduces a...or learned from data. In multiple instance problems, the training data is ambiguously labeled. Instances are grouped into bags, labels of bags are

  1. Bayesian Cosmological inference beyond statistical isotropy

    NASA Astrophysics Data System (ADS)

    Souradeep, Tarun; Das, Santanu; Wandelt, Benjamin

    2016-10-01

    With advent of rich data sets, computationally challenge of inference in cosmology has relied on stochastic sampling method. First, I review the widely used MCMC approach used to infer cosmological parameters and present a adaptive improved implementation SCoPE developed by our group. Next, I present a general method for Bayesian inference of the underlying covariance structure of random fields on a sphere. We employ the Bipolar Spherical Harmonic (BipoSH) representation of general covariance structure on the sphere. We illustrate the efficacy of the method with a principled approach to assess violation of statistical isotropy (SI) in the sky maps of Cosmic Microwave Background (CMB) fluctuations. The general, principled, approach to a Bayesian inference of the covariance structure in a random field on a sphere presented here has huge potential for application to other many aspects of cosmology and astronomy, as well as, more distant areas of research like geosciences and climate modelling.

  2. Exploring Beginning Inference with Novice Grade 7 Students

    ERIC Educational Resources Information Center

    Watson, Jane M.

    2008-01-01

    This study documented efforts to facilitate ideas of beginning inference in novice grade 7 students. A design experiment allowed modified teaching opportunities in light of observation of components of a framework adapted from that developed by Pfannkuch for teaching informal inference with box plots. Box plots were replaced by hat plots, a…

  3. Inference in `poor` languages

    SciTech Connect

    Petrov, S.

    1996-10-01

    Languages with a solvable implication problem but without complete and consistent systems of inference rules (`poor` languages) are considered. The problem of existence of finite complete and consistent inference rule system for a ``poor`` language is stated independently of the language or rules syntax. Several properties of the problem arc proved. An application of results to the language of join dependencies is given.

  4. The Bayes Inference Engine

    SciTech Connect

    Hanson, K.M.; Cunningham, G.S.

    1996-04-01

    The authors are developing a computer application, called the Bayes Inference Engine, to provide the means to make inferences about models of physical reality within a Bayesian framework. The construction of complex nonlinear models is achieved by a fully object-oriented design. The models are represented by a data-flow diagram that may be manipulated by the analyst through a graphical programming environment. Maximum a posteriori solutions are achieved using a general, gradient-based optimization algorithm. The application incorporates a new technique of estimating and visualizing the uncertainties in specific aspects of the model.

  5. Reinforcement learning or active inference?

    PubMed

    Friston, Karl J; Daunizeau, Jean; Kiebel, Stefan J

    2009-07-29

    This paper questions the need for reinforcement learning or control theory when optimising behaviour. We show that it is fairly simple to teach an agent complicated and adaptive behaviours using a free-energy formulation of perception. In this formulation, agents adjust their internal states and sampling of the environment to minimize their free-energy. Such agents learn causal structure in the environment and sample it in an adaptive and self-supervised fashion. This results in behavioural policies that reproduce those optimised by reinforcement learning and dynamic programming. Critically, we do not need to invoke the notion of reward, value or utility. We illustrate these points by solving a benchmark problem in dynamic programming; namely the mountain-car problem, using active perception or inference under the free-energy principle. The ensuing proof-of-concept may be important because the free-energy formulation furnishes a unified account of both action and perception and may speak to a reappraisal of the role of dopamine in the brain.

  6. Reinforcement Learning or Active Inference?

    PubMed Central

    Friston, Karl J.; Daunizeau, Jean; Kiebel, Stefan J.

    2009-01-01

    This paper questions the need for reinforcement learning or control theory when optimising behaviour. We show that it is fairly simple to teach an agent complicated and adaptive behaviours using a free-energy formulation of perception. In this formulation, agents adjust their internal states and sampling of the environment to minimize their free-energy. Such agents learn causal structure in the environment and sample it in an adaptive and self-supervised fashion. This results in behavioural policies that reproduce those optimised by reinforcement learning and dynamic programming. Critically, we do not need to invoke the notion of reward, value or utility. We illustrate these points by solving a benchmark problem in dynamic programming; namely the mountain-car problem, using active perception or inference under the free-energy principle. The ensuing proof-of-concept may be important because the free-energy formulation furnishes a unified account of both action and perception and may speak to a reappraisal of the role of dopamine in the brain. PMID:19641614

  7. Inference as Prediction

    ERIC Educational Resources Information Center

    Watson, Jane

    2007-01-01

    Inference, or decision making, is seen in curriculum documents as the final step in a statistical investigation. For a formal statistical enquiry this may be associated with sophisticated tests involving probability distributions. For young students without the mathematical background to perform such tests, it is still possible to draw informal…

  8. Optical Inference Machines

    DTIC Science & Technology

    1988-06-27

    de olf nessse end Id e ;-tl Sb ieeI smleo) ,Optical Artificial Intellegence ; Optical inference engines; Optical logic; Optical informationprocessing...common. They arise in areas such as expert systems and other artificial intelligence systems. In recent years, the computer science language PROLOG has...cal processors should in principle be well suited for : I artificial intelligence applications. In recent years, symbolic logic processing. , the

  9. Active inference and learning.

    PubMed

    Friston, Karl; FitzGerald, Thomas; Rigoli, Francesco; Schwartenbeck, Philipp; O'Doherty, John; Pezzulo, Giovanni

    2016-09-01

    This paper offers an active inference account of choice behaviour and learning. It focuses on the distinction between goal-directed and habitual behaviour and how they contextualise each other. We show that habits emerge naturally (and autodidactically) from sequential policy optimisation when agents are equipped with state-action policies. In active inference, behaviour has explorative (epistemic) and exploitative (pragmatic) aspects that are sensitive to ambiguity and risk respectively, where epistemic (ambiguity-resolving) behaviour enables pragmatic (reward-seeking) behaviour and the subsequent emergence of habits. Although goal-directed and habitual policies are usually associated with model-based and model-free schemes, we find the more important distinction is between belief-free and belief-based schemes. The underlying (variational) belief updating provides a comprehensive (if metaphorical) process theory for several phenomena, including the transfer of dopamine responses, reversal learning, habit formation and devaluation. Finally, we show that active inference reduces to a classical (Bellman) scheme, in the absence of ambiguity.

  10. Inverse Ising inference with correlated samples

    NASA Astrophysics Data System (ADS)

    Obermayer, Benedikt; Levine, Erel

    2014-12-01

    Correlations between two variables of a high-dimensional system can be indicative of an underlying interaction, but can also result from indirect effects. Inverse Ising inference is a method to distinguish one from the other. Essentially, the parameters of the least constrained statistical model are learned from the observed correlations such that direct interactions can be separated from indirect correlations. Among many other applications, this approach has been helpful for protein structure prediction, because residues which interact in the 3D structure often show correlated substitutions in a multiple sequence alignment. In this context, samples used for inference are not independent but share an evolutionary history on a phylogenetic tree. Here, we discuss the effects of correlations between samples on global inference. Such correlations could arise due to phylogeny but also via other slow dynamical processes. We present a simple analytical model to address the resulting inference biases, and develop an exact method accounting for background correlations in alignment data by combining phylogenetic modeling with an adaptive cluster expansion algorithm. We find that popular reweighting schemes are only marginally effective at removing phylogenetic bias, suggest a rescaling strategy that yields better results, and provide evidence that our conclusions carry over to the frequently used mean-field approach to the inverse Ising problem.

  11. Visual Inference Programming

    NASA Technical Reports Server (NTRS)

    Wheeler, Kevin; Timucin, Dogan; Rabbette, Maura; Curry, Charles; Allan, Mark; Lvov, Nikolay; Clanton, Sam; Pilewskie, Peter

    2002-01-01

    The goal of visual inference programming is to develop a software framework data analysis and to provide machine learning algorithms for inter-active data exploration and visualization. The topics include: 1) Intelligent Data Understanding (IDU) framework; 2) Challenge problems; 3) What's new here; 4) Framework features; 5) Wiring diagram; 6) Generated script; 7) Results of script; 8) Initial algorithms; 9) Independent Component Analysis for instrument diagnosis; 10) Output sensory mapping virtual joystick; 11) Output sensory mapping typing; 12) Closed-loop feedback mu-rhythm control; 13) Closed-loop training; 14) Data sources; and 15) Algorithms. This paper is in viewgraph form.

  12. Circular inferences in schizophrenia.

    PubMed

    Jardri, Renaud; Denève, Sophie

    2013-11-01

    A considerable number of recent experimental and computational studies suggest that subtle impairments of excitatory to inhibitory balance or regulation are involved in many neurological and psychiatric conditions. The current paper aims to relate, specifically and quantitatively, excitatory to inhibitory imbalance with psychotic symptoms in schizophrenia. Considering that the brain constructs hierarchical causal models of the external world, we show that the failure to maintain the excitatory to inhibitory balance results in hallucinations as well as in the formation and subsequent consolidation of delusional beliefs. Indeed, the consequence of excitatory to inhibitory imbalance in a hierarchical neural network is equated to a pathological form of causal inference called 'circular belief propagation'. In circular belief propagation, bottom-up sensory information and top-down predictions are reverberated, i.e. prior beliefs are misinterpreted as sensory observations and vice versa. As a result, these predictions are counted multiple times. Circular inference explains the emergence of erroneous percepts, the patient's overconfidence when facing probabilistic choices, the learning of 'unshakable' causal relationships between unrelated events and a paradoxical immunity to perceptual illusions, which are all known to be associated with schizophrenia.

  13. Moment inference from tomograms

    USGS Publications Warehouse

    Day-Lewis, F. D.; Chen, Y.; Singha, K.

    2007-01-01

    Time-lapse geophysical tomography can provide valuable qualitative insights into hydrologic transport phenomena associated with aquifer dynamics, tracer experiments, and engineered remediation. Increasingly, tomograms are used to infer the spatial and/or temporal moments of solute plumes; these moments provide quantitative information about transport processes (e.g., advection, dispersion, and rate-limited mass transfer) and controlling parameters (e.g., permeability, dispersivity, and rate coefficients). The reliability of moments calculated from tomograms is, however, poorly understood because classic approaches to image appraisal (e.g., the model resolution matrix) are not directly applicable to moment inference. Here, we present a semi-analytical approach to construct a moment resolution matrix based on (1) the classic model resolution matrix and (2) image reconstruction from orthogonal moments. Numerical results for radar and electrical-resistivity imaging of solute plumes demonstrate that moment values calculated from tomograms depend strongly on plume location within the tomogram, survey geometry, regularization criteria, and measurement error. Copyright 2007 by the American Geophysical Union.

  14. Conditional statistical inference with multistage testing designs.

    PubMed

    Zwitser, Robert J; Maris, Gunter

    2015-03-01

    In this paper it is demonstrated how statistical inference from multistage test designs can be made based on the conditional likelihood. Special attention is given to parameter estimation, as well as the evaluation of model fit. Two reasons are provided why the fit of simple measurement models is expected to be better in adaptive designs, compared to linear designs: more parameters are available for the same number of observations; and undesirable response behavior, like slipping and guessing, might be avoided owing to a better match between item difficulty and examinee proficiency. The results are illustrated with simulated data, as well as with real data.

  15. On the Inference of Functional Circadian Networks Using Granger Causality.

    PubMed

    Pourzanjani, Arya; Herzog, Erik D; Petzold, Linda R

    2015-01-01

    Being able to infer one way direct connections in an oscillatory network such as the suprachiastmatic nucleus (SCN) of the mammalian brain using time series data is difficult but crucial to understanding network dynamics. Although techniques have been developed for inferring networks from time series data, there have been no attempts to adapt these techniques to infer directional connections in oscillatory time series, while accurately distinguishing between direct and indirect connections. In this paper an adaptation of Granger Causality is proposed that allows for inference of circadian networks and oscillatory networks in general called Adaptive Frequency Granger Causality (AFGC). Additionally, an extension of this method is proposed to infer networks with large numbers of cells called LASSO AFGC. The method was validated using simulated data from several different networks. For the smaller networks the method was able to identify all one way direct connections without identifying connections that were not present. For larger networks of up to twenty cells the method shows excellent performance in identifying true and false connections; this is quantified by an area-under-the-curve (AUC) 96.88%. We note that this method like other Granger Causality-based methods, is based on the detection of high frequency signals propagating between cell traces. Thus it requires a relatively high sampling rate and a network that can propagate high frequency signals.

  16. BIE: Bayesian Inference Engine

    NASA Astrophysics Data System (ADS)

    Weinberg, Martin D.

    2013-12-01

    The Bayesian Inference Engine (BIE) is an object-oriented library of tools written in C++ designed explicitly to enable Bayesian update and model comparison for astronomical problems. To facilitate "what if" exploration, BIE provides a command line interface (written with Bison and Flex) to run input scripts. The output of the code is a simulation of the Bayesian posterior distribution from which summary statistics e.g. by taking moments, or determine confidence intervals and so forth, can be determined. All of these quantities are fundamentally integrals and the Markov Chain approach produces variates heta distributed according to P( heta|D) so moments are trivially obtained by summing of the ensemble of variates.

  17. Bayesian inference in geomagnetism

    NASA Technical Reports Server (NTRS)

    Backus, George E.

    1988-01-01

    The inverse problem in empirical geomagnetic modeling is investigated, with critical examination of recently published studies. Particular attention is given to the use of Bayesian inference (BI) to select the damping parameter lambda in the uniqueness portion of the inverse problem. The mathematical bases of BI and stochastic inversion are explored, with consideration of bound-softening problems and resolution in linear Gaussian BI. The problem of estimating the radial magnetic field B(r) at the earth core-mantle boundary from surface and satellite measurements is then analyzed in detail, with specific attention to the selection of lambda in the studies of Gubbins (1983) and Gubbins and Bloxham (1985). It is argued that the selection method is inappropriate and leads to lambda values much larger than those that would result if a reasonable bound on the heat flow at the CMB were assumed.

  18. Bayes factors and multimodel inference

    USGS Publications Warehouse

    Link, W.A.; Barker, R.J.; Thomson, David L.; Cooch, Evan G.; Conroy, Michael J.

    2009-01-01

    Multimodel inference has two main themes: model selection, and model averaging. Model averaging is a means of making inference conditional on a model set, rather than on a selected model, allowing formal recognition of the uncertainty associated with model choice. The Bayesian paradigm provides a natural framework for model averaging, and provides a context for evaluation of the commonly used AIC weights. We review Bayesian multimodel inference, noting the importance of Bayes factors. Noting the sensitivity of Bayes factors to the choice of priors on parameters, we define and propose nonpreferential priors as offering a reasonable standard for objective multimodel inference.

  19. Children's and Adults' Ability to Build Online Emotional Inferences during Comprehension of Audiovisual and Auditory Texts

    ERIC Educational Resources Information Center

    Diergarten, Anna Katharina; Nieding, Gerhild

    2015-01-01

    Two studies examined inferences drawn about the protagonist's emotional state in movies (Study 1) or audiobooks (Study 2). Children aged 5, 8, and 10 years old and adults took part. Participants saw or heard 20 movie scenes or sections of audiobooks taken or adapted from the TV show Lassie. An online measure of emotional inference was designed…

  20. Improving Inferences from Multiple Methods.

    ERIC Educational Resources Information Center

    Shotland, R. Lance; Mark, Melvin M.

    1987-01-01

    Multiple evaluation methods (MEMs) can cause an inferential challenge, although there are strategies to strengthen inferences. Practical and theoretical issues involved in the use by social scientists of MEMs, three potential problems in drawing inferences from MEMs, and short- and long-term strategies for alleviating these problems are outlined.…

  1. Causal Inference and Developmental Psychology

    ERIC Educational Resources Information Center

    Foster, E. Michael

    2010-01-01

    Causal inference is of central importance to developmental psychology. Many key questions in the field revolve around improving the lives of children and their families. These include identifying risk factors that if manipulated in some way would foster child development. Such a task inherently involves causal inference: One wants to know whether…

  2. Causal Inference in Retrospective Studies.

    ERIC Educational Resources Information Center

    Holland, Paul W.; Rubin, Donald B.

    1988-01-01

    The problem of drawing causal inferences from retrospective case-controlled studies is considered. A model for causal inference in prospective studies is applied to retrospective studies. Limitations of case-controlled studies are formulated concerning relevant parameters that can be estimated in such studies. A coffee-drinking/myocardial…

  3. Social Inference Through Technology

    NASA Astrophysics Data System (ADS)

    Oulasvirta, Antti

    Awareness cues are computer-mediated, real-time indicators of people’s undertakings, whereabouts, and intentions. Already in the mid-1970 s, UNIX users could use commands such as “finger” and “talk” to find out who was online and to chat. The small icons in instant messaging (IM) applications that indicate coconversants’ presence in the discussion space are the successors of “finger” output. Similar indicators can be found in online communities, media-sharing services, Internet relay chat (IRC), and location-based messaging applications. But presence and availability indicators are only the tip of the iceberg. Technological progress has enabled richer, more accurate, and more intimate indicators. For example, there are mobile services that allow friends to query and follow each other’s locations. Remote monitoring systems developed for health care allow relatives and doctors to assess the wellbeing of homebound patients (see, e.g., Tang and Venables 2000). But users also utilize cues that have not been deliberately designed for this purpose. For example, online gamers pay attention to other characters’ behavior to infer what the other players are like “in real life.” There is a common denominator underlying these examples: shared activities rely on the technology’s representation of the remote person. The other human being is not physically present but present only through a narrow technological channel.

  4. Graphical Models and Computerized Adaptive Testing.

    ERIC Educational Resources Information Center

    Almond, Russell G.; Mislevy, Robert J.

    1999-01-01

    Considers computerized adaptive testing from the perspective of graphical modeling (GM). GM provides methods for making inferences about multifaceted skills and knowledge and for extracting data from complex performances. Provides examples from language-proficiency assessment. (SLD)

  5. Bayesian Inference of Galaxy Morphology

    NASA Astrophysics Data System (ADS)

    Yoon, Ilsang; Weinberg, M.; Katz, N.

    2011-01-01

    Reliable inference on galaxy morphology from quantitative analysis of ensemble galaxy images is challenging but essential ingredient in studying galaxy formation and evolution, utilizing current and forthcoming large scale surveys. To put galaxy image decomposition problem in broader context of statistical inference problem and derive a rigorous statistical confidence levels of the inference, I developed a novel galaxy image decomposition tool, GALPHAT (GALaxy PHotometric ATtributes) that exploits recent developments in Bayesian computation to provide full posterior probability distributions and reliable confidence intervals for all parameters. I will highlight the significant improvements in galaxy image decomposition using GALPHAT, over the conventional model fitting algorithms and introduce the GALPHAT potential to infer the statistical distribution of galaxy morphological structures, using ensemble posteriors of galaxy morphological parameters from the entire galaxy population that one studies.

  6. Flexible retrieval: When true inferences produce false memories.

    PubMed

    Carpenter, Alexis C; Schacter, Daniel L

    2017-03-01

    Episodic memory involves flexible retrieval processes that allow us to link together distinct episodes, make novel inferences across overlapping events, and recombine elements of past experiences when imagining future events. However, the same flexible retrieval and recombination processes that underpin these adaptive functions may also leave memory prone to error or distortion, such as source misattributions in which details of one event are mistakenly attributed to another related event. To determine whether the same recombination-related retrieval mechanism supports both successful inference and source memory errors, we developed a modified version of an associative inference paradigm in which participants encoded everyday scenes comprised of people, objects, and other contextual details. These scenes contained overlapping elements (AB, BC) that could later be linked to support novel inferential retrieval regarding elements that had not appeared together previously (AC). Our critical experimental manipulation concerned whether contextual details were probed before or after the associative inference test, thereby allowing us to assess whether (a) false memories increased for successful versus unsuccessful inferences, and (b) any such effects were specific to after compared with before participants received the inference test. In each of 4 experiments that used variants of this paradigm, participants were more susceptible to false memories for contextual details after successful than unsuccessful inferential retrieval, but only when contextual details were probed after the associative inference test. These results suggest that the retrieval-mediated recombination mechanism that underlies associative inference also contributes to source misattributions that result from combining elements of distinct episodes. (PsycINFO Database Record

  7. Statistical Inference in Graphical Models

    DTIC Science & Technology

    2008-06-17

    Probabilistic Network Library ( PNL ). While not fully mature, PNL does provide the most commonly-used algorithms for inference and learning with the efficiency...of C++, and also offers interfaces for calling the library from MATLAB and R 1361. Notably, both BNT and PNL provide learning and inference algorithms...mature and has been used for research purposes for several years, it is written in MATLAB and thus is not suitable to be used in real-time settings. PNL

  8. Statistical Inference: The Big Picture.

    PubMed

    Kass, Robert E

    2011-02-01

    Statistics has moved beyond the frequentist-Bayesian controversies of the past. Where does this leave our ability to interpret results? I suggest that a philosophy compatible with statistical practice, labelled here statistical pragmatism, serves as a foundation for inference. Statistical pragmatism is inclusive and emphasizes the assumptions that connect statistical models with observed data. I argue that introductory courses often mis-characterize the process of statistical inference and I propose an alternative "big picture" depiction.

  9. Bayesian Inference: with ecological applications

    USGS Publications Warehouse

    Link, William A.; Barker, Richard J.

    2010-01-01

    This text provides a mathematically rigorous yet accessible and engaging introduction to Bayesian inference with relevant examples that will be of interest to biologists working in the fields of ecology, wildlife management and environmental studies as well as students in advanced undergraduate statistics.. This text opens the door to Bayesian inference, taking advantage of modern computational efficiencies and easily accessible software to evaluate complex hierarchical models.

  10. Inferring the Why in Images

    DTIC Science & Technology

    2014-01-01

    images. To our knowledge, this challenging problem has not yet been extensively explored in computer vision. We present a novel learning based...automatically infers why people are performing actions in images by learning from visual data and written language. ∗denotes equal contribution 1 Report...explored in computer vision. We present a novel learning based framework that uses high-level visual recognition to infer why people are performing

  11. Active inference, communication and hermeneutics☆

    PubMed Central

    Friston, Karl J.; Frith, Christopher D.

    2015-01-01

    Hermeneutics refers to interpretation and translation of text (typically ancient scriptures) but also applies to verbal and non-verbal communication. In a psychological setting it nicely frames the problem of inferring the intended content of a communication. In this paper, we offer a solution to the problem of neural hermeneutics based upon active inference. In active inference, action fulfils predictions about how we will behave (e.g., predicting we will speak). Crucially, these predictions can be used to predict both self and others – during speaking and listening respectively. Active inference mandates the suppression of prediction errors by updating an internal model that generates predictions – both at fast timescales (through perceptual inference) and slower timescales (through perceptual learning). If two agents adopt the same model, then – in principle – they can predict each other and minimise their mutual prediction errors. Heuristically, this ensures they are singing from the same hymn sheet. This paper builds upon recent work on active inference and communication to illustrate perceptual learning using simulated birdsongs. Our focus here is the neural hermeneutics implicit in learning, where communication facilitates long-term changes in generative models that are trying to predict each other. In other words, communication induces perceptual learning and enables others to (literally) change our minds and vice versa. PMID:25957007

  12. Active inference, communication and hermeneutics.

    PubMed

    Friston, Karl J; Frith, Christopher D

    2015-07-01

    Hermeneutics refers to interpretation and translation of text (typically ancient scriptures) but also applies to verbal and non-verbal communication. In a psychological setting it nicely frames the problem of inferring the intended content of a communication. In this paper, we offer a solution to the problem of neural hermeneutics based upon active inference. In active inference, action fulfils predictions about how we will behave (e.g., predicting we will speak). Crucially, these predictions can be used to predict both self and others--during speaking and listening respectively. Active inference mandates the suppression of prediction errors by updating an internal model that generates predictions--both at fast timescales (through perceptual inference) and slower timescales (through perceptual learning). If two agents adopt the same model, then--in principle--they can predict each other and minimise their mutual prediction errors. Heuristically, this ensures they are singing from the same hymn sheet. This paper builds upon recent work on active inference and communication to illustrate perceptual learning using simulated birdsongs. Our focus here is the neural hermeneutics implicit in learning, where communication facilitates long-term changes in generative models that are trying to predict each other. In other words, communication induces perceptual learning and enables others to (literally) change our minds and vice versa.

  13. Causal inference and developmental psychology.

    PubMed

    Foster, E Michael

    2010-11-01

    Causal inference is of central importance to developmental psychology. Many key questions in the field revolve around improving the lives of children and their families. These include identifying risk factors that if manipulated in some way would foster child development. Such a task inherently involves causal inference: One wants to know whether the risk factor actually causes outcomes. Random assignment is not possible in many instances, and for that reason, psychologists must rely on observational studies. Such studies identify associations, and causal interpretation of such associations requires additional assumptions. Research in developmental psychology generally has relied on various forms of linear regression, but this methodology has limitations for causal inference. Fortunately, methodological developments in various fields are providing new tools for causal inference-tools that rely on more plausible assumptions. This article describes the limitations of regression for causal inference and describes how new tools might offer better causal inference. This discussion highlights the importance of properly identifying covariates to include (and exclude) from the analysis. This discussion considers the directed acyclic graph for use in accomplishing this task. With the proper covariates having been chosen, many of the available methods rely on the assumption of "ignorability." The article discusses the meaning of ignorability and considers alternatives to this assumption, such as instrumental variables estimation. Finally, the article considers the use of the tools discussed in the context of a specific research question, the effect of family structure on child development.

  14. Optimal inference with suboptimal models: Addiction and active Bayesian inference

    PubMed Central

    Schwartenbeck, Philipp; FitzGerald, Thomas H.B.; Mathys, Christoph; Dolan, Ray; Wurst, Friedrich; Kronbichler, Martin; Friston, Karl

    2015-01-01

    When casting behaviour as active (Bayesian) inference, optimal inference is defined with respect to an agent’s beliefs – based on its generative model of the world. This contrasts with normative accounts of choice behaviour, in which optimal actions are considered in relation to the true structure of the environment – as opposed to the agent’s beliefs about worldly states (or the task). This distinction shifts an understanding of suboptimal or pathological behaviour away from aberrant inference as such, to understanding the prior beliefs of a subject that cause them to behave less ‘optimally’ than our prior beliefs suggest they should behave. Put simply, suboptimal or pathological behaviour does not speak against understanding behaviour in terms of (Bayes optimal) inference, but rather calls for a more refined understanding of the subject’s generative model upon which their (optimal) Bayesian inference is based. Here, we discuss this fundamental distinction and its implications for understanding optimality, bounded rationality and pathological (choice) behaviour. We illustrate our argument using addictive choice behaviour in a recently described ‘limited offer’ task. Our simulations of pathological choices and addictive behaviour also generate some clear hypotheses, which we hope to pursue in ongoing empirical work. PMID:25561321

  15. Design-based and model-based inference in surveys of freshwater mollusks

    USGS Publications Warehouse

    Dorazio, R.M.

    1999-01-01

    Well-known concepts in statistical inference and sampling theory are used to develop recommendations for planning and analyzing the results of quantitative surveys of freshwater mollusks. Two methods of inference commonly used in survey sampling (design-based and model-based) are described and illustrated using examples relevant in surveys of freshwater mollusks. The particular objectives of a survey and the type of information observed in each unit of sampling can be used to help select the sampling design and the method of inference. For example, the mean density of a sparsely distributed population of mollusks can be estimated with higher precision by using model-based inference or by using design-based inference with adaptive cluster sampling than by using design-based inference with conventional sampling. More experience with quantitative surveys of natural assemblages of freshwater mollusks is needed to determine the actual benefits of different sampling designs and inferential procedures.

  16. cosmoabc: Likelihood-free inference for cosmology

    NASA Astrophysics Data System (ADS)

    Ishida, Emille E. O.; Vitenti, Sandro D. P.; Penna-Lima, Mariana; Trindade, Arlindo M.; Cisewski, Jessi; M.; de Souza, Rafael; Cameron, Ewan; Busti, Vinicius C.

    2015-05-01

    Approximate Bayesian Computation (ABC) enables parameter inference for complex physical systems in cases where the true likelihood function is unknown, unavailable, or computationally too expensive. It relies on the forward simulation of mock data and comparison between observed and synthetic catalogs. cosmoabc is a Python Approximate Bayesian Computation (ABC) sampler featuring a Population Monte Carlo variation of the original ABC algorithm, which uses an adaptive importance sampling scheme. The code can be coupled to an external simulator to allow incorporation of arbitrary distance and prior functions. When coupled with the numcosmo library, it has been used to estimate posterior probability distributions over cosmological parameters based on measurements of galaxy clusters number counts without computing the likelihood function.

  17. Psychotic Experiences and Overhasty Inferences Are Related to Maladaptive Learning.

    PubMed

    Stuke, Heiner; Stuke, Hannes; Weilnhammer, Veith Andreas; Schmack, Katharina

    2017-01-01

    Theoretical accounts suggest that an alteration in the brain's learning mechanisms might lead to overhasty inferences, resulting in psychotic symptoms. Here, we sought to elucidate the suggested link between maladaptive learning and psychosis. Ninety-eight healthy individuals with varying degrees of delusional ideation and hallucinatory experiences performed a probabilistic reasoning task that allowed us to quantify overhasty inferences. Replicating previous results, we found a relationship between psychotic experiences and overhasty inferences during probabilistic reasoning. Computational modelling revealed that the behavioral data was best explained by a novel computational learning model that formalizes the adaptiveness of learning by a non-linear distortion of prediction error processing, where an increased non-linearity implies a growing resilience against learning from surprising and thus unreliable information (large prediction errors). Most importantly, a decreased adaptiveness of learning predicted delusional ideation and hallucinatory experiences. Our current findings provide a formal description of the computational mechanisms underlying overhasty inferences, thereby empirically substantiating theories that link psychosis to maladaptive learning.

  18. Psychotic Experiences and Overhasty Inferences Are Related to Maladaptive Learning

    PubMed Central

    Stuke, Heiner; Stuke, Hannes; Weilnhammer, Veith Andreas; Schmack, Katharina

    2017-01-01

    Theoretical accounts suggest that an alteration in the brain’s learning mechanisms might lead to overhasty inferences, resulting in psychotic symptoms. Here, we sought to elucidate the suggested link between maladaptive learning and psychosis. Ninety-eight healthy individuals with varying degrees of delusional ideation and hallucinatory experiences performed a probabilistic reasoning task that allowed us to quantify overhasty inferences. Replicating previous results, we found a relationship between psychotic experiences and overhasty inferences during probabilistic reasoning. Computational modelling revealed that the behavioral data was best explained by a novel computational learning model that formalizes the adaptiveness of learning by a non-linear distortion of prediction error processing, where an increased non-linearity implies a growing resilience against learning from surprising and thus unreliable information (large prediction errors). Most importantly, a decreased adaptiveness of learning predicted delusional ideation and hallucinatory experiences. Our current findings provide a formal description of the computational mechanisms underlying overhasty inferences, thereby empirically substantiating theories that link psychosis to maladaptive learning. PMID:28107344

  19. Statistical inference and string theory

    NASA Astrophysics Data System (ADS)

    Heckman, Jonathan J.

    2015-09-01

    In this paper, we expose some surprising connections between string theory and statistical inference. We consider a large collective of agents sweeping out a family of nearby statistical models for an M-dimensional manifold of statistical fitting parameters. When the agents making nearby inferences align along a d-dimensional grid, we find that the pooled probability that the collective reaches a correct inference is the partition function of a nonlinear sigma model in d dimensions. Stability under perturbations to the original inference scheme requires the agents of the collective to distribute along two dimensions. Conformal invariance of the sigma model corresponds to the condition of a stable inference scheme, directly leading to the Einstein field equations for classical gravity. By summing over all possible arrangements of the agents in the collective, we reach a string theory. We also use this perspective to quantify how much an observer can hope to learn about the internal geometry of a superstring compactification. Finally, we present some brief speculative remarks on applications to the AdS/CFT correspondence and Lorentzian signature space-times.

  20. Computerized Adaptive Mastery Tests as Expert Systems.

    ERIC Educational Resources Information Center

    Frick, Theodore W.

    1992-01-01

    Discussion of expert systems and computerized adaptive tests describes two versions of EXSPRT, a new approach that combines uncertain inference in expert systems with sequential probability ratio test (SPRT) stopping rules. Results of two studies comparing EXSPRT to adaptive mastery testing based on item response theory and SPRT approaches are…

  1. Reinforcement and inference in cross-situational word learning

    PubMed Central

    Tilles, Paulo F. C.; Fontanari, José F.

    2013-01-01

    Cross-situational word learning is based on the notion that a learner can determine the referent of a word by finding something in common across many observed uses of that word. Here we propose an adaptive learning algorithm that contains a parameter that controls the strength of the reinforcement applied to associations between concurrent words and referents, and a parameter that regulates inference, which includes built-in biases, such as mutual exclusivity, and information of past learning events. By adjusting these parameters so that the model predictions agree with data from representative experiments on cross-situational word learning, we were able to explain the learning strategies adopted by the participants of those experiments in terms of a trade-off between reinforcement and inference. These strategies can vary wildly depending on the conditions of the experiments. For instance, for fast mapping experiments (i.e., the correct referent could, in principle, be inferred in a single observation) inference is prevalent, whereas for segregated contextual diversity experiments (i.e., the referents are separated in groups and are exhibited with members of their groups only) reinforcement is predominant. Other experiments are explained with more balanced doses of reinforcement and inference. PMID:24312030

  2. Intracranial EEG correlates of implicit relational inference within the hippocampus.

    PubMed

    Reber, T P; Do Lam, A T A; Axmacher, N; Elger, C E; Helmstaedter, C; Henke, K; Fell, J

    2016-01-01

    Drawing inferences from past experiences enables adaptive behavior in future situations. Inference has been shown to depend on hippocampal processes. Usually, inference is considered a deliberate and effortful mental act which happens during retrieval, and requires the focus of our awareness. Recent fMRI studies hint at the possibility that some forms of hippocampus-dependent inference can also occur during encoding and possibly also outside of awareness. Here, we sought to further explore the feasibility of hippocampal implicit inference, and specifically address the temporal evolution of implicit inference using intracranial EEG. Presurgical epilepsy patients with hippocampal depth electrodes viewed a sequence of word pairs, and judged the semantic fit between two words in each pair. Some of the word pairs entailed a common word (e.g., "winter-red," "red-cat") such that an indirect relation was established in following word pairs (e.g., "winter-cat"). The behavioral results suggested that drawing inference implicitly from past experience is feasible because indirect relations seemed to foster "fit" judgments while the absence of indirect relations fostered "do not fit" judgments, even though the participants were unaware of the indirect relations. A event-related potential (ERP) difference emerging 400 ms post-stimulus was evident in the hippocampus during encoding, suggesting that indirect relations were already established automatically during encoding of the overlapping word pairs. Further ERP differences emerged later post-stimulus (1,500 ms), were modulated by the participants' responses and were evident during encoding and test. Furthermore, response-locked ERP effects were evident at test. These ERP effects could hence be a correlate of the interaction of implicit memory with decision-making. Together, the data map out a time-course in which the hippocampus automatically integrates memories from discrete but related episodes to implicitly influence future

  3. Locative inferences in medical texts.

    PubMed

    Mayer, P S; Bailey, G H; Mayer, R J; Hillis, A; Dvoracek, J E

    1987-06-01

    Medical research relies on epidemiological studies conducted on a large set of clinical records that have been collected from physicians recording individual patient observations. These clinical records are recorded for the purpose of individual care of the patient with little consideration for their use by a biostatistician interested in studying a disease over a large population. Natural language processing of clinical records for epidemiological studies must deal with temporal, locative, and conceptual issues. This makes text understanding and data extraction of clinical records an excellent area for applied research. While much has been done in making temporal or conceptual inferences in medical texts, parallel work in locative inferences has not been done. This paper examines the locative inferences as well as the integration of temporal, locative, and conceptual issues in the clinical record understanding domain by presenting an application that utilizes two key concepts in its parsing strategy--a knowledge-based parsing strategy and a minimal lexicon.

  4. How Forgetting Aids Heuristic Inference

    ERIC Educational Resources Information Center

    Schooler, Lael J.; Hertwig, Ralph

    2005-01-01

    Some theorists, ranging from W. James (1890) to contemporary psychologists, have argued that forgetting is the key to proper functioning of memory. The authors elaborate on the notion of beneficial forgetting by proposing that loss of information aids inference heuristics that exploit mnemonic information. To this end, the authors bring together 2…

  5. Science Shorts: Observation versus Inference

    ERIC Educational Resources Information Center

    Leager, Craig R.

    2008-01-01

    When you observe something, how do you know for sure what you are seeing, feeling, smelling, or hearing? Asking students to think critically about their encounters with the natural world will help to strengthen their understanding and application of the science-process skills of observation and inference. In the following lesson, students make…

  6. The mechanisms of temporal inference

    NASA Technical Reports Server (NTRS)

    Fox, B. R.; Green, S. R.

    1987-01-01

    The properties of a temporal language are determined by its constituent elements: the temporal objects which it can represent, the attributes of those objects, the relationships between them, the axioms which define the default relationships, and the rules which define the statements that can be formulated. The methods of inference which can be applied to a temporal language are derived in part from a small number of axioms which define the meaning of equality and order and how those relationships can be propagated. More complex inferences involve detailed analysis of the stated relationships. Perhaps the most challenging area of temporal inference is reasoning over disjunctive temporal constraints. Simple forms of disjunction do not sufficiently increase the expressive power of a language while unrestricted use of disjunction makes the analysis NP-hard. In many cases a set of disjunctive constraints can be converted to disjunctive normal form and familiar methods of inference can be applied to the conjunctive sub-expressions. This process itself is NP-hard but it is made more tractable by careful expansion of a tree-structured search space.

  7. Statistical inference and Aristotle's Rhetoric.

    PubMed

    Macdonald, Ranald R

    2004-11-01

    Formal logic operates in a closed system where all the information relevant to any conclusion is present, whereas this is not the case when one reasons about events and states of the world. Pollard and Richardson drew attention to the fact that the reasoning behind statistical tests does not lead to logically justifiable conclusions. In this paper statistical inferences are defended not by logic but by the standards of everyday reasoning. Aristotle invented formal logic, but argued that people mostly get at the truth with the aid of enthymemes--incomplete syllogisms which include arguing from examples, analogies and signs. It is proposed that statistical tests work in the same way--in that they are based on examples, invoke the analogy of a model and use the size of the effect under test as a sign that the chance hypothesis is unlikely. Of existing theories of statistical inference only a weak version of Fisher's takes this into account. Aristotle anticipated Fisher by producing an argument of the form that there were too many cases in which an outcome went in a particular direction for that direction to be plausibly attributed to chance. We can therefore conclude that Aristotle would have approved of statistical inference and there is a good reason for calling this form of statistical inference classical.

  8. Word Learning as Bayesian Inference

    ERIC Educational Resources Information Center

    Xu, Fei; Tenenbaum, Joshua B.

    2007-01-01

    The authors present a Bayesian framework for understanding how adults and children learn the meanings of words. The theory explains how learners can generalize meaningfully from just one or a few positive examples of a novel word's referents, by making rational inductive inferences that integrate prior knowledge about plausible word meanings with…

  9. Starfish: Robust spectroscopic inference tools

    NASA Astrophysics Data System (ADS)

    Czekala, Ian; Andrews, Sean M.; Mandel, Kaisey S.; Hogg, David W.; Green, Gregory M.

    2015-05-01

    Starfish is a set of tools used for spectroscopic inference. It robustly determines stellar parameters using high resolution spectral models and uses Markov Chain Monte Carlo (MCMC) to explore the full posterior probability distribution of the stellar parameters. Additional potential applications include other types of spectra, such as unresolved stellar clusters or supernovae spectra.

  10. Improving Explanatory Inferences from Assessments

    ERIC Educational Resources Information Center

    Diakow, Ronli Phyllis

    2013-01-01

    This dissertation comprises three papers that propose, discuss, and illustrate models to make improved inferences about research questions regarding student achievement in education. Addressing the types of questions common in educational research today requires three different "extensions" to traditional educational assessment: (1)…

  11. Perceptual Inference and Autistic Traits

    ERIC Educational Resources Information Center

    Skewes, Joshua C; Jegindø, Else-Marie; Gebauer, Line

    2015-01-01

    Autistic people are better at perceiving details. Major theories explain this in terms of bottom-up sensory mechanisms or in terms of top-down cognitive biases. Recently, it has become possible to link these theories within a common framework. This framework assumes that perception is implicit neural inference, combining sensory evidence with…

  12. Towards General Algorithms for Grammatical Inference

    NASA Astrophysics Data System (ADS)

    Clark, Alexander

    Many algorithms for grammatical inference can be viewed as instances of a more general algorithm which maintains a set of primitive elements, which distributionally define sets of strings, and a set of features or tests that constrain various inference rules. Using this general framework, which we cast as a process of logical inference, we re-analyse Angluin's famous lstar algorithm and several recent algorithms for the inference of context-free grammars and multiple context-free grammars. Finally, to illustrate the advantages of this approach, we extend it to the inference of functional transductions from positive data only, and we present a new algorithm for the inference of finite state transducers.

  13. Adaptive fuzzy neural network control design via a T-S fuzzy model for a robot manipulator including actuator dynamics.

    PubMed

    Wai, Rong-Jong; Yang, Zhi-Wei

    2008-10-01

    This paper focuses on the development of adaptive fuzzy neural network control (AFNNC), including indirect and direct frameworks for an n-link robot manipulator, to achieve high-precision position tracking. In general, it is difficult to adopt a model-based design to achieve this control objective due to the uncertainties in practical applications, such as friction forces, external disturbances, and parameter variations. In order to cope with this problem, an indirect AFNNC (IAFNNC) scheme and a direct AFNNC (DAFNNC) strategy are investigated without the requirement of prior system information. In these model-free control topologies, a continuous-time Takagi-Sugeno (T-S) dynamic fuzzy model with online learning ability is constructed to represent the system dynamics of an n-link robot manipulator. In the IAFNNC, an FNN estimator is designed to tune the nonlinear dynamic function vector in fuzzy local models, and then, the estimative vector is used to indirectly develop a stable IAFNNC law. In the DAFNNC, an FNN controller is directly designed to imitate a predetermined model-based stabilizing control law, and then, the stable control performance can be achieved by only using joint position information. All the IAFNNC and DAFNNC laws and the corresponding adaptive tuning algorithms for FNN weights are established in the sense of Lyapunov stability analyses to ensure the stable control performance. Numerical simulations and experimental results of a two-link robot manipulator actuated by dc servomotors are given to verify the effectiveness and robustness of the proposed methodologies. In addition, the superiority of the proposed control schemes is indicated in comparison with proportional-differential control, fuzzy-model-based control, T-S-type FNN control, and robust neural fuzzy network control systems.

  14. Synaptic and nonsynaptic plasticity approximating probabilistic inference

    PubMed Central

    Tully, Philip J.; Hennig, Matthias H.; Lansner, Anders

    2014-01-01

    Learning and memory operations in neural circuits are believed to involve molecular cascades of synaptic and nonsynaptic changes that lead to a diverse repertoire of dynamical phenomena at higher levels of processing. Hebbian and homeostatic plasticity, neuromodulation, and intrinsic excitability all conspire to form and maintain memories. But it is still unclear how these seemingly redundant mechanisms could jointly orchestrate learning in a more unified system. To this end, a Hebbian learning rule for spiking neurons inspired by Bayesian statistics is proposed. In this model, synaptic weights and intrinsic currents are adapted on-line upon arrival of single spikes, which initiate a cascade of temporally interacting memory traces that locally estimate probabilities associated with relative neuronal activation levels. Trace dynamics enable synaptic learning to readily demonstrate a spike-timing dependence, stably return to a set-point over long time scales, and remain competitive despite this stability. Beyond unsupervised learning, linking the traces with an external plasticity-modulating signal enables spike-based reinforcement learning. At the postsynaptic neuron, the traces are represented by an activity-dependent ion channel that is shown to regulate the input received by a postsynaptic cell and generate intrinsic graded persistent firing levels. We show how spike-based Hebbian-Bayesian learning can be performed in a simulated inference task using integrate-and-fire (IAF) neurons that are Poisson-firing and background-driven, similar to the preferred regime of cortical neurons. Our results support the view that neurons can represent information in the form of probability distributions, and that probabilistic inference could be a functional by-product of coupled synaptic and nonsynaptic mechanisms operating over several timescales. The model provides a biophysical realization of Bayesian computation by reconciling several observed neural phenomena whose

  15. Statistical learning and selective inference

    PubMed Central

    Taylor, Jonathan; Tibshirani, Robert J.

    2015-01-01

    We describe the problem of “selective inference.” This addresses the following challenge: Having mined a set of data to find potential associations, how do we properly assess the strength of these associations? The fact that we have “cherry-picked”—searched for the strongest associations—means that we must set a higher bar for declaring significant the associations that we see. This challenge becomes more important in the era of big data and complex statistical modeling. The cherry tree (dataset) can be very large and the tools for cherry picking (statistical learning methods) are now very sophisticated. We describe some recent new developments in selective inference and illustrate their use in forward stepwise regression, the lasso, and principal components analysis. PMID:26100887

  16. Causal inference based on counterfactuals

    PubMed Central

    Höfler, M

    2005-01-01

    Background The counterfactual or potential outcome model has become increasingly standard for causal inference in epidemiological and medical studies. Discussion This paper provides an overview on the counterfactual and related approaches. A variety of conceptual as well as practical issues when estimating causal effects are reviewed. These include causal interactions, imperfect experiments, adjustment for confounding, time-varying exposures, competing risks and the probability of causation. It is argued that the counterfactual model of causal effects captures the main aspects of causality in health sciences and relates to many statistical procedures. Summary Counterfactuals are the basis of causal inference in medicine and epidemiology. Nevertheless, the estimation of counterfactual differences pose several difficulties, primarily in observational studies. These problems, however, reflect fundamental barriers only when learning from observations, and this does not invalidate the counterfactual concept. PMID:16159397

  17. Statistical learning and selective inference.

    PubMed

    Taylor, Jonathan; Tibshirani, Robert J

    2015-06-23

    We describe the problem of "selective inference." This addresses the following challenge: Having mined a set of data to find potential associations, how do we properly assess the strength of these associations? The fact that we have "cherry-picked"--searched for the strongest associations--means that we must set a higher bar for declaring significant the associations that we see. This challenge becomes more important in the era of big data and complex statistical modeling. The cherry tree (dataset) can be very large and the tools for cherry picking (statistical learning methods) are now very sophisticated. We describe some recent new developments in selective inference and illustrate their use in forward stepwise regression, the lasso, and principal components analysis.

  18. Inferring Centrality from Network Snapshots

    NASA Astrophysics Data System (ADS)

    Shao, Haibin; Mesbahi, Mehran; Li, Dewei; Xi, Yugeng

    2017-01-01

    The topology and dynamics of a complex network shape its functionality. However, the topologies of many large-scale networks are either unavailable or incomplete. Without the explicit knowledge of network topology, we show how the data generated from the network dynamics can be utilised to infer the tempo centrality, which is proposed to quantify the influence of nodes in a consensus network. We show that the tempo centrality can be used to construct an accurate estimate of both the propagation rate of influence exerted on consensus networks and the Kirchhoff index of the underlying graph. Moreover, the tempo centrality also encodes the disturbance rejection of nodes in a consensus network. Our findings provide an approach to infer the performance of a consensus network from its temporal data.

  19. Network Plasticity as Bayesian Inference

    PubMed Central

    Legenstein, Robert; Maass, Wolfgang

    2015-01-01

    General results from statistical learning theory suggest to understand not only brain computations, but also brain plasticity as probabilistic inference. But a model for that has been missing. We propose that inherently stochastic features of synaptic plasticity and spine motility enable cortical networks of neurons to carry out probabilistic inference by sampling from a posterior distribution of network configurations. This model provides a viable alternative to existing models that propose convergence of parameters to maximum likelihood values. It explains how priors on weight distributions and connection probabilities can be merged optimally with learned experience, how cortical networks can generalize learned information so well to novel experiences, and how they can compensate continuously for unforeseen disturbances of the network. The resulting new theory of network plasticity explains from a functional perspective a number of experimental data on stochastic aspects of synaptic plasticity that previously appeared to be quite puzzling. PMID:26545099

  20. Bayesian Inference on Proportional Elections

    PubMed Central

    Brunello, Gabriel Hideki Vatanabe; Nakano, Eduardo Yoshio

    2015-01-01

    Polls for majoritarian voting systems usually show estimates of the percentage of votes for each candidate. However, proportional vote systems do not necessarily guarantee the candidate with the most percentage of votes will be elected. Thus, traditional methods used in majoritarian elections cannot be applied on proportional elections. In this context, the purpose of this paper was to perform a Bayesian inference on proportional elections considering the Brazilian system of seats distribution. More specifically, a methodology to answer the probability that a given party will have representation on the chamber of deputies was developed. Inferences were made on a Bayesian scenario using the Monte Carlo simulation technique, and the developed methodology was applied on data from the Brazilian elections for Members of the Legislative Assembly and Federal Chamber of Deputies in 2010. A performance rate was also presented to evaluate the efficiency of the methodology. Calculations and simulations were carried out using the free R statistical software. PMID:25786259

  1. System Support for Forensic Inference

    NASA Astrophysics Data System (ADS)

    Gehani, Ashish; Kirchner, Florent; Shankar, Natarajan

    Digital evidence is playing an increasingly important role in prosecuting crimes. The reasons are manifold: financially lucrative targets are now connected online, systems are so complex that vulnerabilities abound and strong digital identities are being adopted, making audit trails more useful. If the discoveries of forensic analysts are to hold up to scrutiny in court, they must meet the standard for scientific evidence. Software systems are currently developed without consideration of this fact. This paper argues for the development of a formal framework for constructing “digital artifacts” that can serve as proxies for physical evidence; a system so imbued would facilitate sound digital forensic inference. A case study involving a filesystem augmentation that provides transparent support for forensic inference is described.

  2. Inferring Centrality from Network Snapshots

    PubMed Central

    Shao, Haibin; Mesbahi, Mehran; Li, Dewei; Xi, Yugeng

    2017-01-01

    The topology and dynamics of a complex network shape its functionality. However, the topologies of many large-scale networks are either unavailable or incomplete. Without the explicit knowledge of network topology, we show how the data generated from the network dynamics can be utilised to infer the tempo centrality, which is proposed to quantify the influence of nodes in a consensus network. We show that the tempo centrality can be used to construct an accurate estimate of both the propagation rate of influence exerted on consensus networks and the Kirchhoff index of the underlying graph. Moreover, the tempo centrality also encodes the disturbance rejection of nodes in a consensus network. Our findings provide an approach to infer the performance of a consensus network from its temporal data. PMID:28098166

  3. Bayesian inference for agreement measures.

    PubMed

    Vidal, Ignacio; de Castro, Mário

    2016-08-25

    The agreement of different measurement methods is an important issue in several disciplines like, for example, Medicine, Metrology, and Engineering. In this article, some agreement measures, common in the literature, were analyzed from a Bayesian point of view. Posterior inferences for such agreement measures were obtained based on well-known Bayesian inference procedures for the bivariate normal distribution. As a consequence, a general, simple, and effective method is presented, which does not require Markov Chain Monte Carlo methods and can be applied considering a great variety of prior distributions. Illustratively, the method was exemplified using five objective priors for the bivariate normal distribution. A tool for assessing the adequacy of the model is discussed. Results from a simulation study and an application to a real dataset are also reported.

  4. Efficient Inference of Parsimonious Phenomenological Models of Cellular Dynamics Using S-Systems and Alternating Regression

    PubMed Central

    Daniels, Bryan C.; Nemenman, Ilya

    2015-01-01

    The nonlinearity of dynamics in systems biology makes it hard to infer them from experimental data. Simple linear models are computationally efficient, but cannot incorporate these important nonlinearities. An adaptive method based on the S-system formalism, which is a sensible representation of nonlinear mass-action kinetics typically found in cellular dynamics, maintains the efficiency of linear regression. We combine this approach with adaptive model selection to obtain efficient and parsimonious representations of cellular dynamics. The approach is tested by inferring the dynamics of yeast glycolysis from simulated data. With little computing time, it produces dynamical models with high predictive power and with structural complexity adapted to the difficulty of the inference problem. PMID:25806510

  5. Inference of reversible tree languages.

    PubMed

    López, Damián; Sempere, José M; García, Pedro

    2004-08-01

    In this paper, we study the notion of k-reversibility and k-testability when regular tree languages are involved. We present an inference algorithm for learning a k-testable tree language that runs in polynomial time with respect to the size of the sample used. We also study the tree language classes in relation to other well known ones, and some properties of these languages are proven.

  6. Fast, Flexible, Rational Inductive Inference

    DTIC Science & Technology

    2013-08-23

    learning phonetic categories – the sounds that make up speech – learning the words that those sounds appear in provides sufficiently strong constraints...first to be able to infer realistic phonetic categories directly from simulated speech data. Objective 2.2: Forming feature-based representations...lexicon in phonetic category acquisition. Psychological Review. Griffiths, T. L., Austerweil, J. L., & Berthiaume, V. G. (2012). Comparing the

  7. Cortical circuits for perceptual inference.

    PubMed

    Friston, Karl; Kiebel, Stefan

    2009-10-01

    This paper assumes that cortical circuits have evolved to enable inference about the causes of sensory input received by the brain. This provides a principled specification of what neural circuits have to achieve. Here, we attempt to address how the brain makes inferences by casting inference as an optimisation problem. We look at how the ensuing recognition dynamics could be supported by directed connections and message-passing among neuronal populations, given our knowledge of intrinsic and extrinsic neuronal connections. We assume that the brain models the world as a dynamic system, which imposes causal structure on the sensorium. Perception is equated with the optimisation or inversion of this internal model, to explain sensory input. Given a model of how sensory data are generated, we use a generic variational approach to model inversion to furnish equations that prescribe recognition; i.e., the dynamics of neuronal activity that represents the causes of sensory input. Here, we focus on a model whose hierarchical and dynamical structure enables simulated brains to recognise and predict sequences of sensory states. We first review these models and their inversion under a variational free-energy formulation. We then show that the brain has the necessary infrastructure to implement this inversion and present stimulations using synthetic birds that generate and recognise birdsongs.

  8. An introduction to causal inference.

    PubMed

    Pearl, Judea

    2010-02-26

    This paper summarizes recent advances in causal inference and underscores the paradigmatic shifts that must be undertaken in moving from traditional statistical analysis to causal analysis of multivariate data. Special emphasis is placed on the assumptions that underlie all causal inferences, the languages used in formulating those assumptions, the conditional nature of all causal and counterfactual claims, and the methods that have been developed for the assessment of such claims. These advances are illustrated using a general theory of causation based on the Structural Causal Model (SCM) described in Pearl (2000a), which subsumes and unifies other approaches to causation, and provides a coherent mathematical foundation for the analysis of causes and counterfactuals. In particular, the paper surveys the development of mathematical tools for inferring (from a combination of data and assumptions) answers to three types of causal queries: those about (1) the effects of potential interventions, (2) probabilities of counterfactuals, and (3) direct and indirect effects (also known as "mediation"). Finally, the paper defines the formal and conceptual relationships between the structural and potential-outcome frameworks and presents tools for a symbiotic analysis that uses the strong features of both. The tools are demonstrated in the analyses of mediation, causes of effects, and probabilities of causation.

  9. Children's and adults' evaluation of the certainty of deductive inferences, inductive inferences, and guesses.

    PubMed

    Pillow, Bradford H

    2002-01-01

    Two experiments investigated kindergarten through fourth-grade children's and adults' (N = 128) ability to (1) evaluate the certainty of deductive inferences, inductive inferences, and guesses; and (2) explain the origins of inferential knowledge. When judging their own cognitive state, children in first grade and older rated deductive inferences as more certain than guesses; but when judging another person's knowledge, children did not distinguish valid inferences from invalid inferences and guesses until fourth grade. By third grade, children differentiated their own deductive inferences from inductive inferences and guesses, but only adults both differentiated deductive inferences from inductive inferences and differentiated inductive inferences from guesses. Children's recognition of their own inferences may contribute to the development of knowledge about cognitive processes, scientific reasoning, and a constructivist epistemology.

  10. CA1 subfield contributions to memory integration and inference.

    PubMed

    Schlichting, Margaret L; Zeithamova, Dagmar; Preston, Alison R

    2014-10-01

    The ability to combine information acquired at different times to make novel inferences is a powerful function of episodic memory. One perspective suggests that by retrieving related knowledge during new experiences, existing memories can be linked to the new, overlapping information as it is encoded. The resulting memory traces would thus incorporate content across event boundaries, representing important relationships among items encountered during separate experiences. While prior work suggests that the hippocampus is involved in linking memories experienced at different times, the involvement of specific subfields in this process remains unknown. Using both univariate and multivariate analyses of high-resolution functional magnetic resonance imaging (fMRI) data, we localized this specialized encoding mechanism to human CA1 . Specifically, right CA1 responses during encoding of events that overlapped with prior experience predicted subsequent success on a test requiring inferences about the relationships among events. Furthermore, we employed neural pattern similarity analysis to show that patterns of activation evoked during overlapping event encoding were later reinstated in CA1 during successful inference. The reinstatement of CA1 patterns during inference was specific to those trials that were performed quickly and accurately, consistent with the notion that linking memories during learning facilitates novel judgments. These analyses provide converging evidence that CA1 plays a unique role in encoding overlapping events and highlight the dynamic interactions between hippocampal-mediated encoding and retrieval processes. More broadly, our data reflect the adaptive nature of episodic memories, in which representations are derived across events in anticipation of future judgments.

  11. Visual Adaptation

    PubMed Central

    Webster, Michael A.

    2015-01-01

    Sensory systems continuously mold themselves to the widely varying contexts in which they must operate. Studies of these adaptations have played a long and central role in vision science. In part this is because the specific adaptations remain a powerful tool for dissecting vision, by exposing the mechanisms that are adapting. That is, “if it adapts, it's there.” Many insights about vision have come from using adaptation in this way, as a method. A second important trend has been the realization that the processes of adaptation are themselves essential to how vision works, and thus are likely to operate at all levels. That is, “if it's there, it adapts.” This has focused interest on the mechanisms of adaptation as the target rather than the probe. Together both approaches have led to an emerging insight of adaptation as a fundamental and ubiquitous coding strategy impacting all aspects of how we see. PMID:26858985

  12. Neuronal integration of dynamic sources: Bayesian learning and Bayesian inference

    NASA Astrophysics Data System (ADS)

    Siegelmann, Hava T.; Holzman, Lars E.

    2010-09-01

    One of the brain's most basic functions is integrating sensory data from diverse sources. This ability causes us to question whether the neural system is computationally capable of intelligently integrating data, not only when sources have known, fixed relative dependencies but also when it must determine such relative weightings based on dynamic conditions, and then use these learned weightings to accurately infer information about the world. We suggest that the brain is, in fact, fully capable of computing this parallel task in a single network and describe a neural inspired circuit with this property. Our implementation suggests the possibility that evidence learning requires a more complex organization of the network than was previously assumed, where neurons have different specialties, whose emergence brings the desired adaptivity seen in human online inference.

  13. The NIRS Analysis Package: noise reduction and statistical inference.

    PubMed

    Fekete, Tomer; Rubin, Denis; Carlson, Joshua M; Mujica-Parodi, Lilianne R

    2011-01-01

    Near infrared spectroscopy (NIRS) is a non-invasive optical imaging technique that can be used to measure cortical hemodynamic responses to specific stimuli or tasks. While analyses of NIRS data are normally adapted from established fMRI techniques, there are nevertheless substantial differences between the two modalities. Here, we investigate the impact of NIRS-specific noise; e.g., systemic (physiological), motion-related artifacts, and serial autocorrelations, upon the validity of statistical inference within the framework of the general linear model. We present a comprehensive framework for noise reduction and statistical inference, which is custom-tailored to the noise characteristics of NIRS. These methods have been implemented in a public domain Matlab toolbox, the NIRS Analysis Package (NAP). Finally, we validate NAP using both simulated and actual data, showing marked improvement in the detection power and reliability of NIRS.

  14. Adaptive Management

    EPA Science Inventory

    Adaptive management is an approach to natural resource management that emphasizes learning through management where knowledge is incomplete, and when, despite inherent uncertainty, managers and policymakers must act. Unlike a traditional trial and error approach, adaptive managem...

  15. Bell's theorem, inference, and quantum transactions

    NASA Astrophysics Data System (ADS)

    Garrett, A. J. M.

    1990-04-01

    Bell's theorem is expounded as an analysis in Bayesian inference. Assuming the result of a spin measurement on a particle is governed by a causal variable internal (hidden, “local”) to the particle, one learns about it by making a spin measurement; thence about the internal variable of a second particle correlated with the first; and from there predicts the probabilistic result of spin measurements on the second particle. Such predictions are violated by experiment: locality/causality fails. The statistical nature of the observations rules out signalling; acausal, superluminal, or otherwise. Quantum mechanics is irrelevant to this reasoning, although its correct predictions of experiment imply that it has a nonlocal/acausal interpretation. Cramer's new transactional interpretation, which incorporates this feature by adapting the Wheeler-Feynman idea of advanced and retarded processes to the quantum laws, is advocated. It leads to an invaluable way of envisaging quantum processes. The usual paradoxes melt before this, and one, the “delayed choice” experiment, is chosen for detailed inspection. Nonlocality implies practical difficulties in influencing hidden variables, which provides a very plausible explanation for why they have not yet been found; from this standpoint, Bell's theorem reinforces arguments in favor of hidden variables.

  16. Impact of nonignorable coarsening on Bayesian inference.

    PubMed

    Zhang, Jiameng; Heitjan, Daniel F

    2007-10-01

    The coarse data model of Heitjan and Rubin (1991) generalizes the missing data model of Rubin (1976) to cover other forms of incompleteness such as censoring and grouping. The model has 2 components: an ideal data model describing the distribution of the quantity of interest and a coarsening mechanism that describes a distribution over degrees of coarsening given the ideal data. The coarsening mechanism is said to be nonignorable when the degree of coarsening depends on an incompletely observed ideal outcome, in which case failure to properly account for it can spoil inferences. A theme in recent research is to measure sensitivity to nonignorability by evaluating the effect of a small departure from ignorability on the maximum likelihood estimate (MLE) of a parameter of the ideal data model. One such construct is the "index of local sensitivity to nonignorability" (ISNI) (Troxel and others, 2004), which is the derivative of the MLE with respect to a nonignorability parameter evaluated at the ignorable model. In this paper, we adapt ISNI to Bayesian modeling by instead defining it as the derivative of the posterior expectation. We propose the application of ISNI as a first step in judging the robustness of a Bayesian analysis to nonignorable coarsening. We derive formulas for a range of models and apply the method to evaluate sensitivity to nonignorable coarsening in 2 real data examples, one involving missing CD4 counts in an HIV trial and the other involving potentially informatively censored relapse times in a leukemia trial.

  17. Statistical inference for inverse problems

    NASA Astrophysics Data System (ADS)

    Bissantz, Nicolai; Holzmann, Hajo

    2008-06-01

    In this paper we study statistical inference for certain inverse problems. We go beyond mere estimation purposes and review and develop the construction of confidence intervals and confidence bands in some inverse problems, including deconvolution and the backward heat equation. Further, we discuss the construction of certain hypothesis tests, in particular concerning the number of local maxima of the unknown function. The methods are illustrated in a case study, where we analyze the distribution of heliocentric escape velocities of galaxies in the Centaurus galaxy cluster, and provide statistical evidence for its bimodality.

  18. sick: The Spectroscopic Inference Crank

    NASA Astrophysics Data System (ADS)

    Casey, Andrew R.

    2016-03-01

    There exists an inordinate amount of spectral data in both public and private astronomical archives that remain severely under-utilized. The lack of reliable open-source tools for analyzing large volumes of spectra contributes to this situation, which is poised to worsen as large surveys successively release orders of magnitude more spectra. In this article I introduce sick, the spectroscopic inference crank, a flexible and fast Bayesian tool for inferring astrophysical parameters from spectra. sick is agnostic to the wavelength coverage, resolving power, or general data format, allowing any user to easily construct a generative model for their data, regardless of its source. sick can be used to provide a nearest-neighbor estimate of model parameters, a numerically optimized point estimate, or full Markov Chain Monte Carlo sampling of the posterior probability distributions. This generality empowers any astronomer to capitalize on the plethora of published synthetic and observed spectra, and make precise inferences for a host of astrophysical (and nuisance) quantities. Model intensities can be reliably approximated from existing grids of synthetic or observed spectra using linear multi-dimensional interpolation, or a Cannon-based model. Additional phenomena that transform the data (e.g., redshift, rotational broadening, continuum, spectral resolution) are incorporated as free parameters and can be marginalized away. Outlier pixels (e.g., cosmic rays or poorly modeled regimes) can be treated with a Gaussian mixture model, and a noise model is included to account for systematically underestimated variance. Combining these phenomena into a scalar-justified, quantitative model permits precise inferences with credible uncertainties on noisy data. I describe the common model features, the implementation details, and the default behavior, which is balanced to be suitable for most astronomical applications. Using a forward model on low-resolution, high signal

  19. Universum Inference and Corpus Homogeneity

    NASA Astrophysics Data System (ADS)

    Vogel, Carl; Lynch, Gerard; Janssen, Jerom

    Universum Inference is re-interpreted for assessment of corpus homogeneity in computational stylometry. Recent stylometric research quantifies strength of characterization within dramatic works by assessing the homogeneity of corpora associated with dramatic personas. A methodological advance is suggested to mitigate the potential for the assessment of homogeneity to be achieved by chance. Baseline comparison analysis is constructed for contributions to debates by nonfictional participants: the corpus analyzed consists of transcripts of US Presidential and Vice-Presidential debates from the 2000 election cycle. The corpus is also analyzed in translation to Italian, Spanish and Portuguese. Adding randomized categories makes assessments of homogeneity more conservative.

  20. Bayesian inference for OPC modeling

    NASA Astrophysics Data System (ADS)

    Burbine, Andrew; Sturtevant, John; Fryer, David; Smith, Bruce W.

    2016-03-01

    The use of optical proximity correction (OPC) demands increasingly accurate models of the photolithographic process. Model building and inference techniques in the data science community have seen great strides in the past two decades which make better use of available information. This paper aims to demonstrate the predictive power of Bayesian inference as a method for parameter selection in lithographic models by quantifying the uncertainty associated with model inputs and wafer data. Specifically, the method combines the model builder's prior information about each modelling assumption with the maximization of each observation's likelihood as a Student's t-distributed random variable. Through the use of a Markov chain Monte Carlo (MCMC) algorithm, a model's parameter space is explored to find the most credible parameter values. During parameter exploration, the parameters' posterior distributions are generated by applying Bayes' rule, using a likelihood function and the a priori knowledge supplied. The MCMC algorithm used, an affine invariant ensemble sampler (AIES), is implemented by initializing many walkers which semiindependently explore the space. The convergence of these walkers to global maxima of the likelihood volume determine the parameter values' highest density intervals (HDI) to reveal champion models. We show that this method of parameter selection provides insights into the data that traditional methods do not and outline continued experiments to vet the method.

  1. Bayesian inference for radio observations

    NASA Astrophysics Data System (ADS)

    Lochner, Michelle; Natarajan, Iniyan; Zwart, Jonathan T. L.; Smirnov, Oleg; Bassett, Bruce A.; Oozeer, Nadeem; Kunz, Martin

    2015-06-01

    New telescopes like the Square Kilometre Array (SKA) will push into a new sensitivity regime and expose systematics, such as direction-dependent effects, that could previously be ignored. Current methods for handling such systematics rely on alternating best estimates of instrumental calibration and models of the underlying sky, which can lead to inadequate uncertainty estimates and biased results because any correlations between parameters are ignored. These deconvolution algorithms produce a single image that is assumed to be a true representation of the sky, when in fact it is just one realization of an infinite ensemble of images compatible with the noise in the data. In contrast, here we report a Bayesian formalism that simultaneously infers both systematics and science. Our technique, Bayesian Inference for Radio Observations (BIRO), determines all parameters directly from the raw data, bypassing image-making entirely, by sampling from the joint posterior probability distribution. This enables it to derive both correlations and accurate uncertainties, making use of the flexible software MEQTREES to model the sky and telescope simultaneously. We demonstrate BIRO with two simulated sets of Westerbork Synthesis Radio Telescope data sets. In the first, we perform joint estimates of 103 scientific (flux densities of sources) and instrumental (pointing errors, beamwidth and noise) parameters. In the second example, we perform source separation with BIRO. Using the Bayesian evidence, we can accurately select between a single point source, two point sources and an extended Gaussian source, allowing for `super-resolution' on scales much smaller than the synthesized beam.

  2. Quantum Inference on Bayesian Networks

    NASA Astrophysics Data System (ADS)

    Yoder, Theodore; Low, Guang Hao; Chuang, Isaac

    2014-03-01

    Because quantum physics is naturally probabilistic, it seems reasonable to expect physical systems to describe probabilities and their evolution in a natural fashion. Here, we use quantum computation to speedup sampling from a graphical probability model, the Bayesian network. A specialization of this sampling problem is approximate Bayesian inference, where the distribution on query variables is sampled given the values e of evidence variables. Inference is a key part of modern machine learning and artificial intelligence tasks, but is known to be NP-hard. Classically, a single unbiased sample is obtained from a Bayesian network on n variables with at most m parents per node in time (nmP(e) - 1 / 2) , depending critically on P(e) , the probability the evidence might occur in the first place. However, by implementing a quantum version of rejection sampling, we obtain a square-root speedup, taking (n2m P(e) -1/2) time per sample. The speedup is the result of amplitude amplification, which is proving to be broadly applicable in sampling and machine learning tasks. In particular, we provide an explicit and efficient circuit construction that implements the algorithm without the need for oracle access.

  3. Dopamine, Affordance and Active Inference

    PubMed Central

    Friston, Karl J.; Shiner, Tamara; FitzGerald, Thomas; Galea, Joseph M.; Adams, Rick; Brown, Harriet; Dolan, Raymond J.; Moran, Rosalyn; Stephan, Klaas Enno; Bestmann, Sven

    2012-01-01

    The role of dopamine in behaviour and decision-making is often cast in terms of reinforcement learning and optimal decision theory. Here, we present an alternative view that frames the physiology of dopamine in terms of Bayes-optimal behaviour. In this account, dopamine controls the precision or salience of (external or internal) cues that engender action. In other words, dopamine balances bottom-up sensory information and top-down prior beliefs when making hierarchical inferences (predictions) about cues that have affordance. In this paper, we focus on the consequences of changing tonic levels of dopamine firing using simulations of cued sequential movements. Crucially, the predictions driving movements are based upon a hierarchical generative model that infers the context in which movements are made. This means that we can confuse agents by changing the context (order) in which cues are presented. These simulations provide a (Bayes-optimal) model of contextual uncertainty and set switching that can be quantified in terms of behavioural and electrophysiological responses. Furthermore, one can simulate dopaminergic lesions (by changing the precision of prediction errors) to produce pathological behaviours that are reminiscent of those seen in neurological disorders such as Parkinson's disease. We use these simulations to demonstrate how a single functional role for dopamine at the synaptic level can manifest in different ways at the behavioural level. PMID:22241972

  4. Adapting Tests for Use in Multiple Languages and Cultures. Laboratory of Psychometric and Evaluative Research Report.

    ERIC Educational Resources Information Center

    Hambleton, Ronald K.; Patsula, Liane

    Whatever the purpose of test adaptation, questions arise concerning the validity of inferences from such adapted tests. This paper considers several advantages and disadvantages of adapting tests from one language and culture to another. The paper also reviews several sources of error or invalidity associated with adapting tests and suggests ways…

  5. Inferences of demography and selection in an African population of Drosophila melanogaster.

    PubMed

    Singh, Nadia D; Jensen, Jeffrey D; Clark, Andrew G; Aquadro, Charles F

    2013-01-01

    It remains a central problem in population genetics to infer the past action of natural selection, and these inferences pose a challenge because demographic events will also substantially affect patterns of polymorphism and divergence. Thus it is imperative to explicitly model the underlying demographic history of the population whenever making inferences about natural selection. In light of the considerable interest in adaptation in African populations of Drosophila melanogaster, which are considered ancestral to the species, we generated a large polymorphism data set representing 2.1 Mb from each of 20 individuals from a Ugandan population of D. melanogaster. In contrast to previous inferences of a simple population expansion in eastern Africa, our demographic modeling of this ancestral population reveals a strong signature of a population bottleneck followed by population expansion, which has significant implications for future demographic modeling of derived populations of this species. Taking this more complex underlying demographic history into account, we also estimate a mean X-linked region-wide rate of adaptation of 6 × 10(-11)/site/generation and a mean selection coefficient of beneficial mutations of 0.0009. These inferences regarding the rate and strength of selection are largely consistent with most other estimates from D. melanogaster and indicate a relatively high rate of adaptation driven by weakly beneficial mutations.

  6. Spontaneous Trait Inferences on Social Media

    PubMed Central

    Utz, Sonja

    2016-01-01

    The present research investigates whether spontaneous trait inferences occur under conditions characteristic of social media and networking sites: nonextreme, ostensibly self-generated content, simultaneous presentation of multiple cues, and self-paced browsing. We used an established measure of trait inferences (false recognition paradigm) and a direct assessment of impressions. Without being asked to do so, participants spontaneously formed impressions of people whose status updates they saw. Our results suggest that trait inferences occurred from nonextreme self-generated content, which is commonly found in social media updates (Experiment 1) and when nine status updates from different people were presented in parallel (Experiment 2). Although inferences did occur during free browsing, the results suggest that participants did not necessarily associate the traits with the corresponding status update authors (Experiment 3). Overall, the findings suggest that spontaneous trait inferences occur on social media. We discuss implications for online communication and research on spontaneous trait inferences. PMID:28123646

  7. Inferring echolocation in ancient bats.

    PubMed

    Simmons, Nancy B; Seymour, Kevin L; Habersetzer, Jörg; Gunnell, Gregg F

    2010-08-19

    Laryngeal echolocation, used by most living bats to form images of their surroundings and to detect and capture flying prey, is considered to be a key innovation for the evolutionary success of bats, and palaeontologists have long sought osteological correlates of echolocation that can be used to infer the behaviour of fossil bats. Veselka et al. argued that the most reliable trait indicating echolocation capabilities in bats is an articulation between the stylohyal bone (part of the hyoid apparatus that supports the throat and larynx) and the tympanic bone, which forms the floor of the middle ear. They examined the oldest and most primitive known bat, Onychonycteris finneyi (early Eocene, USA), and argued that it showed evidence of this stylohyal-tympanic articulation, from which they concluded that O. finneyi may have been capable of echolocation. We disagree with their interpretation of key fossil data and instead argue that O. finneyi was probably not an echolocating bat.

  8. Motion Inference During +Gz Acceleration

    DTIC Science & Technology

    2006-09-01

    AFRL-HW-WP-TP-2006-0091 Motion Inference During +Gz Acceleration Lloyd D . Tripp Jr. Richard A. McKinley Robert L. Esken Air Force Research Laboratory...5c. PROGRAM ELEMENT NUMBER 62202F 6. AUTHOR(S) 5d. PROJECT NUMBER Lloyd D . Tripp Jr 7184 Richard A. McKinley 5e. TASK NUMBER Robert L. Esken 03 5f...CD A Cj CL.C2 C 0~ 0. D 0 0~G)C00.E)’ca)4-100 ( 0 Eo12 E a 0 0L0mm 0a0 " C0 U) U) LUr o CLI.,a @ .- . : ) 0 " 0 C CL.. 70 E- 0 M 0.0 toE-C .- 0)c .2 0UL

  9. Inferred properties of stellar granulation

    SciTech Connect

    Gray, D.F.; Toner, C.G.

    1985-06-01

    Apparent characteristics of stellar granulation in F and G main-sequence stars are inferred directly from observed spectral-line asymmetries and from comparisons of numerical simulations with the observations: (1) the apparent granulation velocity increases with effective temperature, (2) the dispersion of granule velocities about their mean velocity of rise increases with the apparent granulation velocity, (3) the mean velocity of rise of granules must be less than the total line broadening, (4) the apparent velocity difference between granules and dark lanes corresponds to the granulation velocity deduced from stellar line bisectors, (5) the dark lanes show velocities of fall approximately twice as large as the granule rise velocities, (6) the light contributed to the stellar flux by the granules is four to ten times more than the light from the dark lanes. Stellar rotation is predicted to produce distortions in the line bisectors which may give information on the absolute velocity displacements of the line bisectors. 37 references.

  10. Synaptic Computation Underlying Probabilistic Inference

    PubMed Central

    Soltani, Alireza; Wang, Xiao-Jing

    2010-01-01

    In this paper we propose that synapses may be the workhorse of neuronal computations that underlie probabilistic reasoning. We built a neural circuit model for probabilistic inference when information provided by different sensory cues needs to be integrated, and the predictive powers of individual cues about an outcome are deduced through experience. We found that bounded synapses naturally compute, through reward-dependent plasticity, the posterior probability that a choice alternative is correct given that a cue is presented. Furthermore, a decision circuit endowed with such synapses makes choices based on the summated log posterior odds and performs near-optimal cue combination. The model is validated by reproducing salient observations of, and provide insights into, a monkey experiment using a categorization task. Our model thus suggests a biophysical instantiation of the Bayesian decision rule, while predicting important deviations from it similar to ‘base-rate neglect’ observed in human studies when alternatives have unequal priors. PMID:20010823

  11. Adaptive SPECT

    PubMed Central

    Barrett, Harrison H.; Furenlid, Lars R.; Freed, Melanie; Hesterman, Jacob Y.; Kupinski, Matthew A.; Clarkson, Eric; Whitaker, Meredith K.

    2008-01-01

    Adaptive imaging systems alter their data-acquisition configuration or protocol in response to the image information received. An adaptive pinhole single-photon emission computed tomography (SPECT) system might acquire an initial scout image to obtain preliminary information about the radiotracer distribution and then adjust the configuration or sizes of the pinholes, the magnifications, or the projection angles in order to improve performance. This paper briefly describes two small-animal SPECT systems that allow this flexibility and then presents a framework for evaluating adaptive systems in general, and adaptive SPECT systems in particular. The evaluation is in terms of the performance of linear observers on detection or estimation tasks. Expressions are derived for the ideal linear (Hotelling) observer and the ideal linear (Wiener) estimator with adaptive imaging. Detailed expressions for the performance figures of merit are given, and possible adaptation rules are discussed. PMID:18541485

  12. Generic comparison of protein inference engines.

    PubMed

    Claassen, Manfred; Reiter, Lukas; Hengartner, Michael O; Buhmann, Joachim M; Aebersold, Ruedi

    2012-04-01

    Protein identifications, instead of peptide-spectrum matches, constitute the biologically relevant result of shotgun proteomics studies. How to appropriately infer and report protein identifications has triggered a still ongoing debate. This debate has so far suffered from the lack of appropriate performance measures that allow us to objectively assess protein inference approaches. This study describes an intuitive, generic and yet formal performance measure and demonstrates how it enables experimentalists to select an optimal protein inference strategy for a given collection of fragment ion spectra. We applied the performance measure to systematically explore the benefit of excluding possibly unreliable protein identifications, such as single-hit wonders. Therefore, we defined a family of protein inference engines by extending a simple inference engine by thousands of pruning variants, each excluding a different specified set of possibly unreliable identifications. We benchmarked these protein inference engines on several data sets representing different proteomes and mass spectrometry platforms. Optimally performing inference engines retained all high confidence spectral evidence, without posterior exclusion of any type of protein identifications. Despite the diversity of studied data sets consistently supporting this rule, other data sets might behave differently. In order to ensure maximal reliable proteome coverage for data sets arising in other studies we advocate abstaining from rigid protein inference rules, such as exclusion of single-hit wonders, and instead consider several protein inference approaches and assess these with respect to the presented performance measure in the specific application context.

  13. Experimental assessment of static and dynamic algorithms for gene regulation inference from time series expression data.

    PubMed

    Lopes, Miguel; Bontempi, Gianluca

    2013-01-01

    Accurate inference of causal gene regulatory networks from gene expression data is an open bioinformatics challenge. Gene interactions are dynamical processes and consequently we can expect that the effect of any regulation action occurs after a certain temporal lag. However such lag is unknown a priori and temporal aspects require specific inference algorithms. In this paper we aim to assess the impact of taking into consideration temporal aspects on the final accuracy of the inference procedure. In particular we will compare the accuracy of static algorithms, where no dynamic aspect is considered, to that of fixed lag and adaptive lag algorithms in three inference tasks from microarray expression data. Experimental results show that network inference algorithms that take dynamics into account perform consistently better than static ones, once the considered lags are properly chosen. However, no individual algorithm stands out in all three inference tasks, and the challenging nature of network inference tasks is evidenced, as a large number of the assessed algorithms does not perform better than random.

  14. Learning Probabilistic Inference through Spike-Timing-Dependent Plasticity123

    PubMed Central

    Pecevski, Dejan

    2016-01-01

    Abstract Numerous experimental data show that the brain is able to extract information from complex, uncertain, and often ambiguous experiences. Furthermore, it can use such learnt information for decision making through probabilistic inference. Several models have been proposed that aim at explaining how probabilistic inference could be performed by networks of neurons in the brain. We propose here a model that can also explain how such neural network could acquire the necessary information for that from examples. We show that spike-timing-dependent plasticity in combination with intrinsic plasticity generates in ensembles of pyramidal cells with lateral inhibition a fundamental building block for that: probabilistic associations between neurons that represent through their firing current values of random variables. Furthermore, by combining such adaptive network motifs in a recursive manner the resulting network is enabled to extract statistical information from complex input streams, and to build an internal model for the distribution p* that generates the examples it receives. This holds even if p* contains higher-order moments. The analysis of this learning process is supported by a rigorous theoretical foundation. Furthermore, we show that the network can use the learnt internal model immediately for prediction, decision making, and other types of probabilistic inference. PMID:27419214

  15. Learning Probabilistic Inference through Spike-Timing-Dependent Plasticity.

    PubMed

    Pecevski, Dejan; Maass, Wolfgang

    2016-01-01

    Numerous experimental data show that the brain is able to extract information from complex, uncertain, and often ambiguous experiences. Furthermore, it can use such learnt information for decision making through probabilistic inference. Several models have been proposed that aim at explaining how probabilistic inference could be performed by networks of neurons in the brain. We propose here a model that can also explain how such neural network could acquire the necessary information for that from examples. We show that spike-timing-dependent plasticity in combination with intrinsic plasticity generates in ensembles of pyramidal cells with lateral inhibition a fundamental building block for that: probabilistic associations between neurons that represent through their firing current values of random variables. Furthermore, by combining such adaptive network motifs in a recursive manner the resulting network is enabled to extract statistical information from complex input streams, and to build an internal model for the distribution p (*) that generates the examples it receives. This holds even if p (*) contains higher-order moments. The analysis of this learning process is supported by a rigorous theoretical foundation. Furthermore, we show that the network can use the learnt internal model immediately for prediction, decision making, and other types of probabilistic inference.

  16. Role of Utility and Inference in the Evolution of Functional Information

    PubMed Central

    Sharov, Alexei A.

    2009-01-01

    Functional information means an encoded network of functions in living organisms from molecular signaling pathways to an organism’s behavior. It is represented by two components: code and an interpretation system, which together form a self-sustaining semantic closure. Semantic closure allows some freedom between components because small variations of the code are still interpretable. The interpretation system consists of inference rules that control the correspondence between the code and the function (phenotype) and determines the shape of the fitness landscape. The utility factor operates at multiple time scales: short-term selection drives evolution towards higher survival and reproduction rate within a given fitness landscape, and long-term selection favors those fitness landscapes that support adaptability and lead to evolutionary expansion of certain lineages. Inference rules make short-term selection possible by shaping the fitness landscape and defining possible directions of evolution, but they are under control of the long-term selection of lineages. Communication normally occurs within a set of agents with compatible interpretation systems, which I call communication system. Functional information cannot be directly transferred between communication systems with incompatible inference rules. Each biological species is a genetic communication system that carries unique functional information together with inference rules that determine evolutionary directions and constraints. This view of the relation between utility and inference can resolve the conflict between realism/positivism and pragmatism. Realism overemphasizes the role of inference in evolution of human knowledge because it assumes that logic is embedded in reality. Pragmatism substitutes usefulness for truth and therefore ignores the advantage of inference. The proposed concept of evolutionary pragmatism rejects the idea that logic is embedded in reality; instead, inference rules are

  17. Protein inference: A protein quantification perspective.

    PubMed

    He, Zengyou; Huang, Ting; Liu, Xiaoqing; Zhu, Peijun; Teng, Ben; Deng, Shengchun

    2016-08-01

    In mass spectrometry-based shotgun proteomics, protein quantification and protein identification are two major computational problems. To quantify the protein abundance, a list of proteins must be firstly inferred from the raw data. Then the relative or absolute protein abundance is estimated with quantification methods, such as spectral counting. Until now, most researchers have been dealing with these two processes separately. In fact, the protein inference problem can be regarded as a special protein quantification problem in the sense that truly present proteins are those proteins whose abundance values are not zero. Some recent published papers have conceptually discussed this possibility. However, there is still a lack of rigorous experimental studies to test this hypothesis. In this paper, we investigate the feasibility of using protein quantification methods to solve the protein inference problem. Protein inference methods aim to determine whether each candidate protein is present in the sample or not. Protein quantification methods estimate the abundance value of each inferred protein. Naturally, the abundance value of an absent protein should be zero. Thus, we argue that the protein inference problem can be viewed as a special protein quantification problem in which one protein is considered to be present if its abundance is not zero. Based on this idea, our paper tries to use three simple protein quantification methods to solve the protein inference problem effectively. The experimental results on six data sets show that these three methods are competitive with previous protein inference algorithms. This demonstrates that it is plausible to model the protein inference problem as a special protein quantification task, which opens the door of devising more effective protein inference algorithms from a quantification perspective. The source codes of our methods are available at: http://code.google.com/p/protein-inference/.

  18. Evaluating Content Alignment in Computerized Adaptive Testing

    ERIC Educational Resources Information Center

    Wise, Steven L.; Kingsbury, G. Gage; Webb, Norman L.

    2015-01-01

    The alignment between a test and the content domain it measures represents key evidence for the validation of test score inferences. Although procedures have been developed for evaluating the content alignment of linear tests, these procedures are not readily applicable to computerized adaptive tests (CATs), which require large item pools and do…

  19. Climate adaptation

    NASA Astrophysics Data System (ADS)

    Kinzig, Ann P.

    2015-03-01

    This paper is intended as a brief introduction to climate adaptation in a conference devoted otherwise to the physics of sustainable energy. Whereas mitigation involves measures to reduce the probability of a potential event, such as climate change, adaptation refers to actions that lessen the impact of climate change. Mitigation and adaptation differ in other ways as well. Adaptation does not necessarily have to be implemented immediately to be effective; it only needs to be in place before the threat arrives. Also, adaptation does not necessarily require global, coordinated action; many effective adaptation actions can be local. Some urban communities, because of land-use change and the urban heat-island effect, currently face changes similar to some expected under climate change, such as changes in water availability, heat-related morbidity, or changes in disease patterns. Concern over those impacts might motivate the implementation of measures that would also help in climate adaptation, despite skepticism among some policy makers about anthropogenic global warming. Studies of ancient civilizations in the southwestern US lends some insight into factors that may or may not be important to successful adaptation.

  20. A Comparison of Two Student Instructional Rating Forms Utilizing High-Inference Versus Moderate Inference Items.

    ERIC Educational Resources Information Center

    Wilson, Pamela W.

    Two types of items used in student evaluations of college teaching were compared: high-inference items, which require considerable inferring from what is seen or heard in the classroom to labelling of teacher behavior; and moderate-inference items, such as "teacher listens carefully." Two instruments were administered to random halves of…

  1. Forward and Backward Inference in Spatial Cognition

    PubMed Central

    Penny, Will D.; Zeidman, Peter; Burgess, Neil

    2013-01-01

    This paper shows that the various computations underlying spatial cognition can be implemented using statistical inference in a single probabilistic model. Inference is implemented using a common set of ‘lower-level’ computations involving forward and backward inference over time. For example, to estimate where you are in a known environment, forward inference is used to optimally combine location estimates from path integration with those from sensory input. To decide which way to turn to reach a goal, forward inference is used to compute the likelihood of reaching that goal under each option. To work out which environment you are in, forward inference is used to compute the likelihood of sensory observations under the different hypotheses. For reaching sensory goals that require a chaining together of decisions, forward inference can be used to compute a state trajectory that will lead to that goal, and backward inference to refine the route and estimate control signals that produce the required trajectory. We propose that these computations are reflected in recent findings of pattern replay in the mammalian brain. Specifically, that theta sequences reflect decision making, theta flickering reflects model selection, and remote replay reflects route and motor planning. We also propose a mapping of the above computational processes onto lateral and medial entorhinal cortex and hippocampus. PMID:24348230

  2. Application of Transformations in Parametric Inference

    ERIC Educational Resources Information Center

    Brownstein, Naomi; Pensky, Marianna

    2008-01-01

    The objective of the present paper is to provide a simple approach to statistical inference using the method of transformations of variables. We demonstrate performance of this powerful tool on examples of constructions of various estimation procedures, hypothesis testing, Bayes analysis and statistical inference for the stress-strength systems.…

  3. Scalar Inferences in Autism Spectrum Disorders

    ERIC Educational Resources Information Center

    Chevallier, Coralie; Wilson, Deirdre; Happe, Francesca; Noveck, Ira

    2010-01-01

    On being told "John or Mary will come", one might infer that "not both" of them will come. Yet the semantics of "or" is compatible with a situation where both John and Mary come. Inferences of this type, which enrich the semantics of "or" from an "inclusive" to an "exclusive" interpretation, have been extensively studied in linguistic pragmatics.…

  4. The Reasoning behind Informal Statistical Inference

    ERIC Educational Resources Information Center

    Makar, Katie; Bakker, Arthur; Ben-Zvi, Dani

    2011-01-01

    Informal statistical inference (ISI) has been a frequent focus of recent research in statistics education. Considering the role that context plays in developing ISI calls into question the need to be more explicit about the reasoning that underpins ISI. This paper uses educational literature on informal statistical inference and philosophical…

  5. Local and Global Thinking in Statistical Inference

    ERIC Educational Resources Information Center

    Pratt, Dave; Johnston-Wilder, Peter; Ainley, Janet; Mason, John

    2008-01-01

    In this reflective paper, we explore students' local and global thinking about informal statistical inference through our observations of 10- to 11-year-olds, challenged to infer the unknown configuration of a virtual die, but able to use the die to generate as much data as they felt necessary. We report how they tended to focus on local changes…

  6. Forward and backward inference in spatial cognition.

    PubMed

    Penny, Will D; Zeidman, Peter; Burgess, Neil

    2013-01-01

    This paper shows that the various computations underlying spatial cognition can be implemented using statistical inference in a single probabilistic model. Inference is implemented using a common set of 'lower-level' computations involving forward and backward inference over time. For example, to estimate where you are in a known environment, forward inference is used to optimally combine location estimates from path integration with those from sensory input. To decide which way to turn to reach a goal, forward inference is used to compute the likelihood of reaching that goal under each option. To work out which environment you are in, forward inference is used to compute the likelihood of sensory observations under the different hypotheses. For reaching sensory goals that require a chaining together of decisions, forward inference can be used to compute a state trajectory that will lead to that goal, and backward inference to refine the route and estimate control signals that produce the required trajectory. We propose that these computations are reflected in recent findings of pattern replay in the mammalian brain. Specifically, that theta sequences reflect decision making, theta flickering reflects model selection, and remote replay reflects route and motor planning. We also propose a mapping of the above computational processes onto lateral and medial entorhinal cortex and hippocampus.

  7. Inferring Learners' Knowledge from Their Actions

    ERIC Educational Resources Information Center

    Rafferty, Anna N.; LaMar, Michelle M.; Griffiths, Thomas L.

    2015-01-01

    Watching another person take actions to complete a goal and making inferences about that person's knowledge is a relatively natural task for people. This ability can be especially important in educational settings, where the inferences can be used for assessment, diagnosing misconceptions, and providing informative feedback. In this paper, we…

  8. Symbolic transfer entropy: inferring directionality in biosignals.

    PubMed

    Staniek, Matthäus; Lehnertz, Klaus

    2009-12-01

    Inferring directional interactions from biosignals is of crucial importance to improve understanding of dynamical interdependences underlying various physiological and pathophysiological conditions. We here present symbolic transfer entropy as a robust measure to infer the direction of interactions between multidimensional dynamical systems. We demonstrate its performance in quantifying driver-responder relationships in a network of coupled nonlinear oscillators and in the human epileptic brain.

  9. Predictive Inferences are Represented as Hypothetical Facts

    ERIC Educational Resources Information Center

    Campion, Nicolas

    2004-01-01

    Three experiments examined the processing of predictive and deductive inferences elicited by narrative texts. In Experiment 1, lexical decision responses indicated that these inferences were activated during reading. In Experiment 2, sentences expressing that an event had ''maybe'' taken place were shown to be appropriate in verifying predictive…

  10. Causal Inferences during Text Comprehension and Production.

    ERIC Educational Resources Information Center

    Kemper, Susan

    As comprehension failure results whenever readers are unable to infer missing causal connections, recent comprehension research has focused both on assessing the inferential complexity of texts and on investigating students' developing ability to infer causal relationships. Studies have demonstrated that texts rely on four types of causal…

  11. Measuring the Inference Load of a Text.

    ERIC Educational Resources Information Center

    Kemper, Susan

    1983-01-01

    A new approach to measuring readability is proposed based on the analysis of texts as causally connected chains of actions, physical states, and mental states. Using the inference load formula reflecting the difficulty readers have in inferring causal connections, the difficulty of texts can be adjusted for readers differing in skill or knowledge.…

  12. Causal inference in economics and marketing.

    PubMed

    Varian, Hal R

    2016-07-05

    This is an elementary introduction to causal inference in economics written for readers familiar with machine learning methods. The critical step in any causal analysis is estimating the counterfactual-a prediction of what would have happened in the absence of the treatment. The powerful techniques used in machine learning may be useful for developing better estimates of the counterfactual, potentially improving causal inference.

  13. The Impact of Disablers on Predictive Inference

    ERIC Educational Resources Information Center

    Cummins, Denise Dellarosa

    2014-01-01

    People consider alternative causes when deciding whether a cause is responsible for an effect (diagnostic inference) but appear to neglect them when deciding whether an effect will occur (predictive inference). Five experiments were conducted to test a 2-part explanation of this phenomenon: namely, (a) that people interpret standard predictive…

  14. Genetic Network Inference Using Hierarchical Structure

    PubMed Central

    Kimura, Shuhei; Tokuhisa, Masato; Okada-Hatakeyama, Mariko

    2016-01-01

    Many methods for inferring genetic networks have been proposed, but the regulations they infer often include false-positives. Several researchers have attempted to reduce these erroneous regulations by proposing the use of a priori knowledge about the properties of genetic networks such as their sparseness, scale-free structure, and so on. This study focuses on another piece of a priori knowledge, namely, that biochemical networks exhibit hierarchical structures. Based on this idea, we propose an inference approach that uses the hierarchical structure in a target genetic network. To obtain a reasonable hierarchical structure, the first step of the proposed approach is to infer multiple genetic networks from the observed gene expression data. We take this step using an existing method that combines a genetic network inference method with a bootstrap method. The next step is to extract a hierarchical structure from the inferred networks that is consistent with most of the networks. Third, we use the hierarchical structure obtained to assign confidence values to all candidate regulations. Numerical experiments are also performed to demonstrate the effectiveness of using the hierarchical structure in the genetic network inference. The improvement accomplished by the use of the hierarchical structure is small. However, the hierarchical structure could be used to improve the performances of many existing inference methods. PMID:26941653

  15. Saturn's ionosphere - Inferred electron densities

    NASA Astrophysics Data System (ADS)

    Kaiser, M. L.; Desch, M. D.; Connerney, J. E. P.

    1984-04-01

    During the two Voyager encounters with Saturn, radio bursts were detected which appear to have originated from atmospheric lightning storms. Although these bursts generally extended over frequencies from as low as 100 kHz to the upper detection limit of the instrument, 40 MHz, they often exhibited a sharp but variable low frequency cutoff below which bursts were not detected. We interpret the variable low-frequency extent of these bursts to be due to the reflection of the radio waves as they propagate through an ionosphere which varies with local time. We obtain estimates of electron densities at a variety of latitude and local time locations. These compare well with the dawn and dusk densities measured by the Pioneer 11 Voyager Radio Science investigations, and with model predictions for dayside densities. However, we infer a two-order-of-magnitude diurnal variation of electron density, which had not been anticipated by theoretical models of Saturn's ionosphere, and an equally dramatic extinction of ionospheric electron density by Saturn's rings. Previously announced in STAR as N84-17102

  16. Active Inference: A Process Theory.

    PubMed

    Friston, Karl; FitzGerald, Thomas; Rigoli, Francesco; Schwartenbeck, Philipp; Pezzulo, Giovanni

    2017-01-01

    This article describes a process theory based on active inference and belief propagation. Starting from the premise that all neuronal processing (and action selection) can be explained by maximizing Bayesian model evidence-or minimizing variational free energy-we ask whether neuronal responses can be described as a gradient descent on variational free energy. Using a standard (Markov decision process) generative model, we derive the neuronal dynamics implicit in this description and reproduce a remarkable range of well-characterized neuronal phenomena. These include repetition suppression, mismatch negativity, violation responses, place-cell activity, phase precession, theta sequences, theta-gamma coupling, evidence accumulation, race-to-bound dynamics, and transfer of dopamine responses. Furthermore, the (approximately Bayes' optimal) behavior prescribed by these dynamics has a degree of face validity, providing a formal explanation for reward seeking, context learning, and epistemic foraging. Technically, the fact that a gradient descent appears to be a valid description of neuronal activity means that variational free energy is a Lyapunov function for neuronal dynamics, which therefore conform to Hamilton's principle of least action.

  17. Saturn's ionosphere: Inferred electron densities

    NASA Technical Reports Server (NTRS)

    Kaiser, M. L.; Desch, M. D.; Connerney, J. E. P.

    1983-01-01

    During the two Voyager encounters with Saturn, radio bursts were detected which appear to have originated from atmospheric lightning storms. Although these bursts generally extended over frequencies from as low as 100 kHz to the upper detection limit of the instrument, 40 MHz, they often exhibited a sharp but variable low frequency cutoff below which bursts were not detected. We interpret the variable low-frequency extent of these bursts to be due to the reflection of the radio waves as they propagate through an ionosphere which varies with local time. We obtain estimates of electron densities at a variety of latitude and local time locations. These compare well with the dawn and dusk densitis measured by the Pioneer 11 Voyager Radio Science investigations, and with model predictions for dayside densities. However, we infer a two-order-of-magnitude diurnal variation of electron density, which had not been anticipated by theoretical models of Saturn's ionosphere, and an equally dramatic extinction of ionospheric electron density by Saturn's rings.

  18. Redshift data and statistical inference

    NASA Technical Reports Server (NTRS)

    Newman, William I.; Haynes, Martha P.; Terzian, Yervant

    1994-01-01

    Frequency histograms and the 'power spectrum analysis' (PSA) method, the latter developed by Yu & Peebles (1969), have been widely employed as techniques for establishing the existence of periodicities. We provide a formal analysis of these two classes of methods, including controlled numerical experiments, to better understand their proper use and application. In particular, we note that typical published applications of frequency histograms commonly employ far greater numbers of class intervals or bins than is advisable by statistical theory sometimes giving rise to the appearance of spurious patterns. The PSA method generates a sequence of random numbers from observational data which, it is claimed, is exponentially distributed with unit mean and variance, essentially independent of the distribution of the original data. We show that the derived random processes is nonstationary and produces a small but systematic bias in the usual estimate of the mean and variance. Although the derived variable may be reasonably described by an exponential distribution, the tail of the distribution is far removed from that of an exponential, thereby rendering statistical inference and confidence testing based on the tail of the distribution completely unreliable. Finally, we examine a number of astronomical examples wherein these methods have been used giving rise to widespread acceptance of statistically unconfirmed conclusions.

  19. Causal Inference in Public Health

    PubMed Central

    Glass, Thomas A.; Goodman, Steven N.; Hernán, Miguel A.; Samet, Jonathan M.

    2014-01-01

    Causal inference has a central role in public health; the determination that an association is causal indicates the possibility for intervention. We review and comment on the long-used guidelines for interpreting evidence as supporting a causal association and contrast them with the potential outcomes framework that encourages thinking in terms of causes that are interventions. We argue that in public health this framework is more suitable, providing an estimate of an action’s consequences rather than the less precise notion of a risk factor’s causal effect. A variety of modern statistical methods adopt this approach. When an intervention cannot be specified, causal relations can still exist, but how to intervene to change the outcome will be unclear. In application, the often-complex structure of causal processes needs to be acknowledged and appropriate data collected to study them. These newer approaches need to be brought to bear on the increasingly complex public health challenges of our globalized world. PMID:23297653

  20. Adaptations and Access to Assessment of Common Core Content

    ERIC Educational Resources Information Center

    Kettler, Ryan J.

    2015-01-01

    This chapter introduces theory that undergirds the role of testing adaptations in assessment, provides examples of item modifications and testing accommodations, reviews research relevant to each, and introduces a new paradigm that incorporates opportunity to learn (OTL), academic enablers, testing adaptations, and inferences that can be made from…

  1. Three Roads Diverged? Routes To Phylogeographic Inference

    PubMed Central

    Bloomquist, Erik W.; Lemey, Philippe

    2010-01-01

    Phylogeographic methods enable inference of the geographical history of genetic lineages. Recent examples successfully explore the patterns of human migration and the origins and spread of viral pandemics. Nevertheless, longstanding disagreement exists over the use and validity of certain phylogeographic inference methodologies. In this paper, we highlight three distinct frameworks for phylogeographic inference to give a taste of this disagreement. Each of the three approaches presents a different viewpoint on phylogeography, most fundamentally how we view the relationship between the inferred history of the sample and the history of the population the sample is embedded in. Satisfactory resolution of this relationship between history of the tree and history of the population remains a challenge for all but the most trivial models of phylogeographic processes. Intriguingly, we believe that some recent methods that entirely side-step inference about the history of the population will eventually help the field toward this goal. PMID:20863591

  2. Inference-based constraint satisfaction supports explanation

    SciTech Connect

    Sqalli, M.H.; Freuder, E.C.

    1996-12-31

    Constraint satisfaction problems are typically solved using search, augmented by general purpose consistency inference methods. This paper proposes a paradigm shift in which inference is used as the primary problem solving method, and attention is focused on special purpose, domain specific inference methods. While we expect this approach to have computational advantages, we emphasize here the advantages of a solution method that is more congenial to human thought processes. Specifically we use inference-based constraint satisfaction to support explanations of the problem solving behavior that are considerably more meaningful than a trace of a search process would be. Logic puzzles are used as a case study. Inference-based constraint satisfaction proves surprisingly powerful and easily extensible in this domain. Problems drawn from commercial logic puzzle booklets are used for evaluation. Explanations are produced that compare well with the explanations provided by these booklets.

  3. Efficient fuzzy Bayesian inference algorithms for incorporating expert knowledge in parameter estimation

    NASA Astrophysics Data System (ADS)

    Rajabi, Mohammad Mahdi; Ataie-Ashtiani, Behzad

    2016-05-01

    Bayesian inference has traditionally been conceived as the proper framework for the formal incorporation of expert knowledge in parameter estimation of groundwater models. However, conventional Bayesian inference is incapable of taking into account the imprecision essentially embedded in expert provided information. In order to solve this problem, a number of extensions to conventional Bayesian inference have been introduced in recent years. One of these extensions is 'fuzzy Bayesian inference' which is the result of integrating fuzzy techniques into Bayesian statistics. Fuzzy Bayesian inference has a number of desirable features which makes it an attractive approach for incorporating expert knowledge in the parameter estimation process of groundwater models: (1) it is well adapted to the nature of expert provided information, (2) it allows to distinguishably model both uncertainty and imprecision, and (3) it presents a framework for fusing expert provided information regarding the various inputs of the Bayesian inference algorithm. However an important obstacle in employing fuzzy Bayesian inference in groundwater numerical modeling applications is the computational burden, as the required number of numerical model simulations often becomes extremely exhaustive and often computationally infeasible. In this paper, a novel approach of accelerating the fuzzy Bayesian inference algorithm is proposed which is based on using approximate posterior distributions derived from surrogate modeling, as a screening tool in the computations. The proposed approach is first applied to a synthetic test case of seawater intrusion (SWI) in a coastal aquifer. It is shown that for this synthetic test case, the proposed approach decreases the number of required numerical simulations by an order of magnitude. Then the proposed approach is applied to a real-world test case involving three-dimensional numerical modeling of SWI in Kish Island, located in the Persian Gulf. An expert

  4. Causal inference in obesity research.

    PubMed

    Franks, P W; Atabaki-Pasdar, N

    2017-03-01

    Obesity is a risk factor for a plethora of severe morbidities and premature death. Most supporting evidence comes from observational studies that are prone to chance, bias and confounding. Even data on the protective effects of weight loss from randomized controlled trials will be susceptible to confounding and bias if treatment assignment cannot be masked, which is usually the case with lifestyle and surgical interventions. Thus, whilst obesity is widely considered the major modifiable risk factor for many chronic diseases, its causes and consequences are often difficult to determine. Addressing this is important, as the prevention and treatment of any disease requires that interventions focus on causal risk factors. Disease prediction, although not dependent on knowing the causes, is nevertheless enhanced by such knowledge. Here, we provide an overview of some of the barriers to causal inference in obesity research and discuss analytical approaches, such as Mendelian randomization, that can help to overcome these obstacles. In a systematic review of the literature in this field, we found: (i) probable causal relationships between adiposity and bone health/disease, cancers (colorectal, lung and kidney cancers), cardiometabolic traits (blood pressure, fasting insulin, inflammatory markers and lipids), uric acid concentrations, coronary heart disease and venous thrombosis (in the presence of pulmonary embolism), (ii) possible causal relationships between adiposity and gray matter volume, depression and common mental disorders, oesophageal cancer, macroalbuminuria, end-stage renal disease, diabetic kidney disease, nuclear cataract and gall stone disease, and (iii) no evidence for causal relationships between adiposity and Alzheimer's disease, pancreatic cancer, venous thrombosis (in the absence of pulmonary embolism), liver function and periodontitis.

  5. Inferring Mantle From Basalt Composition

    NASA Astrophysics Data System (ADS)

    Stracke, A.

    2014-12-01

    Isotope ratios in oceanic basalts, first reported by Gast and co-workers 50 years ago, are unique tracers of mantle composition, because they are expected to mirror the composition of their mantle sources. While the latter is certainly true for homogeneous sources, the plethora of studies over the last 50 years have shown that mantle sources are isotopically heterogeneous on different length scales. Isotopic differences exist between basalts from different ocean basins, volcanoes of individual ocean islands, lava flows of a single volcano, and even in μm sized melt inclusions in a single mineral grain. Diffusion, which acts to homogenize isotopic heterogeneity over Gyr timescales, limits the length scale of isotopic heterogeneity in the mantle to anywhere between several mm to 10s of meters. Melting regions, however, are typically several 100 km wide and up to 100 km deep. The scale of melting is thus generally orders of magnitude larger than the scale of isotopic heterogeneity. How partial melts mix during melting, melt transport, and melt storage then inevitably influences how isotopic heterogeneity is conveyed from source to melt. The isotopic composition of oceanic basalts hence provides an integrated signal of isotopically diverse melts. Recent mixing models and observed isotopic differences between source (abyssal peridotites) and melts (MORB) show that the range of isotopic heterogeneity of erupted melts need NOT directly reflect that of their source(s), nor need observed isotopic endmembers in source and melts be congruent. Many geochemical models, however, implicitly assume equivalence of source and melt composition. Especially when attempting to infer spatial patterns of isotopic heterogeneity in the mantle from those observed in erupted melts, or for linking isotopic diversity to geophysical structures in the mantle requires a more profound understanding to what extent erupted melts represent the isotopic composition of their mantle sources.

  6. Toothbrush Adaptations.

    ERIC Educational Resources Information Center

    Exceptional Parent, 1987

    1987-01-01

    Suggestions are presented for helping disabled individuals learn to use or adapt toothbrushes for proper dental care. A directory lists dental health instructional materials available from various organizations. (CB)

  7. Inferring genetic networks from microarray data.

    SciTech Connect

    May, Elebeoba Eni; Davidson, George S.; Martin, Shawn Bryan; Werner-Washburne, Margaret C.; Faulon, Jean-Loup Michel

    2004-06-01

    In theory, it should be possible to infer realistic genetic networks from time series microarray data. In practice, however, network discovery has proved problematic. The three major challenges are: (1) inferring the network; (2) estimating the stability of the inferred network; and (3) making the network visually accessible to the user. Here we describe a method, tested on publicly available time series microarray data, which addresses these concerns. The inference of genetic networks from genome-wide experimental data is an important biological problem which has received much attention. Approaches to this problem have typically included application of clustering algorithms [6]; the use of Boolean networks [12, 1, 10]; the use of Bayesian networks [8, 11]; and the use of continuous models [21, 14, 19]. Overviews of the problem and general approaches to network inference can be found in [4, 3]. Our approach to network inference is similar to earlier methods in that we use both clustering and Boolean network inference. However, we have attempted to extend the process to better serve the end-user, the biologist. In particular, we have incorporated a system to assess the reliability of our network, and we have developed tools which allow interactive visualization of the proposed network.

  8. Statistical Physics of High Dimensional Inference

    NASA Astrophysics Data System (ADS)

    Advani, Madhu; Ganguli, Surya

    To model modern large-scale datasets, we need efficient algorithms to infer a set of P unknown model parameters from N noisy measurements. What are fundamental limits on the accuracy of parameter inference, given limited measurements, signal-to-noise ratios, prior information, and computational tractability requirements? How can we combine prior information with measurements to achieve these limits? Classical statistics gives incisive answers to these questions as the measurement density α =N/P --> ∞ . However, modern high-dimensional inference problems, in fields ranging from bio-informatics to economics, occur at finite α. We formulate and analyze high-dimensional inference analytically by applying the replica and cavity methods of statistical physics where data serves as quenched disorder and inferred parameters play the role of thermal degrees of freedom. Our analysis reveals that widely cherished Bayesian inference algorithms such as maximum likelihood and maximum a posteriori are suboptimal in the modern setting, and yields new tractable, optimal algorithms to replace them as well as novel bounds on the achievable accuracy of a large class of high-dimensional inference algorithms. Thanks to Stanford Graduate Fellowship and Mind Brain Computation IGERT grant for support.

  9. On Bayesian Inductive Inference & Predictive Estimation

    NASA Technical Reports Server (NTRS)

    Cheeseman, Peter; Stutz, John; Smelyanskiy, Vadim

    2004-01-01

    We investigate Bayesian inference and the Principle of Maximum Entropy (PME) as methods for doing inference under uncertainty. This investigation is primarily through concrete examples that have been previously investigated in the literature. We find that it is possible to do Bayesian inference and PME inference using the same information, despite claims to the contrary, but that the results are not directly comparable. This is because Bayesian inference yields a probability density function (pdf) over the unknown model parameters, whereas PME yields point estimates. If mean estimates are extracted from the Bayesian pdfs, the resulting parameter estimates can differ radically from the PME values and also from the Maximum Likelihood values. We conclude that these differences are due to the Bayesian inference not assuming anything beyond the given prior probabilities and the data, whereas PME implicitly assumes that the given constraints are the only constraints that are operating. Since this assumption can be wrong, PME values may have to be revised when subsequent data shows evidence for more constraints. The entropy concentration previously "proved" by E. T. Jaynes is shown to be in error. Further, we show that PME is a generalized form of independence assumption, and so can be a very powerful method of inference when the variables being investigated are largely independent of each other.

  10. Linguistic Markers of Inference Generation While Reading.

    PubMed

    Clinton, Virginia; Carlson, Sarah E; Seipel, Ben

    2016-06-01

    Words can be informative linguistic markers of psychological constructs. The purpose of this study is to examine associations between word use and the process of making meaningful connections to a text while reading (i.e., inference generation). To achieve this purpose, think-aloud data from third-fifth grade students ([Formula: see text]) reading narrative texts were hand-coded for inferences. These data were also processed with a computer text analysis tool, Linguistic Inquiry and Word Count, for percentages of word use in the following categories: cognitive mechanism words, nonfluencies, and nine types of function words. Findings indicate that cognitive mechanisms were an independent, positive predictor of connections to background knowledge (i.e., elaborative inference generation) and nonfluencies were an independent, negative predictor of connections within the text (i.e., bridging inference generation). Function words did not provide unique variance towards predicting inference generation. These findings are discussed in the context of a cognitive reflection model and the differences between bridging and elaborative inference generation. In addition, potential practical implications for intelligent tutoring systems and computer-based methods of inference identification are presented.

  11. Visual representation of statistical information improves diagnostic inferences in doctors and their patients.

    PubMed

    Garcia-Retamero, Rocio; Hoffrage, Ulrich

    2013-04-01

    Doctors and patients have difficulty inferring the predictive value of a medical test from information about the prevalence of a disease and the sensitivity and false-positive rate of the test. Previous research has established that communicating such information in a format the human mind is adapted to-namely natural frequencies-as compared to probabilities, boosts accuracy of diagnostic inferences. In a study, we investigated to what extent these inferences can be improved-beyond the effect of natural frequencies-by providing visual aids. Participants were 81 doctors and 81 patients who made diagnostic inferences about three medical tests on the basis of information about prevalence of a disease, and the sensitivity and false-positive rate of the tests. Half of the participants received the information in natural frequencies, while the other half received the information in probabilities. Half of the participants only received numerical information, while the other half additionally received a visual aid representing the numerical information. In addition, participants completed a numeracy scale. Our study showed three important findings: (1) doctors and patients made more accurate inferences when information was communicated in natural frequencies as compared to probabilities; (2) visual aids boosted accuracy even when the information was provided in natural frequencies; and (3) doctors were more accurate in their diagnostic inferences than patients, though differences in accuracy disappeared when differences in numerical skills were controlled for. Our findings have important implications for medical practice as they suggest suitable ways to communicate quantitative medical data.

  12. Automated interpretation of LIBS spectra using a fuzzy logic inference engine.

    PubMed

    Hatch, Jeremy J; McJunkin, Timothy R; Hanson, Cynthia; Scott, Jill R

    2012-03-01

    Automated interpretation of laser-induced breakdown spectroscopy (LIBS) data is necessary due to the plethora of spectra that can be acquired in a relatively short time. However, traditional chemometric and artificial neural network methods that have been employed are not always transparent to a skilled user. A fuzzy logic approach to data interpretation has now been adapted to LIBS spectral interpretation. Fuzzy logic inference rules were developed using methodology that includes data mining methods and operator expertise to differentiate between various copper-containing and stainless steel alloys as well as unknowns. Results using the fuzzy logic inference engine indicate a high degree of confidence in spectral assignment.

  13. Inference and the introductory statistics course

    NASA Astrophysics Data System (ADS)

    Pfannkuch, Maxine; Regan, Matt; Wild, Chris; Budgett, Stephanie; Forbes, Sharleen; Harraway, John; Parsonage, Ross

    2011-10-01

    This article sets out some of the rationale and arguments for making major changes to the teaching and learning of statistical inference in introductory courses at our universities by changing from a norm-based, mathematical approach to more conceptually accessible computer-based approaches. The core problem of the inferential argument with its hypothetical probabilistic reasoning process is examined in some depth. We argue that the revolution in the teaching of inference must begin. We also discuss some perplexing issues, problematic areas and some new insights into language conundrums associated with introducing the logic of inference through randomization methods.

  14. Degradation monitoring using probabilistic inference

    NASA Astrophysics Data System (ADS)

    Alpay, Bulent

    In order to increase safety and improve economy and performance in a nuclear power plant (NPP), the source and extent of component degradations should be identified before failures and breakdowns occur. It is also crucial for the next generation of NPPs, which are designed to have a long core life and high fuel burnup to have a degradation monitoring system in order to keep the reactor in a safe state, to meet the designed reactor core lifetime and to optimize the scheduled maintenance. Model-based methods are based on determining the inconsistencies between the actual and expected behavior of the plant, and use these inconsistencies for detection and diagnostics of degradations. By defining degradation as a random abrupt change from the nominal to a constant degraded state of a component, we employed nonlinear filtering techniques based on state/parameter estimation. We utilized a Bayesian recursive estimation formulation in the sequential probabilistic inference framework and constructed a hidden Markov model to represent a general physical system. By addressing the problem of a filter's inability to estimate an abrupt change, which is called the oblivious filter problem in nonlinear extensions of Kalman filtering, and the sample impoverishment problem in particle filtering, we developed techniques to modify filtering algorithms by utilizing additional data sources to improve the filter's response to this problem. We utilized a reliability degradation database that can be constructed from plant specific operational experience and test and maintenance reports to generate proposal densities for probable degradation modes. These are used in a multiple hypothesis testing algorithm. We then test samples drawn from these proposal densities with the particle filtering estimates based on the Bayesian recursive estimation formulation with the Metropolis Hastings algorithm, which is a well-known Markov chain Monte Carlo method (MCMC). This multiple hypothesis testing

  15. An inference engine for embedded diagnostic systems

    NASA Technical Reports Server (NTRS)

    Fox, Barry R.; Brewster, Larry T.

    1987-01-01

    The implementation of an inference engine for embedded diagnostic systems is described. The system consists of two distinct parts. The first is an off-line compiler which accepts a propositional logical statement of the relationship between facts and conclusions and produces data structures required by the on-line inference engine. The second part consists of the inference engine and interface routines which accept assertions of fact and return the conclusions which necessarily follow. Given a set of assertions, it will generate exactly the conclusions which logically follow. At the same time, it will detect any inconsistencies which may propagate from an inconsistent set of assertions or a poorly formulated set of rules. The memory requirements are fixed and the worst case execution times are bounded at compile time. The data structures and inference algorithms are very simple and well understood. The data structures and algorithms are described in detail. The system has been implemented on Lisp, Pascal, and Modula-2.

  16. Experimental evidence for circular inference in schizophrenia

    NASA Astrophysics Data System (ADS)

    Jardri, Renaud; Duverne, Sandrine; Litvinova, Alexandra S.; Denève, Sophie

    2017-01-01

    Schizophrenia (SCZ) is a complex mental disorder that may result in some combination of hallucinations, delusions and disorganized thinking. Here SCZ patients and healthy controls (CTLs) report their level of confidence on a forced-choice task that manipulated the strength of sensory evidence and prior information. Neither group's responses can be explained by simple Bayesian inference. Rather, individual responses are best captured by a model with different degrees of circular inference. Circular inference refers to a corruption of sensory data by prior information and vice versa, leading us to `see what we expect' (through descending loops), to `expect what we see' (through ascending loops) or both. Ascending loops are stronger for SCZ than CTLs and correlate with the severity of positive symptoms. Descending loops correlate with the severity of negative symptoms. Both loops correlate with disorganized symptoms. The findings suggest that circular inference might mediate the clinical manifestations of SCZ.

  17. Experimental evidence for circular inference in schizophrenia

    PubMed Central

    Jardri, Renaud; Duverne, Sandrine; Litvinova, Alexandra S; Denève, Sophie

    2017-01-01

    Schizophrenia (SCZ) is a complex mental disorder that may result in some combination of hallucinations, delusions and disorganized thinking. Here SCZ patients and healthy controls (CTLs) report their level of confidence on a forced-choice task that manipulated the strength of sensory evidence and prior information. Neither group's responses can be explained by simple Bayesian inference. Rather, individual responses are best captured by a model with different degrees of circular inference. Circular inference refers to a corruption of sensory data by prior information and vice versa, leading us to ‘see what we expect' (through descending loops), to ‘expect what we see' (through ascending loops) or both. Ascending loops are stronger for SCZ than CTLs and correlate with the severity of positive symptoms. Descending loops correlate with the severity of negative symptoms. Both loops correlate with disorganized symptoms. The findings suggest that circular inference might mediate the clinical manifestations of SCZ. PMID:28139642

  18. Adaptation of Instructional Materials: A Commentary on the Research on Adaptations of "Who Polluted the Potomac"

    ERIC Educational Resources Information Center

    Ercikan, Kadriye; Alper, Naim

    2009-01-01

    This commentary first summarizes and discusses the analysis of the two translation processes described in the Oliveira, Colak, and Akerson article and the inferences these researchers make based on their research. In the second part of the commentary, we describe procedures and criteria used in adapting tests into different languages and how they…

  19. Causal inference in economics and marketing

    PubMed Central

    Varian, Hal R.

    2016-01-01

    This is an elementary introduction to causal inference in economics written for readers familiar with machine learning methods. The critical step in any causal analysis is estimating the counterfactual—a prediction of what would have happened in the absence of the treatment. The powerful techniques used in machine learning may be useful for developing better estimates of the counterfactual, potentially improving causal inference. PMID:27382144

  20. Operation of the Bayes Inference Engine

    SciTech Connect

    Hanson, K.M.; Cunningham, G.S.

    1998-07-27

    The authors have developed a computer application, called the Bayes Inference Engine, to enable one to make inferences about models of a physical object from radiographs taken of it. In the BIE calculational models are represented by a data-flow diagram that can be manipulated by the analyst in a graphical-programming environment. The authors demonstrate the operation of the BIE in terms of examples of two-dimensional tomographic reconstruction including uncertainty estimation.

  1. Inferring ethnicity from mitochondrial DNA sequence

    PubMed Central

    2011-01-01

    Background The assignment of DNA samples to coarse population groups can be a useful but difficult task. One such example is the inference of coarse ethnic groupings for forensic applications. Ethnicity plays an important role in forensic investigation and can be inferred with the help of genetic markers. Being maternally inherited, of high copy number, and robust persistence in degraded samples, mitochondrial DNA may be useful for inferring coarse ethnicity. In this study, we compare the performance of methods for inferring ethnicity from the sequence of the hypervariable region of the mitochondrial genome. Results We present the results of comprehensive experiments conducted on datasets extracted from the mtDNA population database, showing that ethnicity inference based on support vector machines (SVM) achieves an overall accuracy of 80-90%, consistently outperforming nearest neighbor and discriminant analysis methods previously proposed in the literature. We also evaluate methods of handling missing data and characterize the most informative segments of the hypervariable region of the mitochondrial genome. Conclusions Support vector machines can be used to infer coarse ethnicity from a small region of mitochondrial DNA sequence with surprisingly high accuracy. In the presence of missing data, utilizing only the regions common to the training sequences and a test sequence proves to be the best strategy. Given these results, SVM algorithms are likely to also be useful in other DNA sequence classification applications. PMID:21554759

  2. Adaptive management

    USGS Publications Warehouse

    Allen, Craig R.; Garmestani, Ahjond S.

    2015-01-01

    Adaptive management is an approach to natural resource management that emphasizes learning through management where knowledge is incomplete, and when, despite inherent uncertainty, managers and policymakers must act. Unlike a traditional trial and error approach, adaptive management has explicit structure, including a careful elucidation of goals, identification of alternative management objectives and hypotheses of causation, and procedures for the collection of data followed by evaluation and reiteration. The process is iterative, and serves to reduce uncertainty, build knowledge and improve management over time in a goal-oriented and structured process.

  3. Adaptive evolution of molecular phenotypes

    NASA Astrophysics Data System (ADS)

    Held, Torsten; Nourmohammad, Armita; Lässig, Michael

    2014-09-01

    Molecular phenotypes link genomic information with organismic functions, fitness, and evolution. Quantitative traits are complex phenotypes that depend on multiple genomic loci. In this paper, we study the adaptive evolution of a quantitative trait under time-dependent selection, which arises from environmental changes or through fitness interactions with other co-evolving phenotypes. We analyze a model of trait evolution under mutations and genetic drift in a single-peak fitness seascape. The fitness peak performs a constrained random walk in the trait amplitude, which determines the time-dependent trait optimum in a given population. We derive analytical expressions for the distribution of the time-dependent trait divergence between populations and of the trait diversity within populations. Based on this solution, we develop a method to infer adaptive evolution of quantitative traits. Specifically, we show that the ratio of the average trait divergence and the diversity is a universal function of evolutionary time, which predicts the stabilizing strength and the driving rate of the fitness seascape. From an information-theoretic point of view, this function measures the macro-evolutionary entropy in a population ensemble, which determines the predictability of the evolutionary process. Our solution also quantifies two key characteristics of adapting populations: the cumulative fitness flux, which measures the total amount of adaptation, and the adaptive load, which is the fitness cost due to a population's lag behind the fitness peak.

  4. Bayesian inference on the sphere beyond statistical isotropy

    SciTech Connect

    Das, Santanu; Souradeep, Tarun; Wandelt, Benjamin D. E-mail: wandelt@iap.fr

    2015-10-01

    We present a general method for Bayesian inference of the underlying covariance structure of random fields on a sphere. We employ the Bipolar Spherical Harmonic (BipoSH) representation of general covariance structure on the sphere. We illustrate the efficacy of the method as a principled approach to assess violation of statistical isotropy (SI) in the sky maps of Cosmic Microwave Background (CMB) fluctuations. SI violation in observed CMB maps arise due to known physical effects such as Doppler boost and weak lensing; yet unknown theoretical possibilities like cosmic topology and subtle violations of the cosmological principle, as well as, expected observational artefacts of scanning the sky with a non-circular beam, masking, foreground residuals, anisotropic noise, etc. We explicitly demonstrate the recovery of the input SI violation signals with their full statistics in simulated CMB maps. Our formalism easily adapts to exploring parametric physical models with non-SI covariance, as we illustrate for the inference of the parameters of a Doppler boosted sky map. Our approach promises to provide a robust quantitative evaluation of the evidence for SI violation related anomalies in the CMB sky by estimating the BipoSH spectra along with their complete posterior.

  5. HLA Type Inference via Haplotypes Identical by Descent

    NASA Astrophysics Data System (ADS)

    Setty, Manu N.; Gusev, Alexander; Pe'Er, Itsik

    The Human Leukocyte Antigen (HLA) genes play a major role in adaptive immune response and are used to differentiate self antigens from non self ones. HLA genes are hyper variable with nearly every locus harboring over a dozen alleles. This variation plays an important role in susceptibility to multiple autoimmune diseases and needs to be matched on for organ transplantation. Unfortunately, HLA typing by serological methods is time consuming and expensive compared to high throughput Single Nucleotide Polymorphism (SNP) data. We present a new computational method to infer per-locus HLA types using shared segments Identical By Descent (IBD), inferred from SNP genotype data. IBD information is modeled as graph where shared haplotypes are explored among clusters of individuals with known and unknown HLA types to identify the latter. We analyze performance of the method in a previously typed subset of the HapMap population, achieving accuracy of 96% in HLA-A, 94% in HLA-B, 95% in HLA-C, 77% in HLA-DR1, 93% in HLA-DQA1 and 90% in HLA-DQB1 genes. We compare our method to a tag SNP based approach and demonstrate higher sensitivity and specificity. Our method demonstrates the power of using shared haplotype segments for large-scale imputation at the HLA locus.

  6. Palaeotemperature trend for Precambrian life inferred from resurrected proteins.

    PubMed

    Gaucher, Eric A; Govindarajan, Sridhar; Ganesh, Omjoy K

    2008-02-07

    Biosignatures and structures in the geological record indicate that microbial life has inhabited Earth for the past 3.5 billion years or so. Research in the physical sciences has been able to generate statements about the ancient environment that hosted this life. These include the chemical compositions and temperatures of the early ocean and atmosphere. Only recently have the natural sciences been able to provide experimental results describing the environments of ancient life. Our previous work with resurrected proteins indicated that ancient life lived in a hot environment. Here we expand the timescale of resurrected proteins to provide a palaeotemperature trend of the environments that hosted life from 3.5 to 0.5 billion years ago. The thermostability of more than 25 phylogenetically dispersed ancestral elongation factors suggest that the environment supporting ancient life cooled progressively by 30 degrees C during that period. Here we show that our results are robust to potential statistical bias associated with the posterior distribution of inferred character states, phylogenetic ambiguity, and uncertainties in the amino-acid equilibrium frequencies used by evolutionary models. Our results are further supported by a nearly identical cooling trend for the ancient ocean as inferred from the deposition of oxygen isotopes. The convergence of results from natural and physical sciences suggest that ancient life has continually adapted to changes in environmental temperatures throughout its evolutionary history.

  7. Impaired inference in a case of developmental amnesia

    PubMed Central

    D'Angelo, Maria C.; Rosenbaum, R. Shayna

    2016-01-01

    ABSTRACT Amnesia is associated with impairments in relational memory, which is critically supported by the hippocampus. By adapting the transitivity paradigm, we previously showed that age‐related impairments in inference were mitigated when judgments could be predicated on known pairwise relations, however, such advantages were not observed in the adult‐onset amnesic case D.A. Here, we replicate and extend this finding in a developmental amnesic case (N.C.), who also shows impaired relational learning and transitive expression. Unlike D.A., N.C.'s damage affected the extended hippocampal system and diencephalic structures, and does not extend to neocortical areas that are affected in D.A. Critically, despite their differences in etiology and affected structures, N.C. and D.A. perform similarly on the task. N.C. showed intact pairwise knowledge, suggesting that he is able to use existing semantic information, but this semantic knowledge was insufficient to support transitive expression. The present results suggest a critical role for regions connected to the hippocampus and/or medial prefrontal cortex in inference beyond learning of pairwise relations. © 2016 The Authors Hippocampus Published by Wiley Periodicals, Inc. PMID:27258733

  8. Prediction, Bayesian inference and feedback in speech recognition

    PubMed Central

    Norris, Dennis; McQueen, James M.; Cutler, Anne

    2016-01-01

    ABSTRACT Speech perception involves prediction, but how is that prediction implemented? In cognitive models prediction has often been taken to imply that there is feedback of activation from lexical to pre-lexical processes as implemented in interactive-activation models (IAMs). We show that simple activation feedback does not actually improve speech recognition. However, other forms of feedback can be beneficial. In particular, feedback can enable the listener to adapt to changing input, and can potentially help the listener to recognise unusual input, or recognise speech in the presence of competing sounds. The common feature of these helpful forms of feedback is that they are all ways of optimising the performance of speech recognition using Bayesian inference. That is, listeners make predictions about speech because speech recognition is optimal in the sense captured in Bayesian models. PMID:26740960

  9. Adaptive Thresholds

    SciTech Connect

    Bremer, P. -T.

    2014-08-26

    ADAPT is a topological analysis code that allow to compute local threshold, in particular relevance based thresholds for features defined in scalar fields. The initial target application is vortex detection but the software is more generally applicable to all threshold based feature definitions.

  10. Inference of Isoforms from Short Sequence Reads

    NASA Astrophysics Data System (ADS)

    Feng, Jianxing; Li, Wei; Jiang, Tao

    Due to alternative splicing events in eukaryotic species, the identification of mRNA isoforms (or splicing variants) is a difficult problem. Traditional experimental methods for this purpose are time consuming and cost ineffective. The emerging RNA-Seq technology provides a possible effective method to address this problem. Although the advantages of RNA-Seq over traditional methods in transcriptome analysis have been confirmed by many studies, the inference of isoforms from millions of short sequence reads (e.g., Illumina/Solexa reads) has remained computationally challenging. In this work, we propose a method to calculate the expression levels of isoforms and infer isoforms from short RNA-Seq reads using exon-intron boundary, transcription start site (TSS) and poly-A site (PAS) information. We first formulate the relationship among exons, isoforms, and single-end reads as a convex quadratic program, and then use an efficient algorithm (called IsoInfer) to search for isoforms. IsoInfer can calculate the expression levels of isoforms accurately if all the isoforms are known and infer novel isoforms from scratch. Our experimental tests on known mouse isoforms with both simulated expression levels and reads demonstrate that IsoInfer is able to calculate the expression levels of isoforms with an accuracy comparable to the state-of-the-art statistical method and a 60 times faster speed. Moreover, our tests on both simulated and real reads show that it achieves a good precision and sensitivity in inferring isoforms when given accurate exon-intron boundary, TSS and PAS information, especially for isoforms whose expression levels are significantly high.

  11. Adaptation of adaptive optics systems.

    NASA Astrophysics Data System (ADS)

    Xin, Yu; Zhao, Dazun; Li, Chen

    1997-10-01

    In the paper, a concept of an adaptation of adaptive optical system (AAOS) is proposed. The AAOS has certain real time optimization ability against the variation of the brightness of detected objects m, atmospheric coherence length rO and atmospheric time constant τ by means of changing subaperture number and diameter, dynamic range, and system's temporal response. The necessity of AAOS using a Hartmann-Shack wavefront sensor and some technical approaches are discussed. Scheme and simulation of an AAOS with variable subaperture ability by use of both hardware and software are presented as an example of the system.

  12. Estimating uncertainty of inference for validation

    SciTech Connect

    Booker, Jane M; Langenbrunner, James R; Hemez, Francois M; Ross, Timothy J

    2010-09-30

    We present a validation process based upon the concept that validation is an inference-making activity. This has always been true, but the association has not been as important before as it is now. Previously, theory had been confirmed by more data, and predictions were possible based on data. The process today is to infer from theory to code and from code to prediction, making the role of prediction somewhat automatic, and a machine function. Validation is defined as determining the degree to which a model and code is an accurate representation of experimental test data. Imbedded in validation is the intention to use the computer code to predict. To predict is to accept the conclusion that an observable final state will manifest; therefore, prediction is an inference whose goodness relies on the validity of the code. Quantifying the uncertainty of a prediction amounts to quantifying the uncertainty of validation, and this involves the characterization of uncertainties inherent in theory/models/codes and the corresponding data. An introduction to inference making and its associated uncertainty is provided as a foundation for the validation problem. A mathematical construction for estimating the uncertainty in the validation inference is then presented, including a possibility distribution constructed to represent the inference uncertainty for validation under uncertainty. The estimation of inference uncertainty for validation is illustrated using data and calculations from Inertial Confinement Fusion (ICF). The ICF measurements of neutron yield and ion temperature were obtained for direct-drive inertial fusion capsules at the Omega laser facility. The glass capsules, containing the fusion gas, were systematically selected with the intent of establishing a reproducible baseline of high-yield 10{sup 13}-10{sup 14} neutron output. The deuterium-tritium ratio in these experiments was varied to study its influence upon yield. This paper on validation inference is the

  13. Deep Learning for Population Genetic Inference

    PubMed Central

    Sheehan, Sara; Song, Yun S.

    2016-01-01

    Given genomic variation data from multiple individuals, computing the likelihood of complex population genetic models is often infeasible. To circumvent this problem, we introduce a novel likelihood-free inference framework by applying deep learning, a powerful modern technique in machine learning. Deep learning makes use of multilayer neural networks to learn a feature-based function from the input (e.g., hundreds of correlated summary statistics of data) to the output (e.g., population genetic parameters of interest). We demonstrate that deep learning can be effectively employed for population genetic inference and learning informative features of data. As a concrete application, we focus on the challenging problem of jointly inferring natural selection and demography (in the form of a population size change history). Our method is able to separate the global nature of demography from the local nature of selection, without sequential steps for these two factors. Studying demography and selection jointly is motivated by Drosophila, where pervasive selection confounds demographic analysis. We apply our method to 197 African Drosophila melanogaster genomes from Zambia to infer both their overall demography, and regions of their genome under selection. We find many regions of the genome that have experienced hard sweeps, and fewer under selection on standing variation (soft sweep) or balancing selection. Interestingly, we find that soft sweeps and balancing selection occur more frequently closer to the centromere of each chromosome. In addition, our demographic inference suggests that previously estimated bottlenecks for African Drosophila melanogaster are too extreme. PMID:27018908

  14. Inferring learners' knowledge from their actions.

    PubMed

    Rafferty, Anna N; LaMar, Michelle M; Griffiths, Thomas L

    2015-04-01

    Watching another person take actions to complete a goal and making inferences about that person's knowledge is a relatively natural task for people. This ability can be especially important in educational settings, where the inferences can be used for assessment, diagnosing misconceptions, and providing informative feedback. In this paper, we develop a general framework for automatically making such inferences based on observed actions; this framework is particularly relevant for inferring student knowledge in educational games and other interactive virtual environments. Our approach relies on modeling action planning: We formalize the problem as a Markov decision process in which one must choose what actions to take to complete a goal, where choices will be dependent on one's beliefs about how actions affect the environment. We use a variation of inverse reinforcement learning to infer these beliefs. Through two lab experiments, we show that this model can recover people's beliefs in a simple environment, with accuracy comparable to that of human observers. We then demonstrate that the model can be used to provide real-time feedback and to model data from an existing educational game.

  15. Scene Construction, Visual Foraging, and Active Inference

    PubMed Central

    Mirza, M. Berk; Adams, Rick A.; Mathys, Christoph D.; Friston, Karl J.

    2016-01-01

    This paper describes an active inference scheme for visual searches and the perceptual synthesis entailed by scene construction. Active inference assumes that perception and action minimize variational free energy, where actions are selected to minimize the free energy expected in the future. This assumption generalizes risk-sensitive control and expected utility theory to include epistemic value; namely, the value (or salience) of information inherent in resolving uncertainty about the causes of ambiguous cues or outcomes. Here, we apply active inference to saccadic searches of a visual scene. We consider the (difficult) problem of categorizing a scene, based on the spatial relationship among visual objects where, crucially, visual cues are sampled myopically through a sequence of saccadic eye movements. This means that evidence for competing hypotheses about the scene has to be accumulated sequentially, calling upon both prediction (planning) and postdiction (memory). Our aim is to highlight some simple but fundamental aspects of the requisite functional anatomy; namely, the link between approximate Bayesian inference under mean field assumptions and functional segregation in the visual cortex. This link rests upon the (neurobiologically plausible) process theory that accompanies the normative formulation of active inference for Markov decision processes. In future work, we hope to use this scheme to model empirical saccadic searches and identify the prior beliefs that underwrite intersubject variability in the way people forage for information in visual scenes (e.g., in schizophrenia). PMID:27378899

  16. Computational inference of neural information flow networks.

    PubMed

    Smith, V Anne; Yu, Jing; Smulders, Tom V; Hartemink, Alexander J; Jarvis, Erich D

    2006-11-24

    Determining how information flows along anatomical brain pathways is a fundamental requirement for understanding how animals perceive their environments, learn, and behave. Attempts to reveal such neural information flow have been made using linear computational methods, but neural interactions are known to be nonlinear. Here, we demonstrate that a dynamic Bayesian network (DBN) inference algorithm we originally developed to infer nonlinear transcriptional regulatory networks from gene expression data collected with microarrays is also successful at inferring nonlinear neural information flow networks from electrophysiology data collected with microelectrode arrays. The inferred networks we recover from the songbird auditory pathway are correctly restricted to a subset of known anatomical paths, are consistent with timing of the system, and reveal both the importance of reciprocal feedback in auditory processing and greater information flow to higher-order auditory areas when birds hear natural as opposed to synthetic sounds. A linear method applied to the same data incorrectly produces networks with information flow to non-neural tissue and over paths known not to exist. To our knowledge, this study represents the first biologically validated demonstration of an algorithm to successfully infer neural information flow networks.

  17. Reliability of the Granger causality inference

    NASA Astrophysics Data System (ADS)

    Zhou, Douglas; Zhang, Yaoyu; Xiao, Yanyang; Cai, David

    2014-04-01

    How to characterize information flows in physical, biological, and social systems remains a major theoretical challenge. Granger causality (GC) analysis has been widely used to investigate information flow through causal interactions. We address one of the central questions in GC analysis, that is, the reliability of the GC evaluation and its implications for the causal structures extracted by this analysis. Our work reveals that the manner in which a continuous dynamical process is projected or coarse-grained to a discrete process has a profound impact on the reliability of the GC inference, and different sampling may potentially yield completely opposite inferences. This inference hazard is present for both linear and nonlinear processes. We emphasize that there is a hazard of reaching incorrect conclusions about network topologies, even including statistical (such as small-world or scale-free) properties of the networks, when GC analysis is blindly applied to infer the network topology. We demonstrate this using a small-world network for which a drastic loss of small-world attributes occurs in the reconstructed network using the standard GC approach. We further show how to resolve the paradox that the GC analysis seemingly becomes less reliable when more information is incorporated using finer and finer sampling. Finally, we present strategies to overcome these inference artifacts in order to obtain a reliable GC result.

  18. Deep Learning for Population Genetic Inference.

    PubMed

    Sheehan, Sara; Song, Yun S

    2016-03-01

    Given genomic variation data from multiple individuals, computing the likelihood of complex population genetic models is often infeasible. To circumvent this problem, we introduce a novel likelihood-free inference framework by applying deep learning, a powerful modern technique in machine learning. Deep learning makes use of multilayer neural networks to learn a feature-based function from the input (e.g., hundreds of correlated summary statistics of data) to the output (e.g., population genetic parameters of interest). We demonstrate that deep learning can be effectively employed for population genetic inference and learning informative features of data. As a concrete application, we focus on the challenging problem of jointly inferring natural selection and demography (in the form of a population size change history). Our method is able to separate the global nature of demography from the local nature of selection, without sequential steps for these two factors. Studying demography and selection jointly is motivated by Drosophila, where pervasive selection confounds demographic analysis. We apply our method to 197 African Drosophila melanogaster genomes from Zambia to infer both their overall demography, and regions of their genome under selection. We find many regions of the genome that have experienced hard sweeps, and fewer under selection on standing variation (soft sweep) or balancing selection. Interestingly, we find that soft sweeps and balancing selection occur more frequently closer to the centromere of each chromosome. In addition, our demographic inference suggests that previously estimated bottlenecks for African Drosophila melanogaster are too extreme.

  19. Computationally efficient Bayesian inference for inverse problems.

    SciTech Connect

    Marzouk, Youssef M.; Najm, Habib N.; Rahn, Larry A.

    2007-10-01

    Bayesian statistics provides a foundation for inference from noisy and incomplete data, a natural mechanism for regularization in the form of prior information, and a quantitative assessment of uncertainty in the inferred results. Inverse problems - representing indirect estimation of model parameters, inputs, or structural components - can be fruitfully cast in this framework. Complex and computationally intensive forward models arising in physical applications, however, can render a Bayesian approach prohibitive. This difficulty is compounded by high-dimensional model spaces, as when the unknown is a spatiotemporal field. We present new algorithmic developments for Bayesian inference in this context, showing strong connections with the forward propagation of uncertainty. In particular, we introduce a stochastic spectral formulation that dramatically accelerates the Bayesian solution of inverse problems via rapid evaluation of a surrogate posterior. We also explore dimensionality reduction for the inference of spatiotemporal fields, using truncated spectral representations of Gaussian process priors. These new approaches are demonstrated on scalar transport problems arising in contaminant source inversion and in the inference of inhomogeneous material or transport properties. We also present a Bayesian framework for parameter estimation in stochastic models, where intrinsic stochasticity may be intermingled with observational noise. Evaluation of a likelihood function may not be analytically tractable in these cases, and thus several alternative Markov chain Monte Carlo (MCMC) schemes, operating on the product space of the observations and the parameters, are introduced.

  20. Hierarchical cosmic shear power spectrum inference

    NASA Astrophysics Data System (ADS)

    Alsing, Justin; Heavens, Alan; Jaffe, Andrew H.; Kiessling, Alina; Wandelt, Benjamin; Hoffmann, Till

    2016-02-01

    We develop a Bayesian hierarchical modelling approach for cosmic shear power spectrum inference, jointly sampling from the posterior distribution of the cosmic shear field and its (tomographic) power spectra. Inference of the shear power spectrum is a powerful intermediate product for a cosmic shear analysis, since it requires very few model assumptions and can be used to perform inference on a wide range of cosmological models a posteriori without loss of information. We show that joint posterior for the shear map and power spectrum can be sampled effectively by Gibbs sampling, iteratively drawing samples from the map and power spectrum, each conditional on the other. This approach neatly circumvents difficulties associated with complicated survey geometry and masks that plague frequentist power spectrum estimators, since the power spectrum inference provides prior information about the field in masked regions at every sampling step. We demonstrate this approach for inference of tomographic shear E-mode, B-mode and EB-cross power spectra from a simulated galaxy shear catalogue with a number of important features; galaxies distributed on the sky and in redshift with photometric redshift uncertainties, realistic random ellipticity noise for every galaxy and a complicated survey mask. The obtained posterior distributions for the tomographic power spectrum coefficients recover the underlying simulated power spectra for both E- and B-modes.

  1. Children's and Adults' Judgments of the Certainty of Deductive Inferences, Inductive Inferences, and Guesses

    ERIC Educational Resources Information Center

    Pillow, Bradford H.; Pearson, RaeAnne M.; Hecht, Mary; Bremer, Amanda

    2010-01-01

    Children and adults rated their own certainty following inductive inferences, deductive inferences, and guesses. Beginning in kindergarten, participants rated deductions as more certain than weak inductions or guesses. Deductions were rated as more certain than strong inductions beginning in Grade 3, and fourth-grade children and adults…

  2. Using Alien Coins to Test Whether Simple Inference Is Bayesian

    ERIC Educational Resources Information Center

    Cassey, Peter; Hawkins, Guy E.; Donkin, Chris; Brown, Scott D.

    2016-01-01

    Reasoning and inference are well-studied aspects of basic cognition that have been explained as statistically optimal Bayesian inference. Using a simplified experimental design, we conducted quantitative comparisons between Bayesian inference and human inference at the level of individuals. In 3 experiments, with more than 13,000 participants, we…

  3. The Manhattan Frame Model - Manhattan World Inference in the Space of Surface Normals.

    PubMed

    Straub, Julian; Freifeld, Oren; Rosman, Guy; Leonard, John J; Fisher, John W

    2017-02-01

    Objects and structures within man-made environments typically exhibit a high degree of organization in the form of orthogonal and parallel planes. Traditional approaches utilize these regularities via the restrictive, and rather local, Manhattan World (MW) assumption which posits that every plane is perpendicular to one of the axes of a single coordinate system. The aforementioned regularities are especially evident in the surface normal distribution of a scene where they manifest as orthogonally-coupled clusters. This motivates the introduction of the Manhattan-Frame (MF) model which captures the notion of a MW in the surface normals space, the unit sphere, and two probabilistic MF models over this space. First, for a single MF we propose novel real-time MAP inference algorithms, evaluate their performance and their use in drift-free rotation estimation. Second, to capture the complexity of real-world scenes at a global scale, we extend the MF model to a probabilistic mixture of Manhattan Frames (MMF). For MMF inference we propose a simple MAP inference algorithm and an adaptive Markov-Chain Monte-Carlo sampling algorithm with Metropolis-Hastings split/merge moves that let us infer the unknown number of mixture components. We demonstrate the versatility of the MMF model and inference algorithm across several scales of man-made environments.

  4. Inference of dense spectral reflectance images from sparse reflectance measurement using non-linear regression modeling

    NASA Astrophysics Data System (ADS)

    Deglint, Jason; Kazemzadeh, Farnoud; Wong, Alexander; Clausi, David A.

    2015-09-01

    One method to acquire multispectral images is to sequentially capture a series of images where each image contains information from a different bandwidth of light. Another method is to use a series of beamsplitters and dichroic filters to guide different bandwidths of light onto different cameras. However, these methods are very time consuming and expensive and perform poorly in dynamic scenes or when observing transient phenomena. An alternative strategy to capturing multispectral data is to infer this data using sparse spectral reflectance measurements captured using an imaging device with overlapping bandpass filters, such as a consumer digital camera using a Bayer filter pattern. Currently the only method of inferring dense reflectance spectra is the Wiener adaptive filter, which makes Gaussian assumptions about the data. However, these assumptions may not always hold true for all data. We propose a new technique to infer dense reflectance spectra from sparse spectral measurements through the use of a non-linear regression model. The non-linear regression model used in this technique is the random forest model, which is an ensemble of decision trees and trained via the spectral characterization of the optical imaging system and spectral data pair generation. This model is then evaluated by spectrally characterizing different patches on the Macbeth color chart, as well as by reconstructing inferred multispectral images. Results show that the proposed technique can produce inferred dense reflectance spectra that correlate well with the true dense reflectance spectra, which illustrates the merits of the technique.

  5. On protocols and measures for the validation of supervised methods for the inference of biological networks.

    PubMed

    Schrynemackers, Marie; Küffner, Robert; Geurts, Pierre

    2013-12-03

    Networks provide a natural representation of molecular biology knowledge, in particular to model relationships between biological entities such as genes, proteins, drugs, or diseases. Because of the effort, the cost, or the lack of the experiments necessary for the elucidation of these networks, computational approaches for network inference have been frequently investigated in the literature. In this paper, we examine the assessment of supervised network inference. Supervised inference is based on machine learning techniques that infer the network from a training sample of known interacting and possibly non-interacting entities and additional measurement data. While these methods are very effective, their reliable validation in silico poses a challenge, since both prediction and validation need to be performed on the basis of the same partially known network. Cross-validation techniques need to be specifically adapted to classification problems on pairs of objects. We perform a critical review and assessment of protocols and measures proposed in the literature and derive specific guidelines how to best exploit and evaluate machine learning techniques for network inference. Through theoretical considerations and in silico experiments, we analyze in depth how important factors influence the outcome of performance estimation. These factors include the amount of information available for the interacting entities, the sparsity and topology of biological networks, and the lack of experimentally verified non-interacting pairs.

  6. Phylodynamic Inference with Kernel ABC and Its Application to HIV Epidemiology.

    PubMed

    Poon, Art F Y

    2015-09-01

    The shapes of phylogenetic trees relating virus populations are determined by the adaptation of viruses within each host, and by the transmission of viruses among hosts. Phylodynamic inference attempts to reverse this flow of information, estimating parameters of these processes from the shape of a virus phylogeny reconstructed from a sample of genetic sequences from the epidemic. A key challenge to phylodynamic inference is quantifying the similarity between two trees in an efficient and comprehensive way. In this study, I demonstrate that a new distance measure, based on a subset tree kernel function from computational linguistics, confers a significant improvement over previous measures of tree shape for classifying trees generated under different epidemiological scenarios. Next, I incorporate this kernel-based distance measure into an approximate Bayesian computation (ABC) framework for phylodynamic inference. ABC bypasses the need for an analytical solution of model likelihood, as it only requires the ability to simulate data from the model. I validate this "kernel-ABC" method for phylodynamic inference by estimating parameters from data simulated under a simple epidemiological model. Results indicate that kernel-ABC attained greater accuracy for parameters associated with virus transmission than leading software on the same data sets. Finally, I apply the kernel-ABC framework to study a recent outbreak of a recombinant HIV subtype in China. Kernel-ABC provides a versatile framework for phylodynamic inference because it can fit a broader range of models than methods that rely on the computation of exact likelihoods.

  7. Phylodynamic Inference with Kernel ABC and Its Application to HIV Epidemiology

    PubMed Central

    Poon, Art F.Y.

    2015-01-01

    The shapes of phylogenetic trees relating virus populations are determined by the adaptation of viruses within each host, and by the transmission of viruses among hosts. Phylodynamic inference attempts to reverse this flow of information, estimating parameters of these processes from the shape of a virus phylogeny reconstructed from a sample of genetic sequences from the epidemic. A key challenge to phylodynamic inference is quantifying the similarity between two trees in an efficient and comprehensive way. In this study, I demonstrate that a new distance measure, based on a subset tree kernel function from computational linguistics, confers a significant improvement over previous measures of tree shape for classifying trees generated under different epidemiological scenarios. Next, I incorporate this kernel-based distance measure into an approximate Bayesian computation (ABC) framework for phylodynamic inference. ABC bypasses the need for an analytical solution of model likelihood, as it only requires the ability to simulate data from the model. I validate this “kernel-ABC” method for phylodynamic inference by estimating parameters from data simulated under a simple epidemiological model. Results indicate that kernel-ABC attained greater accuracy for parameters associated with virus transmission than leading software on the same data sets. Finally, I apply the kernel-ABC framework to study a recent outbreak of a recombinant HIV subtype in China. Kernel-ABC provides a versatile framework for phylodynamic inference because it can fit a broader range of models than methods that rely on the computation of exact likelihoods. PMID:26006189

  8. Adaptive equalization

    NASA Astrophysics Data System (ADS)

    Qureshi, S. U. H.

    1985-09-01

    Theoretical work which has been effective in improving data transmission by telephone and radio links using adaptive equalization (AE) techniques is reviewed. AE has been applied to reducing the temporal dispersion effects, such as intersymbol interference, caused by the channel accessed. Attention is given to the Nyquist telegraph transmission theory, least mean square error adaptive filtering and the theory and structure of linear receive and transmit filters for reducing error. Optimum nonlinear receiver structures are discussed in terms of optimality criteria as a function of error probability. A suboptimum receiver structure is explored in the form of a decision-feedback equalizer. Consideration is also given to quadrature amplitude modulation and transversal equalization for receivers.

  9. Connector adapter

    NASA Technical Reports Server (NTRS)

    Hacker, Scott C. (Inventor); Dean, Richard J. (Inventor); Burge, Scott W. (Inventor); Dartez, Toby W. (Inventor)

    2007-01-01

    An adapter for installing a connector to a terminal post, wherein the connector is attached to a cable, is presented. In an embodiment, the adapter is comprised of an elongated collet member having a longitudinal axis comprised of a first collet member end, a second collet member end, an outer collet member surface, and an inner collet member surface. The inner collet member surface at the first collet member end is used to engage the connector. The outer collet member surface at the first collet member end is tapered for a predetermined first length at a predetermined taper angle. The collet includes a longitudinal slot that extends along the longitudinal axis initiating at the first collet member end for a predetermined second length. The first collet member end is formed of a predetermined number of sections segregated by a predetermined number of channels and the longitudinal slot.

  10. Inferring Mountain Basin Precipitation from Streamflow Observations Using Bayesian Model Calibration

    NASA Astrophysics Data System (ADS)

    Henn, B. M.; Kavetski, D.; Clark, M. P.; Lundquist, J. D.

    2014-12-01

    Estimating basin-mean precipitation in complex terrain is difficult due to uncertainty in precipitation gauges' topographical representativeness relative to the basin and undercatch. Inadequate distribution with elevation and spatial density can lead to large estimation errors in basin-mean precipitation. Streamflow offers additional information about the water balance of the basin with which to estimate precipitation. We applied a methodology for inferring basin-mean precipitation from streamflow using Bayes' Theorem, and adapted this approach to snow-dominated basins in the Sierra Nevada of California. We developed and coupled a temperature-index snow model to the FUSE conceptual rainfall-runoff model. We used the BATEA (Bayesian Total Error Analysis) calibration and inference environment, which seeks to robustly calibrate hydrologic models by using streamflow to estimate representativeness errors in the observed precipitation inputs and infer the correct basin-average precipitation. We inferred 1981-2006 annual average precipitation rates across a cluster of basins in and around the high country of Yosemite National Park, and compared the rates to those from approaches based on climatological precipitation patterns (PRISM). The inferred spatial patterns of precipitation showed reasonable match to PRISM, though some deviations were identified. We also investigated the precision and robustness of this approach for estimating mean annual precipitation rates. While this approach clearly identified differences in precipitation rates between basins in different climatic zones, uncertainties between +/-100 and +/-200 mm/yr were associated with the inferred precipitation rates. These were shown to be related to uncertainties in hydrologic model structure, potential evapotranspiration rates and soil storage capacities. Future work will investigate the extent to which observations of snow water content can constrain uncertainty in inferred precipitation rates.

  11. Automatic transformations in the inference process

    SciTech Connect

    Veroff, R. L.

    1980-07-01

    A technique for incorporating automatic transformations into processes such as the application of inference rules, subsumption, and demodulation provides a mechanism for improving search strategies for theorem proving problems arising from the field of program verification. The incorporation of automatic transformations into the inference process can alter the search space for a given problem, and is particularly useful for problems having broad rather than deep proofs. The technique can also be used to permit the generation of inferences that might otherwise be blocked and to build some commutativity or associativity into the unification process. Appropriate choice of transformations, and new literal clashing and unification algorithms for applying them, showed significant improvement on several real problems according to several distinct criteria. 22 references, 1 figure.

  12. Identification and Inference for Econometric Models

    NASA Astrophysics Data System (ADS)

    Andrews, Donald W. K.; Stock, James H.

    2005-07-01

    This volume contains the papers presented in honor of the lifelong achievements of Thomas J. Rothenberg on the occasion of his retirement. The authors of the chapters include many of the leading econometricians of our day, and the chapters address topics of current research significance in econometric theory. The chapters cover four themes: identification and efficient estimation in econometrics, asymptotic approximations to the distributions of econometric estimators and tests, inference involving potentially nonstationary time series, such as processes that might have a unit autoregressive root, and nonparametric and semiparametric inference. Several of the chapters provide overviews and treatments of basic conceptual issues, while others advance our understanding of the properties of existing econometric procedures and/or propose new ones. Specific topics include identification in nonlinear models, inference with weak instruments, tests for nonstationary in time series and panel data, generalized empirical likelihood estimation, and the bootstrap.

  13. Single board system for fuzzy inference

    NASA Technical Reports Server (NTRS)

    Symon, James R.; Watanabe, Hiroyuki

    1991-01-01

    The very large scale integration (VLSI) implementation of a fuzzy logic inference mechanism allows the use of rule-based control and decision making in demanding real-time applications. Researchers designed a full custom VLSI inference engine. The chip was fabricated using CMOS technology. The chip consists of 688,000 transistors of which 476,000 are used for RAM memory. The fuzzy logic inference engine board system incorporates the custom designed integrated circuit into a standard VMEbus environment. The Fuzzy Logic system uses Transistor-Transistor Logic (TTL) parts to provide the interface between the Fuzzy chip and a standard, double height VMEbus backplane, allowing the chip to perform application process control through the VMEbus host. High level C language functions hide details of the hardware system interface from the applications level programmer. The first version of the board was installed on a robot at Oak Ridge National Laboratory in January of 1990.

  14. Network inference in the nonequilibrium steady state

    NASA Astrophysics Data System (ADS)

    Dettmer, Simon L.; Nguyen, H. Chau; Berg, Johannes

    2016-11-01

    Nonequilibrium systems lack an explicit characterization of their steady state like the Boltzmann distribution for equilibrium systems. This has drastic consequences for the inference of the parameters of a model when its dynamics lacks detailed balance. Such nonequilibrium systems occur naturally in applications like neural networks and gene regulatory networks. Here, we focus on the paradigmatic asymmetric Ising model and show that we can learn its parameters from independent samples of the nonequilibrium steady state. We present both an exact inference algorithm and a computationally more efficient, approximate algorithm for weak interactions based on a systematic expansion around mean-field theory. Obtaining expressions for magnetizations and two- and three-point spin correlations, we establish that these observables are sufficient to infer the model parameters. Further, we discuss the symmetries characterizing the different orders of the expansion around the mean field and show how different types of dynamics can be distinguished on the basis of samples from the nonequilibrium steady state.

  15. A Learning Algorithm for Multimodal Grammar Inference.

    PubMed

    D'Ulizia, A; Ferri, F; Grifoni, P

    2011-12-01

    The high costs of development and maintenance of multimodal grammars in integrating and understanding input in multimodal interfaces lead to the investigation of novel algorithmic solutions in automating grammar generation and in updating processes. Many algorithms for context-free grammar inference have been developed in the natural language processing literature. An extension of these algorithms toward the inference of multimodal grammars is necessary for multimodal input processing. In this paper, we propose a novel grammar inference mechanism that allows us to learn a multimodal grammar from its positive samples of multimodal sentences. The algorithm first generates the multimodal grammar that is able to parse the positive samples of sentences and, afterward, makes use of two learning operators and the minimum description length metrics in improving the grammar description and in avoiding the over-generalization problem. The experimental results highlight the acceptable performances of the algorithm proposed in this paper since it has a very high probability of parsing valid sentences.

  16. Quality of computationally inferred gene ontology annotations.

    PubMed

    Skunca, Nives; Altenhoff, Adrian; Dessimoz, Christophe

    2012-05-01

    Gene Ontology (GO) has established itself as the undisputed standard for protein function annotation. Most annotations are inferred electronically, i.e. without individual curator supervision, but they are widely considered unreliable. At the same time, we crucially depend on those automated annotations, as most newly sequenced genomes are non-model organisms. Here, we introduce a methodology to systematically and quantitatively evaluate electronic annotations. By exploiting changes in successive releases of the UniProt Gene Ontology Annotation database, we assessed the quality of electronic annotations in terms of specificity, reliability, and coverage. Overall, we not only found that electronic annotations have significantly improved in recent years, but also that their reliability now rivals that of annotations inferred by curators when they use evidence other than experiments from primary literature. This work provides the means to identify the subset of electronic annotations that can be relied upon-an important outcome given that >98% of all annotations are inferred without direct curation.

  17. Adaptive sampler

    DOEpatents

    Watson, B.L.; Aeby, I.

    1980-08-26

    An adaptive data compression device for compressing data is described. The device has a frequency content, including a plurality of digital filters for analyzing the content of the data over a plurality of frequency regions, a memory, and a control logic circuit for generating a variable rate memory clock corresponding to the analyzed frequency content of the data in the frequency region and for clocking the data into the memory in response to the variable rate memory clock.

  18. A formal model of interpersonal inference

    PubMed Central

    Moutoussis, Michael; Trujillo-Barreto, Nelson J.; El-Deredy, Wael; Dolan, Raymond J.; Friston, Karl J.

    2014-01-01

    Introduction: We propose that active Bayesian inference—a general framework for decision-making—can equally be applied to interpersonal exchanges. Social cognition, however, entails special challenges. We address these challenges through a novel formulation of a formal model and demonstrate its psychological significance. Method: We review relevant literature, especially with regards to interpersonal representations, formulate a mathematical model and present a simulation study. The model accommodates normative models from utility theory and places them within the broader setting of Bayesian inference. Crucially, we endow people's prior beliefs, into which utilities are absorbed, with preferences of self and others. The simulation illustrates the model's dynamics and furnishes elementary predictions of the theory. Results: (1) Because beliefs about self and others inform both the desirability and plausibility of outcomes, in this framework interpersonal representations become beliefs that have to be actively inferred. This inference, akin to “mentalizing” in the psychological literature, is based upon the outcomes of interpersonal exchanges. (2) We show how some well-known social-psychological phenomena (e.g., self-serving biases) can be explained in terms of active interpersonal inference. (3) Mentalizing naturally entails Bayesian updating of how people value social outcomes. Crucially this includes inference about one's own qualities and preferences. Conclusion: We inaugurate a Bayes optimal framework for modeling intersubject variability in mentalizing during interpersonal exchanges. Here, interpersonal representations are endowed with explicit functional and affective properties. We suggest the active inference framework lends itself to the study of psychiatric conditions where mentalizing is distorted. PMID:24723872

  19. Adaptive antennas

    NASA Astrophysics Data System (ADS)

    Barton, P.

    1987-04-01

    The basic principles of adaptive antennas are outlined in terms of the Wiener-Hopf expression for maximizing signal to noise ratio in an arbitrary noise environment; the analogy with generalized matched filter theory provides a useful aid to understanding. For many applications, there is insufficient information to achieve the above solution and thus non-optimum constrained null steering algorithms are also described, together with a summary of methods for preventing wanted signals being nulled by the adaptive system. The three generic approaches to adaptive weight control are discussed; correlation steepest descent, weight perturbation and direct solutions based on sample matrix conversion. The tradeoffs between hardware complexity and performance in terms of null depth and convergence rate are outlined. The sidelobe cancellor technique is described. Performance variation with jammer power and angular distribution is summarized and the key performance limitations identified. The configuration and performance characteristics of both multiple beam and phase scan array antennas are covered, with a brief discussion of performance factors.

  20. A Novel Tool for the Spectroscopic Inference of Fundamental Stellar Parameters

    NASA Astrophysics Data System (ADS)

    Czekala, Ian; Andrews, Sean M.; Latham, David W.; Torres, Guillermo

    2014-06-01

    We present a novel approach for making accurate and unbiased inferences of fundamental stellar parameters (e.g., effective temperature, surface gravity, metallicity) from spectroscopic observations, with reference to a library of synthetic spectra. The forward-modeling formalism we have developed is generic (easily adaptable to data from any instrument or covering any wavelength range) and modular, in that it can incorporate external prior knowledge or additional data (e.g., broadband photometry) and account for instrumental and non-stellar effects on the spectrum (e.g., parametric treatments of extinction, spots, etc.). An approach that employs adaptive correlated noise is used to account for systematic discrepancies between the observations and the synthetic spectral library, ensuring that issues like uncertainties in atomic or molecular constants do not strongly bias the parameter inferences. In addition to extracting a set of unbiased inferences of the (posterior) probability distributions for basic stellar parameters, our modeling approach also "maps" out problematic spectral regions in the synthetic libraries that could be used as a basis for improving the models. As a demonstration, we present some preliminary results from modeling optical spectra of well-characterized exoplanet host stars and nearby pre-main sequence stars. A basic set of adaptable software that performs this modeling approach will be released publicly.

  1. Gene-network inference by message passing

    NASA Astrophysics Data System (ADS)

    Braunstein, A.; Pagnani, A.; Weigt, M.; Zecchina, R.

    2008-01-01

    The inference of gene-regulatory processes from gene-expression data belongs to the major challenges of computational systems biology. Here we address the problem from a statistical-physics perspective and develop a message-passing algorithm which is able to infer sparse, directed and combinatorial regulatory mechanisms. Using the replica technique, the algorithmic performance can be characterized analytically for artificially generated data. The algorithm is applied to genome-wide expression data of baker's yeast under various environmental conditions. We find clear cases of combinatorial control, and enrichment in common functional annotations of regulated genes and their regulators.

  2. Adaptive Thouless-Anderson-Palmer approach to inverse Ising problems with quenched random fields

    NASA Astrophysics Data System (ADS)

    Huang, Haiping; Kabashima, Yoshiyuki

    2013-06-01

    The adaptive Thouless-Anderson-Palmer equation is derived for inverse Ising problems in the presence of quenched random fields. We test the proposed scheme on Sherrington-Kirkpatrick, Hopfield, and random orthogonal models and find that the adaptive Thouless-Anderson-Palmer approach allows accurate inference of quenched random fields whose distribution can be either Gaussian or bimodal. In particular, another competitive method for inferring external fields, namely, the naive mean field method with diagonal weights, is compared and discussed.

  3. Directions of strong winds on Mars inferred

    NASA Technical Reports Server (NTRS)

    Howard, A. D.

    1972-01-01

    Asymmetrical crater shadings and diffuse light and dark streaks visible on the photography returned by the 1969 Mars flyby of Mariners 6 and 7 are probably eolian in origin. Wind directions inferred from mapping of these features parallel motions of observed global dust storms or relate to expected patterns of topographic funneling of winds.

  4. Decision generation tools and Bayesian inference

    NASA Astrophysics Data System (ADS)

    Jannson, Tomasz; Wang, Wenjian; Forrester, Thomas; Kostrzewski, Andrew; Veeris, Christian; Nielsen, Thomas

    2014-05-01

    Digital Decision Generation (DDG) tools are important software sub-systems of Command and Control (C2) systems and technologies. In this paper, we present a special type of DDGs based on Bayesian Inference, related to adverse (hostile) networks, including such important applications as terrorism-related networks and organized crime ones.

  5. Efficient Bayesian inference for ARFIMA processes

    NASA Astrophysics Data System (ADS)

    Graves, T.; Gramacy, R. B.; Franzke, C. L. E.; Watkins, N. W.

    2015-03-01

    Many geophysical quantities, like atmospheric temperature, water levels in rivers, and wind speeds, have shown evidence of long-range dependence (LRD). LRD means that these quantities experience non-trivial temporal memory, which potentially enhances their predictability, but also hampers the detection of externally forced trends. Thus, it is important to reliably identify whether or not a system exhibits LRD. In this paper we present a modern and systematic approach to the inference of LRD. Rather than Mandelbrot's fractional Gaussian noise, we use the more flexible Autoregressive Fractional Integrated Moving Average (ARFIMA) model which is widely used in time series analysis, and of increasing interest in climate science. Unlike most previous work on the inference of LRD, which is frequentist in nature, we provide a systematic treatment of Bayesian inference. In particular, we provide a new approximate likelihood for efficient parameter inference, and show how nuisance parameters (e.g. short memory effects) can be integrated over in order to focus on long memory parameters, and hypothesis testing more directly. We illustrate our new methodology on the Nile water level data, with favorable comparison to the standard estimators.

  6. Statistical inference for serial dilution assay data.

    PubMed

    Lee, M L; Whitmore, G A

    1999-12-01

    Serial dilution assays are widely employed for estimating substance concentrations and minimum inhibitory concentrations. The Poisson-Bernoulli model for such assays is appropriate for count data but not for continuous measurements that are encountered in applications involving substance concentrations. This paper presents practical inference methods based on a log-normal model and illustrates these methods using a case application involving bacterial toxins.

  7. Permutation inference for the general linear model

    PubMed Central

    Winkler, Anderson M.; Ridgway, Gerard R.; Webster, Matthew A.; Smith, Stephen M.; Nichols, Thomas E.

    2014-01-01

    Permutation methods can provide exact control of false positives and allow the use of non-standard statistics, making only weak assumptions about the data. With the availability of fast and inexpensive computing, their main limitation would be some lack of flexibility to work with arbitrary experimental designs. In this paper we report on results on approximate permutation methods that are more flexible with respect to the experimental design and nuisance variables, and conduct detailed simulations to identify the best method for settings that are typical for imaging research scenarios. We present a generic framework for permutation inference for complex general linear models (glms) when the errors are exchangeable and/or have a symmetric distribution, and show that, even in the presence of nuisance effects, these permutation inferences are powerful while providing excellent control of false positives in a wide range of common and relevant imaging research scenarios. We also demonstrate how the inference on glm parameters, originally intended for independent data, can be used in certain special but useful cases in which independence is violated. Detailed examples of common neuroimaging applications are provided, as well as a complete algorithm – the “randomise” algorithm – for permutation inference with the glm. PMID:24530839

  8. Perceptual inferences about indeterminate arrangements of figures.

    PubMed

    Moreno-Ríos, Sergio; Rojas-Barahona, Cristian A; García-Madruga, Juan A

    2014-05-01

    Previous studies in spatial propositional reasoning showed that adults use a particular strategy for making representations and inferences from indeterminate descriptions (those consistent with different alternatives). They do not initially represent all the alternatives, but construct a unified mental representation that includes a kind of mental footnote. Only when the task requires access to alternatives is the unified representation re-inspected. The degree of generalisation of this proposal to other perceptual situations was evaluated in three experiments with children, adolescents and adults, using a perceptual inference task with diagrammatic premises that gave information about the location of one of three possible objects. Results obtained with this very quick perceptual task support the kind of representation proposed from propositional spatial reasoning studies. However, children and adults differed in accuracy, with the results gradually changing with age: indeterminacy leads adults to require extra time for understanding and inferring alternatives, whereas children commit errors. These results could help inform us of how people can make inferences from diagrammatic information and make wrong interpretations.

  9. "Comments on Slavin": Synthesizing Causal Inferences

    ERIC Educational Resources Information Center

    Briggs, Derek C.

    2008-01-01

    When causal inferences are to be synthesized across multiple studies, efforts to establish the magnitude of a causal effect should be balanced by an effort to evaluate the generalizability of the effect. The evaluation of generalizability depends on two factors that are given little attention in current syntheses: construct validity and external…

  10. Evolutionary inference via the Poisson Indel Process.

    PubMed

    Bouchard-Côté, Alexandre; Jordan, Michael I

    2013-01-22

    We address the problem of the joint statistical inference of phylogenetic trees and multiple sequence alignments from unaligned molecular sequences. This problem is generally formulated in terms of string-valued evolutionary processes along the branches of a phylogenetic tree. The classic evolutionary process, the TKF91 model [Thorne JL, Kishino H, Felsenstein J (1991) J Mol Evol 33(2):114-124] is a continuous-time Markov chain model composed of insertion, deletion, and substitution events. Unfortunately, this model gives rise to an intractable computational problem: The computation of the marginal likelihood under the TKF91 model is exponential in the number of taxa. In this work, we present a stochastic process, the Poisson Indel Process (PIP), in which the complexity of this computation is reduced to linear. The Poisson Indel Process is closely related to the TKF91 model, differing only in its treatment of insertions, but it has a global characterization as a Poisson process on the phylogeny. Standard results for Poisson processes allow key computations to be decoupled, which yields the favorable computational profile of inference under the PIP model. We present illustrative experiments in which Bayesian inference under the PIP model is compared with separate inference of phylogenies and alignments.

  11. Causal Inferences with Group Based Trajectory Models

    ERIC Educational Resources Information Center

    Haviland, Amelia M.; Nagin, Daniel S.

    2005-01-01

    A central theme of research on human development and psychopathology is whether a therapeutic intervention or a turning-point event, such as a family break-up, alters the trajectory of the behavior under study. This paper lays out and applies a method for using observational longitudinal data to make more confident causal inferences about the…

  12. The Role of Inference in Effective Communication.

    ERIC Educational Resources Information Center

    Brown, Paula M.; Dell, Gary S.

    A study was conducted to determine whether speakers vary the explicitness of a message in accordance with a listener's likelihood of inferring the intended information. Thirty-six hearing and hearing-impaired college students were asked to read a series of 20 paragraphs. After each one, they were to re-tell the story in their own words to the…

  13. HIERARCHICAL PROBABILISTIC INFERENCE OF COSMIC SHEAR

    SciTech Connect

    Schneider, Michael D.; Dawson, William A.; Hogg, David W.; Marshall, Philip J.; Bard, Deborah J.; Meyers, Joshua; Lang, Dustin

    2015-07-01

    Point estimators for the shearing of galaxy images induced by gravitational lensing involve a complex inverse problem in the presence of noise, pixelization, and model uncertainties. We present a probabilistic forward modeling approach to gravitational lensing inference that has the potential to mitigate the biased inferences in most common point estimators and is practical for upcoming lensing surveys. The first part of our statistical framework requires specification of a likelihood function for the pixel data in an imaging survey given parameterized models for the galaxies in the images. We derive the lensing shear posterior by marginalizing over all intrinsic galaxy properties that contribute to the pixel data (i.e., not limited to galaxy ellipticities) and learn the distributions for the intrinsic galaxy properties via hierarchical inference with a suitably flexible conditional probabilitiy distribution specification. We use importance sampling to separate the modeling of small imaging areas from the global shear inference, thereby rendering our algorithm computationally tractable for large surveys. With simple numerical examples we demonstrate the improvements in accuracy from our importance sampling approach, as well as the significance of the conditional distribution specification for the intrinsic galaxy properties when the data are generated from an unknown number of distinct galaxy populations with different morphological characteristics.

  14. Making statistical inferences about software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1988-01-01

    Failure times of software undergoing random debugging can be modelled as order statistics of independent but nonidentically distributed exponential random variables. Using this model inferences can be made about current reliability and, if debugging continues, future reliability. This model also shows the difficulty inherent in statistical verification of very highly reliable software such as that used by digital avionics in commercial aircraft.

  15. Inference and the Introductory Statistics Course

    ERIC Educational Resources Information Center

    Pfannkuch, Maxine; Regan, Matt; Wild, Chris; Budgett, Stephanie; Forbes, Sharleen; Harraway, John; Parsonage, Ross

    2011-01-01

    This article sets out some of the rationale and arguments for making major changes to the teaching and learning of statistical inference in introductory courses at our universities by changing from a norm-based, mathematical approach to more conceptually accessible computer-based approaches. The core problem of the inferential argument with its…

  16. What Children Infer from Social Categories

    ERIC Educational Resources Information Center

    Diesendruck, Gil; Eldror, Ehud

    2011-01-01

    Children hold the belief that social categories have essences. We investigated what kinds of properties children feel licensed to infer about a person based on social category membership. Seventy-two 4-6-year-olds were introduced to novel social categories defined as having one internal--psychological or biological--and one external--behavioral or…

  17. Active interoceptive inference and the emotional brain

    PubMed Central

    Friston, Karl J.

    2016-01-01

    We review a recent shift in conceptions of interoception and its relationship to hierarchical inference in the brain. The notion of interoceptive inference means that bodily states are regulated by autonomic reflexes that are enslaved by descending predictions from deep generative models of our internal and external milieu. This re-conceptualization illuminates several issues in cognitive and clinical neuroscience with implications for experiences of selfhood and emotion. We first contextualize interoception in terms of active (Bayesian) inference in the brain, highlighting its enactivist (embodied) aspects. We then consider the key role of uncertainty or precision and how this might translate into neuromodulation. We next examine the implications for understanding the functional anatomy of the emotional brain, surveying recent observations on agranular cortex. Finally, we turn to theoretical issues, namely, the role of interoception in shaping a sense of embodied self and feelings. We will draw links between physiological homoeostasis and allostasis, early cybernetic ideas of predictive control and hierarchical generative models in predictive processing. The explanatory scope of interoceptive inference ranges from explanations for autism and depression, through to consciousness. We offer a brief survey of these exciting developments. This article is part of the themed issue ‘Interoception beyond homeostasis: affect, cognition and mental health’. PMID:28080966

  18. Linguistic Markers of Inference Generation While Reading

    ERIC Educational Resources Information Center

    Clinton, Virginia; Carlson, Sarah E.; Seipel, Ben

    2016-01-01

    Words can be informative linguistic markers of psychological constructs. The purpose of this study is to examine associations between word use and the process of making meaningful connections to a text while reading (i.e., inference generation). To achieve this purpose, think-aloud data from third-fifth grade students (N = 218) reading narrative…

  19. Campbell's and Rubin's Perspectives on Causal Inference

    ERIC Educational Resources Information Center

    West, Stephen G.; Thoemmes, Felix

    2010-01-01

    Donald Campbell's approach to causal inference (D. T. Campbell, 1957; W. R. Shadish, T. D. Cook, & D. T. Campbell, 2002) is widely used in psychology and education, whereas Donald Rubin's causal model (P. W. Holland, 1986; D. B. Rubin, 1974, 2005) is widely used in economics, statistics, medicine, and public health. Campbell's approach focuses on…

  20. Rates inferred from the space debris catalog

    SciTech Connect

    Canavan, G.H.

    1996-08-01

    Collision and fragmentation rates are inferred from the AFSPC space debris catalog and compare with estimates from other treatments. The collision rate is evaluated without approximation. The fragmentation rate requires additional empirical assessments. The number of fragments per collision is low compared to analytic and numerical treatments, is peaked low, and falls rapidly with altitude.

  1. Quasi-Experimental Designs for Causal Inference

    ERIC Educational Resources Information Center

    Kim, Yongnam; Steiner, Peter

    2016-01-01

    When randomized experiments are infeasible, quasi-experimental designs can be exploited to evaluate causal treatment effects. The strongest quasi-experimental designs for causal inference are regression discontinuity designs, instrumental variable designs, matching and propensity score designs, and comparative interrupted time series designs. This…

  2. Making statistical inferences about software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1986-01-01

    Failure times of software undergoing random debugging can be modeled as order statistics of independent but nonidentically distributed exponential random variables. Using this model inferences can be made about current reliability and, if debugging continues, future reliability. This model also shows the difficulty inherent in statistical verification of very highly reliable software such as that used by digital avionics in commercial aircraft.

  3. Double jeopardy in inferring cognitive processes

    PubMed Central

    Fific, Mario

    2014-01-01

    Inferences we make about underlying cognitive processes can be jeopardized in two ways due to problematic forms of aggregation. First, averaging across individuals is typically considered a very useful tool for removing random variability. The threat is that averaging across subjects leads to averaging across different cognitive strategies, thus harming our inferences. The second threat comes from the construction of inadequate research designs possessing a low diagnostic accuracy of cognitive processes. For that reason we introduced the systems factorial technology (SFT), which has primarily been designed to make inferences about underlying processing order (serial, parallel, coactive), stopping rule (terminating, exhaustive), and process dependency. SFT proposes that the minimal research design complexity to learn about n number of cognitive processes should be equal to 2n. In addition, SFT proposes that (a) each cognitive process should be controlled by a separate experimental factor, and (b) The saliency levels of all factors should be combined in a full factorial design. In the current study, the author cross combined the levels of jeopardies in a 2 × 2 analysis, leading to four different analysis conditions. The results indicate a decline in the diagnostic accuracy of inferences made about cognitive processes due to the presence of each jeopardy in isolation and when combined. The results warrant the development of more individual subject analyses and the utilization of full-factorial (SFT) experimental designs. PMID:25374545

  4. Causal Inferences in the Campbellian Validity System

    ERIC Educational Resources Information Center

    Lund, Thorleif

    2010-01-01

    The purpose of the present paper is to critically examine causal inferences and internal validity as defined by Campbell and co-workers. Several arguments are given against their counterfactual effect definition, and this effect definition should be considered inadequate for causal research in general. Moreover, their defined independence between…

  5. Inferring Internet Denial-of-Service Activity

    DTIC Science & Technology

    2007-11-02

    Inferring Internet Denial-of-Service Activity David Moore CAIDA San Diego Supercomputer Center University of California, San Diego dmoore@caida.org...the local network topology. kc claffy and Colleen Shannon at CAIDA provided support and valuable feed- back throughout the project. David Wetherall

  6. Investigating Mathematics Teachers' Thoughts of Statistical Inference

    ERIC Educational Resources Information Center

    Yang, Kai-Lin

    2012-01-01

    Research on statistical cognition and application suggests that statistical inference concepts are commonly misunderstood by students and even misinterpreted by researchers. Although some research has been done on students' misunderstanding or misconceptions of confidence intervals (CIs), few studies explore either students' or mathematics…

  7. An Efficient Approach in Analysis of DNA Base Calling Using Neural Fuzzy Model

    PubMed Central

    2017-01-01

    This paper presented the issues of true representation and a reliable measure for analyzing the DNA base calling is provided. The method implemented dealt with the data set quality in analyzing DNA sequencing, it is investigating solution of the problem of using Neurofuzzy techniques for predicting the confidence value for each base in DNA base calling regarding collecting the data for each base in DNA, and the simulation model of designing the ANFIS contains three subsystems and main system; obtain the three features from the subsystems and in the main system and use the three features to predict the confidence value for each base. This is achieving effective results with high performance in employment. PMID:28261268

  8. Comparison of polynomial and neural fuzzy models as applied to the ethanolamine pulping of vine shoots.

    PubMed

    Jiménez, L; Angulo, V; Caparrós, S; Ariza, J

    2007-12-01

    The influence of operational variables in the pulping of vine shoots by use of ethanolamine [viz. temperature (155-185 degrees C), cooking time (30-90min) and ethanolamine concentration (50-70% v/v)] on the properties of the resulting pulp (viz. yield, kappa index, viscosity and drainability) was studied. A central composite factorial design was used in conjunction with the software BMDP and ANFIS Edit Matlab 6.5 to develop polynomial and fuzzy neural models that reproduced the experimental results of the dependent variables with errors less than 10%. Both types of models are therefore effective with a view to simulating the ethanolamine pulping process. Based on the proposed equations, the best choice is to use values of the operational valuables resulting in near-optimal pulp properties while saving energy and immobilized capital on industrial facilities by using lower temperatures and shorter processing times. One combination leading to near-optimal properties with reduced costs is using a temperature of 180 degrees C and an ethanolamine concentration of 60% for 60min, to obtain pulp with a viscosity of 6.13% lower than the maximum value (932.8ml/g) and a drainability of 5.49% lower than the maximum value (71 (o)SR).

  9. Neural-fuzzy controller for real-time mobile robot navigation

    NASA Astrophysics Data System (ADS)

    Ng, Kim C.; Trivedi, Mohan M.

    1996-06-01

    A neural integrated fuzzy controller (NiF-T), which integrates the fuzzy logic representation of human knowledge with the learning capability of neural networks, is developed for nonlinear dynamic control problems. The NiF-T architecture comprises three distinct parts: (1) fuzzy logic membership functions (FMF), (2) rule neural network (RNN), and (3) output-refinement neural network (ORNN). FMF are utilized to fuzzify input parameters. RNN interpolates the fuzzy rule set; after defuzzification, the output is used to train ORNN. The weights of the ORNN can be adjusted on-line to fine-tune the controller. NiF-T can be applied for a wide range of sensor-driven robotics applications, which are characterized by high noise levels and nonlinear behavior, and where system models are unavailable or are unreliable. In this paper, real-time implementations of autonomous mobile robot navigation utilizing the NiF-T are realized. Only five rules were used to train the wall following behavior, while nine were used for the hall centering. With learning capability, the robot, SMAR-T, successfully and reliably hugs wall, and locks onto hall center. For all of the described behaviors, their RNNs are trained only for a few hundred iterations and so are their ORNNs trained only for less than one hundred iterations to learn their parent rule sets.

  10. Comparative evaluation of pattern recognition algorithms: statistical, neural, fuzzy, and neuro-fuzzy techniques

    NASA Astrophysics Data System (ADS)

    Mitra, Sunanda; Castellanos, Ramiro

    1998-10-01

    Pattern recognition by fuzzy, neural, and neuro-fuzzy approaches, has gained popularity partly because of intelligent decision processes involved in some of the above techniques, thus providing better classification and partly because of simplicity in computation required by these methods as opposed to traditional statistical approaches for complex data structures. However, the accuracy of pattern classification by various methods is often not considered. This paper considers the performance of major fuzzy, neural, and neuro-fuzzy pattern recognition algorithms and compares their performances with common statistical methods for the same data sets. For the specific data sets chosen namely the Iris data set, an the small Soybean data set, two neuro-fuzzy algorithms, AFLC and IAFC, outperform other well- known fuzzy, neural, and neuro-fuzzy algorithms in minimizing the classification error and equal the performance of the Bayesian classification. AFLC, and IAFC also demonstrate excellent learning vector quantization capability in generating optimal code books for coding and decoding of large color images at very low bit rates with exceptionally high visual fidelity.

  11. Inference of R(0) and transmission heterogeneity from the size distribution of stuttering chains.

    PubMed

    Blumberg, Seth; Lloyd-Smith, James O

    2013-01-01

    For many infectious disease processes such as emerging zoonoses and vaccine-preventable diseases, [Formula: see text] and infections occur as self-limited stuttering transmission chains. A mechanistic understanding of transmission is essential for characterizing the risk of emerging diseases and monitoring spatio-temporal dynamics. Thus methods for inferring [Formula: see text] and the degree of heterogeneity in transmission from stuttering chain data have important applications in disease surveillance and management. Previous researchers have used chain size distributions to infer [Formula: see text], but estimation of the degree of individual-level variation in infectiousness (as quantified by the dispersion parameter, [Formula: see text]) has typically required contact tracing data. Utilizing branching process theory along with a negative binomial offspring distribution, we demonstrate how maximum likelihood estimation can be applied to chain size data to infer both [Formula: see text] and the dispersion parameter that characterizes heterogeneity. While the maximum likelihood value for [Formula: see text] is a simple function of the average chain size, the associated confidence intervals are dependent on the inferred degree of transmission heterogeneity. As demonstrated for monkeypox data from the Democratic Republic of Congo, this impacts when a statistically significant change in [Formula: see text] is detectable. In addition, by allowing for superspreading events, inference of [Formula: see text] shifts the threshold above which a transmission chain should be considered anomalously large for a given value of [Formula: see text] (thus reducing the probability of false alarms about pathogen adaptation). Our analysis of monkeypox also clarifies the various ways that imperfect observation can impact inference of transmission parameters, and highlights the need to quantitatively evaluate whether observation is likely to significantly bias results.

  12. ANUBIS: artificial neuromodulation using a Bayesian inference system.

    PubMed

    Smith, Benjamin J H; Saaj, Chakravarthini M; Allouis, Elie

    2013-01-01

    Gain tuning is a crucial part of controller design and depends not only on an accurate understanding of the system in question, but also on the designer's ability to predict what disturbances and other perturbations the system will encounter throughout its operation. This letter presents ANUBIS (artificial neuromodulation using a Bayesian inference system), a novel biologically inspired technique for automatically tuning controller parameters in real time. ANUBIS is based on the Bayesian brain concept and modifies it by incorporating a model of the neuromodulatory system comprising four artificial neuromodulators. It has been applied to the controller of EchinoBot, a prototype walking rover for Martian exploration. ANUBIS has been implemented at three levels of the controller; gait generation, foot trajectory planning using Bézier curves, and foot trajectory tracking using a terminal sliding mode controller. We compare the results to a similar system that has been tuned using a multilayer perceptron. The use of Bayesian inference means that the system retains mathematical interpretability, unlike other intelligent tuning techniques, which use neural networks, fuzzy logic, or evolutionary algorithms. The simulation results show that ANUBIS provides significant improvements in efficiency and adaptability of the three controller components; it allows the robot to react to obstacles and uncertainties faster than the system tuned with the MLP, while maintaining stability and accuracy. As well as advancing rover autonomy, ANUBIS could also be applied to other situations where operating conditions are likely to change or cannot be accurately modeled in advance, such as process control. In addition, it demonstrates one way in which neuromodulation could fit into the Bayesian brain framework.

  13. MIDER: Network Inference with Mutual Information Distance and Entropy Reduction

    PubMed Central

    Villaverde, Alejandro F.; Ross, John; Morán, Federico; Banga, Julio R.

    2014-01-01

    The prediction of links among variables from a given dataset is a task referred to as network inference or reverse engineering. It is an open problem in bioinformatics and systems biology, as well as in other areas of science. Information theory, which uses concepts such as mutual information, provides a rigorous framework for addressing it. While a number of information-theoretic methods are already available, most of them focus on a particular type of problem, introducing assumptions that limit their generality. Furthermore, many of these methods lack a publicly available implementation. Here we present MIDER, a method for inferring network structures with information theoretic concepts. It consists of two steps: first, it provides a representation of the network in which the distance among nodes indicates their statistical closeness. Second, it refines the prediction of the existing links to distinguish between direct and indirect interactions and to assign directionality. The method accepts as input time-series data related to some quantitative features of the network nodes (such as e.g. concentrations, if the nodes are chemical species). It takes into account time delays between variables, and allows choosing among several definitions and normalizations of mutual information. It is general purpose: it may be applied to any type of network, cellular or otherwise. A Matlab implementation including source code and data is freely available (http://www.iim.csic.es/~gingproc/mider.html). The performance of MIDER has been evaluated on seven different benchmark problems that cover the main types of cellular networks, including metabolic, gene regulatory, and signaling. Comparisons with state of the art information–theoretic methods have demonstrated the competitive performance of MIDER, as well as its versatility. Its use does not demand any a priori knowledge from the user; the default settings and the adaptive nature of the method provide good results for a wide

  14. Inferring network topology via the propagation process

    NASA Astrophysics Data System (ADS)

    Zeng, An

    2013-11-01

    Inferring the network topology from the dynamics is a fundamental problem, with wide applications in geology, biology, and even counter-terrorism. Based on the propagation process, we present a simple method to uncover the network topology. A numerical simulation on artificial networks shows that our method enjoys a high accuracy in inferring the network topology. We find that the infection rate in the propagation process significantly influences the accuracy, and that each network corresponds to an optimal infection rate. Moreover, the method generally works better in large networks. These finding are confirmed in both real social and nonsocial networks. Finally, the method is extended to directed networks, and a similarity measure specific for directed networks is designed.

  15. Interoceptive inference, emotion, and the embodied self.

    PubMed

    Seth, Anil K

    2013-11-01

    The concept of the brain as a prediction machine has enjoyed a resurgence in the context of the Bayesian brain and predictive coding approaches within cognitive science. To date, this perspective has been applied primarily to exteroceptive perception (e.g., vision, audition), and action. Here, I describe a predictive, inferential perspective on interoception: 'interoceptive inference' conceives of subjective feeling states (emotions) as arising from actively-inferred generative (predictive) models of the causes of interoceptive afferents. The model generalizes 'appraisal' theories that view emotions as emerging from cognitive evaluations of physiological changes, and it sheds new light on the neurocognitive mechanisms that underlie the experience of body ownership and conscious selfhood in health and in neuropsychiatric illness.

  16. Pointwise probability reinforcements for robust statistical inference.

    PubMed

    Frénay, Benoît; Verleysen, Michel

    2014-02-01

    Statistical inference using machine learning techniques may be difficult with small datasets because of abnormally frequent data (AFDs). AFDs are observations that are much more frequent in the training sample that they should be, with respect to their theoretical probability, and include e.g. outliers. Estimates of parameters tend to be biased towards models which support such data. This paper proposes to introduce pointwise probability reinforcements (PPRs): the probability of each observation is reinforced by a PPR and a regularisation allows controlling the amount of reinforcement which compensates for AFDs. The proposed solution is very generic, since it can be used to robustify any statistical inference method which can be formulated as a likelihood maximisation. Experiments show that PPRs can be easily used to tackle regression, classification and projection: models are freed from the influence of outliers. Moreover, outliers can be filtered manually since an abnormality degree is obtained for each observation.

  17. Neural Circuit Inference from Function to Structure.

    PubMed

    Real, Esteban; Asari, Hiroki; Gollisch, Tim; Meister, Markus

    2017-01-23

    Advances in technology are opening new windows on the structural connectivity and functional dynamics of brain circuits. Quantitative frameworks are needed that integrate these data from anatomy and physiology. Here, we present a modeling approach that creates such a link. The goal is to infer the structure of a neural circuit from sparse neural recordings, using partial knowledge of its anatomy as a regularizing constraint. We recorded visual responses from the output neurons of the retina, the ganglion cells. We then generated a systematic sequence of circuit models that represents retinal neurons and connections and fitted them to the experimental data. The optimal models faithfully recapitulated the ganglion cell outputs. More importantly, they made predictions about dynamics and connectivity among unobserved neurons internal to the circuit, and these were subsequently confirmed by experiment. This circuit inference framework promises to facilitate the integration and understanding of big data in neuroscience.

  18. Dopamine, reward learning, and active inference

    PubMed Central

    FitzGerald, Thomas H. B.; Dolan, Raymond J.; Friston, Karl

    2015-01-01

    Temporal difference learning models propose phasic dopamine signaling encodes reward prediction errors that drive learning. This is supported by studies where optogenetic stimulation of dopamine neurons can stand in lieu of actual reward. Nevertheless, a large body of data also shows that dopamine is not necessary for learning, and that dopamine depletion primarily affects task performance. We offer a resolution to this paradox based on an hypothesis that dopamine encodes the precision of beliefs about alternative actions, and thus controls the outcome-sensitivity of behavior. We extend an active inference scheme for solving Markov decision processes to include learning, and show that simulated dopamine dynamics strongly resemble those actually observed during instrumental conditioning. Furthermore, simulated dopamine depletion impairs performance but spares learning, while simulated excitation of dopamine neurons drives reward learning, through aberrant inference about outcome states. Our formal approach provides a novel and parsimonious reconciliation of apparently divergent experimental findings. PMID:26581305

  19. The empirical accuracy of uncertain inference models

    NASA Technical Reports Server (NTRS)

    Vaughan, David S.; Yadrick, Robert M.; Perrin, Bruce M.; Wise, Ben P.

    1987-01-01

    Uncertainty is a pervasive feature of the domains in which expert systems are designed to function. Research design to test uncertain inference methods for accuracy and robustness, in accordance with standard engineering practice is reviewed. Several studies were conducted to assess how well various methods perform on problems constructed so that correct answers are known, and to find out what underlying features of a problem cause strong or weak performance. For each method studied, situations were identified in which performance deteriorates dramatically. Over a broad range of problems, some well known methods do only about as well as a simple linear regression model, and often much worse than a simple independence probability model. The results indicate that some commercially available expert system shells should be used with caution, because the uncertain inference models that they implement can yield rather inaccurate results.

  20. The NIFTY way of Bayesian signal inference

    SciTech Connect

    Selig, Marco

    2014-12-05

    We introduce NIFTY, 'Numerical Information Field Theory', a software package for the development of Bayesian signal inference algorithms that operate independently from any underlying spatial grid and its resolution. A large number of Bayesian and Maximum Entropy methods for 1D signal reconstruction, 2D imaging, as well as 3D tomography, appear formally similar, but one often finds individualized implementations that are neither flexible nor easily transferable. Signal inference in the framework of NIFTY can be done in an abstract way, such that algorithms, prototyped in 1D, can be applied to real world problems in higher-dimensional settings. NIFTY as a versatile library is applicable and already has been applied in 1D, 2D, 3D and spherical settings. A recent application is the D{sup 3}PO algorithm targeting the non-trivial task of denoising, deconvolving, and decomposing photon observations in high energy astronomy.

  1. Spectral likelihood expansions for Bayesian inference

    NASA Astrophysics Data System (ADS)

    Nagel, Joseph B.; Sudret, Bruno

    2016-03-01

    A spectral approach to Bayesian inference is presented. It pursues the emulation of the posterior probability density. The starting point is a series expansion of the likelihood function in terms of orthogonal polynomials. From this spectral likelihood expansion all statistical quantities of interest can be calculated semi-analytically. The posterior is formally represented as the product of a reference density and a linear combination of polynomial basis functions. Both the model evidence and the posterior moments are related to the expansion coefficients. This formulation avoids Markov chain Monte Carlo simulation and allows one to make use of linear least squares instead. The pros and cons of spectral Bayesian inference are discussed and demonstrated on the basis of simple applications from classical statistics and inverse modeling.

  2. Unified Theory of Inference for Text Understanding

    DTIC Science & Technology

    1986-11-25

    reataurant script is recognized, script application would lead to inferences such as identifying the waiter as ’ ’the waiter who is employed by the...relations between the objects. Objects have names as a convenience for the system modeler, but the names are not used for purposes other than...intent is that we can consider talking to be a frame with a talker slot which must be filled by a person. This is just a convenient notation; the

  3. Inferring Trust Based on Similarity with TILLIT

    NASA Astrophysics Data System (ADS)

    Tavakolifard, Mozhgan; Herrmann, Peter; Knapskog, Svein J.

    A network of people having established trust relations and a model for propagation of related trust scores are fundamental building blocks in many of today’s most successful e-commerce and recommendation systems. However, the web of trust is often too sparse to predict trust values between non-familiar people with high accuracy. Trust inferences are transitive associations among users in the context of an underlying social network and may provide additional information to alleviate the consequences of the sparsity and possible cold-start problems. Such approaches are helpful, provided that a complete trust path exists between the two users. An alternative approach to the problem is advocated in this paper. Based on collaborative filtering one can exploit the like-mindedness resp. similarity of individuals to infer trust to yet unknown parties which increases the trust relations in the web. For instance, if one knows that with respect to a specific property, two parties are trusted alike by a large number of different trusters, one can assume that they are similar. Thus, if one has a certain degree of trust to the one party, one can safely assume a very similar trustworthiness of the other one. In an attempt to provide high quality recommendations and proper initial trust values even when no complete trust propagation path or user profile exists, we propose TILLIT — a model based on combination of trust inferences and user similarity. The similarity is derived from the structure of the trust graph and users’ trust behavior as opposed to other collaborative-filtering based approaches which use ratings of items or user’s profile. We describe an algorithm realizing the approach based on a combination of trust inferences and user similarity, and validate the algorithm using a real large-scale data-set.

  4. Ambiguity and Uncertainty in Probabilistic Inference.

    DTIC Science & Technology

    1983-09-01

    Bruner , J. S. Going beyond the information given. In J. S. Bruner et al. (Eds.), Contemporary approaches to cognition. Cambridge, MA: Harvard University...82179the non-additivity of complementary probabilities, current psychological theories of risk, and Ellsberg’s original paradox. The model is tested in...most psychological work on inference has been guided by a Bayesian or subjectivist view of probability, increasing concerns have been expressed about

  5. A Theory of Diagnostic Inference: Judging Causality.

    DTIC Science & Technology

    1983-08-01

    Lepper, 1981) and the lack of search for disconfirming hypotheses (e.g., Mynatt , et al. 1977, 1978; Tweney, et al., 1980), we stress that a...perception de la causalite. Paris: Vrin, 1946. Mynatt , C. R., Doherty, M. Z., a Tweney, R. D. Confirmation bias in a simulated research environment: An...experimental study of scientific inference. Quarterly Journal of Experimental Psychology, 1977, 29, 85-95. =4 57 Mynatt , C. R., Doherty, M. E., & Tweney, R

  6. Thermodynamics of statistical inference by cells.

    PubMed

    Lang, Alex H; Fisher, Charles K; Mora, Thierry; Mehta, Pankaj

    2014-10-03

    The deep connection between thermodynamics, computation, and information is now well established both theoretically and experimentally. Here, we extend these ideas to show that thermodynamics also places fundamental constraints on statistical estimation and learning. To do so, we investigate the constraints placed by (nonequilibrium) thermodynamics on the ability of biochemical signaling networks to estimate the concentration of an external signal. We show that accuracy is limited by energy consumption, suggesting that there are fundamental thermodynamic constraints on statistical inference.

  7. A Unified Approach to Abductive Inference

    DTIC Science & Technology

    2014-09-30

    performance hacks . Alchemy Lite allows for fast, exact inference for models formulated in terms of TML, as well as the ability to update models with...Kimelfeld (bennyk@gmail.com) Molham Aref (molham.aref@logicblox.com) Charles Rivers Analytics Avi Pfeffer (apfeffer@cra.com) Facebook ...works at Yahoo; now at Facebook ) BAE systems Gregory Sullivan (gregory.sullivan@baesystems.com) Raytheon Kenric P Nelson

  8. Inferring Genetic Ancestry: Opportunities, Challenges, and Implications

    PubMed Central

    Royal, Charmaine D.; Novembre, John; Fullerton, Stephanie M.; Goldstein, David B.; Long, Jeffrey C.; Bamshad, Michael J.; Clark, Andrew G.

    2010-01-01

    Increasing public interest in direct-to-consumer (DTC) genetic ancestry testing has been accompanied by growing concern about issues ranging from the personal and societal implications of the testing to the scientific validity of ancestry inference. The very concept of “ancestry” is subject to misunderstanding in both the general and scientific communities. What do we mean by ancestry? How exactly is ancestry measured? How far back can such ancestry be defined and by which genetic tools? How do we validate inferences about ancestry in genetic research? What are the data that demonstrate our ability to do this correctly? What can we say and what can we not say from our research findings and the test results that we generate? This white paper from the American Society of Human Genetics (ASHG) Ancestry and Ancestry Testing Task Force builds upon the 2008 ASHG Ancestry Testing Summary Statement in providing a more in-depth analysis of key scientific and non-scientific aspects of genetic ancestry inference in academia and industry. It culminates with recommendations for advancing the current debate and facilitating the development of scientifically based, ethically sound, and socially attentive guidelines concerning the use of these continually evolving technologies. PMID:20466090

  9. Bayesian Inference for Skewed Stable Distributions

    NASA Astrophysics Data System (ADS)

    Shokripour, Mona; Nassiri, Vahid; Mohammadpour, Adel

    2011-03-01

    Stable distributions are a class of distributions which allow skewness and heavy tail. Non-Gaussian stable random variables play the role of normal distribution in the central limit theorem, for normalized sums of random variables with infinite variance. The lack of analytic formula for density and distribution functions of stable random variables has been a major drawback to the use of stable distributions, also in the case of inference in Bayesian framework. Buckle introduced priors for the parameters of stable random variables to obtain an analytic form of posterior distribution. However, many researchers tried to solve the problem, through the Markov chain Monte Carlo methods, e.g. [8] and their references. In this paper a new class of heavy-tailed distribution is introduced, called skewed stable. This class has two main advantages: It has many inferential advantages, since it is a member of exponential family, so the Bayesian inference can be drawn similar to the exponential family of distributions and modelling skew data with stable distributions is dominated by this family. Finally, Bayesian inference for skewed stable arc compared to the stable distributions through a few simulations study.

  10. Can orangutans (Pongo abelii) infer tool functionality?

    PubMed

    Mulcahy, Nicholas J; Schubiger, Michèle N

    2014-05-01

    It is debatable whether apes can reason about the unobservable properties of tools. We tested orangutans for this ability with a range of tool tasks that they could solve by using observational cues to infer tool functionality. In experiment 1, subjects successfully chose an unbroken tool over a broken one when each tool's middle section was hidden. This prevented seeing which tool was functional but it could be inferred by noting the tools' visible ends that were either disjointed (broken tool) or aligned (unbroken tool). We investigated whether success in experiment 1 was best explained by inferential reasoning or by having a preference per se for a hidden tool with an aligned configuration. We conducted a similar task to experiment 1 and included a functional bent tool that could be arranged to have the same disjointed configuration as the broken tool. The results suggested that subjects had a preference per se for the aligned tool by choosing it regardless of whether it was paired with the broken tool or the functional bent tool. However, further experiments with the bent tool task suggested this preference was a result of additional demands of having to attend to and remember the properties of the tools from the beginning of the task. In our last experiment, we removed these task demands and found evidence that subjects could infer the functionality of a broken tool and an unbroken tool that both looked identical at the time of choice.

  11. Inference of magnetic fields in inhomogeneous prominences

    NASA Astrophysics Data System (ADS)

    Milić, I.; Faurobert, M.; Atanacković, O.

    2017-01-01

    Context. Most of the quantitative information about the magnetic field vector in solar prominences comes from the analysis of the Hanle effect acting on lines formed by scattering. As these lines can be of non-negligible optical thickness, it is of interest to study the line formation process further. Aims: We investigate the multidimensional effects on the interpretation of spectropolarimetric observations, particularly on the inference of the magnetic field vector. We do this by analyzing the differences between multidimensional models, which involve fully self-consistent radiative transfer computations in the presence of spatial inhomogeneities and velocity fields, and those which rely on simple one-dimensional geometry. Methods: We study the formation of a prototype line in ad hoc inhomogeneous, isothermal 2D prominence models. We solve the NLTE polarized line formation problem in the presence of a large-scale oriented magnetic field. The resulting polarized line profiles are then interpreted (i.e. inverted) assuming a simple 1D slab model. Results: We find that differences between input and the inferred magnetic field vector are non-negligible. Namely, we almost universally find that the inferred field is weaker and more horizontal than the input field. Conclusions: Spatial inhomogeneities and radiative transfer have a strong effect on scattering line polarization in the optically thick lines. In real-life situations, ignoring these effects could lead to a serious misinterpretation of spectropolarimetric observations of chromospheric objects such as prominences.

  12. Combinatorics of distance-based tree inference

    PubMed Central

    Pardi, Fabio; Gascuel, Olivier

    2012-01-01

    Several popular methods for phylogenetic inference (or hierarchical clustering) are based on a matrix of pairwise distances between taxa (or any kind of objects): The objective is to construct a tree with branch lengths so that the distances between the leaves in that tree are as close as possible to the input distances. If we hold the structure (topology) of the tree fixed, in some relevant cases (e.g., ordinary least squares) the optimal values for the branch lengths can be expressed using simple combinatorial formulae. Here we define a general form for these formulae and show that they all have two desirable properties: First, the common tree reconstruction approaches (least squares, minimum evolution), when used in combination with these formulae, are guaranteed to infer the correct tree when given enough data (consistency); second, the branch lengths of all the simple (nearest neighbor interchange) rearrangements of a tree can be calculated, optimally, in quadratic time in the size of the tree, thus allowing the efficient application of hill climbing heuristics. The study presented here is a continuation of that by Mihaescu and Pachter on branch length estimation [Mihaescu R, Pachter L (2008) Proc Natl Acad Sci USA 105:13206–13211]. The focus here is on the inference of the tree itself and on providing a basis for novel algorithms to reconstruct trees from distances. PMID:23012403

  13. Combinatorics of distance-based tree inference.

    PubMed

    Pardi, Fabio; Gascuel, Olivier

    2012-10-09

    Several popular methods for phylogenetic inference (or hierarchical clustering) are based on a matrix of pairwise distances between taxa (or any kind of objects): The objective is to construct a tree with branch lengths so that the distances between the leaves in that tree are as close as possible to the input distances. If we hold the structure (topology) of the tree fixed, in some relevant cases (e.g., ordinary least squares) the optimal values for the branch lengths can be expressed using simple combinatorial formulae. Here we define a general form for these formulae and show that they all have two desirable properties: First, the common tree reconstruction approaches (least squares, minimum evolution), when used in combination with these formulae, are guaranteed to infer the correct tree when given enough data (consistency); second, the branch lengths of all the simple (nearest neighbor interchange) rearrangements of a tree can be calculated, optimally, in quadratic time in the size of the tree, thus allowing the efficient application of hill climbing heuristics. The study presented here is a continuation of that by Mihaescu and Pachter on branch length estimation [Mihaescu R, Pachter L (2008) Proc Natl Acad Sci USA 105:13206-13211]. The focus here is on the inference of the tree itself and on providing a basis for novel algorithms to reconstruct trees from distances.

  14. Inferring Pedigree Graphs from Genetic Distances

    NASA Astrophysics Data System (ADS)

    Tamura, Takeyuki; Ito, Hiro

    In this paper, we study a problem of inferring blood relationships which satisfy a given matrix of genetic distances between all pairs of n nodes. Blood relationships are represented by our proposed graph class, which is called a pedigree graph. A pedigree graph is a directed acyclic graph in which the maximum indegree is at most two. We show that the number of pedigree graphs which satisfy the condition of given genetic distances may be exponential, but they can be represented by one directed acyclic graph with n nodes. Moreover, an O(n3) time algorithm which solves the problem is also given. Although phylogenetic trees and phylogenetic networks are similar data structures to pedigree graphs, it seems that inferring methods for phylogenetic trees and networks cannot be applied to infer pedigree graphs since nodes of phylogenetic trees and networks represent species whereas nodes of pedigree graphs represent individuals. We also show an O(n2) time algorithm which detects a contradiction between a given pedigreee graph and distance matrix of genetic distances.

  15. Functional neuroanatomy of intuitive physical inference.

    PubMed

    Fischer, Jason; Mikhael, John G; Tenenbaum, Joshua B; Kanwisher, Nancy

    2016-08-23

    To engage with the world-to understand the scene in front of us, plan actions, and predict what will happen next-we must have an intuitive grasp of the world's physical structure and dynamics. How do the objects in front of us rest on and support each other, how much force would be required to move them, and how will they behave when they fall, roll, or collide? Despite the centrality of physical inferences in daily life, little is known about the brain mechanisms recruited to interpret the physical structure of a scene and predict how physical events will unfold. Here, in a series of fMRI experiments, we identified a set of cortical regions that are selectively engaged when people watch and predict the unfolding of physical events-a "physics engine" in the brain. These brain regions are selective to physical inferences relative to nonphysical but otherwise highly similar scenes and tasks. However, these regions are not exclusively engaged in physical inferences per se or, indeed, even in scene understanding; they overlap with the domain-general "multiple demand" system, especially the parts of that system involved in action planning and tool use, pointing to a close relationship between the cognitive and neural mechanisms involved in parsing the physical content of a scene and preparing an appropriate action.

  16. Inferring sparse networks for noisy transient processes

    NASA Astrophysics Data System (ADS)

    Tran, Hoang M.; Bukkapatnam, Satish T. S.

    2016-02-01

    Inferring causal structures of real world complex networks from measured time series signals remains an open issue. The current approaches are inadequate to discern between direct versus indirect influences (i.e., the presence or absence of a directed arc connecting two nodes) in the presence of noise, sparse interactions, as well as nonlinear and transient dynamics of real world processes. We report a sparse regression (referred to as the -min) approach with theoretical bounds on the constraints on the allowable perturbation to recover the network structure that guarantees sparsity and robustness to noise. We also introduce averaging and perturbation procedures to further enhance prediction scores (i.e., reduce inference errors), and the numerical stability of -min approach. Extensive investigations have been conducted with multiple benchmark simulated genetic regulatory network and Michaelis-Menten dynamics, as well as real world data sets from DREAM5 challenge. These investigations suggest that our approach can significantly improve, oftentimes by 5 orders of magnitude over the methods reported previously for inferring the structure of dynamic networks, such as Bayesian network, network deconvolution, silencing and modular response analysis methods based on optimizing for sparsity, transients, noise and high dimensionality issues.

  17. Functional neuroanatomy of intuitive physical inference

    PubMed Central

    Mikhael, John G.; Tenenbaum, Joshua B.; Kanwisher, Nancy

    2016-01-01

    To engage with the world—to understand the scene in front of us, plan actions, and predict what will happen next—we must have an intuitive grasp of the world’s physical structure and dynamics. How do the objects in front of us rest on and support each other, how much force would be required to move them, and how will they behave when they fall, roll, or collide? Despite the centrality of physical inferences in daily life, little is known about the brain mechanisms recruited to interpret the physical structure of a scene and predict how physical events will unfold. Here, in a series of fMRI experiments, we identified a set of cortical regions that are selectively engaged when people watch and predict the unfolding of physical events—a “physics engine” in the brain. These brain regions are selective to physical inferences relative to nonphysical but otherwise highly similar scenes and tasks. However, these regions are not exclusively engaged in physical inferences per se or, indeed, even in scene understanding; they overlap with the domain-general “multiple demand” system, especially the parts of that system involved in action planning and tool use, pointing to a close relationship between the cognitive and neural mechanisms involved in parsing the physical content of a scene and preparing an appropriate action. PMID:27503892

  18. Orthologous repeats and mammalian phylogenetic inference

    PubMed Central

    Bashir, Ali; Ye, Chun; Price, Alkes L.; Bafna, Vineet

    2005-01-01

    Determining phylogenetic relationships between species is a difficult problem, and many phylogenetic relationships remain unresolved, even among eutherian mammals. Repetitive elements provide excellent markers for phylogenetic analysis, because their mode of evolution is predominantly homoplasy-free and unidirectional. Historically, phylogenetic studies using repetitive elements have relied on biological methods such as PCR analysis, and computational inference is limited to a few isolated repeats. Here, we present a novel computational method for inferring phylogenetic relationships from partial sequence data using orthologous repeats. We apply our method to reconstructing the phylogeny of 28 mammals, using more than 1000 orthologous repeats obtained from sequence data available from the NISC Comparative Sequencing Program. The resulting phylogeny has robust bootstrap numbers, and broadly matches results from previous studies which were obtained using entirely different data and methods. In addition, we shed light on some of the debatable aspects of the phylogeny. With rapid expansion of available partial sequence data, computational analysis of repetitive elements holds great promise for the future of phylogenetic inference. PMID:15998912

  19. To link, to infer, to understand.

    PubMed

    Kock, H

    1989-01-01

    A model of linkage in text processing is proposed: An external proposition and an inference belonging to one frame are superimposed to constitute understanding. A text containing four academic subjects was presented orally to students who recalled it in writing. After transforming the recalls into propositions they are entered into a nonmetric multidimensional scaling to yield a text space. The subjects' interest choices among items of the four aspects are scaled to render an interest space. The decomposition ob both as subspaces of a common space yields an angle as their overall similarity and indicates the degree of predictability from interests. As the aggregate of inferences shows directedness, correlated with volitional-motivational orientation, and inference base is assumed to intervene. It is concluded that recipients try to superimpose and thereby construct a primary stage of processing. This allows for a very general algorithm of parallel information processing (holographic thesis), perhaps constructing the properties we are used to. Motivated perception, knowledge influence, schema-directedness and contribution to coherence are rivalled out as an explanation of this process.

  20. Quality of Computationally Inferred Gene Ontology Annotations

    PubMed Central

    Škunca, Nives; Altenhoff, Adrian; Dessimoz, Christophe

    2012-01-01

    Gene Ontology (GO) has established itself as the undisputed standard for protein function annotation. Most annotations are inferred electronically, i.e. without individual curator supervision, but they are widely considered unreliable. At the same time, we crucially depend on those automated annotations, as most newly sequenced genomes are non-model organisms. Here, we introduce a methodology to systematically and quantitatively evaluate electronic annotations. By exploiting changes in successive releases of the UniProt Gene Ontology Annotation database, we assessed the quality of electronic annotations in terms of specificity, reliability, and coverage. Overall, we not only found that electronic annotations have significantly improved in recent years, but also that their reliability now rivals that of annotations inferred by curators when they use evidence other than experiments from primary literature. This work provides the means to identify the subset of electronic annotations that can be relied upon—an important outcome given that >98% of all annotations are inferred without direct curation. PMID:22693439

  1. Gene network inference via structural equation modeling in genetical genomics experiments.

    PubMed

    Liu, Bing; de la Fuente, Alberto; Hoeschele, Ina

    2008-03-01

    Our goal is gene network inference in genetical genomics or systems genetics experiments. For species where sequence information is available, we first perform expression quantitative trait locus (eQTL) mapping by jointly utilizing cis-, cis-trans-, and trans-regulation. After using local structural models to identify regulator-target pairs for each eQTL, we construct an encompassing directed network (EDN) by assembling all retained regulator-target relationships. The EDN has nodes corresponding to expressed genes and eQTL and directed edges from eQTL to cis-regulated target genes, from cis-regulated genes to cis-trans-regulated target genes, from trans-regulator genes to target genes, and from trans-eQTL to target genes. For network inference within the strongly constrained search space defined by the EDN, we propose structural equation modeling (SEM), because it can model cyclic networks and the EDN indeed contains feedback relationships. On the basis of a factorization of the likelihood and the constrained search space, our SEM algorithm infers networks involving several hundred genes and eQTL. Structure inference is based on a penalized likelihood ratio and an adaptation of Occam's window model selection. The SEM algorithm was evaluated using data simulated with nonlinear ordinary differential equations and known cyclic network topologies and was applied to a real yeast data set.

  2. Data fusion and classification using a hybrid intrinsic cellular inference network

    NASA Astrophysics Data System (ADS)

    Woodley, Robert; Walenz, Brett; Seiffertt, John; Robinette, Paul; Wunsch, Donald

    2010-04-01

    Hybrid Intrinsic Cellular Inference Network (HICIN) is designed for battlespace decision support applications. We developed an automatic method of generating hypotheses for an entity-attribute classifier. The capability and effectiveness of a domain specific ontology was used to generate automatic categories for data classification. Heterogeneous data is clustered using an Adaptive Resonance Theory (ART) inference engine on a sample (unclassified) data set. The data set is the Lahman baseball database. The actual data is immaterial to the architecture, however, parallels in the data can be easily drawn (i.e., "Team" maps to organization, "Runs scored/allowed" to Measure of organization performance (positive/negative), "Payroll" to organization resources, etc.). Results show that HICIN classifiers create known inferences from the heterogonous data. These inferences are not explicitly stated in the ontological description of the domain and are strictly data driven. HICIN uses data uncertainty handling to reduce errors in the classification. The uncertainty handling is based on subjective logic. The belief mass allows evidence from multiple sources to be mathematically combined to increase or discount an assertion. In military operations the ability to reduce uncertainty will be vital in the data fusion operation.

  3. Transitive inference in two lemur species (Eulemur macaco and Eulemur fulvus).

    PubMed

    Tromp, D; Meunier, H; Roeder, J J

    2015-03-01

    When confronted with tasks involving reasoning instead of simple learning through trial and error, lemurs appeared to be less competent than simians. Our study aims to investigate lemurs' capability for transitive inference, a form of deductive reasoning in which the subject deduces logical conclusions from preliminary information. Transitive inference may have an adaptative function, especially in species living in large, complex social groups and is proposed to play a major role in rank estimation and establishment of dominance hierarchies. We proposed to test the capacities of reasoning using transitive inference in two species of lemurs, the brown lemur (Eulemur fulvus) and the black lemur (Eulemur macaco), both living in multimale-multifemale societies. For that purpose, we designed an original setup providing, for the first time in this kind of cognitive task, pictures of conspecifics' faces as stimuli. Subjects were trained to differentiate six photographs of unknown conspecifics named randomly from A to F to establish the order A > B > C > D > E > F and select consistently the highest-ranking photograph in five adjacent pairs AB, BC, CD, DE, and EF. Then lemurs were presented with the same adjacent pairs and three new and non-adjacent pairs BD, BE, CE. The results showed that all subjects correctly selected the highest-ranking photograph in every non-adjacent pair, reflecting lemurs' capacity for transitive inference. Our results are discussed in the context of the still debated current theories about the mechanisms underlying this specific capacity.

  4. The role of gender-related information and self-endorsement of traits in preadolescents' inferences and judgments.

    PubMed

    Lobel, T E; Bempechat, J; Gewirtz, J C; Shoken-Topaz, T; Bashe, E

    1993-08-01

    The major purpose of this study was to examine the effects of a target child's gender typicality on different aspects of preadolescents' inferences and judgments. The secondary purpose of the study was to investigate the relation between children's self-endorsement of traits and their inferences and judgments. Fifth and sixth graders were shown a video film, portraying a child playing either a gender-appropriate game with members of the same sex or a gender-inappropriate game with members of the other sex. In addition, subjects completed an adapted version of the BSRI and were categorized into sex-typed, androgynous, and undifferentiated subjects. Subjects made a number of different types of judgments and inferences about the target, including inferences about traits, popularity, choice of gift and name, and willingness to engage in activities with the target. All types of inferences and judgments were affected by the variations in the targets' gender-related behaviors, whereas self-endorsement of traits was not related to the inferences and judgments. The results suggest that the gender typicality of the target behavior is salient to preadolescents, regardless of their sex-role orientation.

  5. Development of L2 Word-Meaning Inference while Reading

    ERIC Educational Resources Information Center

    Hamada, Megumi

    2009-01-01

    Ability to infer the meaning of unknown words encountered while reading plays an important role in learners' L2 word-knowledge development. Despite numerous findings reported on word-meaning inference, how learners develop this ability is still unclear. In order to provide a developmental inquiry into L2 word-meaning inference while reading, this…

  6. Towards a neural implementation of causal inference in cue combination.

    PubMed

    Ma, Wei Ji; Rahmati, Masih

    2013-01-01

    Causal inference in sensory cue combination is the process of determining whether multiple sensory cues have the same cause or different causes. Psychophysical evidence indicates that humans closely follow the predictions of a Bayesian causal inference model. Here, we explore how Bayesian causal inference could be implemented using probabilistic population coding and plausible neural operations, but conclude that the resulting architecture is unrealistic.

  7. Statistical Inference at Work: Statistical Process Control as an Example

    ERIC Educational Resources Information Center

    Bakker, Arthur; Kent, Phillip; Derry, Jan; Noss, Richard; Hoyles, Celia

    2008-01-01

    To characterise statistical inference in the workplace this paper compares a prototypical type of statistical inference at work, statistical process control (SPC), with a type of statistical inference that is better known in educational settings, hypothesis testing. Although there are some similarities between the reasoning structure involved in…

  8. Reasoning about Informal Statistical Inference: One Statistician's View

    ERIC Educational Resources Information Center

    Rossman, Allan J.

    2008-01-01

    This paper identifies key concepts and issues associated with the reasoning of informal statistical inference. I focus on key ideas of inference that I think all students should learn, including at secondary level as well as tertiary. I argue that a fundamental component of inference is to go beyond the data at hand, and I propose that statistical…

  9. Nonholonomic mobile system control by combining EEG-based BCI with ANFIS.

    PubMed

    Yu, Weiwei; Feng, Huashan; Feng, Yangyang; Madani, Kurosh; Sabourin, Christophe

    2015-01-01

    Motor imagery EEG-based BCI has advantages in the assistance of human control of peripheral devices, such as the mobile robot or wheelchair, because the subject is not exposed to any stimulation and suffers no risk of fatigue. However, the intensive training necessary to recognize the numerous classes of data makes it hard to control these nonholonomic mobile systems accurately and effectively. This paper proposes a new approach which combines motor imagery EEG with the Adaptive Neural Fuzzy Inference System. This approach fuses the intelligence of humans based on motor imagery EEG with the precise capabilities of a mobile system based on ANFIS. This approach realizes a multi-level control, which makes the nonholonomic mobile system highly controllably without stopping or relying on sensor information. Also, because the ANFIS controller can be trained while performing the control task, control accuracy and efficiency is increased for the user. Experimental results of the nonholonomic mobile robot verify the effectiveness of this approach.

  10. Probabilistic adaptation in changing microbial environments

    PubMed Central

    Springer, Michael

    2016-01-01

    Microbes growing in animal host environments face fluctuations that have elements of both randomness and predictability. In the mammalian gut, fluctuations in nutrient levels and other physiological parameters are structured by the host’s behavior, diet, health and microbiota composition. Microbial cells that can anticipate environmental fluctuations by exploiting this structure would likely gain a fitness advantage (by adapting their internal state in advance). We propose that the problem of adaptive growth in structured changing environments, such as the gut, can be viewed as probabilistic inference. We analyze environments that are “meta-changing”: where there are changes in the way the environment fluctuates, governed by a mechanism unobservable to cells. We develop a dynamic Bayesian model of these environments and show that a real-time inference algorithm (particle filtering) for this model can be used as a microbial growth strategy implementable in molecular circuits. The growth strategy suggested by our model outperforms heuristic strategies, and points to a class of algorithms that could support real-time probabilistic inference in natural or synthetic cellular circuits. PMID:27994963

  11. Craniofacial biomechanics and functional and dietary inferences in hominin paleontology.

    PubMed

    Grine, Frederick E; Judex, Stefan; Daegling, David J; Ozcivici, Engin; Ungar, Peter S; Teaford, Mark F; Sponheimer, Matt; Scott, Jessica; Scott, Robert S; Walker, Alan

    2010-04-01

    Finite element analysis (FEA) is a potentially powerful tool by which the mechanical behaviors of different skeletal and dental designs can be investigated, and, as such, has become increasingly popular for biomechanical modeling and inferring the behavior of extinct organisms. However, the use of FEA to extrapolate from characterization of the mechanical environment to questions of trophic or ecological adaptation in a fossil taxon is both challenging and perilous. Here, we consider the problems and prospects of FEA applications in paleoanthropology, and provide a critical examination of one such study of the trophic adaptations of Australopithecus africanus. This particular FEA is evaluated with regard to 1) the nature of the A. africanus cranial composite, 2) model validation, 3) decisions made with respect to model parameters, 4) adequacy of data presentation, and 5) interpretation of the results. Each suggests that the results reflect methodological decisions as much as any underlying biological significance. Notwithstanding these issues, this model yields predictions that follow from the posited emphasis on premolar use by A. africanus. These predictions are tested with data from the paleontological record, including a phylogenetically-informed consideration of relative premolar size, and postcanine microwear fabrics and antemortem enamel chipping. In each instance, the data fail to conform to predictions from the model. This model thus serves to emphasize the need for caution in the application of FEA in paleoanthropological enquiry. Theoretical models can be instrumental in the construction of testable hypotheses; but ultimately, the studies that serve to test these hypotheses - rather than data from the models - should remain the source of information pertaining to hominin paleobiology and evolution.

  12. Bayesian Estimation and Inference Using Stochastic Electronics

    PubMed Central

    Thakur, Chetan Singh; Afshar, Saeed; Wang, Runchun M.; Hamilton, Tara J.; Tapson, Jonathan; van Schaik, André

    2016-01-01

    In this paper, we present the implementation of two types of Bayesian inference problems to demonstrate the potential of building probabilistic algorithms in hardware using single set of building blocks with the ability to perform these computations in real time. The first implementation, referred to as the BEAST (Bayesian Estimation and Stochastic Tracker), demonstrates a simple problem where an observer uses an underlying Hidden Markov Model (HMM) to track a target in one dimension. In this implementation, sensors make noisy observations of the target position at discrete time steps. The tracker learns the transition model for target movement, and the observation model for the noisy sensors, and uses these to estimate the target position by solving the Bayesian recursive equation online. We show the tracking performance of the system and demonstrate how it can learn the observation model, the transition model, and the external distractor (noise) probability interfering with the observations. In the second implementation, referred to as the Bayesian INference in DAG (BIND), we show how inference can be performed in a Directed Acyclic Graph (DAG) using stochastic circuits. We show how these building blocks can be easily implemented using simple digital logic gates. An advantage of the stochastic electronic implementation is that it is robust to certain types of noise, which may become an issue in integrated circuit (IC) technology with feature sizes in the order of tens of nanometers due to their low noise margin, the effect of high-energy cosmic rays and the low supply voltage. In our framework, the flipping of random individual bits would not affect the system performance because information is encoded in a bit stream. PMID:27047326

  13. Nonparametric inference of network structure and dynamics

    NASA Astrophysics Data System (ADS)

    Peixoto, Tiago P.

    The network structure of complex systems determine their function and serve as evidence for the evolutionary mechanisms that lie behind them. Despite considerable effort in recent years, it remains an open challenge to formulate general descriptions of the large-scale structure of network systems, and how to reliably extract such information from data. Although many approaches have been proposed, few methods attempt to gauge the statistical significance of the uncovered structures, and hence the majority cannot reliably separate actual structure from stochastic fluctuations. Due to the sheer size and high-dimensionality of many networks, this represents a major limitation that prevents meaningful interpretations of the results obtained with such nonstatistical methods. In this talk, I will show how these issues can be tackled in a principled and efficient fashion by formulating appropriate generative models of network structure that can have their parameters inferred from data. By employing a Bayesian description of such models, the inference can be performed in a nonparametric fashion, that does not require any a priori knowledge or ad hoc assumptions about the data. I will show how this approach can be used to perform model comparison, and how hierarchical models yield the most appropriate trade-off between model complexity and quality of fit based on the statistical evidence present in the data. I will also show how this general approach can be elegantly extended to networks with edge attributes, that are embedded in latent spaces, and that change in time. The latter is obtained via a fully dynamic generative network model, based on arbitrary-order Markov chains, that can also be inferred in a nonparametric fashion. Throughout the talk I will illustrate the application of the methods with many empirical networks such as the internet at the autonomous systems level, the global airport network, the network of actors and films, social networks, citations among

  14. Bayesian Estimation and Inference Using Stochastic Electronics.

    PubMed

    Thakur, Chetan Singh; Afshar, Saeed; Wang, Runchun M; Hamilton, Tara J; Tapson, Jonathan; van Schaik, André

    2016-01-01

    In this paper, we present the implementation of two types of Bayesian inference problems to demonstrate the potential of building probabilistic algorithms in hardware using single set of building blocks with the ability to perform these computations in real time. The first implementation, referred to as the BEAST (Bayesian Estimation and Stochastic Tracker), demonstrates a simple problem where an observer uses an underlying Hidden Markov Model (HMM) to track a target in one dimension. In this implementation, sensors make noisy observations of the target position at discrete time steps. The tracker learns the transition model for target movement, and the observation model for the noisy sensors, and uses these to estimate the target position by solving the Bayesian recursive equation online. We show the tracking performance of the system and demonstrate how it can learn the observation model, the transition model, and the external distractor (noise) probability interfering with the observations. In the second implementation, referred to as the Bayesian INference in DAG (BIND), we show how inference can be performed in a Directed Acyclic Graph (DAG) using stochastic circuits. We show how these building blocks can be easily implemented using simple digital logic gates. An advantage of the stochastic electronic implementation is that it is robust to certain types of noise, which may become an issue in integrated circuit (IC) technology with feature sizes in the order of tens of nanometers due to their low noise margin, the effect of high-energy cosmic rays and the low supply voltage. In our framework, the flipping of random individual bits would not affect the system performance because information is encoded in a bit stream.

  15. Dynamical Inference in the Milky Way

    NASA Astrophysics Data System (ADS)

    Bovy, Jo

    Current and future surveys of the Galaxy contain a wealth of information about the structure and evolution of the Galactic disk and halo. Teasing out this information is complicated by measurement uncertainties, missing data, and sparse sampling. I develop and describe several applications of generative modeling--creating an approximate description of the probability of the data given the physical parameters of the system--to deal with these issues. I develop a method for inferring the Galactic potential from individual observations of stellar kinematics such as will be furnished by the upcoming Gaia space astrometry mission. This method takes uncertainties in our knowledge of the distribution function of stellar tracers into account through marginalization. I demonstrate the method by inferring the force law in the Solar System from observations of the positions and velocities of the eight planets at a single epoch. I apply a similar method to derive the Milky Way's circular velocity from observations of maser kinematics. I infer the velocity distribution of nearby stars from Hipparcos data, which only consist of tangential velocities, by forward modeling the underlying distribution with a flexible multi-Gaussian model. I characterize the contribution of several "moving groups"---overdensities of co-moving stars---to the full distribution. By studying the properties of stars in these moving groups, I show that they do not form a single-burst population and that they are most likely due to transient non-axisymmetric features of the disk, such as transient spiral structure. By forward modeling one such scenario, I show how the Hercules moving group can be traced around the Galaxy by future surveys, which would confirm that the Milky Way bar's outer Lindblad resonance lies near the Solar radius.

  16. Prediction of Earth rotation parameters by fuzzy inference systems

    NASA Astrophysics Data System (ADS)

    Akyilmaz, O.; Kutterer, H.

    2004-09-01

    The short-term prediction of Earth rotation parameters (ERP) (length-of-day and polar motion) is studied up to 10 days by means of ANFIS (adaptive network based fuzzy inference system). The prediction is then extended to 40 days into the future by using the formerly predicted values as input data. The ERP C04 time series with daily values from the International Earth Rotation Service (IERS) serve as the data base. Well-known effects in the ERP series, such as the impact of the tides of the solid Earth and the oceans or seasonal variations of the atmosphere, were removed a priori from the C04 series. The residual series were used for both training and validation of the network. Different network architectures are discussed and compared in order to optimize the network solution. The results of the prediction are analyzed and compared with those of other methods. Short-term ERP values predicted by ANFIS show root-mean-square errors which are equal to or even lower than those from the other considered methods. The presented method is easy to use.

  17. Infants use relative numerical group size to infer social dominance

    PubMed Central

    Pun, Anthea; Birch, Susan A. J.; Baron, Andrew Scott

    2016-01-01

    Detecting dominance relationships, within and across species, provides a clear fitness advantage because this ability helps individuals assess their potential risk of injury before engaging in a competition. Previous research has demonstrated that 10- to 13-mo-old infants can represent the dominance relationship between two agents in terms of their physical size (larger agent = more dominant), whereas younger infants fail to do so. It is unclear whether infants younger than 10 mo fail to represent dominance relationships in general, or whether they lack sensitivity to physical size as a cue to dominance. Two studies explored whether infants, like many species across the animal kingdom, use numerical group size to assess dominance relationships and whether this capacity emerges before their sensitivity to physical size. A third study ruled out an alternative explanation for our findings. Across these studies, we report that infants 6–12 mo of age use numerical group size to infer dominance relationships. Specifically, preverbal infants expect an agent from a numerically larger group to win in a right-of-way competition against an agent from a numerically smaller group. In addition, this is, to our knowledge, the first study to demonstrate that infants 6–9 mo of age are capable of understanding social dominance relations. These results demonstrate that infants’ understanding of social dominance relations may be based on evolutionarily relevant cues and reveal infants’ early sensitivity to an important adaptive function of social groups. PMID:26884199

  18. Annual Rainfall Forecasting by Using Mamdani Fuzzy Inference System

    NASA Astrophysics Data System (ADS)

    Fallah-Ghalhary, G.-A.; Habibi Nokhandan, M.; Mousavi Baygi, M.

    2009-04-01

    Long-term rainfall prediction is very important to countries thriving on agro-based economy. In general, climate and rainfall are highly non-linear phenomena in nature giving rise to what is known as "butterfly effect". The parameters that are required to predict the rainfall are enormous even for a short period. Soft computing is an innovative approach to construct computationally intelligent systems that are supposed to possess humanlike expertise within a specific domain, adapt themselves and learn to do better in changing environments, and explain how they make decisions. Unlike conventional artificial intelligence techniques the guiding principle of soft computing is to exploit tolerance for imprecision, uncertainty, robustness, partial truth to achieve tractability, and better rapport with reality. In this paper, 33 years of rainfall data analyzed in khorasan state, the northeastern part of Iran situated at latitude-longitude pairs (31°-38°N, 74°- 80°E). this research attempted to train Fuzzy Inference System (FIS) based prediction models with 33 years of rainfall data. For performance evaluation, the model predicted outputs were compared with the actual rainfall data. Simulation results reveal that soft computing techniques are promising and efficient. The test results using by FIS model showed that the RMSE was obtained 52 millimeter.

  19. The boundaries of language and thought in deductive inference.

    PubMed

    Monti, Martin M; Parsons, Lawrence M; Osherson, Daniel N

    2009-07-28

    Is human thought fully embedded in language, or do some forms of thought operate independently? To directly address this issue, we focus on inference-making, a central feature of human cognition. In a 3T fMRI study we compare logical inferences relying on sentential connectives (e.g., not, or, if ... then) to linguistic inferences based on syntactic transformation of sentences involving ditransitive verbs (e.g., give, say, take). When contrasted with matched grammaticality judgments, logic inference alone recruited "core" regions of deduction [Brodmann area (BA) 10p and 8m], whereas linguistic inference alone recruited perisylvian regions of linguistic competence, among others (BA 21, 22, 37, 39, 44, and 45 and caudate). In addition, the two inferences commonly recruited a set of general "support" areas in frontoparietal cortex (BA 6, 7, 8, 40, and 47). The results indicate that logical inference is not embedded in natural language and confirm the relative modularity of linguistic processes.

  20. Using alien coins to test whether simple inference is Bayesian.

    PubMed

    Cassey, Peter; Hawkins, Guy E; Donkin, Chris; Brown, Scott D

    2016-03-01

    Reasoning and inference are well-studied aspects of basic cognition that have been explained as statistically optimal Bayesian inference. Using a simplified experimental design, we conducted quantitative comparisons between Bayesian inference and human inference at the level of individuals. In 3 experiments, with more than 13,000 participants, we asked people for prior and posterior inferences about the probability that 1 of 2 coins would generate certain outcomes. Most participants' inferences were inconsistent with Bayes' rule. Only in the simplest version of the task did the majority of participants adhere to Bayes' rule, but even in that case, there was a significant proportion that failed to do so. The current results highlight the importance of close quantitative comparisons between Bayesian inference and human data at the individual-subject level when evaluating models of cognition.

  1. Inferring cellular networks using probabilistic graphical models.

    PubMed

    Friedman, Nir

    2004-02-06

    High-throughput genome-wide molecular assays, which probe cellular networks from different perspectives, have become central to molecular biology. Probabilistic graphical models are useful for extracting meaningful biological insights from the resulting data sets. These models provide a concise representation of complex cellular networks by composing simpler submodels. Procedures based on well-understood principles for inferring such models from data facilitate a model-based methodology for analysis and discovery. This methodology and its capabilities are illustrated by several recent applications to gene expression data.

  2. Inferring Boolean network states from partial information

    PubMed Central

    2013-01-01

    Networks of molecular interactions regulate key processes in living cells. Therefore, understanding their functionality is a high priority in advancing biological knowledge. Boolean networks are often used to describe cellular networks mathematically and are fitted to experimental datasets. The fitting often results in ambiguities since the interpretation of the measurements is not straightforward and since the data contain noise. In order to facilitate a more reliable mapping between datasets and Boolean networks, we develop an algorithm that infers network trajectories from a dataset distorted by noise. We analyze our algorithm theoretically and demonstrate its accuracy using simulation and microarray expression data. PMID:24006954

  3. Inferring Evolutionary Scenarios for Protein Domain Compositions

    NASA Astrophysics Data System (ADS)

    Wiedenhoeft, John; Krause, Roland; Eulenstein, Oliver

    Essential cellular processes are controlled by functional interactions of protein domains, which can be inferred from their evolutionary histories. Methods to reconstruct these histories are challenged by the complexity of reconstructing macroevolutionary events. In this work we model these events using a novel network-like structure that represents the evolution of domain combinations, called plexus. We describe an algorithm to find a plexus that represents the evolution of a given collection of domain histories as phylogenetic trees with the minimum number of macroevolutionary events, and demonstrate its effectiveness in practice.

  4. Bayesian Inference in Satellite Gravity Inversion

    NASA Technical Reports Server (NTRS)

    Kis, K. I.; Taylor, Patrick T.; Wittmann, G.; Kim, Hyung Rae; Torony, B.; Mayer-Guerr, T.

    2005-01-01

    To solve a geophysical inverse problem means applying measurements to determine the parameters of the selected model. The inverse problem is formulated as the Bayesian inference. The Gaussian probability density functions are applied in the Bayes's equation. The CHAMP satellite gravity data are determined at the altitude of 400 kilometer altitude over the South part of the Pannonian basin. The model of interpretation is the right vertical cylinder. The parameters of the model are obtained from the minimum problem solved by the Simplex method.

  5. Data free inference with processed data products

    SciTech Connect

    Chowdhary, K.; Najm, H. N.

    2014-07-12

    Here, we consider the context of probabilistic inference of model parameters given error bars or confidence intervals on model output values, when the data is unavailable. We introduce a class of algorithms in a Bayesian framework, relying on maximum entropy arguments and approximate Bayesian computation methods, to generate consistent data with the given summary statistics. Once we obtain consistent data sets, we pool the respective posteriors, to arrive at a single, averaged density on the parameters. This approach allows us to perform accurate forward uncertainty propagation consistent with the reported statistics.

  6. Identifying inference attacks against healthcare data repositories

    PubMed Central

    Vaidya, Jaideep; Shafiq, Basit; Jiang, Xiaoqian; Ohno-Machado, Lucila

    Health care data repositories play an important role in driving progress in medical research. Finding new pathways to discovery requires having adequate data and relevant analysis. However, it is critical to ensure the privacy and security of the stored data. In this paper, we identify a dangerous inference attack against naive suppression based approaches that are used to protect sensitive information. We base our attack on the querying system provided by the Healthcare Cost and Utilization Project, though it applies in general to any medical database providing a query capability. We also discuss potential solutions to this problem. PMID:24303279

  7. Inverse Ising Inference Using All the Data

    NASA Astrophysics Data System (ADS)

    Aurell, Erik; Ekeberg, Magnus

    2012-03-01

    We show that a method based on logistic regression, using all the data, solves the inverse Ising problem far better than mean-field calculations relying only on sample pairwise correlation functions, while still computationally feasible for hundreds of nodes. The largest improvement in reconstruction occurs for strong interactions. Using two examples, a diluted Sherrington-Kirkpatrick model and a two-dimensional lattice, we also show that interaction topologies can be recovered from few samples with good accuracy and that the use of l1 regularization is beneficial in this process, pushing inference abilities further into low-temperature regimes.

  8. AIMS: Asteroseismic Inference on a Massive Scale

    NASA Astrophysics Data System (ADS)

    Reese, Daniel R.

    2016-11-01

    AIMS (Asteroseismic Inference on a Massive Scale) estimates stellar parameters and credible intervals/error bars in a Bayesian manner from a set of seismic frequency data and so-called classic constraints. To achieve reliable parameter estimates and computational efficiency it searches through a grid of pre-computed models using an MCMC algorithm; interpolation within the grid of models is performed by first tessellating the grid using a Delaunay triangulation and then doing a linear barycentric interpolation on matching simplexes. Inputs for the modeling consists of individual frequencies from peak-bagging, which can be complemented with classic spectroscopic constraints.

  9. Fitness Inference from Short-Read Data: Within-Host Evolution of a Reassortant H5N1 Influenza Virus

    PubMed Central

    Illingworth, Christopher J.R.

    2015-01-01

    We present a method to infer the role of selection acting during the within-host evolution of the influenza virus from short-read genome sequence data. Linkage disequilibrium between loci is accounted for by treating short-read sequences as noisy multilocus emissions from an underlying model of haplotype evolution. A hierarchical model-selection procedure is used to infer the underlying fitness landscape of the virus insofar as that landscape is explored by the viral population. In a first application of our method, we analyze data from an evolutionary experiment describing the growth of a reassortant H5N1 virus in ferrets. Across two sets of replica experiments we infer multiple alleles to be under selection, including variants associated with receptor binding specificity, glycosylation, and with the increased transmissibility of the virus. We identify epistasis as an important component of the within-host fitness landscape, and show that adaptation can proceed through multiple genetic pathways. PMID:26243288

  10. Statistical inference of the generation probability of T-cell receptors from sequence repertoires.

    PubMed

    Murugan, Anand; Mora, Thierry; Walczak, Aleksandra M; Callan, Curtis G

    2012-10-02

    Stochastic rearrangement of germline V-, D-, and J-genes to create variable coding sequence for certain cell surface receptors is at the origin of immune system diversity. This process, known as "VDJ recombination", is implemented via a series of stochastic molecular events involving gene choices and random nucleotide insertions between, and deletions from, genes. We use large sequence repertoires of the variable CDR3 region of human CD4+ T-cell receptor beta chains to infer the statistical properties of these basic biochemical events. Because any given CDR3 sequence can be produced in multiple ways, the probability distribution of hidden recombination events cannot be inferred directly from the observed sequences; we therefore develop a maximum likelihood inference method to achieve this end. To separate the properties of the molecular rearrangement mechanism from the effects of selection, we focus on nonproductive CDR3 sequences in T-cell DNA. We infer the joint distribution of the various generative events that occur when a new T-cell receptor gene is created. We find a rich picture of correlation (and absence thereof), providing insight into the molecular mechanisms involved. The generative event statistics are consistent between individuals, suggesting a universal biochemical process. Our probabilistic model predicts the generation probability of any specific CDR3 sequence by the primitive recombination process, allowing us to quantify the potential diversity of the T-cell repertoire and to understand why some sequences are shared between individuals. We argue that the use of formal statistical inference methods, of the kind presented in this paper, will be essential for quantitative understanding of the generation and evolution of diversity in the adaptive immune system.

  11. Phylogenetic Inference From Conserved sites Alignments

    SciTech Connect

    grundy, W.N.; Naylor, G.J.P.

    1999-08-15

    Molecular sequences provide a rich source of data for inferring the phylogenetic relationships among species. However, recent work indicates that even an accurate multiple alignment of a large sequence set may yield an incorrect phylogeny and that the quality of the phylogenetic tree improves when the input consists only of the highly conserved, motif regions of the alignment. This work introduces two methods of producing multiple alignments that include only the conserved regions of the initial alignment. The first method retains conserved motifs, whereas the second retains individual conserved sites in the initial alignment. Using parsimony analysis on a mitochondrial data set containing 19 species among which the phylogenetic relationships are widely accepted, both conserved alignment methods produce better phylogenetic trees than the complete alignment. Unlike any of the 19 inference methods used before to analyze this data, both methods produce trees that are completely consistent with the known phylogeny. The motif-based method employs far fewer alignment sites for comparable error rates. For a larger data set containing mitochondrial sequences from 39 species, the site-based method produces a phylogenetic tree that is largely consistent with known phylogenetic relationships and suggests several novel placements.

  12. Models for inference in dynamic metacommunity systems

    USGS Publications Warehouse

    Dorazio, R.M.; Kery, M.; Royle, J. Andrew; Plattner, M.

    2010-01-01

    A variety of processes are thought to be involved in the formation and dynamics of species assemblages. For example, various metacommunity theories are based on differences in the relative contributions of dispersal of species among local communities and interactions of species within local communities. Interestingly, metacommunity theories continue to be advanced without much empirical validation. Part of the problem is that statistical models used to analyze typical survey data either fail to specify ecological processes with sufficient complexity or they fail to account for errors in detection of species during sampling. In this paper, we describe a statistical modeling framework for the analysis of metacommunity dynamics that is based on the idea of adopting a unified approach, multispecies occupancy modeling, for computing inferences about individual species, local communities of species, or the entire metacommunity of species. This approach accounts for errors in detection of species during sampling and also allows different metacommunity paradigms to be specified in terms of species-and location-specific probabilities of occurrence, extinction, and colonization: all of which are estimable. In addition, this approach can be used to address inference problems that arise in conservation ecology, such as predicting temporal and spatial changes in biodiversity for use in making conservation decisions. To illustrate, we estimate changes in species composition associated with the species-specific phenologies of flight patterns of butterflies in Switzerland for the purpose of estimating regional differences in biodiversity. ?? 2010 by the Ecological Society of America.

  13. Models for inference in dynamic metacommunity systems

    USGS Publications Warehouse

    Dorazio, Robert M.; Kery, Marc; Royle, J. Andrew; Plattner, Matthias

    2010-01-01

    A variety of processes are thought to be involved in the formation and dynamics of species assemblages. For example, various metacommunity theories are based on differences in the relative contributions of dispersal of species among local communities and interactions of species within local communities. Interestingly, metacommunity theories continue to be advanced without much empirical validation. Part of the problem is that statistical models used to analyze typical survey data either fail to specify ecological processes with sufficient complexity or they fail to account for errors in detection of species during sampling. In this paper, we describe a statistical modeling framework for the analysis of metacommunity dynamics that is based on the idea of adopting a unified approach, multispecies occupancy modeling, for computing inferences about individual species, local communities of species, or the entire metacommunity of species. This approach accounts for errors in detection of species during sampling and also allows different metacommunity paradigms to be specified in terms of species- and location-specific probabilities of occurrence, extinction, and colonization: all of which are estimable. In addition, this approach can be used to address inference problems that arise in conservation ecology, such as predicting temporal and spatial changes in biodiversity for use in making conservation decisions. To illustrate, we estimate changes in species composition associated with the species-specific phenologies of flight patterns of butterflies in Switzerland for the purpose of estimating regional differences in biodiversity.

  14. Inferring Seizure Frequency From Brief EEG Recordings

    PubMed Central

    Westover, M. Brandon; Bianchi, Matt T.; Shafi, Mouhsin; Hoch, Daniel B.; Cole, Andrew J.; Chiappa, Keith; Cash, Sydney S.

    2012-01-01

    Routine EEGs remain a cornerstone test in caring for people with epilepsy. Although rare, a self-limited seizure (clinical or electrographic only) may be observed during such brief EEGs. The implications of observing a seizure in this situation, especially with respect to inferring the underlying seizure frequency, are unclear. The issue is complicated by the inaccuracy of patient-reported estimations of seizure frequency. The treating clinician is often left to wonder whether the single seizure indicates very frequent seizures, or if it is of lesser significance. We applied standard concepts of probabilistic inference to a simple model of seizure incidence to provide some guidance for clinicians facing this situation. Our analysis establishes upper and lower bounds on the seizure rate implied by observing a single seizure during routine EEG. Not surprisingly, with additional information regarding the expected seizure rate, these bounds can be further constrained. This framework should aid the clinician in applying a more principled approach toward decision making in the setting of a single seizure on a routine EEG. PMID:23545768

  15. Causal inference, probability theory, and graphical insights.

    PubMed

    Baker, Stuart G

    2013-11-10

    Causal inference from observational studies is a fundamental topic in biostatistics. The causal graph literature typically views probability theory as insufficient to express causal concepts in observational studies. In contrast, the view here is that probability theory is a desirable and sufficient basis for many topics in causal inference for the following two reasons. First, probability theory is generally more flexible than causal graphs: Besides explaining such causal graph topics as M-bias (adjusting for a collider) and bias amplification and attenuation (when adjusting for instrumental variable), probability theory is also the foundation of the paired availability design for historical controls, which does not fit into a causal graph framework. Second, probability theory is the basis for insightful graphical displays including the BK-Plot for understanding Simpson's paradox with a binary confounder, the BK2-Plot for understanding bias amplification and attenuation in the presence of an unobserved binary confounder, and the PAD-Plot for understanding the principal stratification component of the paired availability design.

  16. An Ada inference engine for expert systems

    NASA Technical Reports Server (NTRS)

    Lavallee, David B.

    1986-01-01

    The purpose is to investigate the feasibility of using Ada for rule-based expert systems with real-time performance requirements. This includes exploring the Ada features which give improved performance to expert systems as well as optimizing the tradeoffs or workarounds that the use of Ada may require. A prototype inference engine was built using Ada, and rule firing rates in excess of 500 per second were demonstrated on a single MC68000 processor. The knowledge base uses a directed acyclic graph to represent production lines. The graph allows the use of AND, OR, and NOT logical operators. The inference engine uses a combination of both forward and backward chaining in order to reach goals as quickly as possible. Future efforts will include additional investigation of multiprocessing to improve performance and creating a user interface allowing rule input in an Ada-like syntax. Investigation of multitasking and alternate knowledge base representations will help to analyze some of the performance issues as they relate to larger problems.

  17. Inferring social ties from geographic coincidences

    PubMed Central

    Crandall, David J.; Backstrom, Lars; Cosley, Dan; Suri, Siddharth; Huttenlocher, Daniel; Kleinberg, Jon

    2010-01-01

    We investigate the extent to which social ties between people can be inferred from co-occurrence in time and space: Given that two people have been in approximately the same geographic locale at approximately the same time, on multiple occasions, how likely are they to know each other? Furthermore, how does this likelihood depend on the spatial and temporal proximity of the co-occurrences? Such issues arise in data originating in both online and offline domains as well as settings that capture interfaces between online and offline behavior. Here we develop a framework for quantifying the answers to such questions, and we apply this framework to publicly available data from a social media site, finding that even a very small number of co-occurrences can result in a high empirical likelihood of a social tie. We then present probabilistic models showing how such large probabilities can arise from a natural model of proximity and co-occurrence in the presence of social ties. In addition to providing a method for establishing some of the first quantifiable estimates of these measures, our findings have potential privacy implications, particularly for the ways in which social structures can be inferred from public online records that capture individuals’ physical locations over time. PMID:21148099

  18. Inferring causal structure: a quantum advantage

    NASA Astrophysics Data System (ADS)

    Ried, Katja; Spekkens, Robert

    2014-03-01

    The problem of inferring causal relations from observed correlations is central to science, and extensive study has yielded both important conceptual insights and widely used practical applications. Yet some of the simplest questions are impossible to answer classically: for instance, if one observes correlations between two variables (such as taking a new medical treatment and the subject's recovery), does this show a direct causal influence, or is it due to some hidden common cause? We develop a framework for quantum causal inference, and show how quantum theory provides a unique advantage in this decision problem. The key insight is that certain quantum correlations can only arise from specific causal structures, whereas pairs of classical variables can exhibit any pattern of correlation regardless of whether they have a common cause or a direct-cause relation. For example, suppose one measures the same Pauli observable on two qubits. If they share a common cause, such as being prepared in an entangled state, then one never finds perfect (positive) correlations in every basis, whereas perfect anticorrelations are possible (if one prepares the singlet state). Conversely, if a channel connects the qubits, hence a direct causal influence, perfect anticorrelations are impossible.

  19. Brain Connectivity Inference under Network Spatial Subsampling

    NASA Astrophysics Data System (ADS)

    da Rocha Amaral, Selene; Vieira, Gilson; Baccala, Luiz

    2013-03-01

    Neurophysiological time series analysis using functional Magnetic Resonance Magnetic Imaging (fMRI) data can be seen as tool to investigate how the complex networks of neuronal populations interact naturally leading to brain connectivity description issues where it is desirable to process as many simultaneous structures as possible to avoid misleading interaction inferences. Here we systematically use simulations to gauge how connectivity inference is affected when only subsets of network structures are considered through exploratory tools like Partial Directed Coherence (PDC) and confirmatory methods like Dynamic Causal Modeling (DCM). PDC is based on Granger causality and uses autoregressive models to expose the direction of information flow whereas DCM was proposed to characterize neural fMRI connectivity using prior knowlegde of possible connectivity structures. SPM software was used to simulate the full network fMRI data which was subject to realistic noise levels prior to analysis of network structure subsets. This work has been financially supported by FAPESP/CINAPCE 2011/0150-4

  20. Natural frequencies facilitate diagnostic inferences of managers

    PubMed Central

    Hoffrage, Ulrich; Hafenbrädl, Sebastian; Bouquet, Cyril

    2015-01-01

    In Bayesian inference tasks, information about base rates as well as hit rate and false-alarm rate needs to be integrated according to Bayes’ rule after the result of a diagnostic test became known. Numerous studies have found that presenting information in a Bayesian inference task in terms of natural frequencies leads to better performance compared to variants with information presented in terms of probabilities or percentages. Natural frequencies are the tallies in a natural sample in which hit rate and false-alarm rate are not normalized with respect to base rates. The present research replicates the beneficial effect of natural frequencies with four tasks from the domain of management, and with management students as well as experienced executives as participants. The percentage of Bayesian responses was almost twice as high when information was presented in natural frequencies compared to a presentation in terms of percentages. In contrast to most tasks previously studied, the majority of numerical responses were lower than the Bayesian solutions. Having heard of Bayes’ rule prior to the study did not affect Bayesian performance. An implication of our work is that textbooks explaining Bayes’ rule should teach how to represent information in terms of natural frequencies instead of how to plug probabilities or percentages into a formula. PMID:26157397

  1. Spatial Inference for Distributed Remote Sensing Data

    NASA Astrophysics Data System (ADS)

    Braverman, A. J.; Katzfuss, M.; Nguyen, H.

    2014-12-01

    Remote sensing data are inherently spatial, and a substantial portion of their value for scientific analyses derives from the information they can provide about spatially dependent processes. Geophysical variables such as atmopsheric temperature, cloud properties, humidity, aerosols and carbon dioxide all exhibit spatial patterns, and satellite observations can help us learn about the physical mechanisms driving them. However, remote sensing observations are often noisy and incomplete, so inferring properties of true geophysical fields from them requires some care. These data can also be massive, which is both a blessing and a curse: using more data drives uncertainties down, but also drives costs up, particularly when data are stored on different computers or in different physical locations. In this talk I will discuss a methodology for spatial inference on massive, distributed data sets that does not require moving large volumes of data. The idea is based on a combination of ideas including modeling spatial covariance structures with low-rank covariance matrices, and distributed estimation in sensor or wireless networks.

  2. Information Theory, Inference and Learning Algorithms

    NASA Astrophysics Data System (ADS)

    Mackay, David J. C.

    2003-10-01

    Information theory and inference, often taught separately, are here united in one entertaining textbook. These topics lie at the heart of many exciting areas of contemporary science and engineering - communication, signal processing, data mining, machine learning, pattern recognition, computational neuroscience, bioinformatics, and cryptography. This textbook introduces theory in tandem with applications. Information theory is taught alongside practical communication systems, such as arithmetic coding for data compression and sparse-graph codes for error-correction. A toolbox of inference techniques, including message-passing algorithms, Monte Carlo methods, and variational approximations, are developed alongside applications of these tools to clustering, convolutional codes, independent component analysis, and neural networks. The final part of the book describes the state of the art in error-correcting codes, including low-density parity-check codes, turbo codes, and digital fountain codes -- the twenty-first century standards for satellite communications, disk drives, and data broadcast. Richly illustrated, filled with worked examples and over 400 exercises, some with detailed solutions, David MacKay's groundbreaking book is ideal for self-learning and for undergraduate or graduate courses. Interludes on crosswords, evolution, and sex provide entertainment along the way. In sum, this is a textbook on information, communication, and coding for a new generation of students, and an unparalleled entry point into these subjects for professionals in areas as diverse as computational biology, financial engineering, and machine learning.

  3. Inferring epigenetic dynamics from kin correlations

    PubMed Central

    Hormoz, Sahand; Desprat, Nicolas; Shraiman, Boris I.

    2015-01-01

    Populations of isogenic embryonic stem cells or clonal bacteria often exhibit extensive phenotypic heterogeneity that arises from intrinsic stochastic dynamics of cells. The phenotypic state of a cell can be transmitted epigenetically in cell division, leading to correlations in the states of cells related by descent. The extent of these correlations is determined by the rates of transitions between the phenotypic states. Therefore, a snapshot of the phenotypes of a collection of cells with known genealogical structure contains information on phenotypic dynamics. Here, we use a model of phenotypic dynamics on a genealogical tree to define an inference method that allows extraction of an approximate probabilistic description of the dynamics from observed phenotype correlations as a function of the degree of kinship. The approach is tested and validated on the example of Pyoverdine dynamics in Pseudomonas aeruginosa colonies. Interestingly, we find that correlations among pairs and triples of distant relatives have a simple but nontrivial structure indicating that observed phenotypic dynamics on the genealogical tree is approximately conformal—a symmetry characteristic of critical behavior in physical systems. The proposed inference method is sufficiently general to be applied in any system where lineage information is available. PMID:25902540

  4. Hierarchical Bayesian inference in the visual cortex

    NASA Astrophysics Data System (ADS)

    Lee, Tai Sing; Mumford, David

    2003-07-01

    Traditional views of visual processing suggest that early visual neurons in areas V1 and V2 are static spatiotemporal filters that extract local features from a visual scene. The extracted information is then channeled through a feedforward chain of modules in successively higher visual areas for further analysis. Recent electrophysiological recordings from early visual neurons in awake behaving monkeys reveal that there are many levels of complexity in the information processing of the early visual cortex, as seen in the long-latency responses of its neurons. These new findings suggest that activity in the early visual cortex is tightly coupled and highly interactive with the rest of the visual system. They lead us to propose a new theoretical setting based on the mathematical framework of hierarchical Bayesian inference for reasoning about the visual system. In this framework, the recurrent feedforward/feedback loops in the cortex serve to integrate top-down contextual priors and bottom-up observations so as to implement concurrent probabilistic inference along the visual hierarchy. We suggest that the algorithms of particle filtering and Bayesian-belief propagation might model these interactive cortical computations. We review some recent neurophysiological evidences that support the plausibility of these ideas. 2003 Optical Society of America

  5. Inferring seizure frequency from brief EEG recordings.

    PubMed

    Westover, M Brandon; Bianchi, Matt T; Shafi, Mouhsin; Hoch, Daniel B; Cole, Andrew J; Chiappa, Keith; Cash, Sydney S

    2013-04-01

    Routine EEGs remain a cornerstone test in caring for people with epilepsy. Although rare, a self-limited seizure (clinical or electrographic only) may be observed during such brief EEGs. The implications of observing a seizure in this situation, especially with respect to inferring the underlying seizure frequency, are unclear. The issue is complicated by the inaccuracy of patient-reported estimations of seizure frequency. The treating clinician is often left to wonder whether the single seizure indicates very frequent seizures, or if it is of lesser significance. We applied standard concepts of probabilistic inference to a simple model of seizure incidence to provide some guidance for clinicians facing this situation. Our analysis establishes upper and lower bounds on the seizure rate implied by observing a single seizure during routine EEG. Not surprisingly, with additional information regarding the expected seizure rate, these bounds can be further constrained. This framework should aid the clinician in applying a more principled approach toward decision making in the setting of a single seizure on a routine EEG.

  6. Cooperative inference: Features, objects, and collections.

    PubMed

    Searcy, Sophia Ray; Shafto, Patrick

    2016-10-01

    Cooperation plays a central role in theories of development, learning, cultural evolution, and education. We argue that existing models of learning from cooperative informants have fundamental limitations that prevent them from explaining how cooperation benefits learning. First, existing models are shown to be computationally intractable, suggesting that they cannot apply to realistic learning problems. Second, existing models assume a priori agreement about which concepts are favored in learning, which leads to a conundrum: Learning fails without precise agreement on bias yet there is no single rational choice. We introduce cooperative inference, a novel framework for cooperation in concept learning, which resolves these limitations. Cooperative inference generalizes the notion of cooperation used in previous models from omission of labeled objects to the omission values of features, labels for objects, and labels for collections of objects. The result is an approach that is computationally tractable, does not require a priori agreement about biases, applies to both Boolean and first-order concepts, and begins to approximate the richness of real-world concept learning problems. We conclude by discussing relations to and implications for existing theories of cognition, cognitive development, and cultural evolution. (PsycINFO Database Record

  7. Representing Documents via Latent Keyphrase Inference

    PubMed Central

    Liu, Jialu; Ren, Xiang; Shang, Jingbo; Cassidy, Taylor; Voss, Clare R.; Han, Jiawei

    2017-01-01

    Many text mining approaches adopt bag-of-words or n-grams models to represent documents. Looking beyond just the words, i.e., the explicit surface forms, in a document can improve a computer’s understanding of text. Being aware of this, researchers have proposed concept-based models that rely on a human-curated knowledge base to incorporate other related concepts in the document representation. But these methods are not desirable when applied to vertical domains (e.g., literature, enterprise, etc.) due to low coverage of in-domain concepts in the general knowledge base and interference from out-of-domain concepts. In this paper, we propose a data-driven model named Latent Keyphrase Inference (LAKI) that represents documents with a vector of closely related domain keyphrases instead of single words or existing concepts in the knowledge base. We show that given a corpus of in-domain documents, topical content units can be learned for each domain keyphrase, which enables a computer to do smart inference to discover latent document keyphrases, going beyond just explicit mentions. Compared with the state-of-art document representation approaches, LAKI fills the gap between bag-of-words and concept-based models by using domain keyphrases as the basic representation unit. It removes dependency on a knowledge base while providing, with keyphrases, readily interpretable representations. When evaluated against 8 other methods on two text mining tasks over two corpora, LAKI outperformed all. PMID:28229132

  8. Adaptive Image Denoising by Mixture Adaptation

    NASA Astrophysics Data System (ADS)

    Luo, Enming; Chan, Stanley H.; Nguyen, Truong Q.

    2016-10-01

    We propose an adaptive learning procedure to learn patch-based image priors for image denoising. The new algorithm, called the Expectation-Maximization (EM) adaptation, takes a generic prior learned from a generic external database and adapts it to the noisy image to generate a specific prior. Different from existing methods that combine internal and external statistics in ad-hoc ways, the proposed algorithm is rigorously derived from a Bayesian hyper-prior perspective. There are two contributions of this paper: First, we provide full derivation of the EM adaptation algorithm and demonstrate methods to improve the computational complexity. Second, in the absence of the latent clean image, we show how EM adaptation can be modified based on pre-filtering. Experimental results show that the proposed adaptation algorithm yields consistently better denoising results than the one without adaptation and is superior to several state-of-the-art algorithms.

  9. Adaptive Image Denoising by Mixture Adaptation.

    PubMed

    Luo, Enming; Chan, Stanley H; Nguyen, Truong Q

    2016-10-01

    We propose an adaptive learning procedure to learn patch-based image priors for image denoising. The new algorithm, called the expectation-maximization (EM) adaptation, takes a generic prior learned from a generic external database and adapts it to the noisy image to generate a specific prior. Different from existing methods that combine internal and external statistics in ad hoc ways, the proposed algorithm is rigorously derived from a Bayesian hyper-prior perspective. There are two contributions of this paper. First, we provide full derivation of the EM adaptation algorithm and demonstrate methods to improve the computational complexity. Second, in the absence of the latent clean image, we show how EM adaptation can be modified based on pre-filtering. The experimental results show that the proposed adaptation algorithm yields consistently better denoising results than the one without adaptation and is superior to several state-of-the-art algorithms.

  10. Practical aspects of gene regulatory inference via conditional inference forests from expression data.

    PubMed

    Bessonov, Kyrylo; Van Steen, Kristel

    2016-12-01

    Gene regulatory network (GRN) inference is an active area of research that facilitates understanding the complex interplays between biological molecules. We propose a novel framework to create such GRNs, based on Conditional Inference Forests (CIFs) as proposed by Strobl et al. Our framework consists of using ensembles of Conditional Inference Trees (CITs) and selecting an appropriate aggregation scheme for variant selection prior to network construction. We show on synthetic microarray data that taking the original implementation of CIFs with conditional permutation scheme (CIFcond ) may lead to improved performance compared to Breiman's implementation of Random Forests (RF). Among all newly introduced CIF-based methods and five network scenarios obtained from the DREAM4 challenge, CIFcond performed best. Networks derived from well-tuned CIFs, obtained by simply averaging P-values over tree ensembles (CIFmean ) are particularly attractive, because they combine adequate performance with computational efficiency. Moreover, thresholds for variable selection are based on significance levels for P-values and, hence, do not need to be tuned. From a practical point of view, our extensive simulations show the potential advantages of CIFmean -based methods. Although more work is needed to improve on speed, especially when fully exploiting the advantages of CITs in the context of heterogeneous and correlated data, we have shown that CIF methodology can be flexibly inserted in a framework to infer biological interactions. Notably, we confirmed biologically relevant interaction between IL2RA and FOXP1, linked to the IL-2 signaling pathway and to type 1 diabetes.

  11. Inferring Neuronal Dynamics from Calcium Imaging Data Using Biophysical Models and Bayesian Inference.

    PubMed

    Rahmati, Vahid; Kirmse, Knut; Marković, Dimitrije; Holthoff, Knut; Kiebel, Stefan J

    2016-02-01

    Calcium imaging has been used as a promising technique to monitor the dynamic activity of neuronal populations. However, the calcium trace is temporally smeared which restricts the extraction of quantities of interest such as spike trains of individual neurons. To address this issue, spike reconstruction algorithms have been introduced. One limitation of such reconstructions is that the underlying models are not informed about the biophysics of spike and burst generations. Such existing prior knowledge might be useful for constraining the possible solutions of spikes. Here we describe, in a novel Bayesian approach, how principled knowledge about neuronal dynamics can be employed to infer biophysical variables and parameters from fluorescence traces. By using both synthetic and in vitro recorded fluorescence traces, we demonstrate that the new approach is able to reconstruct different repetitive spiking and/or bursting patterns with accurate single spike resolution. Furthermore, we show that the high inference precision of the new approach is preserved even if the fluorescence trace is rather noisy or if the fluorescence transients show slow rise kinetics lasting several hundred milliseconds, and inhomogeneous rise and decay times. In addition, we discuss the use of the new approach for inferring parameter changes, e.g. due to a pharmacological intervention, as well as for inferring complex characteristics of immature neuronal circuits.

  12. Inferring Neuronal Dynamics from Calcium Imaging Data Using Biophysical Models and Bayesian Inference

    PubMed Central

    Rahmati, Vahid; Kirmse, Knut; Marković, Dimitrije; Holthoff, Knut; Kiebel, Stefan J.

    2016-01-01

    Calcium imaging has been used as a promising technique to monitor the dynamic activity of neuronal populations. However, the calcium trace is temporally smeared which restricts the extraction of quantities of interest such as spike trains of individual neurons. To address this issue, spike reconstruction algorithms have been introduced. One limitation of such reconstructions is that the underlying models are not informed about the biophysics of spike and burst generations. Such existing prior knowledge might be useful for constraining the possible solutions of spikes. Here we describe, in a novel Bayesian approach, how principled knowledge about neuronal dynamics can be employed to infer biophysical variables and parameters from fluorescence traces. By using both synthetic and in vitro recorded fluorescence traces, we demonstrate that the new approach is able to reconstruct different repetitive spiking and/or bursting patterns with accurate single spike resolution. Furthermore, we show that the high inference precision of the new approach is preserved even if the fluorescence trace is rather noisy or if the fluorescence transients show slow rise kinetics lasting several hundred milliseconds, and inhomogeneous rise and decay times. In addition, we discuss the use of the new approach for inferring parameter changes, e.g. due to a pharmacological intervention, as well as for inferring complex characteristics of immature neuronal circuits. PMID:26894748

  13. Automated Interpretation of LIBS Spectra using a Fuzzy Logic Inference Engine

    SciTech Connect

    Jeremy J. Hatch; Timothy R. McJunkin; Cynthia Hanson; Jill R. Scott

    2012-02-01

    Automated interpretation of laser-induced breakdown spectroscopy (LIBS) data is necessary due to the plethora of spectra that can be acquired in a relatively short time. However, traditional chemometric and artificial neural network methods that have been employed are not always transparent to a skilled user. A fuzzy logic approach to data interpretation has now been adapted to LIBS spectral interpretation. A fuzzy logic inference engine (FLIE) was used to differentiate between various copper containing and stainless steel alloys as well as unknowns. Results using FLIE indicate a high degree of confidence in spectral assignment.

  14. Measure of librarian pressure using fuzzy inference system: A case study in Longyan University

    NASA Astrophysics Data System (ADS)

    Huang, Jian-Jing

    2014-10-01

    As the hierarchy of middle managers in college's librarian. They may own much work pressure from their mind. How to adapt psychological problem, control the emotion and keep a good relationship in their work place, it becomes an important issue. Especially, they work in China mainland environment. How estimate the librarians work pressure and improve the quality of service in college libraries. Those are another serious issues. In this article, the authors would like discuss how can we use fuzzy inference to test librarian work pressure.

  15. Children's inference generation: The role of vocabulary and working memory.

    PubMed

    Currie, Nicola Kate; Cain, Kate

    2015-09-01

    Inferences are crucial to successful discourse comprehension. We assessed the contributions of vocabulary and working memory to inference making in children aged 5 and 6years (n=44), 7 and 8years (n=43), and 9 and 10years (n=43). Children listened to short narratives and answered questions to assess local and global coherence inferences after each one. Analysis of variance (ANOVA) confirmed developmental improvements on both types of inference. Although standardized measures of both vocabulary and working memory were correlated with inference making, multiple regression analyses determined that vocabulary was the key predictor. For local coherence inferences, only vocabulary predicted unique variance for the 6- and 8-year-olds; in contrast, none of the variables predicted performance for the 10-year-olds. For global coherence inferences, vocabulary was the only unique predictor for each age group. Mediation analysis confirmed that although working memory was associated with the ability to generate local and global coherence inferences in 6- to 10-year-olds, the effect was mediated by vocabulary. We conclude that vocabulary knowledge supports inference making in two ways: through knowledge of word meanings required to generate inferences and through its contribution to memory processes.

  16. Human brain lesion-deficit inference remapped.

    PubMed

    Mah, Yee-Haur; Husain, Masud; Rees, Geraint; Nachev, Parashkev

    2014-09-01

    Our knowledge of the anatomical organization of the human brain in health and disease draws heavily on the study of patients with focal brain lesions. Historically the first method of mapping brain function, it is still potentially the most powerful, establishing the necessity of any putative neural substrate for a given function or deficit. Great inferential power, however, carries a crucial vulnerability: without stronger alternatives any consistent error cannot be easily detected. A hitherto unexamined source of such error is the structure of the high-dimensional distribution of patterns of focal damage, especially in ischaemic injury-the commonest aetiology in lesion-deficit studies-where the anatomy is naturally shaped by the architecture of the vascular tree. This distribution is so complex that analysis of lesion data sets of conventional size cannot illuminate its structure, leaving us in the dark about the presence or absence of such error. To examine this crucial question we assembled the largest known set of focal brain lesions (n = 581), derived from unselected patients with acute ischaemic injury (mean age = 62.3 years, standard deviation = 17.8, male:female ratio = 0.547), visualized with diffusion-weighted magnetic resonance imaging, and processed with validated automated lesion segmentation routines. High-dimensional analysis of this data revealed a hidden bias within the multivariate patterns of damage that will consistently distort lesion-deficit maps, displacing inferred critical regions from their true locations, in a manner opaque to replication. Quantifying the size of this mislocalization demonstrates that past lesion-deficit relationships estimated with conventional inferential methodology are likely to be significantly displaced, by a magnitude dependent on the unknown underlying lesion-deficit relationship itself. Past studies therefore cannot be retrospectively corrected, except by new knowledge that would render them redundant

  17. The role of causal models in analogical inference.

    PubMed

    Lee, Hee Seung; Holyoak, Keith J

    2008-09-01

    Computational models of analogy have assumed that the strength of an inductive inference about the target is based directly on similarity of the analogs and in particular on shared higher order relations. In contrast, work in philosophy of science suggests that analogical inference is also guided by causal models of the source and target. In 3 experiments, the authors explored the possibility that people may use causal models to assess the strength of analogical inferences. Experiments 1-2 showed that reducing analogical overlap by eliminating a shared causal relation (a preventive cause present in the source) from the target increased inductive strength even though it decreased similarity of the analogs. These findings were extended in Experiment 3 to cross-domain analogical inferences based on correspondences between higher order causal relations. Analogical inference appears to be mediated by building and then running a causal model. The implications of the present findings for theories of both analogy and causal inference are discussed.

  18. Cancer evolution: mathematical models and computational inference.

    PubMed

    Beerenwinkel, Niko; Schwarz, Roland F; Gerstung, Moritz; Markowetz, Florian

    2015-01-01

    Cancer is a somatic evolutionary process characterized by the accumulation of mutations, which contribute to tumor growth, clinical progression, immune escape, and drug resistance development. Evolutionary theory can be used to analyze the dynamics of tumor cell populations and to make inference about the evolutionary history of a tumor from molecular data. We review recent approaches to modeling the evolution of cancer, including population dynamics models of tumor initiation and progression, phylogenetic methods to model the evolutionary relationship between tumor subclones, and probabilistic graphical models to describe dependencies among mutations. Evolutionary modeling helps to understand how tumors arise and will also play an increasingly important prognostic role in predicting disease progression and the outcome of medical interventions, such as targeted therapy.

  19. Bayesian Inference for Nonnegative Matrix Factorisation Models

    PubMed Central

    Cemgil, Ali Taylan

    2009-01-01

    We describe nonnegative matrix factorisation (NMF) with a Kullback-Leibler (KL) error measure in a statistical framework, with a hierarchical generative model consisting of an observation and a prior component. Omitting the prior leads to the standard KL-NMF algorithms as special cases, where maximum likelihood parameter estimation is carried out via the Expectation-Maximisation (EM) algorithm. Starting from this view, we develop full Bayesian inference via variational Bayes or Monte Carlo. Our construction retains conjugacy and enables us to develop more powerful models while retaining attractive features of standard NMF such as monotonic convergence and easy implementation. We illustrate our approach on model order selection and image reconstruction. PMID:19536273

  20. SERIES - Satellite Emission Range Inferred Earth Surveying

    NASA Technical Reports Server (NTRS)

    Macdoran, P. F.; Spitzmesser, D. J.; Buennagel, L. A.

    1983-01-01

    The Satellite Emission Range Inferred Earth Surveying (SERIES) concept is based on the utilization of NAVSTAR Global Positioning System (GPS) radio transmissions without any satellite modifications and in a totally passive mode. The SERIES stations are equipped with lightweight 1.5 m diameter dish antennas mounted on trailers. A series baseline measurement accuracy demonstration is considered, taking into account a 100 meter baseline estimation from approximately one hour of differential Doppler data. It is planned to conduct the next phase of experiments on a 150 m baseline. Attention is given to details regarding future baseline measurement accuracy demonstrations, aspects of ionospheric calibration in connection with subdecimeter baseline accuracy requirements of geodesy, and advantages related to the use of the differential Doppler or pseudoranging mode.

  1. Seasonal constraints on inferred planetary heat content

    NASA Astrophysics Data System (ADS)

    McKinnon, Karen A.; Huybers, Peter

    2016-10-01

    Planetary heating can be quantified using top of the atmosphere energy fluxes or through monitoring the heat content of the Earth system. It has been difficult, however, to compare the two methods with each other because of biases in satellite measurements and incomplete spatial coverage of ocean observations. Here we focus on the the seasonal cycle whose amplitude is large relative to satellite biases and observational errors. The seasonal budget can be closed through inferring contributions from high-latitude oceans and marginal seas using the covariance structure of National Center for Atmospheric Research (NCAR) Community Earth System Model (CESM1). In contrast, if these regions are approximated as the average across well-observed regions, the amplitude of the seasonal cycle is overestimated relative to satellite constraints. Analysis of the same CESM1 simulation indicates that complete measurement of the upper ocean would increase the magnitude and precision of interannual trend estimates in ocean heating more than fully measuring the deep ocean.

  2. Migration of objects and inferences across episodes.

    PubMed

    Hannigan, Sharon L; Reinitz, Mark Tippens

    2003-04-01

    Participants viewed episodes in the form of a series of photographs portraying ordinary routines (e.g., eating at a restaurant) and later received a recognition test. In Experiment 1, it was shown that objects (e.g., a vase of flowers, a pewter lantern) that appeared in a single episode during the study phase migrated between memories of episodes described by the same abstract schema (e.g., from Restaurant Episode A at study to Restaurant Episode B at test), and not between episodes anchored by different schemas. In Experiment 2, it was demonstrated that backward causal inferences from one study episode influenced memories of other episodes described by the same schema, and that high-schema-relevant items viewed in one episode were sometimes remembered as having occurred in another episode of the same schematic type.

  3. A Graphical Approach to Relatedness Inference

    PubMed Central

    Almudevar, Anthony

    2007-01-01

    The estimation of relatedness structure in natural populations using molecular marker data has become an important tool in population biology, resulting in a variety of estimation procedures for specific sampling scenarios. In this article a general approach is proposed, in which the detailed relationship structure, typically a pedigree graph or partition, is considered to be the object of inference. This makes available tools used in complex model selection theory which have demonstrated effectiveness. An important advantage of this approach is that it permits a fully Bayesian approach to the problem, providing a principled and accessible way to measure statistical error. The approach is demonstrated by applying the minimum description length principle. This technique is used in model selection to provide a rational way of comparing models of varying complexity. We show how the resulting score may be interpreted and applied as a Bayesian posterior density. PMID:17169391

  4. Cancer Evolution: Mathematical Models and Computational Inference

    PubMed Central

    Beerenwinkel, Niko; Schwarz, Roland F.; Gerstung, Moritz; Markowetz, Florian

    2015-01-01

    Cancer is a somatic evolutionary process characterized by the accumulation of mutations, which contribute to tumor growth, clinical progression, immune escape, and drug resistance development. Evolutionary theory can be used to analyze the dynamics of tumor cell populations and to make inference about the evolutionary history of a tumor from molecular data. We review recent approaches to modeling the evolution of cancer, including population dynamics models of tumor initiation and progression, phylogenetic methods to model the evolutionary relationship between tumor subclones, and probabilistic graphical models to describe dependencies among mutations. Evolutionary modeling helps to understand how tumors arise and will also play an increasingly important prognostic role in predicting disease progression and the outcome of medical interventions, such as targeted therapy. PMID:25293804

  5. Chloroplast Phylogenomic Inference of Green Algae Relationships.

    PubMed

    Sun, Linhua; Fang, Ling; Zhang, Zhenhua; Chang, Xin; Penny, David; Zhong, Bojian

    2016-02-05

    The green algal phylum Chlorophyta has six diverse classes, but the phylogenetic relationship of the classes within Chlorophyta remains uncertain. In order to better understand the ancient Chlorophyta evolution, we have applied a site pattern sorting method to study compositional heterogeneity and the model fit in the green algal chloroplast genomic data. We show that the fastest-evolving sites are significantly correlated with among-site compositional heterogeneity, and these sites have a much poorer fit to the evolutionary model. Our phylogenomic analyses suggest that the class Chlorophyceae is a monophyletic group, and the classes Ulvophyceae, Trebouxiophyceae and Prasinophyceae are non-monophyletic groups. Our proposed phylogenetic tree of Chlorophyta will offer new insights to investigate ancient green algae evolution, and our analytical framework will provide a useful approach for evaluating and mitigating the potential errors of phylogenomic inferences.

  6. Disabling conditional inferences: an EEG study.

    PubMed

    Bonnefond, Mathilde; Kaliuzhna, Mariia; Van der Henst, Jean-Baptiste; De Neys, Wim

    2014-04-01

    Although the Modus Ponens inference is one of the most basic logical rules, decades of conditional reasoning research show that it is often rejected when people consider stored background knowledge about potential disabling conditions. In the present study we used EEG to identify neural markers of this process. We presented participants with many and few disabler conditionals for which retrieval of disabling conditions was likely or unlikely. As in classic behavioral studies we observed that participants accepted the standard MP conclusion less for conditionals with many disablers. The key finding was that the presentation of the standard MP conclusion also resulted in a more pronounced N2 and less pronounced P3b for the many disabler conditionals. This specific N2/P3b pattern has been linked to the violation and satisfaction of expectations, respectively. Thereby, the present ERP findings support the idea that disabler retrieval lowers reasoners' expectations that the standard MP conclusion can be drawn.

  7. Inferring biological dynamics in heterogeneous cellular environments

    NASA Astrophysics Data System (ADS)

    Pressé, Steve

    In complex environments, it often appears that biomolecules such as proteins do not diffuse normally. That is, their mean square displacement does not scale linearly with time. This anomalous diffusion happens for multiple reasons: proteins can bind to structures and other proteins; fluorophores used to label proteins may flicker or blink making it appear that the labeled protein is diffusing anomalously; and proteins can diffuse in differently crowded environments. Here we describe methods for learning about such processes from imaging data collected inside the heterogeneous environment of the living cell. Refs.: ''Inferring Diffusional Dynamics from FCS in Heterogeneous Nuclear Environments'' Konstantinos Tsekouras, Amanda Siegel, Richard N. Day, Steve Pressé*, Biophys. J. , 109, 7 (2015). ''A data-driven alternative to the fractional Fokker-Planck equation'' Steve Pressé*, J. Stat. Phys.: Th. and Expmt. , P07009 (2015).

  8. Inferring character from faces: a developmental study.

    PubMed

    Cogsdill, Emily J; Todorov, Alexander T; Spelke, Elizabeth S; Banaji, Mahzarin R

    2014-05-01

    Human adults attribute character traits to faces readily and with high consensus. In two experiments investigating the development of face-to-trait inference, adults and children ages 3 through 10 attributed trustworthiness, dominance, and competence to pairs of faces. In Experiment 1, the attributions of 3- to 4-year-olds converged with those of adults, and 5- to 6-year-olds' attributions were at adult levels of consistency. Children ages 3 and above consistently attributed the basic mean/nice evaluation not only to faces varying in trustworthiness (Experiment 1) but also to faces varying in dominance and competence (Experiment 2). This research suggests that the predisposition to judge others using scant facial information appears in adultlike forms early in childhood and does not require prolonged social experience.

  9. Single-Trial Inference on Visual Attention

    NASA Astrophysics Data System (ADS)

    Dyrholm, Mads; Kyllingsbæk, Søren; Vangkilde, Signe; Habekost, Thomas; Bundesen, Claus

    2011-06-01

    In this paper we take a step towards single-trial behavioral modeling within a Theory of Visual Attention (TVA). In selective attention tasks, such as the Partial Report paradigm, the subject is asked to ignore distractors and only report stimuli that belong to the target class. Nothing about a distractor is observed directly in the subject's overt behavior, hence behavioral modeling of such trials involves out-marginalizing the variables that represent the distractors' influence on behavior. In this paper we derive equations for inferring a latent representation of the distractors on a Partial Report trial. This result retrodicts a latent attentional state of the subject using the observed response from that particular trial and thus differs from other predictions made with TVA which are based on expected values of observed variables. We show an example of the result in single-trial analysis of an occipital EEG component.

  10. Causal Inference in Multisensory Heading Estimation

    PubMed Central

    Katliar, Mikhail; Bülthoff, Heinrich H.

    2017-01-01

    A large body of research shows that the Central Nervous System (CNS) integrates multisensory information. However, this strategy should only apply to multisensory signals that have a common cause; independent signals should be segregated. Causal Inference (CI) models account for this notion. Surprisingly, previous findings suggested that visual and inertial cues on heading of self-motion are integrated regardless of discrepancy. We hypothesized that CI does occur, but that characteristics of the motion profiles affect multisensory processing. Participants estimated heading of visual-inertial motion stimuli with several different motion profiles and a range of intersensory discrepancies. The results support the hypothesis that judgments of signal causality are included in the heading estimation process. Moreover, the data suggest a decreasing tolerance for discrepancies and an increasing reliance on visual cues for longer duration motions. PMID:28060957

  11. Causal Network Inference Via Group Sparse Regularization

    PubMed Central

    Bolstad, Andrew; Van Veen, Barry D.; Nowak, Robert

    2011-01-01

    This paper addresses the problem of inferring sparse causal networks modeled by multivariate autoregressive (MAR) processes. Conditions are derived under which the Group Lasso (gLasso) procedure consistently estimates sparse network structure. The key condition involves a “false connection score” ψ. In particular, we show that consistent recovery is possible even when the number of observations of the network is far less than the number of parameters describing the network, provided that ψ < 1. The false connection score is also demonstrated to be a useful metric of recovery in nonasymptotic regimes. The conditions suggest a modified gLasso procedure which tends to improve the false connection score and reduce the chances of reversing the direction of causal influence. Computational experiments and a real network based electrocorticogram (ECoG) simulation study demonstrate the effectiveness of the approach. PMID:21918591

  12. Inferring Network Connectivity by Delayed Feedback Control

    PubMed Central

    Yu, Dongchuan; Parlitz, Ulrich

    2011-01-01

    We suggest a control based approach to topology estimation of networks with elements. This method first drives the network to steady states by a delayed feedback control; then performs structural perturbations for shifting the steady states times; and finally infers the connection topology from the steady states' shifts by matrix inverse algorithm () or -norm convex optimization strategy applicable to estimate the topology of sparse networks from perturbations. We discuss as well some aspects important for applications, such as the topology reconstruction quality and error sources, advantages and disadvantages of the suggested method, and the influence of (control) perturbations, inhomegenity, sparsity, coupling functions, and measurement noise. Some examples of networks with Chua's oscillators are presented to illustrate the reliability of the suggested technique. PMID:21969856

  13. Nuclear Forensic Inferences Using Iterative Multidimensional Statistics

    SciTech Connect

    Robel, M; Kristo, M J; Heller, M A

    2009-06-09

    Nuclear forensics involves the analysis of interdicted nuclear material for specific material characteristics (referred to as 'signatures') that imply specific geographical locations, production processes, culprit intentions, etc. Predictive signatures rely on expert knowledge of physics, chemistry, and engineering to develop inferences from these material characteristics. Comparative signatures, on the other hand, rely on comparison of the material characteristics of the interdicted sample (the 'questioned sample' in FBI parlance) with those of a set of known samples. In the ideal case, the set of known samples would be a comprehensive nuclear forensics database, a database which does not currently exist. In fact, our ability to analyze interdicted samples and produce an extensive list of precise materials characteristics far exceeds our ability to interpret the results. Therefore, as we seek to develop the extensive databases necessary for nuclear forensics, we must also develop the methods necessary to produce the necessary inferences from comparison of our analytical results with these large, multidimensional sets of data. In the work reported here, we used a large, multidimensional dataset of results from quality control analyses of uranium ore concentrate (UOC, sometimes called 'yellowcake'). We have found that traditional multidimensional techniques, such as principal components analysis (PCA), are especially useful for understanding such datasets and drawing relevant conclusions. In particular, we have developed an iterative partial least squares-discriminant analysis (PLS-DA) procedure that has proven especially adept at identifying the production location of unknown UOC samples. By removing classes which fell far outside the initial decision boundary, and then rebuilding the PLS-DA model, we have consistently produced better and more definitive attributions than with a single pass classification approach. Performance of the iterative PLS-DA method

  14. Tracing retinal vessel trees by transductive inference

    PubMed Central

    2014-01-01

    Background Structural study of retinal blood vessels provides an early indication of diseases such as diabetic retinopathy, glaucoma, and hypertensive retinopathy. These studies require accurate tracing of retinal vessel tree structure from fundus images in an automated manner. However, the existing work encounters great difficulties when dealing with the crossover issue commonly-seen in vessel networks. Results In this paper, we consider a novel graph-based approach to address this tracing with crossover problem: After initial steps of segmentation and skeleton extraction, its graph representation can be established, where each segment in the skeleton map becomes a node, and a direct contact between two adjacent segments is translated to an undirected edge of the two corresponding nodes. The segments in the skeleton map touching the optical disk area are considered as root nodes. This determines the number of trees to-be-found in the vessel network, which is always equal to the number of root nodes. Based on this undirected graph representation, the tracing problem is further connected to the well-studied transductive inference in machine learning, where the goal becomes that of properly propagating the tree labels from those known root nodes to the rest of the graph, such that the graph is partitioned into disjoint sub-graphs, or equivalently, each of the trees is traced and separated from the rest of the vessel network. This connection enables us to address the tracing problem by exploiting established development in transductive inference. Empirical experiments on public available fundus image datasets demonstrate the applicability of our approach. Conclusions We provide a novel and systematic approach to trace retinal vessel trees with the present of crossovers by solving a transductive learning problem on induced undirected graphs. PMID:24438151

  15. Sympatry inference and network analysis in biogeography.

    PubMed

    Dos Santos, Daniel A; Fernández, Hugo R; Cuezzo, María Gabriela; Domínguez, Eduardo

    2008-06-01

    A new approach for biogeography to find patterns of sympatry, based on network analysis, is proposed. Biogeographic analysis focuses basically on sympatry patterns of species. Sympatry is a network (= relational) datum, but it has never been analyzed before using relational tools such as Network Analysis. Our approach to biogeographic analysis consists of two parts: first the sympatry inference and second the network analysis method (NAM). The sympatry inference method was designed to propose sympatry hypothesis, constructing a basal sympatry network based on punctual data, independent of a priori distributional area determination. In this way, two or more species are considered sympatric when there is interpenetration and relative proximity among their records of occurrence. In nature, groups of species presenting within-group sympatry and between-group allopatry constitute natural units (units of co-occurrence). These allopatric units are usually connected by intermediary species. The network analysis method (NAM) that we propose here is based on the identification and removal of intermediary species to segregate units of co-occurrence, using the betweenness measure and the clustering coefficient. The species ranges of the units of co-occurrence obtained are transferred to a map, being considered as candidates to areas of endemism. The new approach was implemented on three different real complex data sets (one of them a classic example previously used in biogeography) resulting in (1) independence of predefined spatial units; (2) definition of co-occurrence patterns from the sympatry network structure, not from species range similarities; (3) higher stability in results despite scale changes; (4) identification of candidates to areas of endemism supported by strictly endemic species; (5) identification of intermediary species with particular biological attributes.

  16. Pathway network inference from gene expression data

    PubMed Central

    2014-01-01

    Background The development of high-throughput omics technologies enabled genome-wide measurements of the activity of cellular elements and provides the analytical resources for the progress of the Systems Biology discipline. Analysis and interpretation of gene expression data has evolved from the gene to the pathway and interaction level, i.e. from the detection of differentially expressed genes, to the establishment of gene interaction networks and the identification of enriched functional categories. Still, the understanding of biological systems requires a further level of analysis that addresses the characterization of the interaction between functional modules. Results We present a novel computational methodology to study the functional interconnections among the molecular elements of a biological system. The PANA approach uses high-throughput genomics measurements and a functional annotation scheme to extract an activity profile from each functional block -or pathway- followed by machine-learning methods to infer the relationships between these functional profiles. The result is a global, interconnected network of pathways that represents the functional cross-talk within the molecular system. We have applied this approach to describe the functional transcriptional connections during the yeast cell cycle and to identify pathways that change their connectivity in a disease condition using an Alzheimer example. Conclusions PANA is a useful tool to deepen in our understanding of the functional interdependences that operate within complex biological systems. We show the approach is algorithmically consistent and the inferred network is well supported by the available functional data. The method allows the dissection of the molecular basis of the functional connections and we describe the different regulatory mechanisms that explain the network's topology obtained for the yeast cell cycle data. PMID:25032889

  17. Network geometry inference using common neighbors

    NASA Astrophysics Data System (ADS)

    Papadopoulos, Fragkiskos; Aldecoa, Rodrigo; Krioukov, Dmitri

    2015-08-01

    We introduce and explore a method for inferring hidden geometric coordinates of nodes in complex networks based on the number of common neighbors between the nodes. We compare this approach to the HyperMap method, which is based only on the connections (and disconnections) between the nodes, i.e., on the links that the nodes have (or do not have). We find that for high degree nodes, the common-neighbors approach yields a more accurate inference than the link-based method, unless heuristic periodic adjustments (or "correction steps") are used in the latter. The common-neighbors approach is computationally intensive, requiring O (t4) running time to map a network of t nodes, versus O (t3) in the link-based method. But we also develop a hybrid method with O (t3) running time, which combines the common-neighbors and link-based approaches, and we explore a heuristic that reduces its running time further to O (t2) , without significant reduction in the mapping accuracy. We apply this method to the autonomous systems (ASs) Internet, and we reveal how soft communities of ASs evolve over time in the similarity space. We further demonstrate the method's predictive power by forecasting future links between ASs. Taken altogether, our results advance our understanding of how to efficiently and accurately map real networks to their latent geometric spaces, which is an important necessary step toward understanding the laws that govern the dynamics of nodes in these spaces, and the fine-grained dynamics of network connections.

  18. Systematic parameter inference in stochastic mesoscopic modeling

    NASA Astrophysics Data System (ADS)

    Lei, Huan; Yang, Xiu; Li, Zhen; Karniadakis, George Em

    2017-02-01

    We propose a method to efficiently determine the optimal coarse-grained force field in mesoscopic stochastic simulations of Newtonian fluid and polymer melt systems modeled by dissipative particle dynamics (DPD) and energy conserving dissipative particle dynamics (eDPD). The response surfaces of various target properties (viscosity, diffusivity, pressure, etc.) with respect to model parameters are constructed based on the generalized polynomial chaos (gPC) expansion using simulation results on sampling points (e.g., individual parameter sets). To alleviate the computational cost to evaluate the target properties, we employ the compressive sensing method to compute the coefficients of the dominant gPC terms given the prior knowledge that the coefficients are "sparse". The proposed method shows comparable accuracy with the standard probabilistic collocation method (PCM) while it imposes a much weaker restriction on the number of the simulation samples especially for systems with high dimensional parametric space. Fully access to the response surfaces within the confidence range enables us to infer the optimal force parameters given the desirable values of target properties at the macroscopic scale. Moreover, it enables us to investigate the intrinsic relationship between the model parameters, identify possible degeneracies in the parameter space, and optimize the model by eliminating model redundancies. The proposed method provides an efficient alternative approach for constructing mesoscopic models by inferring model parameters to recover target properties of the physics systems (e.g., from experimental measurements), where those force field parameters and formulation cannot be derived from the microscopic level in a straight forward way.

  19. Spurious correlations and inference in landscape genetics.

    PubMed

    Cushman, Samuel A; Landguth, Erin L

    2010-09-01

    Reliable interpretation of landscape genetic analyses depends on statistical methods that have high power to identify the correct process driving gene flow while rejecting incorrect alternative hypotheses. Little is known about statistical power and inference in individual-based landscape genetics. Our objective was to evaluate the power of causal-modelling with partial Mantel tests in individual-based landscape genetic analysis. We used a spatially explicit simulation model to generate genetic data across a spatially distributed population as functions of several alternative gene flow processes. This allowed us to stipulate the actual process that is in action, enabling formal evaluation of the strength of spurious correlations with incorrect models. We evaluated the degree to which naïve correlational approaches can lead to incorrect attribution of the driver of observed genetic structure. Second, we evaluated the power of causal modelling with partial Mantel tests on resistance gradients to correctly identify the explanatory model and reject incorrect alternative models. Third, we evaluated how rapidly after the landscape genetic process is initiated that we are able to reliably detect the effect of the correct model and reject the incorrect models. Our analyses suggest that simple correlational analyses between genetic data and proposed explanatory models produce strong spurious correlations, which lead to incorrect inferences. We found that causal modelling was extremely effective at rejecting incorrect explanations and correctly identifying the true causal process. We propose a generalized framework for landscape genetics based on analysis of the spatial genetic relationships among individual organisms relative to alternative hypotheses that define functional relationships between landscape features and spatial population processes.

  20. Length Scales in Bayesian Automatic Adaptive Quadrature

    NASA Astrophysics Data System (ADS)

    Adam, Gh.; Adam, S.

    2016-02-01

    Two conceptual developments in the Bayesian automatic adaptive quadrature approach to the numerical solution of one-dimensional Riemann integrals [Gh. Adam, S. Adam, Springer LNCS 7125, 1-16 (2012)] are reported. First, it is shown that the numerical quadrature which avoids the overcomputing and minimizes the hidden floating point loss of precision asks for the consideration of three classes of integration domain lengths endowed with specific quadrature sums: microscopic (trapezoidal rule), mesoscopic (Simpson rule), and macroscopic (quadrature sums of high algebraic degrees of precision). Second, sensitive diagnostic tools for the Bayesian inference on macroscopic ranges, coming from the use of Clenshaw-Curtis quadrature, are derived.

  1. Implementation of Fuzzy Inference Systems Using Neural Network Techniques

    DTIC Science & Technology

    1992-03-01

    rules required to implement the system, which are usually supplied by ’experts’. One alternative is to use a neural network -type architecture to implement...the fuzzy inference system, and neural network -type training techniques to ’learn’ the control parameters needed by the fuzzy inference system. By...using a generalized version of a neural network , the rules of the fuzzy inference system can be learned without the assistance of experts.

  2. Interplanetary magnetic sector polarity inferred from polar geomagnetic field observations

    NASA Technical Reports Server (NTRS)

    Friis-Christensen, E.; Lassen, K.; Wilcox, J. M.; Gonzalez, W.; Colburn, D. S.

    1971-01-01

    In order to infer the interplanetary sector polarity from polar geomagnetic field diurnal variations, measurements were carried out at Godhavn and Thule (Denmark) Geomagnetic Observatories. The inferred interplanetary sector polarity was compared with the polarity observed at the same time by Explorer 33 and 35 magnetometers. It is shown that the polarity (toward or away from the sun) of the interplanetary magnetic field can be reliably inferred from observations of the polar cap geomagnetic fields.

  3. Dynamical Logic Driven by Classified Inferences Including Abduction

    NASA Astrophysics Data System (ADS)

    Sawa, Koji; Gunji, Yukio-Pegio

    2010-11-01

    We propose a dynamical model of formal logic which realizes a representation of logical inferences, deduction and induction. In addition, it also represents abduction which is classified by Peirce as the third inference following deduction and induction. The three types of inference are represented as transformations of a directed graph. The state of a relation between objects of the model fluctuates between the collective and the distinctive. In addition, the location of the relation in the sequence of the relation influences its state.

  4. Fuzzy inference game approach to uncertainty in business decisions and market competitions.

    PubMed

    Oderanti, Festus Oluseyi

    2013-01-01

    The increasing challenges and complexity of business environments are making business decisions and operations more difficult for entrepreneurs to predict the outcomes of these processes. Therefore, we developed a decision support scheme that could be used and adapted to various business decision processes. These involve decisions that are made under uncertain situations such as business competition in the market or wage negotiation within a firm. The scheme uses game strategies and fuzzy inference concepts to effectively grasp the variables in these uncertain situations. The games are played between human and fuzzy players. The accuracy of the fuzzy rule base and the game strategies help to mitigate the adverse effects that a business may suffer from these uncertain factors. We also introduced learning which enables the fuzzy player to adapt over time. We tested this scheme in different scenarios and discover that it could be an invaluable tool in the hand of entrepreneurs that are operating under uncertain and competitive business environments.

  5. Habituation of visual adaptation

    PubMed Central

    Dong, Xue; Gao, Yi; Lv, Lili; Bao, Min

    2016-01-01

    Our sensory system adjusts its function driven by both shorter-term (e.g. adaptation) and longer-term (e.g. learning) experiences. Most past adaptation literature focuses on short-term adaptation. Only recently researchers have begun to investigate how adaptation changes over a span of days. This question is important, since in real life many environmental changes stretch over multiple days or longer. However, the answer to the question remains largely unclear. Here we addressed this issue by tracking perceptual bias (also known as aftereffect) induced by motion or contrast adaptation across multiple daily adaptation sessions. Aftereffects were measured every day after adaptation, which corresponded to the degree of adaptation on each day. For passively viewed adapters, repeated adaptation attenuated aftereffects. Once adapters were presented with an attentional task, aftereffects could either reduce for easy tasks, or initially show an increase followed by a later decrease for demanding tasks. Quantitative analysis of the decay rates in contrast adaptation showed that repeated exposure of the adapter appeared to be equivalent to adaptation to a weaker stimulus. These results suggest that both attention and a non-attentional habituation-like mechanism jointly determine how adaptation develops across multiple daily sessions. PMID:26739917

  6. Inferring animal social networks and leadership: applications for passive monitoring arrays

    PubMed Central

    Papastamatiou, Yannis P.; Freeman, Robin

    2016-01-01

    Analyses of animal social networks have frequently benefited from techniques derived from other disciplines. Recently, machine learning algorithms have been adopted to infer social associations from time-series data gathered using remote, telemetry systems situated at provisioning sites. We adapt and modify existing inference methods to reveal the underlying social structure of wide-ranging marine predators moving through spatial arrays of passive acoustic receivers. From six months of tracking data for grey reef sharks (Carcharhinus amblyrhynchos) at Palmyra atoll in the Pacific Ocean, we demonstrate that some individuals emerge as leaders within the population and that this behavioural coordination is predicted by both sex and the duration of co-occurrences between conspecifics. In doing so, we provide the first evidence of long-term, spatially extensive social processes in wild sharks. To achieve these results, we interrogate simulated and real tracking data with the explicit purpose of drawing attention to the key considerations in the use and interpretation of inference methods and their impact on resultant social structure. We provide a modified translation of the GMMEvents method for R, including new analyses quantifying the directionality and duration of social events with the aim of encouraging the careful use of these methods more widely in less tractable social animal systems but where passive telemetry is already widespread. PMID:27881803

  7. Inferring topologies via driving-based generalized synchronization of two-layer networks

    NASA Astrophysics Data System (ADS)

    Wang, Yingfei; Wu, Xiaoqun; Feng, Hui; Lu, Jun-an; Xu, Yuhua

    2016-05-01

    The interaction topology among the constituents of a complex network plays a crucial role in the network’s evolutionary mechanisms and functional behaviors. However, some network topologies are usually unknown or uncertain. Meanwhile, coupling delays are ubiquitous in various man-made and natural networks. Hence, it is necessary to gain knowledge of the whole or partial topology of a complex dynamical network by taking into consideration communication delay. In this paper, topology identification of complex dynamical networks is investigated via generalized synchronization of a two-layer network. Particularly, based on the LaSalle-type invariance principle of stochastic differential delay equations, an adaptive control technique is proposed by constructing an auxiliary layer and designing proper control input and updating laws so that the unknown topology can be recovered upon successful generalized synchronization. Numerical simulations are provided to illustrate the effectiveness of the proposed method. The technique provides a certain theoretical basis for topology inference of complex networks. In particular, when the considered network is composed of systems with high-dimension or complicated dynamics, a simpler response layer can be constructed, which is conducive to circuit design. Moreover, it is practical to take into consideration perturbations caused by control input. Finally, the method is applicable to infer topology of a subnetwork embedded within a complex system and locate hidden sources. We hope the results can provide basic insight into further research endeavors on understanding practical and economical topology inference of networks.

  8. Inferring animal social networks and leadership: applications for passive monitoring arrays.

    PubMed

    Jacoby, David M P; Papastamatiou, Yannis P; Freeman, Robin

    2016-11-01

    Analyses of animal social networks have frequently benefited from techniques derived from other disciplines. Recently, machine learning algorithms have been adopted to infer social associations from time-series data gathered using remote, telemetry systems situated at provisioning sites. We adapt and modify existing inference methods to reveal the underlying social structure of wide-ranging marine predators moving through spatial arrays of passive acoustic receivers. From six months of tracking data for grey reef sharks (Carcharhinus amblyrhynchos) at Palmyra atoll in the Pacific Ocean, we demonstrate that some individuals emerge as leaders within the population and that this behavioural coordination is predicted by both sex and the duration of co-occurrences between conspecifics. In doing so, we provide the first evidence of long-term, spatially extensive social processes in wild sharks. To achieve these results, we interrogate simulated and real tracking data with the explicit purpose of drawing attention to the key considerations in the use and interpretation of inference methods and their impact on resultant social structure. We provide a modified translation of the GMMEvents method for R, including new analyses quantifying the directionality and duration of social events with the aim of encouraging the careful use of these methods more widely in less tractable social animal systems but where passive telemetry is already widespread.

  9. Adaptive Neuro-Fuzzy Inference System for Computing the Resonant Frequency of Circular Microstrip Antennas

    DTIC Science & Technology

    2004-11-01

    Use of Artificial Neural Networks,” Microwave and Optical Technology Letters, Vol.14, pp. 89-93, 1997. [41] S. Sagiroglu, K. Guney, and M. Erler ...Computer-Aided Engineering, Vol. 8, pp. 270- 277, 1998. [42] S. Sagiroglu, K. Guney, and M. Erler , “Calculation of Bandwidth for Electrically Thin and...S. Sagiroglu, and M. Erler , “Neural Computation of Resonant Frequency of Electrically Thin and Thick Rectangular Microstrip Antennas,” IEE. Proc

  10. Quantitative Modeling of Virus Evolutionary Dynamics and Adaptation in Serial Passages Using Empirically Inferred Fitness Landscapes

    DTIC Science & Technology

    2014-01-01

    Jaques Reifman ‹ Biotechnology High Performance Computing Software Applications Institute, Telemedicine and Advanced Technology Research Center, U.S...hwoo@bhsai.org, or Jaques Reifman, jaques.reifman.civ@mail.mil. Copyright © 2014, American Society for Microbiology . All Rights Reserved. doi...Medical Research and Materiel Command,DoD Biotechnology High Performance Computing Software Applications Institute,Telemedicine and Advanced Technology

  11. A Comprehensive Approach to Fusion for Microsensor Networks: Distributed and Hierarchical Inference, Communication, and Adaption

    DTIC Science & Technology

    2000-08-01

    compression based on the Burrows-Wheeler transform: Analysis and optimality. IEEE Transactions on Information Theory, 2001. pp. 1461-1472. 82. K...Theory, 2005. 84. M.J. Wainwright, E. P. Simoncelli, A. S. Willsky. Random cascades on wavelet trees and their use in analyzing and modeling natural...images. Applied Computational and Harmonic Analysis (Special issue on wavelet applications), April 2001. 85. M.J. Wainwright, T.S. Jaakkola, A.S

  12. MI-ANFIS: A Multiple Instance Adaptive Neuro-Fuzzy Inference System

    DTIC Science & Technology

    2015-08-02

    commonly used to evaluate MIL methods. The data sets are namely the MUSK1, MUSK2 [11], and Fox, Tiger , and Elephant from the COREL data set [12]. MUSK1...MUSK1 has 92 bags, of which 47 are positive, and MUSK2 has 102 bags, of which 39 are positive. The other data sets from COREL: Fox, Tiger , and...Negative No.Instances MUSK1 166(25) 92 47 45 2→ 40 MUSK2 166(25) 102 39 63 1→ 1044 Fox 230(10) 200 100 100 2→ 13 Tiger 230(10) 200 100 100 1→ 13 Elephant

  13. Making inference from wildlife collision data: inferring predator absence from prey strikes.

    PubMed

    Caley, Peter; Hosack, Geoffrey R; Barry, Simon C

    2017-01-01

    Wildlife collision data are ubiquitous, though challenging for making ecological inference due to typically irreducible uncertainty relating to the sampling process. We illustrate a new approach that is useful for generating inference from predator data arising from wildlife collisions. By simply conditioning on a second prey species sampled via the same collision process, and by using a biologically realistic numerical response functions, we can produce a coherent numerical response relationship between predator and prey. This relationship can then be used to make inference on the population size of the predator species, including the probability of extinction. The statistical conditioning enables us to account for unmeasured variation in factors influencing the runway strike incidence for individual airports and to enable valid comparisons. A practical application of the approach for testing hypotheses about the distribution and abundance of a predator species is illustrated using the hypothesized red fox incursion into Tasmania, Australia. We estimate that conditional on the numerical response between fox and lagomorph runway strikes on mainland Australia, the predictive probability of observing no runway strikes of foxes in Tasmania after observing 15 lagomorph strikes is 0.001. We conclude there is enough evidence to safely reject the null hypothesis that there is a widespread red fox population in Tasmania at a population density consistent with prey availability. The method is novel and has potential wider application.

  14. Making inference from wildlife collision data: inferring predator absence from prey strikes

    PubMed Central

    Hosack, Geoffrey R.; Barry, Simon C.

    2017-01-01

    Wildlife collision data are ubiquitous, though challenging for making ecological inference due to typically irreducible uncertainty relating to the sampling process. We illustrate a new approach that is useful for generating inference from predator data arising from wildlife collisions. By simply conditioning on a second prey species sampled via the same collision process, and by using a biologically realistic numerical response functions, we can produce a coherent numerical response relationship between predator and prey. This relationship can then be used to make inference on the population size of the predator species, including the probability of extinction. The statistical conditioning enables us to account for unmeasured variation in factors influencing the runway strike incidence for individual airports and to enable valid comparisons. A practical application of the approach for testing hypotheses about the distribution and abundance of a predator species is illustrated using the hypothesized red fox incursion into Tasmania, Australia. We estimate that conditional on the numerical response between fox and lagomorph runway strikes on mainland Australia, the predictive probability of observing no runway strikes of foxes in Tasmania after observing 15 lagomorph strikes is 0.001. We conclude there is enough evidence to safely reject the null hypothesis that there is a widespread red fox population in Tasmania at a population density consistent with prey availability. The method is novel and has potential wider application. PMID:28243534

  15. The probabilistic convolution tree: efficient exact Bayesian inference for faster LC-MS/MS protein inference.

    PubMed

    Serang, Oliver

    2014-01-01

    Exact Bayesian inference can sometimes be performed efficiently for special cases where a function has commutative and associative symmetry of its inputs (called "causal independence"). For this reason, it is desirable to exploit such symmetry on big data sets. Here we present a method to exploit a general form of this symmetry on probabilistic adder nodes by transforming those probabilistic adder nodes into a probabilistic convolution tree with which dynamic programming computes exact probabilities. A substantial speedup is demonstrated using an illustration example that can arise when identifying splice forms with bottom-up mass spectrometry-based proteomics. On this example, even state-of-the-art exact inference algorithms require a runtime more than exponential in the number of splice forms considered. By using the probabilistic convolution tree, we reduce the runtime to O(k log(k)2) and the space to O(k log(k)) where k is the number of variables joined by an additive or cardinal operator. This approach, which can also be used with junction tree inference, is applicable to graphs with arbitrary dependency on counting variables or cardinalities and can be used on diverse problems and fields like forward error correcting codes, elemental decomposition, and spectral demixing. The approach also trivially generalizes to multiple dimensions.

  16. Expressing Adaptation Strategies Using Adaptation Patterns

    ERIC Educational Resources Information Center

    Zemirline, N.; Bourda, Y.; Reynaud, C.

    2012-01-01

    Today, there is a real challenge to enable personalized access to information. Several systems have been proposed to address this challenge including Adaptive Hypermedia Systems (AHSs). However, the specification of adaptation strategies remains a difficult task for creators of such systems. In this paper, we consider the problem of the definition…

  17. Inferring gene regression networks with model trees

    PubMed Central

    2010-01-01

    Background Novel strategies are required in order to handle the huge amount of data produced by microarray technologies. To infer gene regulatory networks, the first step is to find direct regulatory relationships between genes building the so-called gene co-expression networks. They are typically generated using correlation statistics as pairwise similarity measures. Correlation-based methods are very useful in order to determine whether two genes have a strong global similarity but do not detect local similarities. Results We propose model trees as a method to identify gene interaction networks. While correlation-based methods analyze each pair of genes, in our approach we generate a single regression tree for each gene from the remaining genes. Finally, a graph from all the relationships among output and input genes is built taking into account whether the pair of genes is statistically significant. For this reason we apply a statistical procedure to control the false discovery rate. The performance of our approach, named REGNET, is experimentally tested on two well-known data sets: Saccharomyces Cerevisiae and E.coli data set. First, the biological coherence of the results are tested. Second the E.coli transcriptional network (in the Regulon database) is used as control to compare the results to that of a correlation-based method. This experiment shows that REGNET performs more accurately at detecting true gene associations than the Pearson and Spearman zeroth and first-order correlation-based methods. Conclusions REGNET generates gene association networks from gene expression data, and differs from correlation-based methods in that the relationship between one gene and others is calculated simultaneously. Model trees are very useful techniques to estimate the numerical values for the target genes by linear regression functions. They are very often more precise than linear regression models because they can add just different linear regressions to separate

  18. The Strong-Inference Protocol: Not Just for Grant Proposals

    ERIC Educational Resources Information Center

    Hiebert, Sara M.

    2007-01-01

    The strong-inference protocol puts into action the important concepts in Platt's often-assigned, classic paper on the strong-inference method (10). Yet, perhaps because students are frequently performing experiments with known outcomes, the protocols they write as undergraduates are usually little more than step-by-step instructions for performing…

  19. Causal Inference and Language Comprehension: Event-Related Potential Investigations

    ERIC Educational Resources Information Center

    Davenport, Tristan S.

    2014-01-01

    The most important information conveyed by language is often contained not in the utterance itself, but in the interaction between the utterance and the comprehender's knowledge of the world and the current situation. This dissertation uses psycholinguistic methods to explore the effects of a common type of inference--causal inference--on language…

  20. Statistical Inference and Patterns of Inequality in the Global North

    ERIC Educational Resources Information Center

    Moran, Timothy Patrick

    2006-01-01

    Cross-national inequality trends have historically been a crucial field of inquiry across the social sciences, and new methodological techniques of statistical inference have recently improved the ability to analyze these trends over time. This paper applies Monte Carlo, bootstrap inference methods to the income surveys of the Luxembourg Income…

  1. Investigation of Statistical Inference Methodologies Through Scale Model Propagation Experiments

    DTIC Science & Technology

    2015-09-30

    Investigation of Statistical Inference Methodologies Through Scale Model Propagation Experiments Jason D. Sagers Applied Research Laboratories...statistical inference methodologies for ocean-acoustic problems by investigating and applying statistical methods to data collected from scale -model...experiments over a translationally invariant wedge, (2) to plan and conduct 3D propagation experiments over the Hudson Canyon scale -model bathymetry, and (3

  2. Bayesian Statistical Inference for Coefficient Alpha. ACT Research Report Series.

    ERIC Educational Resources Information Center

    Li, Jun Corser; Woodruff, David J.

    Coefficient alpha is a simple and very useful index of test reliability that is widely used in educational and psychological measurement. Classical statistical inference for coefficient alpha is well developed. This paper presents two methods for Bayesian statistical inference for a single sample alpha coefficient. An approximate analytic method…

  3. The Role of Causal Models in Analogical Inference

    ERIC Educational Resources Information Center

    Lee, Hee Seung; Holyoak, Keith J.

    2008-01-01

    Computational models of analogy have assumed that the strength of an inductive inference about the target is based directly on similarity of the analogs and in particular on shared higher order relations. In contrast, work in philosophy of science suggests that analogical inference is also guided by causal models of the source and target. In 3…

  4. Strategic Processing and Predictive Inference Generation in L2 Reading

    ERIC Educational Resources Information Center

    Nahatame, Shingo

    2014-01-01

    Predictive inference is the anticipation of the likely consequences of events described in a text. This study investigated predictive inference generation during second language (L2) reading, with a focus on the effects of strategy instructions. In this experiment, Japanese university students read several short narrative passages designed to…

  5. Contemporary Quantitative Methods and "Slow" Causal Inference: Response to Palinkas

    ERIC Educational Resources Information Center

    Stone, Susan

    2014-01-01

    This response considers together simultaneously occurring discussions about causal inference in social work and allied health and social science disciplines. It places emphasis on scholarship that integrates the potential outcomes model with directed acyclic graphing techniques to extract core steps in causal inference. Although this scholarship…

  6. Hemispheric inference priming during comprehension of conversations and narratives.

    PubMed

    Powers, Chivon; Bencic, Rachel; Horton, William S; Beeman, Mark

    2012-09-01

    In this study we examined asymmetric semantic activation patterns as people listened to conversations and narratives that promoted causal inferences. Based on the hypothesis that understanding the unique features of conversational input may benefit from or require a modified pattern of conceptual activation during conversation, we compared semantic priming in both hemispheres for inferences embedded in conversations and in narratives. Participants named inference-related target words or unrelated words presented to the left visual field-right hemisphere (lvf-RH) or to the right visual field-left hemisphere (rvf-LH) at critical coherence points that required an inference in order to correctly understand an utterance in the context of the conversation or narrative. Fifty-seven undergraduates listened to 36 conversations or narratives and were tested at 100 target inference points. During narrative comprehension, inference-related priming was reliable and equally strong in both hemispheres. In contrast, during conversation comprehension, inference-related priming was only reliable for target words presented to lvf-RH. This work demonstrates that priming for inference-related concepts can be measured with input in conversational form and suggests the language processing style of the RH is advantageous for comprehending conversation.

  7. Deduced Inference in the Analysis of Experimental Data

    ERIC Educational Resources Information Center

    Bird, Kevin D.

    2011-01-01

    Any set of confidence interval inferences on J - 1 linearly independent contrasts on J means, such as the two comparisons [mu][subscript 1] - [mu][subscript 2] and [mu][subscript 2] - [mu][subscript 3] on 3 means, provides a basis for the deduction of interval inferences on all other contrasts, such as the redundant comparison [mu][subscript 1] -…

  8. Aging and Predicting Inferences: A Diffusion Model Analysis

    ERIC Educational Resources Information Center

    McKoon, Gail; Ratcliff, Roger

    2013-01-01

    In the domain of discourse processing, it has been claimed that older adults (60-0-year-olds) are less likely to encode and remember some kinds of information from texts than young adults. The experiment described here shows that they do make a particular kind of inference to the same extent that college-age adults do. The inferences examined were…

  9. A Probability Index of the Robustness of a Causal Inference

    ERIC Educational Resources Information Center

    Pan, Wei; Frank, Kenneth A.

    2003-01-01

    Causal inference is an important, controversial topic in the social sciences, where it is difficult to conduct experiments or measure and control for all confounding variables. To address this concern, the present study presents a probability index to assess the robustness of a causal inference to the impact of a confounding variable. The…

  10. Developing Young Students' Informal Inference Skills in Data Analysis

    ERIC Educational Resources Information Center

    Paparistodemou, Efi; Meletiou-Mavrotheris, Maria

    2008-01-01

    This paper focuses on developing students' informal inference skills, reporting on how a group of third grade students formulated and evaluated data-based inferences using the dynamic statistics data-visualization environment TinkerPlots[TM] (Konold & Miller, 2005), software specifically designed to meet the learning needs of students in the…

  11. Deontic Introduction: A Theory of Inference from Is to Ought

    ERIC Educational Resources Information Center

    Elqayam, Shira; Thompson, Valerie A.; Wilkinson, Meredith R.; Evans, Jonathan St. B. T.; Over, David E.

    2015-01-01

    Humans have a unique ability to generate novel norms. Faced with the knowledge that there are hungry children in Somalia, we easily and naturally infer that we ought to donate to famine relief charities. Although a contentious and lively issue in metaethics, such inference from "is" to "ought" has not been systematically…

  12. Jumping to the Right Conclusions, Inferences, and Predictions

    ERIC Educational Resources Information Center

    Giannattasio, Jack; Bazler, Judith

    2005-01-01

    Writing meaningful conclusions, drawing accurate inferences, and making relevant predictions are essential skills that many adolescents lack. The differences among conclusions, inferences, and predictions, although subtle, must be recognized to accurately analyze and interpret lab data. During one of the authors' 14 years as a physics and…

  13. Virtual reality and consciousness inference in dreaming

    PubMed Central

    Hobson, J. Allan; Hong, Charles C.-H.; Friston, Karl J.

    2014-01-01

    This article explores the notion that the brain is genetically endowed with an innate virtual reality generator that – through experience-dependent plasticity – becomes a generative or predictive model of the world. This model, which is most clearly revealed in rapid eye movement (REM) sleep dreaming, may provide the theater for conscious experience. Functional neuroimaging evidence for brain activations that are time-locked to rapid eye movements (REMs) endorses the view that waking consciousness emerges from REM sleep – and dreaming lays the foundations for waking perception. In this view, the brain is equipped with a virtual model of the world that generates predictions of its sensations. This model is continually updated and entrained by sensory prediction errors in wakefulness to ensure veridical perception, but not in dreaming. In contrast, dreaming plays an essential role in maintaining and enhancing the capacity to model the world by minimizing model complexity and thereby maximizing both statistical and thermodynamic efficiency. This perspective suggests that consciousness corresponds to the embodied process of inference, realized through the generation of virtual realities (in both sleep and wakefulness). In short, our premise or hypothesis is that the waking brain engages with the world to predict the causes of sensations, while in sleep the brain’s generative model is actively refined so that it generates more efficient predictions during waking. We review the evidence in support of this hypothesis – evidence that grounds consciousness in biophysical computations whose neuronal and neurochemical infrastructure has been disclosed by sleep research. PMID:25346710

  14. Attention as a Bayesian inference process

    NASA Astrophysics Data System (ADS)

    Chikkerur, Sharat; Serre, Thomas; Tan, Cheston; Poggio, Tomaso

    2011-03-01

    David Marr famously defined vision as "knowing what is where by seeing". In the framework described here, attention is the inference process that solves the visual recognition problem of what is where. The theory proposes a computational role for attention and leads to a model that performs well in recognition tasks and that predicts some of the main properties of attention at the level of psychophysics and physiology. We propose an algorithmic implementation a Bayesian network that can be mapped into the basic functional anatomy of attention involving the ventral stream and the dorsal stream. This description integrates bottom-up, feature-based as well as spatial (context based) attentional mechanisms. We show that the Bayesian model predicts well human eye fixations (considered as a proxy for shifts of attention) in natural scenes, and can improve accuracy in object recognition tasks involving cluttered real world images. In both cases, we found that the proposed model can predict human performance better than existing bottom-up and top-down computational models.

  15. Global atmospheric black carbon inferred from AERONET

    PubMed Central

    Sato, Makiko; Hansen, James; Koch, Dorothy; Lacis, Andrew; Ruedy, Reto; Dubovik, Oleg; Holben, Brent; Chin, Mian; Novakov, Tica

    2003-01-01

    AERONET, a network of well calibrated sunphotometers, provides data on aerosol optical depth and absorption optical depth at >250 sites around the world. The spectral range of AERONET allows discrimination between constituents that absorb most strongly in the UV region, such as soil dust and organic carbon, and the more ubiquitously absorbing black carbon (BC). AERONET locations, primarily continental, are not representative of the global mean, but they can be used to calibrate global aerosol climatologies produced by tracer transport models. We find that the amount of BC in current climatologies must be increased by a factor of 2–4 to yield best agreement with AERONET, in the approximation in which BC is externally mixed with other aerosols. The inferred climate forcing by BC, regardless of whether it is internally or externally mixed, is ≈1 W/m2, most of which is probably anthropogenic. This positive forcing (warming) by BC must substantially counterbalance cooling by anthropogenic reflective aerosols. Thus, especially if reflective aerosols such as sulfates are reduced, it is important to reduce BC to minimize global warming. PMID:12746494

  16. How prescriptive norms influence causal inferences.

    PubMed

    Samland, Jana; Waldmann, Michael R

    2016-11-01

    Recent experimental findings suggest that prescriptive norms influence causal inferences. The cognitive mechanism underlying this finding is still under debate. We compare three competing theories: The culpable control model of blame argues that reasoners tend to exaggerate the causal influence of norm-violating agents, which should lead to relatively higher causal strength estimates for these agents. By contrast, the counterfactual reasoning account of causal selection assumes that norms do not alter the representation of the causal model, but rather later causal selection stages. According to this view, reasoners tend to preferentially consider counterfactual states of abnormal rather than normal factors, which leads to the choice of the abnormal factor in a causal selection task. A third view, the accountability hypothesis, claims that the effects of prescriptive norms are generated by the ambiguity of the causal test question. Asking whether an agent is a cause can be understood as a request to assess her causal contribution but also her moral accountability. According to this theory norm effects on causal selection are mediated by accountability judgments that are not only sensitive to the abnormality of behavior but also to mitigating factors, such as intentionality and knowledge of norms. Five experiments are presented that favor the accountability account over the two alternative theories.

  17. Functional network inference of the suprachiasmatic nucleus

    PubMed Central

    Abel, John H.; Meeker, Kirsten; Granados-Fuentes, Daniel; St. John, Peter C.; Wang, Thomas J.; Bales, Benjamin B.; Doyle, Francis J.; Herzog, Erik D.; Petzold, Linda R.

    2016-01-01

    In the mammalian suprachiasmatic nucleus (SCN), noisy cellular oscillators communicate within a neuronal network to generate precise system-wide circadian rhythms. Although the intracellular genetic oscillator and intercellular biochemical coupling mechanisms have been examined previously, the network topology driving synchronization of the SCN has not been elucidated. This network has been particularly challenging to probe, due to its oscillatory components and slow coupling timescale. In this work, we investigated the SCN network at a single-cell resolution through a chemically induced desynchronization. We then inferred functional connections in the SCN by applying the maximal information coefficient statistic to bioluminescence reporter data from individual neurons while they resynchronized their circadian cycling. Our results demonstrate that the functional network of circadian cells associated with resynchronization has small-world characteristics, with a node degree distribution that is exponential. We show that hubs of this small-world network are preferentially located in the central SCN, with sparsely connected shells surrounding these cores. Finally, we used two computational models of circadian neurons to validate our predictions of network structure. PMID:27044085

  18. Functional network inference of the suprachiasmatic nucleus

    SciTech Connect

    Abel, John H.; Meeker, Kirsten; Granados-Fuentes, Daniel; St. John, Peter C.; Wang, Thomas J.; Bales, Benjamin B.; Doyle, Francis J.; Herzog, Erik D.; Petzold, Linda R.

    2016-04-04

    In the mammalian suprachiasmatic nucleus (SCN), noisy cellular oscillators communicate within a neuronal network to generate precise system-wide circadian rhythms. Although the intracellular genetic oscillator and intercellular biochemical coupling mechanisms have been examined previously, the network topology driving synchronization of the SCN has not been elucidated. This network has been particularly challenging to probe, due to its oscillatory components and slow coupling timescale. In this work, we investigated the SCN network at a single-cell resolution through a chemically induced desynchronization. We then inferred functional connections in the SCN by applying the maximal information coefficient statistic to bioluminescence reporter data from individual neurons while they resynchronized their circadian cycling. Our results demonstrate that the functional network of circadian cells associated with resynchronization has small-world characteristics, with a node degree distribution that is exponential. We show that hubs of this small-world network are preferentially located in the central SCN, with sparsely connected shells surrounding these cores. Finally, we used two computational models of circadian neurons to validate our predictions of network structure.

  19. Causal inference with a quantitative exposure.

    PubMed

    Zhang, Zhiwei; Zhou, Jie; Cao, Weihua; Zhang, Jun

    2016-02-01

    The current statistical literature on causal inference is mostly concerned with binary or categorical exposures, even though exposures of a quantitative nature are frequently encountered in epidemiologic research. In this article, we review the available methods for estimating the dose-response curve for a quantitative exposure, which include ordinary regression based on an outcome regression model, inverse propensity weighting and stratification based on a propensity function model, and an augmented inverse propensity weighting method that is doubly robust with respect to the two models. We note that an outcome regression model often imposes an implicit constraint on the dose-response curve, and propose a flexible modeling strategy that avoids constraining the dose-response curve. We also propose two new methods: a weighted regression method that combines ordinary regression with inverse propensity weighting and a stratified regression method that combines ordinary regression with stratification. The proposed methods are similar to the augmented inverse propensity weighting method in the sense of double robustness, but easier to implement and more generally applicable. The methods are illustrated with an obstetric example and compared in simulation studies.

  20. Inferences of Ice Processes From Properties

    NASA Astrophysics Data System (ADS)

    Alley, R. B.; Wilen, L. A.; Spencer, M. K.; Hansen, D. P.; Fitzpatrick, J. J.

    2001-12-01

    Barclay Kamb's pioneering work on the physics and mineralogy of laboratory and natural ices has guided glaciological research spanning 40 years. Much of that research required extremely tedious use of optical universal stages to study thin sections of ice. Recent advances in digital systems have revolutionized data collection and offer great opportunities to use ice properties to infer processes that operate too slowly for proper laboratory investigation, leading toward a greatly improved understanding of the history of ice and its softness for further deformation (Wilen, 1999; Hansen and Wilen, in review; Wilen et al., this meeting). Patterns of nearest-neighbor c-axis orientations reveal the influence of nucleation-and-growth recrystallization (typically indicative of steady-state deformation) or polygonization. Combining these results with correlations between grain sizes and dust and chemical loadings reveals impurity effects on active processes. The relations between mean grain size and c-axis-fabric strength may show the importance of grain-boundary processes in deformation. Bubble sizes reveal climate conditions during firnification, and bubble shapes can provide information on in situ strain rates. These and many other possibilities should enhance our understanding of ice flow and of the paleoclimatic records archived in ice.

  1. Inferring interaction partners from protein sequences

    PubMed Central

    Bitbol, Anne-Florence; Dwyer, Robert S.; Colwell, Lucy J.; Wingreen, Ned S.

    2016-01-01

    Specific protein−protein interactions are crucial in the cell, both to ensure the formation and stability of multiprotein complexes and to enable signal transduction in various pathways. Functional interactions between proteins result in coevolution between the interaction partners, causing their sequences to be correlated. Here we exploit these correlations to accurately identify, from sequence data alone, which proteins are specific interaction partners. Our general approach, which employs a pairwise maximum entropy model to infer couplings between residues, has been successfully used to predict the 3D structures of proteins from sequences. Thus inspired, we introduce an iterative algorithm to predict specific interaction partners from two protein families whose members are known to interact. We first assess the algorithm’s performance on histidine kinases and response regulators from bacterial two-component signaling systems. We obtain a striking 0.93 true positive fraction on our complete dataset without any a priori knowledge of interaction partners, and we uncover the origin of this success. We then apply the algorithm to proteins from ATP-binding cassette (ABC) transporter complexes, and obtain accurate predictions in these systems as well. Finally, we present two metrics that accurately distinguish interacting protein families from noninteracting ones, using only sequence data. PMID:27663738

  2. Inferring unstable equilibrium configurations from experimental data

    NASA Astrophysics Data System (ADS)

    Virgin, L. N.; Wiebe, R.; Spottswood, S. M.; Beberniss, T.

    2016-09-01

    This research considers the structural behavior of slender, mechanically buckled beams and panels of the type commonly found in aerospace structures. The specimens were deflected and then clamped in a rigid frame in order to exhibit snap-through. That is, the initial equilibrium and the buckled (snapped-through) equilibrium configurations both co-existed for the given clamped conditions. In order to transit between these two stable equilibrium configurations (for example, under the action of an externally applied load), it is necessary for the structural component to pass through an intermediate unstable equilibrium configuration. A sequence of sudden impacts was imparted to the system, of various strengths and at various locations. The goal of this impact force was to induce relatively intermediate-sized transients that effectively slowed-down in the vicinity of the unstable equilibrium configuration. Thus, monitoring the velocity of the motion, and specifically its slowing down, should give an indication of the presence of an equilibrium configuration, even though it is unstable and not amenable to direct experimental observation. A digital image correlation (DIC) system was used in conjunction with an instrumented impact hammer to track trajectories and statistical methods used to infer the presence of unstable equilibria in both a beam and a panel.

  3. Bayesian inference in physics: case studies

    NASA Astrophysics Data System (ADS)

    Dose, V.

    2003-09-01

    This report describes the Bayesian approach to probability theory with emphasis on the application to the evaluation of experimental data. A brief summary of Bayesian principles is given, with a discussion of concepts, terminology and pitfalls. The step from Bayesian principles to data processing involves major numerical efforts. We address the presently employed procedures of numerical integration, which are mainly based on the Monte Carlo method. The case studies include examples from electron spectroscopies, plasma physics, ion beam analysis and mass spectrometry. Bayesian solutions to the ubiquitous problem of spectrum restoration are presented and advantages and limitations are discussed. Parameter estimation within the Bayesian framework is shown to allow for the incorporation of expert knowledge which in turn allows the treatment of under-determined problems which are inaccessible by the traditional maximum likelihood method. A unique and extremely valuable feature of Bayesian theory is the model comparison option. Bayesian model comparison rests on Ockham's razor which limits the complexity of a model to the amount necessary to explain the data without fitting noise. Finally we deal with the treatment of inconsistent data. They arise frequently in experimental work either from incorrect estimation of the errors associated with a measurement or alternatively from distortions of the measurement signal by some unrecognized spurious source. Bayesian data analysis sometimes meets with spectacular success. However, the approach cannot do wonders, but it does result in optimal robust inferences on the basis of all available and explicitly declared information.

  4. Mathematical inference in one point microrheology

    NASA Astrophysics Data System (ADS)

    Hohenegger, Christel; McKinley, Scott

    2016-11-01

    Pioneered by the work of Mason and Weitz, one point passive microrheology has been successfully applied to obtaining estimates of the loss and storage modulus of viscoelastic fluids when the mean-square displacement obeys a local power law. Using numerical simulations of a fluctuating viscoelastic fluid model, we study the problem of recovering the mechanical parameters of the fluid's memory kernel using statistical inference like mean-square displacements and increment auto-correlation functions. Seeking a better understanding of the influence of the assumptions made in the inversion process, we mathematically quantify the uncertainty in traditional one point microrheology for simulated data and demonstrate that a large family of memory kernels yields the same statistical signature. We consider both simulated data obtained from a full viscoelastic fluid simulation of the unsteady Stokes equations with fluctuations and from a Generalized Langevin Equation of the particle's motion described by the same memory kernel. From the theory of inverse problems, we propose an alternative method that can be used to recover information about the loss and storage modulus and discuss its limitations and uncertainties. NSF-DMS 1412998.

  5. BAYESIAN INFERENCE OF CMB GRAVITATIONAL LENSING

    SciTech Connect

    Anderes, Ethan; Wandelt, Benjamin D.; Lavaux, Guilhem

    2015-08-01

    The Planck satellite, along with several ground-based telescopes, has mapped the cosmic microwave background (CMB) at sufficient resolution and signal-to-noise so as to allow a detection of the subtle distortions due to the gravitational influence of the intervening matter distribution. A natural modeling approach is to write a Bayesian hierarchical model for the lensed CMB in terms of the unlensed CMB and the lensing potential. So far there has been no feasible algorithm for inferring the posterior distribution of the lensing potential from the lensed CMB map. We propose a solution that allows efficient Markov Chain Monte Carlo sampling from the joint posterior of the lensing potential and the unlensed CMB map using the Hamiltonian Monte Carlo technique. The main conceptual step in the solution is a re-parameterization of CMB lensing in terms of the lensed CMB and the “inverse lensing” potential. We demonstrate a fast implementation on simulated data, including noise and a sky cut, that uses a further acceleration based on a very mild approximation of the inverse lensing potential. We find that the resulting Markov Chain has short correlation lengths and excellent convergence properties, making it promising for applications to high-resolution CMB data sets in the future.

  6. Probabilistic learning and inference in schizophrenia

    PubMed Central

    Averbeck, Bruno B.; Evans, Simon; Chouhan, Viraj; Bristow, Eleanor; Shergill, Sukhwinder S.

    2010-01-01

    Patients with schizophrenia make decisions on the basis of less evidence when required to collect information to make an inference, a behavior often called jumping to conclusions. The underlying basis for this behaviour remains controversial. We examined the cognitive processes underpinning this finding by testing subjects on the beads task, which has been used previously to elicit jumping to conclusions behaviour, and a stochastic sequence learning task, with a similar decision theoretic structure. During the sequence learning task, subjects had to learn a sequence of button presses, while receiving noisy feedback on their choices. We fit a Bayesian decision making model to the sequence task and compared model parameters to the choice behavior in the beads task in both patients and healthy subjects. We found that patients did show a jumping to conclusions style; and those who picked early in the beads task tended to learn less from positive feedback in the sequence task. This favours the likelihood of patients selecting early because they have a low threshold for making decisions, and that they make choices on the basis of relatively little evidence. PMID:20810252

  7. Atomic Inference from Weak Gravitational Lensing Data

    SciTech Connect

    Marshall, Phil; /KIPAC, Menlo Park

    2005-12-14

    We present a novel approach to reconstructing the projected mass distribution from the sparse and noisy weak gravitational lensing shear data. The reconstructions are regularized via the knowledge gained from numerical simulations of clusters, with trial mass distributions constructed from n NFW profile ellipsoidal components. The parameters of these ''atoms'' are distributed a priori as in the simulated clusters. Sampling the mass distributions from the atom parameter probability density function allows estimates of the properties of the mass distribution to be generated, with error bars. The appropriate number of atoms is inferred from the data itself via the Bayesian evidence, and is typically found to be small, reecting the quality of the data. Ensemble average mass maps are found to be robust to the details of the noise realization, and succeed in recovering the demonstration input mass distribution (from a realistic simulated cluster) over a wide range of scales. As an application of such a reliable mapping algorithm, we comment on the residuals of the reconstruction and the implications for predicting convergence and shear at specific points on the sky.

  8. Inference by replication in densely connected systems

    SciTech Connect

    Neirotti, Juan P.; Saad, David

    2007-10-15

    An efficient Bayesian inference method for problems that can be mapped onto dense graphs is presented. The approach is based on message passing where messages are averaged over a large number of replicated variable systems exposed to the same evidential nodes. An assumption about the symmetry of the solutions is required for carrying out the averages; here we extend the previous derivation based on a replica-symmetric- (RS)-like structure to include a more complex one-step replica-symmetry-breaking-like (1RSB-like) ansatz. To demonstrate the potential of the approach it is employed for studying critical properties of the Ising linear perceptron and for multiuser detection in code division multiple access (CDMA) under different noise models. Results obtained under the RS assumption in the noncritical regime give rise to a highly efficient signal detection algorithm in the context of CDMA; while in the critical regime one observes a first-order transition line that ends in a continuous phase transition point. Finite size effects are also observed. While the 1RSB ansatz is not required for the original problems, it was applied to the CDMA signal detection problem with a more complex noise model that exhibits RSB behavior, resulting in an improvement in performance.

  9. Inferring human mobility using communication patterns.

    PubMed

    Palchykov, Vasyl; Mitrović, Marija; Jo, Hang-Hyun; Saramäki, Jari; Pan, Raj Kumar

    2014-08-22

    Understanding the patterns of mobility of individuals is crucial for a number of reasons, from city planning to disaster management. There are two common ways of quantifying the amount of travel between locations: by direct observations that often involve privacy issues, e.g., tracking mobile phone locations, or by estimations from models. Typically, such models build on accurate knowledge of the population size at each location. However, when this information is not readily available, their applicability is rather limited. As mobile phones are ubiquitous, our aim is to investigate if mobility patterns can be inferred from aggregated mobile phone call data alone. Using data released by Orange for Ivory Coast, we show that human mobility is well predicted by a simple model based on the frequency of mobile phone calls between two locations and their geographical distance. We argue that the strength of the model comes from directly incorporating the social dimension of mobility. Furthermore, as only aggregated call data is required, the model helps to avoid potential privacy problems.

  10. Inferring human mobility using communication patterns

    NASA Astrophysics Data System (ADS)

    Palchykov, Vasyl; Mitrović, Marija; Jo, Hang-Hyun; Saramäki, Jari; Pan, Raj Kumar

    2014-08-01

    Understanding the patterns of mobility of individuals is crucial for a number of reasons, from city planning to disaster management. There are two common ways of quantifying the amount of travel between locations: by direct observations that often involve privacy issues, e.g., tracking mobile phone locations, or by estimations from models. Typically, such models build on accurate knowledge of the population size at each location. However, when this information is not readily available, their applicability is rather limited. As mobile phones are ubiquitous, our aim is to investigate if mobility patterns can be inferred from aggregated mobile phone call data alone. Using data released by Orange for Ivory Coast, we show that human mobility is well predicted by a simple model based on the frequency of mobile phone calls between two locations and their geographical distance. We argue that the strength of the model comes from directly incorporating the social dimension of mobility. Furthermore, as only aggregated call data is required, the model helps to avoid potential privacy problems.

  11. Human Inferences about Sequences: A Minimal Transition Probability Model

    PubMed Central

    2016-01-01

    The brain constantly infers the causes of the inputs it receives and uses these inferences to generate statistical expectations about future observations. Experimental evidence for these expectations and their violations include explicit reports, sequential effects on reaction times, and mismatch or surprise signals recorded in electrophysiology and functional MRI. Here, we explore the hypothesis that the brain acts as a near-optimal inference device that constantly attempts to infer the time-varying matrix of transition probabilities between the stimuli it receives, even when those stimuli are in fact fully unpredictable. This parsimonious Bayesian model, with a single free parameter, accounts for a broad range of findings on surprise signals, sequential effects and the perception of randomness. Notably, it explains the pervasive asymmetry between repetitions and alternations encountered in those studies. Our analysis suggests that a neural machinery for inferring transition probabilities lies at the core of human sequence knowledge. PMID:28030543

  12. Toddlers infer higher-order relational principles in causal learning.

    PubMed

    Walker, Caren M; Gopnik, Alison

    2014-01-01

    Children make inductive inferences about the causal properties of individual objects from a very young age. When can they infer higher-order relational properties? In three experiments, we examined 18- to 30-month-olds' relational inferences in a causal task. Results suggest that at this age, children are able to infer a higher-order relational causal principle from just a few observations and use this inference to guide their own subsequent actions and bring about a novel causal outcome. Moreover, the children passed a revised version of the relational match-to-sample task that has proven very difficult for nonhuman primates. The findings are considered in light of their implications for understanding the nature of relational and causal reasoning, and their evolutionary origins.

  13. Wisdom of crowds for robust gene network inference

    PubMed Central

    Marbach, Daniel; Costello, James C.; Küffner, Robert; Vega, Nicci; Prill, Robert J.; Camacho, Diogo M.; Allison, Kyle R.; Kellis, Manolis; Collins, James J.; Stolovitzky, Gustavo

    2012-01-01

    Reconstructing gene regulatory networks from high-throughput data is a long-standing problem. Through the DREAM project (Dialogue on Reverse Engineering Assessment and Methods), we performed a comprehensive blind assessment of over thirty network inference methods on Escherichia coli, Staphylococcus aureus, Saccharomyces cerevisiae, and in silico microarray data. We characterize performance, data requirements, and inherent biases of different inference approaches offering guidelines for both algorithm application and development. We observe that no single inference method performs optimally across all datasets. In contrast, integration of predictions from multiple inference methods shows robust and high performance across diverse datasets. Thereby, we construct high-confidence networks for E. coli and S. aureus, each comprising ~1700 transcriptional interactions at an estimated precision of 50%. We experimentally test 53 novel interactions in E. coli, of which 23 were supported (43%). Our results establish community-based methods as a powerful and robust tool for the inference of transcriptional gene regulatory networks. PMID:22796662

  14. Quantum-Like Representation of Non-Bayesian Inference

    NASA Astrophysics Data System (ADS)

    Asano, M.; Basieva, I.; Khrennikov, A.; Ohya, M.; Tanaka, Y.

    2013-01-01

    This research is related to the problem of "irrational decision making or inference" that have been discussed in cognitive psychology. There are some experimental studies, and these statistical data cannot be described by classical probability theory. The process of decision making generating these data cannot be reduced to the classical Bayesian inference. For this problem, a number of quantum-like coginitive models of decision making was proposed. Our previous work represented in a natural way the classical Bayesian inference in the frame work of quantum mechanics. By using this representation, in this paper, we try to discuss the non-Bayesian (irrational) inference that is biased by effects like the quantum interference. Further, we describe "psychological factor" disturbing "rationality" as an "environment" correlating with the "main system" of usual Bayesian inference.

  15. RulNet: A Web-Oriented Platform for Regulatory Network Inference, Application to Wheat –Omics Data

    PubMed Central

    Vincent, Jonathan; Martre, Pierre; Gouriou, Benjamin; Ravel, Catherine; Dai, Zhanwu; Petit, Jean-Marc; Pailloux, Marie

    2015-01-01

    With the increasing amount of –omics data available, a particular effort has to be made to provide suitable analysis tools. A major challenge is that of unraveling the molecular regulatory networks from massive and heterogeneous datasets. Here we describe RulNet, a web-oriented platform dedicated to the inference and analysis of regulatory networks from qualitative and quantitative –omics data by means of rule discovery. Queries for rule discovery can be written in an extended form of the RQL query language, which has a syntax similar to SQL. RulNet also offers users interactive features that progressively adjust and refine the inferred networks. In this paper, we present a functional characterization of RulNet and compare inferred networks with correlation-based approaches. The performance of RulNet has been evaluated using the three benchmark datasets used for the transcriptional network inference challenge DREAM5. Overall, RulNet performed as well as the best methods that participated in this challenge and it was shown to behave more consistently when compared across the three datasets. Finally, we assessed the suitability of RulNet to analyze experimental –omics data and to infer regulatory networks involved in the response to nitrogen and sulfur supply in wheat (Triticum aestivum L.) grains. The results highlight putative actors governing the response to nitrogen and sulfur supply in wheat grains. We evaluate the main characteristics and features of RulNet as an all-in-one solution for RN inference, visualization and editing. Using simple yet powerful RulNet queries allowed RNs involved in the adaptation of wheat grain to N and S supply to be discovered. We demonstrate the effectiveness and suitability of RulNet as a platform for the analysis of RNs involving different types of –omics data. The results are promising since they are consistent with what was previously established by the scientific community. PMID:25993562

  16. METABOLIC INTERMEDIATES IN ADAPTIVE FERMENTATION OF GALACTOSE BY YEAST

    PubMed Central

    Reiner, John M.

    1947-01-01

    Pyruvic acid, which is known to be an intermediate of glucose fermentation, was added to yeast during adaptation to galactose fermentation. It was found to neutralize the inhibition by sodium fluoride, and to decrease the apparent time of adaptation from 90 to about 45 or 60 minutes. In control experiments, it was shown that intact yeast is unable appreciably to ferment or decarboxylate alone, although it oxidizes the compound readily. Experiments in which galactose and pyruvate were added at various times and in different orders were used to eliminate the possible complications of the rates at which these compounds penetrate the cells. Under these conditions, it was not possible to reduce the time of adaptation below 45 minutes. It was concluded that the rôle of added pyruvate was to serve as a source of acetaldehyde, which in turn could accept hydrogen and be reduced to alcohol. Substances, such as triose phosphate, which could serve as hydrogen donors were not produced from galactose in appreciable quantities until 45 minutes had elapsed. This time was therefore inferred to be the true adaptation time, during which the first synthesis of adaptive enzymes occurred. Some determinations of the distribution of phosphorylated intermediates at various stages during the adaptive process were carried out. It was found that ATP, which usually serves to phosphorylate hexoses, accumulates during the preadaptive phase, diminishes rapidly after 60 minutes, and subsequently increases once more. The source of the ATP phosphate appeared to be PPA or triose phosphate initially present in the cells. It was inferred that the adaptive enzyme was concerned with the phosphorylation of galactose and the conversion of the phosphate ester to a glucose ester, which could then be fermented by the normal enzymes of the cell. Added ATP was found to stimulate adaptation to a considerable extent, but did not shorten the time of adaptation below 75 minutes. This seemed consistent with the r

  17. Inferring climate variability from skewed proxy records

    NASA Astrophysics Data System (ADS)

    Emile-Geay, J.; Tingley, M.

    2013-12-01

    Many paleoclimate analyses assume a linear relationship between the proxy and the target climate variable, and that both the climate quantity and the errors follow normal distributions. An ever-increasing number of proxy records, however, are better modeled using distributions that are heavy-tailed, skewed, or otherwise non-normal, on account of the proxies reflecting non-normally distributed climate variables, or having non-linear relationships with a normally distributed climate variable. The analysis of such proxies requires a different set of tools, and this work serves as a cautionary tale on the danger of making conclusions about the underlying climate from applications of classic statistical procedures to heavily skewed proxy records. Inspired by runoff proxies, we consider an idealized proxy characterized by a nonlinear, thresholded relationship with climate, and describe three approaches to using such a record to infer past climate: (i) applying standard methods commonly used in the paleoclimate literature, without considering the non-linearities inherent to the proxy record; (ii) applying a power transform prior to using these standard methods; (iii) constructing a Bayesian model to invert the mechanistic relationship between the climate and the proxy. We find that neglecting the skewness in the proxy leads to erroneous conclusions and often exaggerates changes in climate variability between different time intervals. In contrast, an explicit treatment of the skewness, using either power transforms or a Bayesian inversion of the mechanistic model for the proxy, yields significantly better estimates of past climate variations. We apply these insights in two paleoclimate settings: (1) a classical sedimentary record from Laguna Pallcacocha, Ecuador (Moy et al., 2002). Our results agree with the qualitative aspects of previous analyses of this record, but quantitative departures are evident and hold implications for how such records are interpreted, and

  18. Inferring mental states from neuroimaging data: From reverse inference to large-scale decoding

    PubMed Central

    Poldrack, Russell A.

    2011-01-01

    A common goal of neuroimaging research is to use imaging data to identify the mental processes that are engaged when a subject performs a mental task. The use of reasoning from activation to mental functions, known as “reverse inference”, has been previously criticized on the basis that it does not take into account how selectively the area is activated by the mental process in question. In this Perspective, I outline the critique of informal reverse inference, and describe a number of new developments that provide the ability to more formally test the predictive power of neuroimaging data. PMID:22153367

  19. Causal inference in biology networks with integrated belief propagation.

    PubMed

    Chang, Rui; Karr, Jonathan R; Schadt, Eric E

    2015-01-01

    Inferring causal relationships among molecular and higher order phenotypes is a critical step in elucidating the complexity of living systems. Here we propose a novel method for inferring causality that is no longer constrained by the conditional dependency arguments that limit the ability of statistical causal inference methods to resolve causal relationships within sets of graphical models that are Markov equivalent. Our method utilizes Bayesian belief propagation to infer the responses of perturbation events on molecular traits given a hypothesized graph structure. A distance measure between the inferred response distribution and the observed data is defined to assess the 'fitness' of the hypothesized causal relationships. To test our algorithm, we infer causal relationships within equivalence classes of gene networks in which the form of the functional interactions that are possible are assumed to be nonlinear, given synthetic microarray and RNA sequencing data. We also apply our method to infer causality in real metabolic network with v-structure and feedback loop. We show that our method can recapitulate the causal structure and recover the feedback loop only from steady-state data which conventional method cannot.

  20. Active inference and robot control: a case study.

    PubMed

    Pio-Lopez, Léo; Nizard, Ange; Friston, Karl; Pezzulo, Giovanni

    2016-09-01

    Active inference is a general framework for perception and action that is gaining prominence in computational and systems neuroscience but is less known outside these fields. Here, we discuss a proof-of-principle implementation of the active inference scheme for the control or the 7-DoF arm of a (simulated) PR2 robot. By manipulating visual and proprioceptive noise levels, we show under which conditions robot control under the active inference scheme is accurate. Besides accurate control, our analysis of the internal system dynamics (e.g. the dynamics of the hidden states that are inferred during the inference) sheds light on key aspects of the framework such as the quintessentially multimodal nature of control and the differential roles of proprioception and vision. In the discussion, we consider the potential importance of being able to implement active inference in robots. In particular, we briefly review the opportunities for modelling psychophysiological phenomena such as sensory attenuation and related failures of gain control, of the sort seen in Parkinson's disease. We also consider the fundamental difference between active inference and optimal control formulations, showing that in the former the heavy lifting shifts from solving a dynamical inverse problem to creating deep forward or generative models with dynamics, whose attracting sets prescribe desired behaviours.