Sample records for functional equivalency inferred

  1. Event-related potential correlates of emergent inference in human arbitrary relational learning.

    PubMed

    Wang, Ting; Dymond, Simon

    2013-01-01

    Two experiments investigated the functional-anatomical correlates of cognition supporting untrained, emergent relational inference in a stimulus equivalence task. In Experiment 1, after learning a series of conditional relations involving words and pseudowords, participants performed a relatedness task during which EEG was recorded. Behavioural performance was faster and more accurate on untrained, indirectly related symmetry (i.e., learn AB and infer BA) and equivalence trials (i.e., learn AB and AC and infer CB) than on unrelated trials, regardless of whether or not a formal test for stimulus equivalence relations had been conducted. Consistent with previous results, event related potentials (ERPs) evoked by trained and emergent trials at parietal and occipital sites differed only for those participants who had not received a prior equivalence test. Experiment 2 further replicated and extended these behavioural and ERP findings using arbitrary symbols as stimuli and demonstrated time and frequency differences for trained and untrained relatedness trials. Overall, the findings demonstrate convincingly the ERP correlates of intra-experimentally established stimulus equivalence relations consisting entirely of arbitrary symbols and offer support for a contemporary cognitive-behavioural model of symbolic categorisation and relational inference. Copyright © 2012 Elsevier B.V. All rights reserved.

  2. Causal inference in biology networks with integrated belief propagation.

    PubMed

    Chang, Rui; Karr, Jonathan R; Schadt, Eric E

    2015-01-01

    Inferring causal relationships among molecular and higher order phenotypes is a critical step in elucidating the complexity of living systems. Here we propose a novel method for inferring causality that is no longer constrained by the conditional dependency arguments that limit the ability of statistical causal inference methods to resolve causal relationships within sets of graphical models that are Markov equivalent. Our method utilizes Bayesian belief propagation to infer the responses of perturbation events on molecular traits given a hypothesized graph structure. A distance measure between the inferred response distribution and the observed data is defined to assess the 'fitness' of the hypothesized causal relationships. To test our algorithm, we infer causal relationships within equivalence classes of gene networks in which the form of the functional interactions that are possible are assumed to be nonlinear, given synthetic microarray and RNA sequencing data. We also apply our method to infer causality in real metabolic network with v-structure and feedback loop. We show that our method can recapitulate the causal structure and recover the feedback loop only from steady-state data which conventional method cannot.

  3. Inferring consistent functional interaction patterns from natural stimulus FMRI data

    PubMed Central

    Sun, Jiehuan; Hu, Xintao; Huang, Xiu; Liu, Yang; Li, Kaiming; Li, Xiang; Han, Junwei; Guo, Lei

    2014-01-01

    There has been increasing interest in how the human brain responds to natural stimulus such as video watching in the neuroimaging field. Along this direction, this paper presents our effort in inferring consistent and reproducible functional interaction patterns under natural stimulus of video watching among known functional brain regions identified by task-based fMRI. Then, we applied and compared four statistical approaches, including Bayesian network modeling with searching algorithms: greedy equivalence search (GES), Peter and Clark (PC) analysis, independent multiple greedy equivalence search (IMaGES), and the commonly used Granger causality analysis (GCA), to infer consistent and reproducible functional interaction patterns among these brain regions. It is interesting that a number of reliable and consistent functional interaction patterns were identified by the GES, PC and IMaGES algorithms in different participating subjects when they watched multiple video shots of the same semantic category. These interaction patterns are meaningful given current neuroscience knowledge and are reasonably reproducible across different brains and video shots. In particular, these consistent functional interaction patterns are supported by structural connections derived from diffusion tensor imaging (DTI) data, suggesting the structural underpinnings of consistent functional interactions. Our work demonstrates that specific consistent patterns of functional interactions among relevant brain regions might reflect the brain's fundamental mechanisms of online processing and comprehension of video messages. PMID:22440644

  4. Evolution in Mind: Evolutionary Dynamics, Cognitive Processes, and Bayesian Inference.

    PubMed

    Suchow, Jordan W; Bourgin, David D; Griffiths, Thomas L

    2017-07-01

    Evolutionary theory describes the dynamics of population change in settings affected by reproduction, selection, mutation, and drift. In the context of human cognition, evolutionary theory is most often invoked to explain the origins of capacities such as language, metacognition, and spatial reasoning, framing them as functional adaptations to an ancestral environment. However, evolutionary theory is useful for understanding the mind in a second way: as a mathematical framework for describing evolving populations of thoughts, ideas, and memories within a single mind. In fact, deep correspondences exist between the mathematics of evolution and of learning, with perhaps the deepest being an equivalence between certain evolutionary dynamics and Bayesian inference. This equivalence permits reinterpretation of evolutionary processes as algorithms for Bayesian inference and has relevance for understanding diverse cognitive capacities, including memory and creativity. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Functional equivalency inferred from "authoritative sources" in networks of homologous proteins.

    PubMed

    Natarajan, Shreedhar; Jakobsson, Eric

    2009-06-12

    A one-on-one mapping of protein functionality across different species is a critical component of comparative analysis. This paper presents a heuristic algorithm for discovering the Most Likely Functional Counterparts (MoLFunCs) of a protein, based on simple concepts from network theory. A key feature of our algorithm is utilization of the user's knowledge to assign high confidence to selected functional identification. We show use of the algorithm to retrieve functional equivalents for 7 membrane proteins, from an exploration of almost 40 genomes form multiple online resources. We verify the functional equivalency of our dataset through a series of tests that include sequence, structure and function comparisons. Comparison is made to the OMA methodology, which also identifies one-on-one mapping between proteins from different species. Based on that comparison, we believe that incorporation of user's knowledge as a key aspect of the technique adds value to purely statistical formal methods.

  6. Functional Equivalency Inferred from “Authoritative Sources” in Networks of Homologous Proteins

    PubMed Central

    Natarajan, Shreedhar; Jakobsson, Eric

    2009-01-01

    A one-on-one mapping of protein functionality across different species is a critical component of comparative analysis. This paper presents a heuristic algorithm for discovering the Most Likely Functional Counterparts (MoLFunCs) of a protein, based on simple concepts from network theory. A key feature of our algorithm is utilization of the user's knowledge to assign high confidence to selected functional identification. We show use of the algorithm to retrieve functional equivalents for 7 membrane proteins, from an exploration of almost 40 genomes form multiple online resources. We verify the functional equivalency of our dataset through a series of tests that include sequence, structure and function comparisons. Comparison is made to the OMA methodology, which also identifies one-on-one mapping between proteins from different species. Based on that comparison, we believe that incorporation of user's knowledge as a key aspect of the technique adds value to purely statistical formal methods. PMID:19521530

  7. Bacterial growth laws reflect the evolutionary importance of energy efficiency.

    PubMed

    Maitra, Arijit; Dill, Ken A

    2015-01-13

    We are interested in the balance of energy and protein synthesis in bacterial growth. How has evolution optimized this balance? We describe an analytical model that leverages extensive literature data on growth laws to infer the underlying fitness landscape and to draw inferences about what evolution has optimized in Escherichia coli. Is E. coli optimized for growth speed, energy efficiency, or some other property? Experimental data show that at its replication speed limit, E. coli produces about four mass equivalents of nonribosomal proteins for every mass equivalent of ribosomes. This ratio can be explained if the cell's fitness function is the the energy efficiency of cells under fast growth conditions, indicating a tradeoff between the high energy costs of ribosomes under fast growth and the high energy costs of turning over nonribosomal proteins under slow growth. This model gives insight into some of the complex nonlinear relationships between energy utilization and ribosomal and nonribosomal production as a function of cell growth conditions.

  8. Conceptual influences on category-based induction

    PubMed Central

    Gelman, Susan A.; Davidson, Natalie S.

    2013-01-01

    One important function of categories is to permit rich inductive inferences. Prior work shows that children use category labels to guide their inductive inferences. However, there are competing theories to explain this phenomenon, differing in the roles attributed to conceptual information versus perceptual similarity. Seven experiments with 4- to 5-year-old children and adults (N = 344) test these theories by teaching categories for which category membership and perceptual similarity are in conflict, and varying the conceptual basis of the novel categories. Results indicate that for non-natural kind categories that have little conceptual coherence, children make inferences based on perceptual similarity, whereas adults make inferences based on category membership. In contrast, for basic- and ontological-level categories that have a principled conceptual basis, children and adults alike make use of category membership more than perceptual similarity as the basis of their inferences. These findings provide evidence in favor of the role of conceptual information in preschoolers’ inferences, and further demonstrate that labeled categories are not all equivalent; they differ in their inductive potential. PMID:23517863

  9. Neural correlates of species-typical illogical cognitive bias in human inference.

    PubMed

    Ogawa, Akitoshi; Yamazaki, Yumiko; Ueno, Kenichi; Cheng, Kang; Iriki, Atsushi

    2010-09-01

    The ability to think logically is a hallmark of human intelligence, yet our innate inferential abilities are marked by implicit biases that often lead to illogical inference. For example, given AB ("if A then B"), people frequently but fallaciously infer the inverse, BA. This mode of inference, called symmetry, is logically invalid because, although it may be true, it is not necessarily true. Given pairs of conditional relations, such as AB and BC, humans reflexively perform two additional modes of inference: transitivity, whereby one (validly) infers AC; and equivalence, whereby one (invalidly) infers CA. In sharp contrast, nonhuman animals can handle transitivity but can rarely be made to acquire symmetry or equivalence. In the present study, human subjects performed logical and illogical inferences about the relations between abstract, visually presented figures while their brain activation was monitored with fMRI. The prefrontal, medial frontal, and intraparietal cortices were activated during all modes of inference. Additional activation in the precuneus and posterior parietal cortex was observed during transitivity and equivalence, which may reflect the need to retrieve the intermediate stimulus (B) from memory. Surprisingly, the patterns of brain activation in illogical and logical inference were very similar. We conclude that the observed inference-related fronto-parietal network is adapted for processing categorical, but not logical, structures of association among stimuli. Humans might prefer categorization over the memorization of logical structures in order to minimize the cognitive working memory load when processing large volumes of information.

  10. Towards the unification of inference structures in medical diagnostic tasks.

    PubMed

    Mira, J; Rives, J; Delgado, A E; Martínez, R

    1998-01-01

    The central purpose of artificial intelligence applied to medicine is to develop models for diagnosis and therapy planning at the knowledge level, in the Newell sense, and software environments to facilitate the reduction of these models to the symbol level. The usual methodology (KADS, Common-KADS, GAMES, HELIOS, Protégé, etc) has been to develop libraries of generic tasks and reusable problem-solving methods with explicit ontologies. The principal problem which clinicians have with these methodological developments concerns the diversity and complexity of new terms whose meaning is not sufficiently clear, precise, unambiguous and consensual for them to be accessible in the daily clinical environment. As a contribution to the solution of this problem, we develop in this article the conjecture that one inference structure is enough to describe the set of analysis tasks associated with medical diagnoses. To this end, we first propose a modification of the systematic diagnostic inference scheme to obtain an analysis generic task and then compare it with the monitoring and the heuristic classification task inference schemes using as comparison criteria the compatibility of domain roles (data structures), the similarity in the inferences, and the commonality in the set of assumptions which underlie the functionally equivalent models. The equivalences proposed are illustrated with several examples. Note that though our ongoing work aims to simplify the methodology and to increase the precision of the terms used, the proposal presented here should be viewed more in the nature of a conjecture.

  11. Backward renormalization-group inference of cortical dipole sources and neural connectivity efficacy

    NASA Astrophysics Data System (ADS)

    Amaral, Selene da Rocha; Baccalá, Luiz A.; Barbosa, Leonardo S.; Caticha, Nestor

    2017-06-01

    Proper neural connectivity inference has become essential for understanding cognitive processes associated with human brain function. Its efficacy is often hampered by the curse of dimensionality. In the electroencephalogram case, which is a noninvasive electrophysiological monitoring technique to record electrical activity of the brain, a possible way around this is to replace multichannel electrode information with dipole reconstructed data. We use a method based on maximum entropy and the renormalization group to infer the position of the sources, whose success hinges on transmitting information from low- to high-resolution representations of the cortex. The performance of this method compares favorably to other available source inference algorithms, which are ranked here in terms of their performance with respect to directed connectivity inference by using artificially generated dynamic data. We examine some representative scenarios comprising different numbers of dynamically connected dipoles over distinct cortical surface positions and under different sensor noise impairment levels. The overall conclusion is that inverse problem solutions do not affect the correct inference of the direction of the flow of information as long as the equivalent dipole sources are correctly found.

  12. Modularity-like objective function in annotated networks

    NASA Astrophysics Data System (ADS)

    Xie, Jia-Rong; Wang, Bing-Hong

    2017-12-01

    We ascertain the modularity-like objective function whose optimization is equivalent to the maximum likelihood in annotated networks. We demonstrate that the modularity-like objective function is a linear combination of modularity and conditional entropy. In contrast with statistical inference methods, in our method, the influence of the metadata is adjustable; when its influence is strong enough, the metadata can be recovered. Conversely, when it is weak, the detection may correspond to another partition. Between the two, there is a transition. This paper provides a concept for expanding the scope of modularity methods.

  13. Stochastic reconstructions of spectral functions: Application to lattice QCD

    NASA Astrophysics Data System (ADS)

    Ding, H.-T.; Kaczmarek, O.; Mukherjee, Swagato; Ohno, H.; Shu, H.-T.

    2018-05-01

    We present a detailed study of the applications of two stochastic approaches, stochastic optimization method (SOM) and stochastic analytical inference (SAI), to extract spectral functions from Euclidean correlation functions. SOM has the advantage that it does not require prior information. On the other hand, SAI is a more generalized method based on Bayesian inference. Under mean field approximation SAI reduces to the often-used maximum entropy method (MEM) and for a specific choice of the prior SAI becomes equivalent to SOM. To test the applicability of these two stochastic methods to lattice QCD, firstly, we apply these methods to various reasonably chosen model correlation functions and present detailed comparisons of the reconstructed spectral functions obtained from SOM, SAI and MEM. Next, we present similar studies for charmonia correlation functions obtained from lattice QCD computations using clover-improved Wilson fermions on large, fine, isotropic lattices at 0.75 and 1.5 Tc, Tc being the deconfinement transition temperature of a pure gluon plasma. We find that SAI and SOM give consistent results to MEM at these two temperatures.

  14. Theory of mind broad and narrow: reasoning about social exchange engages ToM areas, precautionary reasoning does not.

    PubMed

    Ermer, Elsa; Guerin, Scott A; Cosmides, Leda; Tooby, John; Miller, Michael B

    2006-01-01

    Baron-Cohen (1995) proposed that the theory of mind (ToM) inference system evolved to promote strategic social interaction. Social exchange--a form of co-operation for mutual benefit--involves strategic social interaction and requires ToM inferences about the contents of other individuals' mental states, especially their desires, goals, and intentions. There are behavioral and neuropsychological dissociations between reasoning about social exchange and reasoning about equivalent problems tapping other, more general content domains. It has therefore been proposed that social exchange behavior is regulated by social contract algorithms: a domain-specific inference system that is functionally specialized for reasoning about social exchange. We report an fMRI study using the Wason selection task that provides further support for this hypothesis. Precautionary rules share so many properties with social exchange rules--they are conditional, deontic, and involve subjective utilities--that most reasoning theories claim they are processed by the same neurocomputational machinery. Nevertheless, neuroimaging shows that reasoning about social exchange activates brain areas not activated by reasoning about precautionary rules, and vice versa. As predicted, neural correlates of ToM (anterior and posterior temporal cortex) were activated when subjects interpreted social exchange rules, but not precautionary rules (where ToM inferences are unnecessary). We argue that the interaction between ToM and social contract algorithms can be reciprocal: social contract algorithms requires ToM inferences, but their functional logic also allows ToM inferences to be made. By considering interactions between ToM in the narrower sense (belief-desire reasoning) and all the social inference systems that create the logic of human social interaction--ones that enable as well as use inferences about the content of mental states--a broader conception of ToM may emerge: a computational model embodying a Theory of Human Nature (ToHN).

  15. Mathematical properties and bounds on haplotyping populations by pure parsimony.

    PubMed

    Wang, I-Lin; Chang, Chia-Yuan

    2011-06-01

    Although the haplotype data can be used to analyze the function of DNA, due to the significant efforts required in collecting the haplotype data, usually the genotype data is collected and then the population haplotype inference (PHI) problem is solved to infer haplotype data from genotype data for a population. This paper investigates the PHI problem based on the pure parsimony criterion (HIPP), which seeks the minimum number of distinct haplotypes to infer a given genotype data. We analyze the mathematical structure and properties for the HIPP problem, propose techniques to reduce the given genotype data into an equivalent one of much smaller size, and analyze the relations of genotype data using a compatible graph. Based on the mathematical properties in the compatible graph, we propose a maximal clique heuristic to obtain an upper bound, and a new polynomial-sized integer linear programming formulation to obtain a lower bound for the HIPP problem. Copyright © 2011 Elsevier Inc. All rights reserved.

  16. Neural Net Gains Estimation Based on an Equivalent Model

    PubMed Central

    Aguilar Cruz, Karen Alicia; Medel Juárez, José de Jesús; Fernández Muñoz, José Luis; Esmeralda Vigueras Velázquez, Midory

    2016-01-01

    A model of an Equivalent Artificial Neural Net (EANN) describes the gains set, viewed as parameters in a layer, and this consideration is a reproducible process, applicable to a neuron in a neural net (NN). The EANN helps to estimate the NN gains or parameters, so we propose two methods to determine them. The first considers a fuzzy inference combined with the traditional Kalman filter, obtaining the equivalent model and estimating in a fuzzy sense the gains matrix A and the proper gain K into the traditional filter identification. The second develops a direct estimation in state space, describing an EANN using the expected value and the recursive description of the gains estimation. Finally, a comparison of both descriptions is performed; highlighting the analytical method describes the neural net coefficients in a direct form, whereas the other technique requires selecting into the Knowledge Base (KB) the factors based on the functional error and the reference signal built with the past information of the system. PMID:27366146

  17. Neural Net Gains Estimation Based on an Equivalent Model.

    PubMed

    Aguilar Cruz, Karen Alicia; Medel Juárez, José de Jesús; Fernández Muñoz, José Luis; Esmeralda Vigueras Velázquez, Midory

    2016-01-01

    A model of an Equivalent Artificial Neural Net (EANN) describes the gains set, viewed as parameters in a layer, and this consideration is a reproducible process, applicable to a neuron in a neural net (NN). The EANN helps to estimate the NN gains or parameters, so we propose two methods to determine them. The first considers a fuzzy inference combined with the traditional Kalman filter, obtaining the equivalent model and estimating in a fuzzy sense the gains matrix A and the proper gain K into the traditional filter identification. The second develops a direct estimation in state space, describing an EANN using the expected value and the recursive description of the gains estimation. Finally, a comparison of both descriptions is performed; highlighting the analytical method describes the neural net coefficients in a direct form, whereas the other technique requires selecting into the Knowledge Base (KB) the factors based on the functional error and the reference signal built with the past information of the system.

  18. Measurement equivalence and differential item functioning in family psychology.

    PubMed

    Bingenheimer, Jeffrey B; Raudenbush, Stephen W; Leventhal, Tama; Brooks-Gunn, Jeanne

    2005-09-01

    Several hypotheses in family psychology involve comparisons of sociocultural groups. Yet the potential for cross-cultural inequivalence in widely used psychological measurement instruments threatens the validity of inferences about group differences. Methods for dealing with these issues have been developed via the framework of item response theory. These methods deal with an important type of measurement inequivalence, called differential item functioning (DIF). The authors introduce DIF analytic methods, linking them to a well-established framework for conceptualizing cross-cultural measurement equivalence in psychology (C.H. Hui and H.C. Triandis, 1985). They illustrate the use of DIF methods using data from the Project on Human Development in Chicago Neighborhoods (PHDCN). Focusing on the Caregiver Warmth and Environmental Organization scales from the PHDCN's adaptation of the Home Observation for Measurement of the Environment Inventory, the authors obtain results that exemplify the range of outcomes that may result when these methods are applied to psychological measurement instruments. (c) 2005 APA, all rights reserved

  19. Black-footed ferrets and Siberian polecats as ecological surrogates and ecological equivalents

    USGS Publications Warehouse

    Biggins, D.E.; Hanebury, L.R.; Miller, B.J.; Powell, R.A.

    2011-01-01

    Ecologically equivalent species serve similar functions in different communities, and an ecological surrogate species can be used as a substitute for an equivalent species in a community. Siberian polecats (Mustela eversmanii) and black-footed ferrets (M. nigripes) have long been considered ecological equivalents. Polecats also have been used as investigational surrogates for black-footed ferrets, yet the similarities and differences between the 2 species are poorly understood. We contrasted activity patterns of radiotagged polecats and ferrets released onto ferret habitat. Ferrets tended to be nocturnal and most active after midnight. Polecats were not highly selective for any period of the day or night. Ferrets and polecats moved most during brightly moonlit nights. The diel activity pattern of ferrets was consistent with avoidance of coyotes (Canis latrans) and diurnal birds of prey. Similarly, polecat activity was consistent with avoidance of red foxes (Vulpes vulpes) in their natural range. Intraguild predation (including interference competition) is inferred as a selective force influencing behaviors of these mustelines. Examination of our data suggests that black-footed ferrets and Siberian polecats might be ecological equivalents but are not perfect surrogates. Nonetheless, polecats as surrogates for black-footed ferrets have provided critical insight needed, especially related to predation, to improve the success of ferret reintroductions. ?? 2011 American Society of Mammalogists.

  20. A major crustal feature in the southeastern United States inferred from the MAGSAT equivalent source anomaly field

    NASA Technical Reports Server (NTRS)

    Ruder, M. E.; Alexander, S. S.

    1985-01-01

    The MAGSAT equivalent-source anomaly field evaluated at 325 km altitude depicts a prominent anomaly centered over southeast Georgia, which is adjacent to the high-amplitude positive Kentucky anomaly. To overcome the satellite resolution constraint in studying this anomaly, conventional geophysical data were included in analysis: Bouguer gravity, seismic reflection and refraction, aeromagnetic, and in-situ stress-strain measurements. This integrated geophysical approach, infers more specifically the nature and extent of the crustal and/or lithospheric source of the Georgia MAGSAT anomaly. Physical properties and tectonic evolution of the area are all important in the interpretation.

  1. Modification of the USLE K factor for soil erodibility assessment on calcareous soils in Iran

    NASA Astrophysics Data System (ADS)

    Ostovari, Yaser; Ghorbani-Dashtaki, Shoja; Bahrami, Hossein-Ali; Naderi, Mehdi; Dematte, Jose Alexandre M.; Kerry, Ruth

    2016-11-01

    The measurement of soil erodibility (K) in the field is tedious, time-consuming and expensive; therefore, its prediction through pedotransfer functions (PTFs) could be far less costly and time-consuming. The aim of this study was to develop new PTFs to estimate the K factor using multiple linear regression, Mamdani fuzzy inference systems, and artificial neural networks. For this purpose, K was measured in 40 erosion plots with natural rainfall. Various soil properties including the soil particle size distribution, calcium carbonate equivalent, organic matter, permeability, and wet-aggregate stability were measured. The results showed that the mean measured K was 0.014 t h MJ- 1 mm- 1 and 2.08 times less than the estimated mean K (0.030 t h MJ- 1 mm- 1) using the USLE model. Permeability, wet-aggregate stability, very fine sand, and calcium carbonate were selected as independent variables by forward stepwise regression in order to assess the ability of multiple linear regression, Mamdani fuzzy inference systems and artificial neural networks to predict K. The calcium carbonate equivalent, which is not accounted for in the USLE model, had a significant impact on K in multiple linear regression due to its strong influence on the stability of aggregates and soil permeability. Statistical indices in validation and calibration datasets determined that the artificial neural networks method with the highest R2, lowest RMSE, and lowest ME was the best model for estimating the K factor. A strong correlation (R2 = 0.81, n = 40, p < 0.05) between the estimated K from multiple linear regression and measured K indicates that the use of calcium carbonate equivalent as a predictor variable gives a better estimation of K in areas with calcareous soils.

  2. Pattern formation, logistics, and maximum path probability

    NASA Astrophysics Data System (ADS)

    Kirkaldy, J. S.

    1985-05-01

    The concept of pattern formation, which to current researchers is a synonym for self-organization, carries the connotation of deductive logic together with the process of spontaneous inference. Defining a pattern as an equivalence relation on a set of thermodynamic objects, we establish that a large class of irreversible pattern-forming systems, evolving along idealized quasisteady paths, approaches the stable steady state as a mapping upon the formal deductive imperatives of a propositional function calculus. In the preamble the classical reversible thermodynamics of composite systems is analyzed as an externally manipulated system of space partitioning and classification based on ideal enclosures and diaphragms. The diaphragms have discrete classification capabilities which are designated in relation to conserved quantities by descriptors such as impervious, diathermal, and adiabatic. Differentiability in the continuum thermodynamic calculus is invoked as equivalent to analyticity and consistency in the underlying class or sentential calculus. The seat of inference, however, rests with the thermodynamicist. In the transition to an irreversible pattern-forming system the defined nature of the composite reservoirs remains, but a given diaphragm is replaced by a pattern-forming system which by its nature is a spontaneously evolving volume partitioner and classifier of invariants. The seat of volition or inference for the classification system is thus transferred from the experimenter or theoretician to the diaphragm, and with it the full deductive facility. The equivalence relations or partitions associated with the emerging patterns may thus be associated with theorems of the natural pattern-forming calculus. The entropy function, together with its derivatives, is the vehicle which relates the logistics of reservoirs and diaphragms to the analog logistics of the continuum. Maximum path probability or second-order differentiability of the entropy in isolation are sufficiently strong interpretations of the second law of thermodynamics to define the approach to and the nature of patterned stable steady states. For many pattern-forming systems these principles define quantifiable stable states as maxima or minima (or both) in the dissipation. An elementary statistical-mechanical proof is offered. To turn the argument full circle, the transformations of the partitions and classes which are predicated upon such minimax entropic paths can through digital modeling be directly identified with the syntactic and inferential elements of deductive logic. It follows therefore that all self-organizing or pattern-forming systems which possess stable steady states approach these states according to the imperatives of formal logic, the optimum pattern with its rich endowment of equivalence relations representing the central theorem of the associated calculus. Logic is thus ``the stuff of the universe,'' and biological evolution with its culmination in the human brain is the most significant example of all the irreversible pattern-forming processes. We thus conclude with a few remarks on the relevance of the contribution to the theory of evolution and to research on artificial intelligence.

  3. Abductive Equivalential Translation and its application to Natural Language Database Interfacing

    NASA Astrophysics Data System (ADS)

    Rayner, Manny

    1994-05-01

    The thesis describes a logical formalization of natural-language database interfacing. We assume the existence of a ``natural language engine'' capable of mediating between surface linguistic string and their representations as ``literal'' logical forms: the focus of interest will be the question of relating ``literal'' logical forms to representations in terms of primitives meaningful to the underlying database engine. We begin by describing the nature of the problem, and show how a variety of interface functionalities can be considered as instances of a type of formal inference task which we call ``Abductive Equivalential Translation'' (AET); functionalities which can be reduced to this form include answering questions, responding to commands, reasoning about the completeness of answers, answering meta-questions of type ``Do you know...'', and generating assertions and questions. In each case, a ``linguistic domain theory'' (LDT) Γ and an input formula F are given, and the goal is to construct a formula with certain properties which is equivalent to F, given Γ and a set of permitted assumptions. If the LDT is of a certain specified type, whose formulas are either conditional equivalences or Horn-clauses, we show that the AET problem can be reduced to a goal-directed inference method. We present an abstract description of this method, and sketch its realization in Prolog. The relationship between AET and several problems previously discussed in the literature is discussed. In particular, we show how AET can provide a simple and elegant solution to the so-called ``Doctor on Board'' problem, and in effect allows a ``relativization'' of the Closed World Assumption. The ideas in the thesis have all been implemented concretely within the SRI CLARE project, using a real projects and payments database. The LDT for the example database is described in detail, and examples of the types of functionality that can be achieved within the example domain are presented.

  4. Application of p-i-n photodiodes to charged particle fluence measurements beyond 1015 1-MeV-neutron-equivalent/cm2

    NASA Astrophysics Data System (ADS)

    Hoeferkamp, M. R.; Grummer, A.; Rajen, I.; Seidel, S.

    2018-05-01

    Methods are developed for the application of forward biased p-i-n photodiodes to measurements of charged particle fluence beyond 1015 1-MeV-neutron-equivalent/cm2. An order of magnitude extension of the regime where forward voltage can be used to infer fluence is achieved for OSRAM BPW34F devices.

  5. The Mean Metal-line Absorption Spectrum of Damped Ly α Systems in BOSS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mas-Ribas, Lluís; Miralda-Escudé, Jordi; Pérez-Ràfols, Ignasi

    We study the mean absorption spectrum of the Damped Ly α (DLA) population at z ∼ 2.6 by stacking normalized, rest-frame-shifted spectra of ∼27,000 DLA systems from the DR12 of the Baryon Oscillation Spectroscopic Survey (BOSS)/SDSS-III. We measure the equivalent widths of 50 individual metal absorption lines in five intervals of DLA hydrogen column density, five intervals of DLA redshift, and overall mean equivalent widths for an additional 13 absorption features from groups of strongly blended lines. The mean equivalent width of low-ionization lines increases with N {sub H} {sub i}, whereas for high-ionization lines the increase is much weaker.more » The mean metal line equivalent widths decrease by a factor ∼1.1–1.5 from z ∼ 2.1 to z ∼ 3.5, with small or no differences between low- and high-ionization species. We develop a theoretical model, inspired by the presence of multiple absorption components observed in high-resolution spectra, to infer mean metal column densities from the equivalent widths of partially saturated metal lines. We apply this model to 14 low-ionization species and to Al iii, S iii, Si iii, C iv, Si iv, N v, and O vi. We use an approximate derivation for separating the equivalent width contributions of several lines to blended absorption features, and infer mean equivalent widths and column densities from lines of the additional species N i, Zn ii, C ii*, Fe iii, and S iv. Several of these mean column densities of metal lines in DLAs are obtained for the first time; their values generally agree with measurements of individual DLAs from high-resolution, high signal-to-noise ratio spectra when they are available.« less

  6. Inference of beliefs and emotions in patients with Alzheimer's disease.

    PubMed

    Zaitchik, Deborah; Koff, Elissa; Brownell, Hiram; Winner, Ellen; Albert, Marilyn

    2006-01-01

    The present study compared 20 patients with mild to moderate Alzheimer's disease with 20 older controls (ages 69-94 years) on their ability to make inferences about emotions and beliefs in others. Six tasks tested their ability to make 1st-order and 2nd-order inferences as well as to offer explanations and moral evaluations of human action by appeal to emotions and beliefs. Results showed that the ability to infer emotions and beliefs in 1st-order tasks remains largely intact in patients with mild to moderate Alzheimer's. Patients were able to use mental states in the prediction, explanation, and moral evaluation of behavior. Impairment on 2nd-order tasks involving inference of mental states was equivalent to impairment on control tasks, suggesting that patients' difficulty is secondary to their cognitive impairments. ((c) 2006 APA, all rights reserved).

  7. Direct measurement of the effective infrared dielectric response of a highly doped semiconductor metamaterial.

    PubMed

    Al Mohtar, Abeer; Kazan, Michel; Taliercio, Thierry; Cerutti, Laurent; Blaize, Sylvain; Bruyant, Aurélien

    2017-03-24

    We have investigated the effective dielectric response of a subwavelength grating made of highly doped semiconductors (HDS) excited in reflection, using numerical simulations and spectroscopic measurement. The studied system can exhibit strong localized surface resonances and has, therefore, a great potential for surface-enhanced infrared absorption (SEIRA) spectroscopy application. It consists of a highly doped InAsSb grating deposited on lattice-matched GaSb. The numerical analysis demonstrated that the resonance frequencies can be inferred from the dielectric function of an equivalent homogeneous slab by accounting for the complex reflectivity of the composite layer. Fourier transform infrared reflectivity (FTIR) measurements, analyzed with the Kramers-Kronig conversion technique, were used to deduce the effective response in reflection of the investigated system. From the knowledge of this phenomenological dielectric function, transversal and longitudinal energy-loss functions were extracted and attributed to transverse and longitudinal resonance modes frequencies.

  8. A Novel Way to Relate Ontology Classes

    PubMed Central

    Choksi, Ami T.; Jinwala, Devesh C.

    2015-01-01

    The existing ontologies in the semantic web typically have anonymous union and intersection classes. The anonymous classes are limited in scope and may not be part of the whole inference process. The tools, namely, the pellet, the jena, and the protégé, interpret collection classes as (a) equivalent/subclasses of union class and (b) superclasses of intersection class. As a result, there is a possibility that the tools will produce error prone inference results for relations, namely, sub-, union, intersection, equivalent relations, and those dependent on these relations, namely, complement. To verify whether a class is complement of other involves utilization of sub- and equivalent relations. Motivated by the same, we (i) refine the test data set of the conference ontology by adding named, union, and intersection classes and (ii) propose a match algorithm to (a) calculate corrected subclasses list, (b) correctly relate intersection and union classes with their collection classes, and (c) match union, intersection, sub-, complement, and equivalent classes in a proper sequence, to avoid error prone match results. We compare the results of our algorithms with those of a candidate reasoner, namely, the pellet reasoner. To the best of our knowledge, ours is a unique attempt in establishing a novel way to relate ontology classes. PMID:25984560

  9. Bayesian analysis of time-series data under case-crossover designs: posterior equivalence and inference.

    PubMed

    Li, Shi; Mukherjee, Bhramar; Batterman, Stuart; Ghosh, Malay

    2013-12-01

    Case-crossover designs are widely used to study short-term exposure effects on the risk of acute adverse health events. While the frequentist literature on this topic is vast, there is no Bayesian work in this general area. The contribution of this paper is twofold. First, the paper establishes Bayesian equivalence results that require characterization of the set of priors under which the posterior distributions of the risk ratio parameters based on a case-crossover and time-series analysis are identical. Second, the paper studies inferential issues under case-crossover designs in a Bayesian framework. Traditionally, a conditional logistic regression is used for inference on risk-ratio parameters in case-crossover studies. We consider instead a more general full likelihood-based approach which makes less restrictive assumptions on the risk functions. Formulation of a full likelihood leads to growth in the number of parameters proportional to the sample size. We propose a semi-parametric Bayesian approach using a Dirichlet process prior to handle the random nuisance parameters that appear in a full likelihood formulation. We carry out a simulation study to compare the Bayesian methods based on full and conditional likelihood with the standard frequentist approaches for case-crossover and time-series analysis. The proposed methods are illustrated through the Detroit Asthma Morbidity, Air Quality and Traffic study, which examines the association between acute asthma risk and ambient air pollutant concentrations. © 2013, The International Biometric Society.

  10. Spatial assessment of land degradation through key ecosystem services: The role of globally available data.

    PubMed

    Cerretelli, Stefania; Poggio, Laura; Gimona, Alessandro; Yakob, Getahun; Boke, Shiferaw; Habte, Mulugeta; Coull, Malcolm; Peressotti, Alessandro; Black, Helaina

    2018-07-01

    Land degradation is a serious issue especially in dry and developing countries leading to ecosystem services (ESS) degradation due to soil functions' depletion. Reliably mapping land degradation spatial distribution is therefore important for policy decisions. The main objectives of this paper were to infer land degradation through ESS assessment and compare the modelling results obtained using different sets of data. We modelled important physical processes (sediment erosion and nutrient export) and the equivalent ecosystem services (sediment and nutrient retention) to infer land degradation in an area in the Ethiopian Great Rift Valley. To model soil erosion/retention capability, and nitrogen export/retention capability, two datasets were used: a 'global' dataset derived from existing global-coverage data and a hybrid dataset where global data were integrated with data from local surveys. The results showed that ESS assessments can be used to infer land degradation and identify priority areas for interventions. The comparison between the modelling results of the two different input datasets showed that caution is necessary if only global-coverage data are used at a local scale. In remote and data-poor areas, an approach that integrates global data with targeted local sampling campaigns might be a good compromise to use ecosystem services in decision-making. Copyright © 2018. Published by Elsevier B.V.

  11. On methods of estimating cosmological bulk flows

    NASA Astrophysics Data System (ADS)

    Nusser, Adi

    2016-01-01

    We explore similarities and differences between several estimators of the cosmological bulk flow, B, from the observed radial peculiar velocities of galaxies. A distinction is made between two theoretical definitions of B as a dipole moment of the velocity field weighted by a radial window function. One definition involves the three-dimensional (3D) peculiar velocity, while the other is based on its radial component alone. Different methods attempt at inferring B for either of these definitions which coincide only for the case of a velocity field which is constant in space. We focus on the Wiener Filtering (WF) and the Constrained Minimum Variance (CMV) methodologies. Both methodologies require a prior expressed in terms of the radial velocity correlation function. Hoffman et al. compute B in Top-Hat windows from a WF realization of the 3D peculiar velocity field. Feldman et al. infer B directly from the observed velocities for the second definition of B. The WF methodology could easily be adapted to the second definition, in which case it will be equivalent to the CMV with the exception of the imposed constraint. For a prior with vanishing correlations or very noisy data, CMV reproduces the standard Maximum Likelihood estimation for B of the entire sample independent of the radial weighting function. Therefore, this estimator is likely more susceptible to observational biases that could be present in measurements of distant galaxies. Finally, two additional estimators are proposed.

  12. The orbital PDF: general inference of the gravitational potential from steady-state tracers

    NASA Astrophysics Data System (ADS)

    Han, Jiaxin; Wang, Wenting; Cole, Shaun; Frenk, Carlos S.

    2016-02-01

    We develop two general methods to infer the gravitational potential of a system using steady-state tracers, I.e. tracers with a time-independent phase-space distribution. Combined with the phase-space continuity equation, the time independence implies a universal orbital probability density function (oPDF) dP(λ|orbit) ∝ dt, where λ is the coordinate of the particle along the orbit. The oPDF is equivalent to Jeans theorem, and is the key physical ingredient behind most dynamical modelling of steady-state tracers. In the case of a spherical potential, we develop a likelihood estimator that fits analytical potentials to the system and a non-parametric method (`phase-mark') that reconstructs the potential profile, both assuming only the oPDF. The methods involve no extra assumptions about the tracer distribution function and can be applied to tracers with any arbitrary distribution of orbits, with possible extension to non-spherical potentials. The methods are tested on Monte Carlo samples of steady-state tracers in dark matter haloes to show that they are unbiased as well as efficient. A fully documented C/PYTHON code implementing our method is freely available at a GitHub repository linked from http://icc.dur.ac.uk/data/#oPDF.

  13. Dark matter, long-range forces, and large-scale structure

    NASA Technical Reports Server (NTRS)

    Gradwohl, Ben-Ami; Frieman, Joshua A.

    1992-01-01

    If the dark matter in galaxies and clusters is nonbaryonic, it can interact with additional long-range fields that are invisible to experimental tests of the equivalence principle. We discuss the astrophysical and cosmological implications of a long-range force coupled only to the dark matter and find rather tight constraints on its strength. If the force is repulsive (attractive), the masses of galaxy groups and clusters (and the mean density of the universe inferred from them) have been systematically underestimated (overestimated). We explore the consequent effects on the two-point correlation function, large-scale velocity flows, and microwave background anisotropies, for models with initial scale-invariant adiabatic perturbations and cold dark matter.

  14. Time-Resolved Molecular Characterization of Limonene/Ozone Aerosol using High-Resolution Electrospray Ionization Mass Spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bateman, Adam P.; Nizkorodov, Serguei; Laskin, Julia

    2009-09-09

    Molecular composition of limonene/O3 secondary organic aerosol (SOA) was investigated using high-resolution electrospray ionization mass spectrometry (HR-ESI-MS) as a function of reaction time. SOA was generated by ozonation of D-limonene in a reaction chamber and sampled at different time intervals using a cascade impactor. The SOA samples were extracted into acetonitrile and analyzed using a HR-ESI-MS instrument with a resolving power of 100,000 (m/Δm). The resulting mass spectra provided detailed information about the extent of oxidation inferred from the O:C ratios, double bond equivalency (DBE) factors, and aromaticity indexes (AI) in hundreds of identified individual SOA species.

  15. Oracle estimation of parametric models under boundary constraints.

    PubMed

    Wong, Kin Yau; Goldberg, Yair; Fine, Jason P

    2016-12-01

    In many classical estimation problems, the parameter space has a boundary. In most cases, the standard asymptotic properties of the estimator do not hold when some of the underlying true parameters lie on the boundary. However, without knowledge of the true parameter values, confidence intervals constructed assuming that the parameters lie in the interior are generally over-conservative. A penalized estimation method is proposed in this article to address this issue. An adaptive lasso procedure is employed to shrink the parameters to the boundary, yielding oracle inference which adapt to whether or not the true parameters are on the boundary. When the true parameters are on the boundary, the inference is equivalent to that which would be achieved with a priori knowledge of the boundary, while if the converse is true, the inference is equivalent to that which is obtained in the interior of the parameter space. The method is demonstrated under two practical scenarios, namely the frailty survival model and linear regression with order-restricted parameters. Simulation studies and real data analyses show that the method performs well with realistic sample sizes and exhibits certain advantages over standard methods. © 2016, The International Biometric Society.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    La Russa, D

    Purpose: The purpose of this project is to develop a robust method of parameter estimation for a Poisson-based TCP model using Bayesian inference. Methods: Bayesian inference was performed using the PyMC3 probabilistic programming framework written in Python. A Poisson-based TCP regression model that accounts for clonogen proliferation was fit to observed rates of local relapse as a function of equivalent dose in 2 Gy fractions for a population of 623 stage-I non-small-cell lung cancer patients. The Slice Markov Chain Monte Carlo sampling algorithm was used to sample the posterior distributions, and was initiated using the maximum of the posterior distributionsmore » found by optimization. The calculation of TCP with each sample step required integration over the free parameter α, which was performed using an adaptive 24-point Gauss-Legendre quadrature. Convergence was verified via inspection of the trace plot and posterior distribution for each of the fit parameters, as well as with comparisons of the most probable parameter values with their respective maximum likelihood estimates. Results: Posterior distributions for α, the standard deviation of α (σ), the average tumour cell-doubling time (Td), and the repopulation delay time (Tk), were generated assuming α/β = 10 Gy, and a fixed clonogen density of 10{sup 7} cm−{sup 3}. Posterior predictive plots generated from samples from these posterior distributions are in excellent agreement with the observed rates of local relapse used in the Bayesian inference. The most probable values of the model parameters also agree well with maximum likelihood estimates. Conclusion: A robust method of performing Bayesian inference of TCP data using a complex TCP model has been established.« less

  17. Human factors of intelligent computer aided display design

    NASA Technical Reports Server (NTRS)

    Hunt, R. M.

    1985-01-01

    Design concepts for a decision support system being studied at NASA Langley as an aid to visual display unit (VDU) designers are described. Ideally, human factors should be taken into account by VDU designers. In reality, although the human factors database on VDUs is small, such systems must be constantly developed. Human factors are therefore a secondary consideration. An expert system will thus serve mainly in an advisory capacity. Functions can include facilitating the design process by shortening the time to generate and alter drawings, enhancing the capability of breaking design requirements down into simpler functions, and providing visual displays equivalent to the final product. The VDU system could also discriminate, and display the difference, between designer decisions and machine inferences. The system could also aid in analyzing the effects of designer choices on future options and in ennunciating when there are data available on a design selections.

  18. Free energy reconstruction from steered dynamics without post-processing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Athenes, Manuel, E-mail: Manuel.Athenes@cea.f; Condensed Matter and Materials Division, Physics and Life Sciences Directorate, LLNL, Livermore, CA 94551; Marinica, Mihai-Cosmin

    2010-09-20

    Various methods achieving importance sampling in ensembles of nonequilibrium trajectories enable one to estimate free energy differences and, by maximum-likelihood post-processing, to reconstruct free energy landscapes. Here, based on Bayes theorem, we propose a more direct method in which a posterior likelihood function is used both to construct the steered dynamics and to infer the contribution to equilibrium of all the sampled states. The method is implemented with two steering schedules. First, using non-autonomous steering, we calculate the migration barrier of the vacancy in Fe-{alpha}. Second, using an autonomous scheduling related to metadynamics and equivalent to temperature-accelerated molecular dynamics, wemore » accurately reconstruct the two-dimensional free energy landscape of the 38-atom Lennard-Jones cluster as a function of an orientational bond-order parameter and energy, down to the solid-solid structural transition temperature of the cluster and without maximum-likelihood post-processing.« less

  19. Equivalence testing using existing reference data: An example with genetically modified and conventional crops in animal feeding studies.

    PubMed

    van der Voet, Hilko; Goedhart, Paul W; Schmidt, Kerstin

    2017-11-01

    An equivalence testing method is described to assess the safety of regulated products using relevant data obtained in historical studies with assumedly safe reference products. The method is illustrated using data from a series of animal feeding studies with genetically modified and reference maize varieties. Several criteria for quantifying equivalence are discussed, and study-corrected distribution-wise equivalence is selected as being appropriate for the example case study. An equivalence test is proposed based on a high probability of declaring equivalence in a simplified situation, where there is no between-group variation, where the historical and current studies have the same residual variance, and where the current study is assumed to have a sample size as set by a regulator. The method makes use of generalized fiducial inference methods to integrate uncertainties from both the historical and the current data. Copyright © 2017 The Authors. Published by Elsevier Ltd.. All rights reserved.

  20. Assessment of Cognitive Function in the Water Maze Task: Maximizing Data Collection and Analysis in Animal Models of Brain Injury.

    PubMed

    Whiting, Mark D; Kokiko-Cochran, Olga N

    2016-01-01

    Animal models play a critical role in understanding the biomechanical, pathophysiological, and behavioral consequences of traumatic brain injury (TBI). In preclinical studies, cognitive impairment induced by TBI is often assessed using the Morris water maze (MWM). Frequently described as a hippocampally dependent spatial navigation task, the MWM is a highly integrative behavioral task that requires intact functioning in numerous brain regions and involves an interdependent set of mnemonic and non-mnemonic processes. In this chapter, we review the special considerations involved in using the MWM in animal models of TBI, with an emphasis on maximizing the degree of information extracted from performance data. We include a theoretical framework for examining deficits in discrete stages of cognitive function and offer suggestions for how to make inferences regarding the specific nature of TBI-induced cognitive impairment. The ultimate goal is more precise modeling of the animal equivalents of the cognitive deficits seen in human TBI.

  1. Beyond statistical inference: A decision theory for science

    PubMed Central

    KILLEEN, PETER R.

    2008-01-01

    Traditional null hypothesis significance testing does not yield the probability of the null or its alternative and, therefore, cannot logically ground scientific decisions. The decision theory proposed here calculates the expected utility of an effect on the basis of (1) the probability of replicating it and (2) a utility function on its size. It takes significance tests—which place all value on the replicability of an effect and none on its magnitude—as a special case, one in which the cost of a false positive is revealed to be an order of magnitude greater than the value of a true positive. More realistic utility functions credit both replicability and effect size, integrating them for a single index of merit. The analysis incorporates opportunity cost and is consistent with alternate measures of effect size, such as r2 and information transmission, and with Bayesian model selection criteria. An alternate formulation is functionally equivalent to the formal theory, transparent, and easy to compute. PMID:17201351

  2. Beyond statistical inference: a decision theory for science.

    PubMed

    Killeen, Peter R

    2006-08-01

    Traditional null hypothesis significance testing does not yield the probability of the null or its alternative and, therefore, cannot logically ground scientific decisions. The decision theory proposed here calculates the expected utility of an effect on the basis of (1) the probability of replicating it and (2) a utility function on its size. It takes significance tests--which place all value on the replicability of an effect and none on its magnitude--as a special case, one in which the cost of a false positive is revealed to be an order of magnitude greater than the value of a true positive. More realistic utility functions credit both replicability and effect size, integrating them for a single index of merit. The analysis incorporates opportunity cost and is consistent with alternate measures of effect size, such as r2 and information transmission, and with Bayesian model selection criteria. An alternate formulation is functionally equivalent to the formal theory, transparent, and easy to compute.

  3. Spontaneous evaluative inferences and their relationship to spontaneous trait inferences.

    PubMed

    Schneid, Erica D; Carlston, Donal E; Skowronski, John J

    2015-05-01

    Three experiments are reported that explore affectively based spontaneous evaluative impressions (SEIs) of stimulus persons. Experiments 1 and 2 used modified versions of the savings in relearning paradigm (Carlston & Skowronski, 1994) to confirm the occurrence of SEIs, indicating that they are equivalent whether participants are instructed to form trait impressions, evaluative impressions, or neither. These experiments also show that SEIs occur independently of explicit recall for the trait implications of the stimuli. Experiment 3 provides a single dissociation test to distinguish SEIs from spontaneous trait inferences (STIs), showing that disrupting cognitive processing interferes with a trait-based prediction task that presumably reflects STIs, but not with an affectively based social approach task that presumably reflects SEIs. Implications of these findings for the potential independence of spontaneous trait and evaluative inferences, as well as limitations and important steps for future study are discussed. (c) 2015 APA, all rights reserved).

  4. Identification of Boolean Network Models From Time Series Data Incorporating Prior Knowledge.

    PubMed

    Leifeld, Thomas; Zhang, Zhihua; Zhang, Ping

    2018-01-01

    Motivation: Mathematical models take an important place in science and engineering. A model can help scientists to explain dynamic behavior of a system and to understand the functionality of system components. Since length of a time series and number of replicates is limited by the cost of experiments, Boolean networks as a structurally simple and parameter-free logical model for gene regulatory networks have attracted interests of many scientists. In order to fit into the biological contexts and to lower the data requirements, biological prior knowledge is taken into consideration during the inference procedure. In the literature, the existing identification approaches can only deal with a subset of possible types of prior knowledge. Results: We propose a new approach to identify Boolean networks from time series data incorporating prior knowledge, such as partial network structure, canalizing property, positive and negative unateness. Using vector form of Boolean variables and applying a generalized matrix multiplication called the semi-tensor product (STP), each Boolean function can be equivalently converted into a matrix expression. Based on this, the identification problem is reformulated as an integer linear programming problem to reveal the system matrix of Boolean model in a computationally efficient way, whose dynamics are consistent with the important dynamics captured in the data. By using prior knowledge the number of candidate functions can be reduced during the inference. Hence, identification incorporating prior knowledge is especially suitable for the case of small size time series data and data without sufficient stimuli. The proposed approach is illustrated with the help of a biological model of the network of oxidative stress response. Conclusions: The combination of efficient reformulation of the identification problem with the possibility to incorporate various types of prior knowledge enables the application of computational model inference to systems with limited amount of time series data. The general applicability of this methodological approach makes it suitable for a variety of biological systems and of general interest for biological and medical research.

  5. Equivalent circuit models for interpreting impedance perturbation spectroscopy data

    NASA Astrophysics Data System (ADS)

    Smith, R. Lowell

    2004-07-01

    As in-situ structural integrity monitoring disciplines mature, there is a growing need to process sensor/actuator data efficiently in real time. Although smaller, faster embedded processors will contribute to this, it is also important to develop straightforward, robust methods to reduce the overall computational burden for practical applications of interest. This paper addresses the use of equivalent circuit modeling techniques for inferring structure attributes monitored using impedance perturbation spectroscopy. In pioneering work about ten years ago significant progress was associated with the development of simple impedance models derived from the piezoelectric equations. Using mathematical modeling tools currently available from research in ultrasonics and impedance spectroscopy is expected to provide additional synergistic benefits. For purposes of structural health monitoring the objective is to use impedance spectroscopy data to infer the physical condition of structures to which small piezoelectric actuators are bonded. Features of interest include stiffness changes, mass loading, and damping or mechanical losses. Equivalent circuit models are typically simple enough to facilitate the development of practical analytical models of the actuator-structure interaction. This type of parametric structure model allows raw impedance/admittance data to be interpreted optimally using standard multiple, nonlinear regression analysis. One potential long-term outcome is the possibility of cataloging measured viscoelastic properties of the mechanical subsystems of interest as simple lists of attributes and their statistical uncertainties, whose evolution can be followed in time. Equivalent circuit models are well suited for addressing calibration and self-consistency issues such as temperature corrections, Poisson mode coupling, and distributed relaxation processes.

  6. Equivalent statistics and data interpretation.

    PubMed

    Francis, Gregory

    2017-08-01

    Recent reform efforts in psychological science have led to a plethora of choices for scientists to analyze their data. A scientist making an inference about their data must now decide whether to report a p value, summarize the data with a standardized effect size and its confidence interval, report a Bayes Factor, or use other model comparison methods. To make good choices among these options, it is necessary for researchers to understand the characteristics of the various statistics used by the different analysis frameworks. Toward that end, this paper makes two contributions. First, it shows that for the case of a two-sample t test with known sample sizes, many different summary statistics are mathematically equivalent in the sense that they are based on the very same information in the data set. When the sample sizes are known, the p value provides as much information about a data set as the confidence interval of Cohen's d or a JZS Bayes factor. Second, this equivalence means that different analysis methods differ only in their interpretation of the empirical data. At first glance, it might seem that mathematical equivalence of the statistics suggests that it does not matter much which statistic is reported, but the opposite is true because the appropriateness of a reported statistic is relative to the inference it promotes. Accordingly, scientists should choose an analysis method appropriate for their scientific investigation. A direct comparison of the different inferential frameworks provides some guidance for scientists to make good choices and improve scientific practice.

  7. Bayesian Inference for Functional Dynamics Exploring in fMRI Data.

    PubMed

    Guo, Xuan; Liu, Bing; Chen, Le; Chen, Guantao; Pan, Yi; Zhang, Jing

    2016-01-01

    This paper aims to review state-of-the-art Bayesian-inference-based methods applied to functional magnetic resonance imaging (fMRI) data. Particularly, we focus on one specific long-standing challenge in the computational modeling of fMRI datasets: how to effectively explore typical functional interactions from fMRI time series and the corresponding boundaries of temporal segments. Bayesian inference is a method of statistical inference which has been shown to be a powerful tool to encode dependence relationships among the variables with uncertainty. Here we provide an introduction to a group of Bayesian-inference-based methods for fMRI data analysis, which were designed to detect magnitude or functional connectivity change points and to infer their functional interaction patterns based on corresponding temporal boundaries. We also provide a comparison of three popular Bayesian models, that is, Bayesian Magnitude Change Point Model (BMCPM), Bayesian Connectivity Change Point Model (BCCPM), and Dynamic Bayesian Variable Partition Model (DBVPM), and give a summary of their applications. We envision that more delicate Bayesian inference models will be emerging and play increasingly important roles in modeling brain functions in the years to come.

  8. Dark matter and the equivalence principle

    NASA Technical Reports Server (NTRS)

    Frieman, Joshua A.; Gradwohl, Ben-Ami

    1991-01-01

    If the dark matter in galaxies and clusters is nonbaryonic, it can interact with additional long-range fields that are invisible to experimental tests of the equivalence principle. The astrophysical and cosmological implications of a long-range force coupled only to the dark matter are discussed and rather tight constraints on its strength are found. If the force is repulsive (attractive), the masses of galaxy groups and clusters (and the mean density of the universe inferred from them) have been systematically underestimated (overestimated). Such an interaction also has unusual implications for the growth of large-scale structure.

  9. BiKEGG: a COBRA toolbox extension for bridging the BiGG and KEGG databases.

    PubMed

    Jamialahmadi, Oveis; Motamedian, Ehsan; Hashemi-Najafabadi, Sameereh

    2016-10-18

    Development of an interface tool between the Biochemical, Genetic and Genomic (BiGG) and KEGG databases is necessary for simultaneous access to the features of both databases. For this purpose, we present the BiKEGG toolbox, an open source COBRA toolbox extension providing a set of functions to infer the reaction correspondences between the KEGG reaction identifiers and those in the BiGG knowledgebase using a combination of manual verification and computational methods. Inferred reaction correspondences using this approach are supported by evidence from the literature, which provides a higher number of reconciled reactions between these two databases compared to the MetaNetX and MetRxn databases. This set of equivalent reactions is then used to automatically superimpose the predicted fluxes using COBRA methods on classical KEGG pathway maps or to create a customized metabolic map based on the KEGG global metabolic pathway, and to find the corresponding reactions in BiGG based on the genome annotation of an organism in the KEGG database. Customized metabolic maps can be created for a set of pathways of interest, for the whole KEGG global map or exclusively for all pathways for which there exists at least one flux carrying reaction. This flexibility in visualization enables BiKEGG to indicate reaction directionality as well as to visualize the reaction fluxes for different static or dynamic conditions in an animated manner. BiKEGG allows the user to export (1) the output visualized metabolic maps to various standard image formats or save them as a video or animated GIF file, and (2) the equivalent reactions for an organism as an Excel spreadsheet.

  10. Behavioral inference of diving metabolic rate in free-ranging leatherback turtles.

    PubMed

    Bradshaw, Corey J A; McMahon, Clive R; Hays, Graeme C

    2007-01-01

    Good estimates of metabolic rate in free-ranging animals are essential for understanding behavior, distribution, and abundance. For the critically endangered leatherback turtle (Dermochelys coriacea), one of the world's largest reptiles, there has been a long-standing debate over whether this species demonstrates any metabolic endothermy. In short, do leatherbacks have a purely ectothermic reptilian metabolic rate or one that is elevated as a result of regional endothermy? Recent measurements have provided the first estimates of field metabolic rate (FMR) in leatherback turtles using doubly labeled water; however, the technique is prohibitively expensive and logistically difficult and produces estimates that are highly variable across individuals in this species. We therefore examined dive duration and depth data collected for nine free-swimming leatherback turtles over long periods (up to 431 d) to infer aerobic dive limits (ADLs) based on the asymptotic increase in maximum dive duration with depth. From this index of ADL and the known mass-specific oxygen storage capacity (To(2)) of leatherbacks, we inferred diving metabolic rate (DMR) as To2/ADL. We predicted that if leatherbacks conform to the purely ectothermic reptilian model of oxygen consumption, these inferred estimates of DMR should fall between predicted and measured values of reptilian resting and field metabolic rates, as well as being substantially lower than the FMR predicted for an endotherm of equivalent mass. Indeed, our behaviorally derived DMR estimates (mean=0.73+/-0.11 mL O(2) min(-1) kg(-1)) were 3.00+/-0.54 times the resting metabolic rate measured in unrestrained leatherbacks and 0.50+/-0.08 times the average FMR for a reptile of equivalent mass. These DMRs were also nearly one order of magnitude lower than the FMR predicted for an endotherm of equivalent mass. Thus, our findings lend support to the notion that diving leatherback turtles are indeed ectothermic and do not demonstrate elevated metabolic rates that might be expected due to regional endothermy. Their capacity to have a warm body core even in cold water therefore seems to derive from their large size, heat exchangers, thermal inertia, and insulating fat layers and not from an elevated metabolic rate.

  11. On Modeling of If-Then Rules for Probabilistic Inference

    DTIC Science & Technology

    1993-02-01

    conditionals b -- a. This space contains A strictly. Contrary to a statement in Gilio and Spezzaferri (1992), these conditionals are equivalent to...Wiley, N.Y. [4] Gilio , A. and Spezzaferri, F. (1992). Knowledge integration for condi- tional probability assessmn-ts. Proceedings 8th Conf. Uncertainty

  12. Mongoose: Creation of a Rad-Hard MIPS R3000

    NASA Technical Reports Server (NTRS)

    Lincoln, Dan; Smith, Brian

    1993-01-01

    This paper describes the development of a 32 Bit, full MIPS R3000 code-compatible Rad-Hard CPU, code named Mongoose. Mongoose progressed from contract award, through the design cycle, to operational silicon in 12 months to meet a space mission for NASA. The goal was the creation of a fully static device capable of operation to the maximum Mil-883 derated speed, worst-case post-rad exposure with full operational integrity. This included consideration of features for functional enhancements relating to mission compatibility and removal of commercial practices not supported by Rad-Hard technology. 'Mongoose' developed from an evolution of LSI Logic's MIPS-I embedded processor, LR33000, code named Cobra, to its Rad-Hard 'equivalent', Mongoose. The term 'equivalent' is used to infer that the core of the processor is functionally identical, allowing the same use and optimizations of the MIPS-I Instruction Set software tool suite for compilation, software program trace, etc. This activity was started in September of 1991 under a contract from NASA-Goddard Space Flight Center (GSFC)-Flight Data Systems. The approach affected a teaming of NASA-GSFC for program development, LSI Logic for system and ASIC design coupled with the Rad-Hard process technology, and Harris (GASD) for Rad-Hard microprocessor design expertise. The program culminated with the generation of Rad-Hard Mongoose prototypes one year later.

  13. Inverse Bayesian inference as a key of consciousness featuring a macroscopic quantum logical structure.

    PubMed

    Gunji, Yukio-Pegio; Shinohara, Shuji; Haruna, Taichi; Basios, Vasileios

    2017-02-01

    To overcome the dualism between mind and matter and to implement consciousness in science, a physical entity has to be embedded with a measurement process. Although quantum mechanics have been regarded as a candidate for implementing consciousness, nature at its macroscopic level is inconsistent with quantum mechanics. We propose a measurement-oriented inference system comprising Bayesian and inverse Bayesian inferences. While Bayesian inference contracts probability space, the newly defined inverse one relaxes the space. These two inferences allow an agent to make a decision corresponding to an immediate change in their environment. They generate a particular pattern of joint probability for data and hypotheses, comprising multiple diagonal and noisy matrices. This is expressed as a nondistributive orthomodular lattice equivalent to quantum logic. We also show that an orthomodular lattice can reveal information generated by inverse syllogism as well as the solutions to the frame and symbol-grounding problems. Our model is the first to connect macroscopic cognitive processes with the mathematical structure of quantum mechanics with no additional assumptions. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  14. Inverse methods-based estimation of plate coupling in a plate motion model governed by mantle flow

    NASA Astrophysics Data System (ADS)

    Ratnaswamy, V.; Stadler, G.; Gurnis, M.

    2013-12-01

    Plate motion is primarily controlled by buoyancy (slab pull) which occurs at convergent plate margins where oceanic plates undergo deformation near the seismogenic zone. Yielding within subducting plates, lateral variations in viscosity, and the strength of seismic coupling between plate margins likely have an important control on plate motion. Here, we wish to infer the inter-plate coupling for different subduction zones, and develop a method for inferring it as a PDE-constrained optimization problem, where the cost functional is the misfit in plate velocities and is constrained by the nonlinear Stokes equation. The inverse models have well resolved slabs, plates, and plate margins in addition to a power law rheology with yielding in the upper mantle. Additionally, a Newton method is used to solve the nonlinear Stokes equation with viscosity bounds. We infer plate boundary strength using an inexact Gauss-Newton method with line search for backtracking. Each inverse model is applied to two simple 2-D scenarios (each with three subduction zones), one with back-arc spreading and one without. For each case we examine the sensitivity of the inversion to the amount of surface velocity used: 1) full surface velocity data and 2) surface velocity data simplified using a single scalar average (2-D equivalent to an Euler pole) for each plate. We can recover plate boundary strength in each case, even in the presence of highly nonlinear flow with extreme variations in viscosity. Additionally, we ascribe an uncertainty in each plate's velocity and perform an uncertainty quantification (UQ) through the Hessian of the misfit in plate velocities. We find that as plate boundaries become strongly coupled, the uncertainty in the inferred plate boundary strength decreases. For very weak, uncoupled subduction zones, the uncertainty of inferred plate margin strength increases since there is little sensitivity between plate margin strength and plate velocity. This result is significant because it implies we can infer which plate boundaries are more coupled (seismically) for a realistic dynamic model of plates and mantle flow.

  15. Education: The Overcoming of Experience.

    ERIC Educational Resources Information Center

    Buchmann, Margret; Schwille, John

    1983-01-01

    Looks at what is entailed when education and experience are regarded as equivalent. Identifies faulty inferences resulting from learning by experience. Considers how experience can close avenues to conceptual and social change. Argues that ideas based on secondhand information are more likely than firsthand experience to manifest both the real and…

  16. 40 CFR 312.10 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    .... Institutional controls means: non-engineered instruments, such as administrative and/or legal controls, that... be presumed to be deserted, or an intent to relinquish possession or control can be inferred from the..., tribe, or U.S. territory (or the Commonwealth of Puerto Rico) and have the equivalent of three (3) years...

  17. 40 CFR 312.10 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    .... Institutional controls means: non-engineered instruments, such as administrative and/or legal controls, that... be presumed to be deserted, or an intent to relinquish possession or control can be inferred from the..., tribe, or U.S. territory (or the Commonwealth of Puerto Rico) and have the equivalent of three (3) years...

  18. 40 CFR 312.10 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    .... Institutional controls means: non-engineered instruments, such as administrative and/or legal controls, that... be presumed to be deserted, or an intent to relinquish possession or control can be inferred from the..., tribe, or U.S. territory (or the Commonwealth of Puerto Rico) and have the equivalent of three (3) years...

  19. LLNL Partners with IBM on Brain-Like Computing Chip

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Essen, Brian

    Lawrence Livermore National Laboratory (LLNL) will receive a first-of-a-kind brain-inspired supercomputing platform for deep learning developed by IBM Research. Based on a breakthrough neurosynaptic computer chip called IBM TrueNorth, the scalable platform will process the equivalent of 16 million neurons and 4 billion synapses and consume the energy equivalent of a hearing aid battery – a mere 2.5 watts of power. The brain-like, neural network design of the IBM Neuromorphic System is able to infer complex cognitive tasks such as pattern recognition and integrated sensory processing far more efficiently than conventional chips.

  20. LLNL Partners with IBM on Brain-Like Computing Chip

    ScienceCinema

    Van Essen, Brian

    2018-06-25

    Lawrence Livermore National Laboratory (LLNL) will receive a first-of-a-kind brain-inspired supercomputing platform for deep learning developed by IBM Research. Based on a breakthrough neurosynaptic computer chip called IBM TrueNorth, the scalable platform will process the equivalent of 16 million neurons and 4 billion synapses and consume the energy equivalent of a hearing aid battery – a mere 2.5 watts of power. The brain-like, neural network design of the IBM Neuromorphic System is able to infer complex cognitive tasks such as pattern recognition and integrated sensory processing far more efficiently than conventional chips.

  1. How Can Comparison Groups Strengthen Regression Discontinuity Designs?

    ERIC Educational Resources Information Center

    Wing, Coady; Cook, Thomas D.

    2011-01-01

    In this paper, the authors examine some of the ways that different types of non-equivalent comparison groups can be used to strengthen causal inferences based on regression discontinuity design (RDD). First, they consider a design that incorporates pre-test data on assignment scores and outcomes that were collected either before the treatment…

  2. Regression Discontinuity Design in Gifted and Talented Education Research

    ERIC Educational Resources Information Center

    Matthews, Michael S.; Peters, Scott J.; Housand, Angela M.

    2012-01-01

    This Methodological Brief introduces the reader to the regression discontinuity design (RDD), which is a method that when used correctly can yield estimates of research treatment effects that are equivalent to those obtained through randomized control trials and can therefore be used to infer causality. However, RDD does not require the random…

  3. Hydrogen Distribution in the Lunar Polar Regions

    NASA Technical Reports Server (NTRS)

    Sanin, A. B.; Mitrofanov, I. G.; Litvak, M. L.; Bakhtin, B. N.; Bodnarik, J. G.; Boynton, W. V.; Chin, G.; Evans, L. G.; Harshmann, K.; Fedosov, F.; hide

    2016-01-01

    We present a method of conversion of the lunar neutron counting rate measured by the Lunar Reconnaissance Orbiter (LRO) Lunar Exploration Neutron Detector (LEND) instrument collimated neutron detectors, to water equivalent hydrogen (WEH) in the top approximately 1 m layer of lunar regolith. Polar maps of the Moon’s inferred hydrogen abundance are presented and discussed.

  4. Evaluating Score Equity Assessment for State NAEP

    ERIC Educational Resources Information Center

    Wells, Craig S.; Baldwin, Su; Hambleton, Ronald K.; Sireci, Stephen G.; Karatonis, Ana; Jirka, Stephen

    2009-01-01

    Score equity assessment is an important analysis to ensure inferences drawn from test scores are comparable across subgroups of examinees. The purpose of the present evaluation was to assess the extent to which the Grade 8 NAEP Math and Reading assessments for 2005 were equivalent across selected states. More specifically, the present study…

  5. Confidence-Based Assessments within an Adult Learning Environment

    ERIC Educational Resources Information Center

    Novacek, Paul

    2013-01-01

    Traditional knowledge assessments rely on multiple-choice type questions that only report a right or wrong answer. The reliance within the education system on this technique infers that a student who provides a correct answer purely through guesswork possesses knowledge equivalent to a student who actually knows the correct answer. A more complete…

  6. Land Water Storage within the Congo Basin Inferred from GRACE Satellite Gravity Data

    NASA Technical Reports Server (NTRS)

    Crowley, John W.; Mitrovica, Jerry X.; Bailey, Richard C.; Tamisiea, Mark E.; Davis, James L.

    2006-01-01

    GRACE satellite gravity data is used to estimate terrestrial (surface plus ground) water storage within the Congo Basin in Africa for the period of April, 2002 - May, 2006. These estimates exhibit significant seasonal (30 +/- 6 mm of equivalent water thickness) and long-term trends, the latter yielding a total loss of approximately 280 km(exp 3) of water over the 50-month span of data. We also combine GRACE and precipitation data set (CMAP, TRMM) to explore the relative contributions of the source term to the seasonal hydrological balance within the Congo Basin. We find that the seasonal water storage tends to saturate for anomalies greater than 30-44 mm of equivalent water thickness. Furthermore, precipitation contributed roughly three times the peak water storage after anomalously rainy seasons, in early 2003 and 2005, implying an approximately 60-70% loss from runoff and evapotranspiration. Finally, a comparison of residual land water storage (monthly estimates minus best-fitting trends) in the Congo and Amazon Basins shows an anticorrelation, in agreement with the 'see-saw' variability inferred by others from runoff data.

  7. The Aeroacoustics of Supersonic Coaxial Jets

    NASA Technical Reports Server (NTRS)

    Dahl, Milo D.

    1994-01-01

    Instability waves have been established as the dominant source of mixing noise radiating into the downstream arc of a supersonic jet when the waves have phase velocities that are supersonic relative to ambient conditions. Recent theories for supersonic jet noise have used the concepts of growing and decaying linear instability waves for predicting radiated noise. This analysis is extended to the prediction of noise radiation from supersonic coaxial jets. Since the analysis requires a known mean flow and the coaxial jet mean flow is not described easily in terms of analytic functions, a numerical prediction is made for its development. The Reynolds averaged, compressible, boundary layer equations are solved using a mixing length turbulence model. Empirical correlations are developed for the effects of velocity and temperature ratios and Mach number. Both normal and inverted velocity profile coaxial jets are considered. Comparisons with measurements for both single and coaxial jets show good agreement. The results from mean flow and stability calculations are used to predict the noise radiation from coaxial jets with different operating conditions. Comparisons are made between different coaxial jets and a single equivalent jet with the same total thrust, mass flow, and exit area. Results indicate that normal velocity profile jets can have noise reductions compared to the single equivalent jet. No noise reductions are found for inverted velocity profile jets operated at the minimum noise condition compared to the single equivalent jet. However, it is inferred that changes in area ratio may provide noise reduction benefits for inverted velocity profile jets.

  8. Statistical Validation of Surrogate Endpoints: Another Look at the Prentice Criterion and Other Criteria.

    PubMed

    Saraf, Sanatan; Mathew, Thomas; Roy, Anindya

    2015-01-01

    For the statistical validation of surrogate endpoints, an alternative formulation is proposed for testing Prentice's fourth criterion, under a bivariate normal model. In such a setup, the criterion involves inference concerning an appropriate regression parameter, and the criterion holds if the regression parameter is zero. Testing such a null hypothesis has been criticized in the literature since it can only be used to reject a poor surrogate, and not to validate a good surrogate. In order to circumvent this, an equivalence hypothesis is formulated for the regression parameter, namely the hypothesis that the parameter is equivalent to zero. Such an equivalence hypothesis is formulated as an alternative hypothesis, so that the surrogate endpoint is statistically validated when the null hypothesis is rejected. Confidence intervals for the regression parameter and tests for the equivalence hypothesis are proposed using bootstrap methods and small sample asymptotics, and their performances are numerically evaluated and recommendations are made. The choice of the equivalence margin is a regulatory issue that needs to be addressed. The proposed equivalence testing formulation is also adopted for other parameters that have been proposed in the literature on surrogate endpoint validation, namely, the relative effect and proportion explained.

  9. Role of Utility and Inference in the Evolution of Functional Information

    PubMed Central

    Sharov, Alexei A.

    2009-01-01

    Functional information means an encoded network of functions in living organisms from molecular signaling pathways to an organism’s behavior. It is represented by two components: code and an interpretation system, which together form a self-sustaining semantic closure. Semantic closure allows some freedom between components because small variations of the code are still interpretable. The interpretation system consists of inference rules that control the correspondence between the code and the function (phenotype) and determines the shape of the fitness landscape. The utility factor operates at multiple time scales: short-term selection drives evolution towards higher survival and reproduction rate within a given fitness landscape, and long-term selection favors those fitness landscapes that support adaptability and lead to evolutionary expansion of certain lineages. Inference rules make short-term selection possible by shaping the fitness landscape and defining possible directions of evolution, but they are under control of the long-term selection of lineages. Communication normally occurs within a set of agents with compatible interpretation systems, which I call communication system. Functional information cannot be directly transferred between communication systems with incompatible inference rules. Each biological species is a genetic communication system that carries unique functional information together with inference rules that determine evolutionary directions and constraints. This view of the relation between utility and inference can resolve the conflict between realism/positivism and pragmatism. Realism overemphasizes the role of inference in evolution of human knowledge because it assumes that logic is embedded in reality. Pragmatism substitutes usefulness for truth and therefore ignores the advantage of inference. The proposed concept of evolutionary pragmatism rejects the idea that logic is embedded in reality; instead, inference rules are constructed within each communication system to represent reality and they evolve towards higher adaptability on a long time scale. PMID:20160960

  10. Inhomogeneous Poisson process rate function inference from dead-time limited observations.

    PubMed

    Verma, Gunjan; Drost, Robert J

    2017-05-01

    The estimation of an inhomogeneous Poisson process (IHPP) rate function from a set of process observations is an important problem arising in optical communications and a variety of other applications. However, because of practical limitations of detector technology, one is often only able to observe a corrupted version of the original process. In this paper, we consider how inference of the rate function is affected by dead time, a period of time after the detection of an event during which a sensor is insensitive to subsequent IHPP events. We propose a flexible nonparametric Bayesian approach to infer an IHPP rate function given dead-time limited process realizations. Simulation results illustrate the effectiveness of our inference approach and suggest its ability to extend the utility of existing sensor technology by permitting more accurate inference on signals whose observations are dead-time limited. We apply our inference algorithm to experimentally collected optical communications data, demonstrating the practical utility of our approach in the context of channel modeling and validation.

  11. Equivalence between Step Selection Functions and Biased Correlated Random Walks for Statistical Inference on Animal Movement.

    PubMed

    Duchesne, Thierry; Fortin, Daniel; Rivest, Louis-Paul

    2015-01-01

    Animal movement has a fundamental impact on population and community structure and dynamics. Biased correlated random walks (BCRW) and step selection functions (SSF) are commonly used to study movements. Because no studies have contrasted the parameters and the statistical properties of their estimators for models constructed under these two Lagrangian approaches, it remains unclear whether or not they allow for similar inference. First, we used the Weak Law of Large Numbers to demonstrate that the log-likelihood function for estimating the parameters of BCRW models can be approximated by the log-likelihood of SSFs. Second, we illustrated the link between the two approaches by fitting BCRW with maximum likelihood and with SSF to simulated movement data in virtual environments and to the trajectory of bison (Bison bison L.) trails in natural landscapes. Using simulated and empirical data, we found that the parameters of a BCRW estimated directly from maximum likelihood and by fitting an SSF were remarkably similar. Movement analysis is increasingly used as a tool for understanding the influence of landscape properties on animal distribution. In the rapidly developing field of movement ecology, management and conservation biologists must decide which method they should implement to accurately assess the determinants of animal movement. We showed that BCRW and SSF can provide similar insights into the environmental features influencing animal movements. Both techniques have advantages. BCRW has already been extended to allow for multi-state modeling. Unlike BCRW, however, SSF can be estimated using most statistical packages, it can simultaneously evaluate habitat selection and movement biases, and can easily integrate a large number of movement taxes at multiple scales. SSF thus offers a simple, yet effective, statistical technique to identify movement taxis.

  12. Similar Associations of Tooth Microwear and Morphology Indicate Similar Diet across Marsupial and Placental Mammals

    PubMed Central

    Christensen, Hilary B.

    2014-01-01

    Low-magnification microwear techniques have been used effectively to infer diets within many unrelated mammalian orders, but the extent to which patterns are comparable among such different groups, including long extinct mammal lineages, is unknown. Microwear patterns between ecologically equivalent placental and marsupial mammals are found to be statistically indistinguishable, indicating that microwear can be used to infer diet across the mammals. Microwear data were compared to body size and molar shearing crest length in order to develop a system to distinguish the diet of mammals. Insectivores and carnivores were difficult to distinguish from herbivores using microwear alone, but combining microwear data with body size estimates and tooth morphology provides robust dietary inferences. This approach is a powerful tool for dietary assessment of fossils from extinct lineages and from museum specimens of living species where field study would be difficult owing to the animal’s behavior, habitat, or conservation status. PMID:25099537

  13. Isotopic abundances of Hg in mercury stars inferred from the Hg II line at 3984 A

    NASA Technical Reports Server (NTRS)

    White, R. E.; Vaughan, A. H., Jr.; Preston, G. W.; Swings, J. P.

    1976-01-01

    Wavelengths of the Hg II absorption feature at 3984 A in 30 Hg stars are distributed uniformly from the value for the terrestrial mix to a value that corresponds to nearly pure Hg-204. The wavelengths are correlated loosely with effective temperatures inferred from Q(UBV). Relative isotopic abundances derived from partially resolved profiles of the 3984-A line in iota CrB, chi Lup, and HR 4072 suggest that mass-dependent fractionation has occurred in all three stars. It is supposed that such fractionation occurs in all Hg stars, and a scheme whereby isotopic compositions can be inferred from a comparison of stellar wavelengths and equivalent widths with those calculated for a family of fractionated isotopic mixes. Theoretical profiles calculated for the derived isotopic composition agree well with high-resolution interferometric profiles obtained for three of the stars.

  14. Factorial Structure of the Family Values Scale from a Multilevel-Multicultural Perspective

    ERIC Educational Resources Information Center

    Byrne, Barbara M.; van de Vijver, Fons J. R.

    2014-01-01

    In cross-cultural research, there is a tendency for researchers to draw inferences at the country level based on individual-level data. Such action implicitly and often mistakenly assumes that both the measuring instrument and its underlying construct(s) are operating equivalently across both levels. Based on responses from 5,482 college students…

  15. Granger Causality and Transfer Entropy Are Equivalent for Gaussian Variables

    NASA Astrophysics Data System (ADS)

    Barnett, Lionel; Barrett, Adam B.; Seth, Anil K.

    2009-12-01

    Granger causality is a statistical notion of causal influence based on prediction via vector autoregression. Developed originally in the field of econometrics, it has since found application in a broader arena, particularly in neuroscience. More recently transfer entropy, an information-theoretic measure of time-directed information transfer between jointly dependent processes, has gained traction in a similarly wide field. While it has been recognized that the two concepts must be related, the exact relationship has until now not been formally described. Here we show that for Gaussian variables, Granger causality and transfer entropy are entirely equivalent, thus bridging autoregressive and information-theoretic approaches to data-driven causal inference.

  16. The quark condensate in multi-flavour QCD – planar equivalence confronting lattice simulations

    DOE PAGES

    Armoni, Adi; Shifman, Mikhail; Shore, Graham; ...

    2015-02-01

    Planar equivalence between the large N limits of N=1 Super Yang–Mills (SYM) theory and a variant of QCD with fermions in the antisymmetric representation is a powerful tool to obtain analytic non-perturbative results in QCD itself. In particular, it allows the quark condensate for N=3 QCD with quarks in the fundamental representation to be inferred from exact calculations of the gluino condensate in N=1 SYM. In this paper, we review and refine our earlier predictions for the quark condensate in QCD with a general number nf of flavours and confront these with lattice results.

  17. Rational hypocrisy: a Bayesian analysis based on informal argumentation and slippery slopes.

    PubMed

    Rai, Tage S; Holyoak, Keith J

    2014-01-01

    Moral hypocrisy is typically viewed as an ethical accusation: Someone is applying different moral standards to essentially identical cases, dishonestly claiming that one action is acceptable while otherwise equivalent actions are not. We suggest that in some instances the apparent logical inconsistency stems from different evaluations of a weak argument, rather than dishonesty per se. Extending Corner, Hahn, and Oaksford's (2006) analysis of slippery slope arguments, we develop a Bayesian framework in which accusations of hypocrisy depend on inferences of shared category membership between proposed actions and previous standards, based on prior probabilities that inform the strength of competing hypotheses. Across three experiments, we demonstrate that inferences of hypocrisy increase as perceptions of the likelihood of shared category membership between precedent cases and current cases increase, that these inferences follow established principles of category induction, and that the presence of self-serving motives increases inferences of hypocrisy independent of changes in the actions themselves. Taken together, these results demonstrate that Bayesian analyses of weak arguments may have implications for assessing moral reasoning. © 2014 Cognitive Science Society, Inc.

  18. Choose, rate or squeeze: Comparison of economic value functions elicited by different behavioral tasks

    PubMed Central

    Pessiglione, Mathias

    2017-01-01

    A standard view in neuroeconomics is that to make a choice, an agent first assigns subjective values to available options, and then compares them to select the best. In choice tasks, these cardinal values are typically inferred from the preference expressed by subjects between options presented in pairs. Alternatively, cardinal values can be directly elicited by asking subjects to place a cursor on an analog scale (rating task) or to exert a force on a power grip (effort task). These tasks can vary in many respects: they can notably be more or less costly and consequential. Here, we compared the value functions elicited by choice, rating and effort tasks on options composed of two monetary amounts: one for the subject (gain) and one for a charity (donation). Bayesian model selection showed that despite important differences between the three tasks, they all elicited a same value function, with similar weighting of gain and donation, but variable concavity. Moreover, value functions elicited by the different tasks could predict choices with equivalent accuracy. Our finding therefore suggests that comparable value functions can account for various motivated behaviors, beyond economic choice. Nevertheless, we report slight differences in the computational efficiency of parameter estimation that may guide the design of future studies. PMID:29161252

  19. Reasoning from an incompatibility: False dilemma fallacies and content effects.

    PubMed

    Brisson, Janie; Markovits, Henry; Robert, Serge; Schaeken, Walter

    2018-03-23

    In the present studies, we investigated inferences from an incompatibility statement. Starting with two propositions that cannot be true at the same time, these inferences consist of deducing the falsity of one from the truth of the other or deducing the truth of one from the falsity of the other. Inferences of this latter form are relevant to human reasoning since they are the formal equivalent of a discourse manipulation called the false dilemma fallacy, often used in politics and advertising in order to force a choice between two selected options. Based on research on content-related variability in conditional reasoning, we predicted that content would have an impact on how reasoners treat incompatibility inferences. Like conditional inferences, they present two invalid forms for which the logical response is one of uncertainty. We predicted that participants would endorse a smaller proportion of the invalid incompatibility inferences when more counterexamples are available. In Study 1, we found the predicted pattern using causal premises translated into incompatibility statements with many and few counterexamples. In Study 2A, we replicated the content effects found in Study 1, but with premises for which the incompatibility statement is a non-causal relation between classes. These results suggest that the tendency to fall into the false dilemma fallacy is modulated by the background knowledge of the reasoner. They also provide additional evidence on the link between semantic information retrieval and deduction.

  20. Defects in the synthetic pathway prevent DIF-1 mediated stalk lineage specification cascade in the non-differentiating social amoeba, Acytostelium subglobosum.

    PubMed

    Mohri, Kurato; Hata, Takashi; Kikuchi, Haruhisa; Oshima, Yoshiteru; Urushihara, Hideko

    2014-05-29

    Separation of somatic cells from germ-line cells is a crucial event for multicellular organisms, but how this step was achieved during evolution remains elusive. In Dictyostelium discoideum and many other dictyostelid species, solitary amoebae gather and form a multicellular fruiting body in which germ-line spores and somatic stalk cells differentiate, whereas in Acytostelium subglobosum, acellular stalks form and all aggregated amoebae become spores. In this study, because most D. discoideum genes known to be required for stalk cell differentiation have homologs in A. subglobosum, we inferred functional variations in these genes and examined conservation of the stalk cell specification cascade of D. discoideum mediated by the polyketide differentiation-inducing factor-1 (DIF-1) in A. subglobosum. Through heterologous expression of A. subglobosum orthologs of DIF-1 biosynthesis genes in D. discoideum, we confirmed that two of the three genes were functional equivalents, while DIF-methyltransferase (As-dmtA) involved at the final step of DIF-1 synthesis was not. In fact, DIF-1 activity was undetectable in A. subglobosum lysates and amoebae of this species were not responsive to DIF-1, suggesting a lack of DIF-1 production in this species. On the other hand, the molecular function of an A. subglobosum ortholog of DIF-1 responsive transcription factor was equivalent with that of D. discoideum and inhibition of polyketide synthesis caused developmental arrest in A. subglobosum, which could not be rescued by DIF-1 addition. These results suggest that non-DIF-1 polyketide cascades involving downstream transcription factors are required for fruiting body development of A. subglobosum. © 2014. Published by The Company of Biologists Ltd.

  1. Coinductive Logic Programming with Negation

    NASA Astrophysics Data System (ADS)

    Min, Richard; Gupta, Gopal

    We introduce negation into coinductive logic programming (co-LP) via what we term Coinductive SLDNF (co-SLDNF) resolution. We present declarative and operational semantics of co-SLDNF resolution and present their equivalence under the restriction of rationality. Co-LP with co-SLDNF resolution provides a powerful, practical and efficient operational semantics for Fitting's Kripke-Kleene three-valued logic with restriction of rationality. Further, applications of co-SLDNF resolution are also discussed and illustrated where Co-SLDNF resolution allows one to develop elegant implementations of modal logics. Moreover it provides the capability of non-monotonic inference (e.g., predicate Answer Set Programming) that can be used to develop novel and effective first-order modal non-monotonic inference engines.

  2. A new learning algorithm for a fully connected neuro-fuzzy inference system.

    PubMed

    Chen, C L Philip; Wang, Jing; Wang, Chi-Hsu; Chen, Long

    2014-10-01

    A traditional neuro-fuzzy system is transformed into an equivalent fully connected three layer neural network (NN), namely, the fully connected neuro-fuzzy inference systems (F-CONFIS). The F-CONFIS differs from traditional NNs by its dependent and repeated weights between input and hidden layers and can be considered as the variation of a kind of multilayer NN. Therefore, an efficient learning algorithm for the F-CONFIS to cope these repeated weights is derived. Furthermore, a dynamic learning rate is proposed for neuro-fuzzy systems via F-CONFIS where both premise (hidden) and consequent portions are considered. Several simulation results indicate that the proposed approach achieves much better accuracy and fast convergence.

  3. Inertia and Double Bending of Light from Equivalence

    NASA Technical Reports Server (NTRS)

    Shuler, Robert L., Jr.

    2010-01-01

    Careful examination of light paths in an accelerated reference frame, with use of Special Relativity, can account fully for the observed bending of light in a gravitational field, not just half of it as reported in 1911. This analysis also leads to a Machian formulation of inertia similar to the one proposed by Einstein in 1912 and later derived from gravitational field equations in Minkowsky Space by Sciama in 1953. There is a clear inference from equivalence that there is some type of inertial mass increase in a gravitational field. It is the purpose of the current paper to suggest that equivalence provides a more complete picture of gravitational effects than previously thought, correctly predicting full light bending, and that since the theory of inertia is derivable from equivalence, any theory based on equivalence must take account of it. Einstein himself clearly was not satisfied with the status of inertia in GRT, as our quotes have shown. Many have tried to account for inertia and met with less than success, for example Davidson s integration of Sciama s inertia into GRT but only for a steady state cosmology [10], and the Machian gravity theory of Brans and Dicke [11]. Yet Mach s idea hasn t gone away, and now it seems that it cannot go away without also disposing of equivalence.

  4. Model averaging, optimal inference, and habit formation

    PubMed Central

    FitzGerald, Thomas H. B.; Dolan, Raymond J.; Friston, Karl J.

    2014-01-01

    Postulating that the brain performs approximate Bayesian inference generates principled and empirically testable models of neuronal function—the subject of much current interest in neuroscience and related disciplines. Current formulations address inference and learning under some assumed and particular model. In reality, organisms are often faced with an additional challenge—that of determining which model or models of their environment are the best for guiding behavior. Bayesian model averaging—which says that an agent should weight the predictions of different models according to their evidence—provides a principled way to solve this problem. Importantly, because model evidence is determined by both the accuracy and complexity of the model, optimal inference requires that these be traded off against one another. This means an agent's behavior should show an equivalent balance. We hypothesize that Bayesian model averaging plays an important role in cognition, given that it is both optimal and realizable within a plausible neuronal architecture. We outline model averaging and how it might be implemented, and then explore a number of implications for brain and behavior. In particular, we propose that model averaging can explain a number of apparently suboptimal phenomena within the framework of approximate (bounded) Bayesian inference, focusing particularly upon the relationship between goal-directed and habitual behavior. PMID:25018724

  5. Data Analysis Techniques for Physical Scientists

    NASA Astrophysics Data System (ADS)

    Pruneau, Claude A.

    2017-10-01

    Preface; How to read this book; 1. The scientific method; Part I. Foundation in Probability and Statistics: 2. Probability; 3. Probability models; 4. Classical inference I: estimators; 5. Classical inference II: optimization; 6. Classical inference III: confidence intervals and statistical tests; 7. Bayesian inference; Part II. Measurement Techniques: 8. Basic measurements; 9. Event reconstruction; 10. Correlation functions; 11. The multiple facets of correlation functions; 12. Data correction methods; Part III. Simulation Techniques: 13. Monte Carlo methods; 14. Collision and detector modeling; List of references; Index.

  6. Satellite gravity measurement monitoring terrestrial water storage change and drought in the continental United States.

    PubMed

    Yi, Hang; Wen, Lianxing

    2016-01-27

    We use satellite gravity measurements in the Gravity Recovery and Climate Experiment (GRACE) to estimate terrestrial water storage (TWS) change in the continental United States (US) from 2003 to 2012, and establish a GRACE-based Hydrological Drought Index (GHDI) for drought monitoring. GRACE-inferred TWS exhibits opposite patterns between north and south of the continental US from 2003 to 2012, with the equivalent water thickness increasing from -4.0 to 9.4 cm in the north and decreasing from 4.1 to -6.7 cm in the south. The equivalent water thickness also decreases by -5.1 cm in the middle south in 2006. GHDI is established to represent the extent of GRACE-inferred TWS anomaly departing from its historical average and is calibrated to resemble traditional Palmer Hydrological Drought Index (PHDI) in the continental US. GHDI exhibits good correlations with PHDI in the continental US, indicating its feasibility for drought monitoring. Since GHDI is GRACE-based and has minimal dependence of hydrological parameters on the ground, it can be extended for global drought monitoring, particularly useful for the countries that lack sufficient hydrological monitoring infrastructures on the ground.

  7. Bayesian Redshift Classification of Emission-line Galaxies with Photometric Equivalent Widths

    NASA Astrophysics Data System (ADS)

    Leung, Andrew S.; Acquaviva, Viviana; Gawiser, Eric; Ciardullo, Robin; Komatsu, Eiichiro; Malz, A. I.; Zeimann, Gregory R.; Bridge, Joanna S.; Drory, Niv; Feldmeier, John J.; Finkelstein, Steven L.; Gebhardt, Karl; Gronwall, Caryl; Hagen, Alex; Hill, Gary J.; Schneider, Donald P.

    2017-07-01

    We present a Bayesian approach to the redshift classification of emission-line galaxies when only a single emission line is detected spectroscopically. We consider the case of surveys for high-redshift Lyα-emitting galaxies (LAEs), which have traditionally been classified via an inferred rest-frame equivalent width (EW {W}{Lyα }) greater than 20 Å. Our Bayesian method relies on known prior probabilities in measured emission-line luminosity functions and EW distributions for the galaxy populations, and returns the probability that an object in question is an LAE given the characteristics observed. This approach will be directly relevant for the Hobby-Eberly Telescope Dark Energy Experiment (HETDEX), which seeks to classify ˜106 emission-line galaxies into LAEs and low-redshift [{{O}} {{II}}] emitters. For a simulated HETDEX catalog with realistic measurement noise, our Bayesian method recovers 86% of LAEs missed by the traditional {W}{Lyα } > 20 Å cutoff over 2 < z < 3, outperforming the EW cut in both contamination and incompleteness. This is due to the method’s ability to trade off between the two types of binary classification error by adjusting the stringency of the probability requirement for classifying an observed object as an LAE. In our simulations of HETDEX, this method reduces the uncertainty in cosmological distance measurements by 14% with respect to the EW cut, equivalent to recovering 29% more cosmological information. Rather than using binary object labels, this method enables the use of classification probabilities in large-scale structure analyses. It can be applied to narrowband emission-line surveys as well as upcoming large spectroscopic surveys including Euclid and WFIRST.

  8. 78 FR 255 - Resumption of the Population Estimates Challenge Program

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-03

    ... governmental unit. In those instances where a non-functioning county-level government or statistical equivalent...) A non-functioning county or statistical equivalent means a sub- state entity that does not function... represents a non-functioning county or statistical equivalent, the governor will serve as the chief executive...

  9. Toward understanding the evolution of vertebrate gene regulatory networks: comparative genomics and epigenomic approaches.

    PubMed

    Martinez-Morales, Juan R

    2016-07-01

    Vertebrates, as most animal phyla, originated >500 million years ago during the Cambrian explosion, and progressively radiated into the extant classes. Inferring the evolutionary history of the group requires understanding the architecture of the developmental programs that constrain the vertebrate anatomy. Here, I review recent comparative genomic and epigenomic studies, based on ChIP-seq and chromatin accessibility, which focus on the identification of functionally equivalent cis-regulatory modules among species. This pioneer work, primarily centered in the mammalian lineage, has set the groundwork for further studies in representative vertebrate and chordate species. Mapping of active regulatory regions across lineages will shed new light on the evolutionary forces stabilizing ancestral developmental programs, as well as allowing their variation to sustain morphological adaptations on the inherited vertebrate body plan. © The Author 2015. Published by Oxford University Press. All rights reserved. For permissions, please email: journals.permissions@oup.com.

  10. 75 FR 28076 - New Postal Products

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-19

    ...) contracts.\\1\\ The Postal Service believes the instant contracts are functionally equivalent to previously..., which established GEPS 1 as a product, also authorized functionally equivalent agreements to be included... of Four Functionally Equivalent Global Expedited Package Services 2 Negotiated Service Agreements and...

  11. 77 FR 12888 - International Mail Contract

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-02

    ... instant contract is functionally equivalent to the IBRS 3 baseline contract originally filed in Docket Nos... product, also authorized functionally equivalent agreements to be included within the product, provided... Service Filing of a Functionally Equivalent International Business Reply Service Competitive Contract 3...

  12. 75 FR 65386 - New Postal Product

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-10-22

    ... Postal Service believes the instant contract is functionally equivalent to previously submitted GEPS... GEPS 1 as a product, also authorized functionally equivalent agreements to be included within the... Postal Service of Filing a Functionally Equivalent Global Expedited Package Services 3 Negotiated Service...

  13. 75 FR 22633 - New Postal Product

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-29

    ...) contracts.\\1\\ The Postal Service believes the instant contracts are functionally equivalent to previously..., which established GEPS 1 as a product, also authorized functionally equivalent agreements to be included... Filing of Two Functionally Equivalent Global Expedited Package Services 2 Negotiated Service Agreements...

  14. 75 FR 53002 - New Postal Products

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-30

    ... contracts are functionally equivalent to previously submitted GEPS contracts, and are supported by Governors... authorized functionally equivalent agreements to be included within the product, provided that they meet the... Functionally Equivalent Global Expedited Package Services 3 Negotiated Service Agreements and Application for...

  15. 75 FR 14475 - New Postal Product

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-25

    ... Service believes the instant contracts are functionally equivalent to previously submitted GEPS 2... established GEPS 1 as a product, also authorized functionally equivalent agreements to be included within the... Functionally Equivalent Global Expedited Package Services 2 Negotiated Service Agreements and Application for...

  16. 75 FR 9005 - New Postal Product

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-26

    ... functionally equivalent to previously submitted GEPS 2 contracts, and is supported by Governors' Decision No... functionally equivalent agreements to be included within the product, provided that they meet the requirements...]\\ Notice of United States Postal Service Filing of Functionally Equivalent Global Expedited Package...

  17. 75 FR 25303 - New Postal Products

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-07

    ... believes the instant contracts are functionally equivalent to previously submitted GEPS 2 contracts, and... GEPS 1 as a product, also authorized functionally equivalent agreements to be included within the... Functionally Equivalent Global Expedited Package Services 2 Negotiated Service Agreements and Application for...

  18. Connection between Dynamically Derived Initial Mass Function Normalization and Stellar Population Parameters

    NASA Astrophysics Data System (ADS)

    McDermid, Richard M.; Cappellari, Michele; Alatalo, Katherine; Bayet, Estelle; Blitz, Leo; Bois, Maxime; Bournaud, Frédéric; Bureau, Martin; Crocker, Alison F.; Davies, Roger L.; Davis, Timothy A.; de Zeeuw, P. T.; Duc, Pierre-Alain; Emsellem, Eric; Khochfar, Sadegh; Krajnović, Davor; Kuntschner, Harald; Morganti, Raffaella; Naab, Thorsten; Oosterloo, Tom; Sarzi, Marc; Scott, Nicholas; Serra, Paolo; Weijmans, Anne-Marie; Young, Lisa M.

    2014-09-01

    We report on empirical trends between the dynamically determined stellar initial mass function (IMF) and stellar population properties for a complete, volume-limited sample of 260 early-type galaxies from the ATLAS3D project. We study trends between our dynamically derived IMF normalization αdyn ≡ (M/L)stars/(M/L)Salp and absorption line strengths, and interpret these via single stellar population-equivalent ages, abundance ratios (measured as [α/Fe]), and total metallicity, [Z/H]. We find that old and alpha-enhanced galaxies tend to have on average heavier (Salpeter-like) mass normalization of the IMF, but stellar population does not appear to be a good predictor of the IMF, with a large range of αdyn at a given population parameter. As a result, we find weak αdyn-[α/Fe] and αdyn -Age correlations and no significant αdyn -[Z/H] correlation. The observed trends appear significantly weaker than those reported in studies that measure the IMF normalization via the low-mass star demographics inferred through stellar spectral analysis.

  19. 20 CFR 416.926a - Functional equivalence for children.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... development and functioning, and that not all children within an age category are expected to be able to do... 20 Employees' Benefits 2 2014-04-01 2014-04-01 false Functional equivalence for children. 416.926a... Functional equivalence for children. (a) General. If you have a severe impairment or combination of...

  20. 20 CFR 416.926a - Functional equivalence for children.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... development and functioning, and that not all children within an age category are expected to be able to do... 20 Employees' Benefits 2 2013-04-01 2013-04-01 false Functional equivalence for children. 416.926a... Functional equivalence for children. (a) General. If you have a severe impairment or combination of...

  1. 20 CFR 416.926a - Functional equivalence for children.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... development and functioning, and that not all children within an age category are expected to be able to do... 20 Employees' Benefits 2 2012-04-01 2012-04-01 false Functional equivalence for children. 416.926a... Functional equivalence for children. (a) General. If you have a severe impairment or combination of...

  2. The Low-Mass Stellar Initial Mass Function: Ultra-Faint Dwarf Galaxies Revisited

    NASA Astrophysics Data System (ADS)

    Platais, Imants

    2017-08-01

    The stellar Initial Mass Function plays a critical role in the evolution of the baryonic content of the Universe. The form of the low-mass IMF - stars of mass less than the solar mass - determines the fraction of baryons locked up for a Hubble time, and thus indicates how gas and metals are cycled through galaxies. Inferences from resolved stellar populations, where the low-mass luminosity function and associated IMF can be derived from direct star counts, generally favor an invariant and universal IMF. However, a recent study of ultra-faint dwarf galaxies Hercules and Leo IV indicates a bottom-lite IMF, over a narrow range of stellar mass (only 0.55-0.75 M_sun), correlated with the internal velocity dispersion and/or metallicity. We propose to obtain ultra-deep imaging for a significantly closer ultra-faint dwarf, Bootes I, which will allow us to construct the luminosity function down to M_v=+10 (equivalent to 0.35 solar mass). We will also re-analyze the HST archival observations for the Hercules and Leo IV dwarfs using the same updated techniques as for Bootes I. The combined datasets should provide a reliable answer to the question of how variable is the low-mass stellar IMF.

  3. On Galactic Density Modeling in the Presence of Dust Extinction

    NASA Astrophysics Data System (ADS)

    Bovy, Jo; Rix, Hans-Walter; Green, Gregory M.; Schlafly, Edward F.; Finkbeiner, Douglas P.

    2016-02-01

    Inferences about the spatial density or phase-space structure of stellar populations in the Milky Way require a precise determination of the effective survey volume. The volume observed by surveys such as Gaia or near-infrared spectroscopic surveys, which have good coverage of the Galactic midplane region, is highly complex because of the abundant small-scale structure in the three-dimensional interstellar dust extinction. We introduce a novel framework for analyzing the importance of small-scale structure in the extinction. This formalism demonstrates that the spatially complex effect of extinction on the selection function of a pencil-beam or contiguous sky survey is equivalent to a low-pass filtering of the extinction-affected selection function with the smooth density field. We find that the angular resolution of current 3D extinction maps is sufficient for analyzing Gaia sub-samples of millions of stars. However, the current distance resolution is inadequate and needs to be improved by an order of magnitude, especially in the inner Galaxy. We also present a practical and efficient method for properly taking the effect of extinction into account in analyses of Galactic structure through an effective selection function. We illustrate its use with the selection function of red-clump stars in APOGEE using and comparing a variety of current 3D extinction maps.

  4. Evidence and Clinical Trials.

    NASA Astrophysics Data System (ADS)

    Goodman, Steven N.

    1989-11-01

    This dissertation explores the use of a mathematical measure of statistical evidence, the log likelihood ratio, in clinical trials. The methods and thinking behind the use of an evidential measure are contrasted with traditional methods of analyzing data, which depend primarily on a p-value as an estimate of the statistical strength of an observed data pattern. It is contended that neither the behavioral dictates of Neyman-Pearson hypothesis testing methods, nor the coherency dictates of Bayesian methods are realistic models on which to base inference. The use of the likelihood alone is applied to four aspects of trial design or conduct: the calculation of sample size, the monitoring of data, testing for the equivalence of two treatments, and meta-analysis--the combining of results from different trials. Finally, a more general model of statistical inference, using belief functions, is used to see if it is possible to separate the assessment of evidence from our background knowledge. It is shown that traditional and Bayesian methods can be modeled as two ends of a continuum of structured background knowledge, methods which summarize evidence at the point of maximum likelihood assuming no structure, and Bayesian methods assuming complete knowledge. Both schools are seen to be missing a concept of ignorance- -uncommitted belief. This concept provides the key to understanding the problem of sampling to a foregone conclusion and the role of frequency properties in statistical inference. The conclusion is that statistical evidence cannot be defined independently of background knowledge, and that frequency properties of an estimator are an indirect measure of uncommitted belief. Several likelihood summaries need to be used in clinical trials, with the quantitative disparity between summaries being an indirect measure of our ignorance. This conclusion is linked with parallel ideas in the philosophy of science and cognitive psychology.

  5. Universal Darwinism As a Process of Bayesian Inference.

    PubMed

    Campbell, John O

    2016-01-01

    Many of the mathematical frameworks describing natural selection are equivalent to Bayes' Theorem, also known as Bayesian updating. By definition, a process of Bayesian Inference is one which involves a Bayesian update, so we may conclude that these frameworks describe natural selection as a process of Bayesian inference. Thus, natural selection serves as a counter example to a widely-held interpretation that restricts Bayesian Inference to human mental processes (including the endeavors of statisticians). As Bayesian inference can always be cast in terms of (variational) free energy minimization, natural selection can be viewed as comprising two components: a generative model of an "experiment" in the external world environment, and the results of that "experiment" or the "surprise" entailed by predicted and actual outcomes of the "experiment." Minimization of free energy implies that the implicit measure of "surprise" experienced serves to update the generative model in a Bayesian manner. This description closely accords with the mechanisms of generalized Darwinian process proposed both by Dawkins, in terms of replicators and vehicles, and Campbell, in terms of inferential systems. Bayesian inference is an algorithm for the accumulation of evidence-based knowledge. This algorithm is now seen to operate over a wide range of evolutionary processes, including natural selection, the evolution of mental models and cultural evolutionary processes, notably including science itself. The variational principle of free energy minimization may thus serve as a unifying mathematical framework for universal Darwinism, the study of evolutionary processes operating throughout nature.

  6. Universal Darwinism As a Process of Bayesian Inference

    PubMed Central

    Campbell, John O.

    2016-01-01

    Many of the mathematical frameworks describing natural selection are equivalent to Bayes' Theorem, also known as Bayesian updating. By definition, a process of Bayesian Inference is one which involves a Bayesian update, so we may conclude that these frameworks describe natural selection as a process of Bayesian inference. Thus, natural selection serves as a counter example to a widely-held interpretation that restricts Bayesian Inference to human mental processes (including the endeavors of statisticians). As Bayesian inference can always be cast in terms of (variational) free energy minimization, natural selection can be viewed as comprising two components: a generative model of an “experiment” in the external world environment, and the results of that “experiment” or the “surprise” entailed by predicted and actual outcomes of the “experiment.” Minimization of free energy implies that the implicit measure of “surprise” experienced serves to update the generative model in a Bayesian manner. This description closely accords with the mechanisms of generalized Darwinian process proposed both by Dawkins, in terms of replicators and vehicles, and Campbell, in terms of inferential systems. Bayesian inference is an algorithm for the accumulation of evidence-based knowledge. This algorithm is now seen to operate over a wide range of evolutionary processes, including natural selection, the evolution of mental models and cultural evolutionary processes, notably including science itself. The variational principle of free energy minimization may thus serve as a unifying mathematical framework for universal Darwinism, the study of evolutionary processes operating throughout nature. PMID:27375438

  7. REVERSAL LEARNING SET AND FUNCTIONAL EQUIVALENCE IN CHILDREN WITH AND WITHOUT AUTISM

    PubMed Central

    Lionello-DeNolf, Karen M.; McIlvane, William J.; Canovas, Daniela S.; de Souza, Deisy G.; Barros, Romariz S.

    2009-01-01

    To evaluate whether children with and without autism could exhibit (a) functional equivalence in the course of yoked repeated-reversal training and (b) reversal learning set, 6 children, in each of two experiments, were exposed to simple discrimination contingencies with three sets of stimuli. The discriminative functions of the set members were yoked and repeatedly reversed. In Experiment 1, all the children (of preschool age) showed gains in the efficiency of reversal learning across reversal problems and behavior that suggested formation of functional equivalence. In Experiment 2, 3 nonverbal children with autism exhibited strong evidence of reversal learning set and 2 showed evidence of functional equivalence. The data suggest a possible relationship between efficiency of reversal learning and functional equivalence test outcomes. Procedural variables may prove important in assessing the potential of young or nonverbal children to classify stimuli on the basis of shared discriminative functions. PMID:20186287

  8. Humans treat unreliable filled-in percepts as more real than veridical ones

    PubMed Central

    Ehinger, Benedikt V; Häusser, Katja; Ossandón, José P; König, Peter

    2017-01-01

    Humans often evaluate sensory signals according to their reliability for optimal decision-making. However, how do we evaluate percepts generated in the absence of direct input that are, therefore, completely unreliable? Here, we utilize the phenomenon of filling-in occurring at the physiological blind-spots to compare partially inferred and veridical percepts. Subjects chose between stimuli that elicit filling-in, and perceptually equivalent ones presented outside the blind-spots, looking for a Gabor stimulus without a small orthogonal inset. In ambiguous conditions, when the stimuli were physically identical and the inset was absent in both, subjects behaved opposite to optimal, preferring the blind-spot stimulus as the better example of a collinear stimulus, even though no relevant veridical information was available. Thus, a percept that is partially inferred is paradoxically considered more reliable than a percept based on external input. In other words: Humans treat filled-in inferred percepts as more real than veridical ones. DOI: http://dx.doi.org/10.7554/eLife.21761.001 PMID:28506359

  9. Algorithmic methods to infer the evolutionary trajectories in cancer progression

    PubMed Central

    Graudenzi, Alex; Ramazzotti, Daniele; Sanz-Pamplona, Rebeca; De Sano, Luca; Mauri, Giancarlo; Moreno, Victor; Antoniotti, Marco; Mishra, Bud

    2016-01-01

    The genomic evolution inherent to cancer relates directly to a renewed focus on the voluminous next-generation sequencing data and machine learning for the inference of explanatory models of how the (epi)genomic events are choreographed in cancer initiation and development. However, despite the increasing availability of multiple additional -omics data, this quest has been frustrated by various theoretical and technical hurdles, mostly stemming from the dramatic heterogeneity of the disease. In this paper, we build on our recent work on the “selective advantage” relation among driver mutations in cancer progression and investigate its applicability to the modeling problem at the population level. Here, we introduce PiCnIc (Pipeline for Cancer Inference), a versatile, modular, and customizable pipeline to extract ensemble-level progression models from cross-sectional sequenced cancer genomes. The pipeline has many translational implications because it combines state-of-the-art techniques for sample stratification, driver selection, identification of fitness-equivalent exclusive alterations, and progression model inference. We demonstrate PiCnIc’s ability to reproduce much of the current knowledge on colorectal cancer progression as well as to suggest novel experimentally verifiable hypotheses. PMID:27357673

  10. Effect of gamma and neutron irradiation on the mechanical properties of Spectralon™ porous PTFE

    DOE PAGES

    Gourdin, William H.; Datte, Philip; Jensen, Wayne; ...

    2016-07-21

    Here, we establish a correspondence between the mechanical properties (maximum load and failure elongation) of Spectralon™ porous PTFE irradiated with 14 MeV neutrons and 1.17 and 1.33 MeV gammas from a cobalt-60 source. From this correspondence we infer that the effects of neutrons and gammas on this material are approximately equivalent for a given absorbed dose.

  11. Boys Will Be Boys; Cows Will Be Cows: Children's Essentialist Reasoning about Gender Categories and Animal Species

    ERIC Educational Resources Information Center

    Taylor, Marianne G.; Rhodes, Marjorie; Gelman, Susan A.

    2009-01-01

    Two studies (N = 456) compared the development of concepts of animal species and human gender, using a switched-at-birth reasoning task. Younger children (5- and 6-year-olds) treated animal species and human gender as equivalent; they made similar levels of category-based inferences and endorsed similar explanations for development in these 2…

  12. 76 FR 53160 - Postal Service Rate Adjustment

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-25

    ... Functionally Equivalent Agreement, August 12, 2011 (Notice). See also Docket Nos. MC2010-35, R2010-5 and R2010... Operators 1 product and two functionally equivalent agreements, Strategic Bilateral Agreement Between United...[egrave]s Service Agreement is functionally equivalent to the delivery confirmation service provided with...

  13. The Social Attribution Task - Multiple Choice (SAT-MC): Psychometric comparison with social cognitive measures for schizophrenia research.

    PubMed

    Johannesen, Jason K; Fiszdon, Joanna M; Weinstein, Andrea; Ciosek, David; Bell, Morris D

    2018-04-01

    The Social Attribution Task-Multiple Choice (SAT-MC) tests the ability to extract social themes from viewed object motion. This form of animacy perception is thought to aid the development of social inference, but appears impaired in schizophrenia. The current study was undertaken to examine psychometric equivalence of two forms of the SAT-MC and to compare their performance against social cognitive tests recommended for schizophrenia research. Thirty-two schizophrenia (SZ) and 30 substance use disorder (SUD) participants completed both SAT-MC forms, the Bell-Lysaker Emotion Recognition Task (BLERT), Hinting Task, The Awareness of Social Inference Test (TASIT), Ambiguous Intentions and Hostility Questionnaire (AIHQ) and questionnaire measures of interpersonal function. Test sensitivity, construct and external validity, test-retest reliability, and internal consistency were evaluated. SZ scored significantly lower than SUD on both SAT-MC forms, each classifying ~60% of SZ as impaired, compared with ~30% of SUD. SAT-MC forms demonstrated good test-retest and parallel form reliability, minimal practice effect, high internal consistency, and similar patterns of correlation with social cognitive and external validity measures. The SAT-MC compared favorably to recommended social cognitive tests across psychometric features and, with exception of TASIT, was most sensitive to impairment in schizophrenia when compared to a chronic substance use sample. Published by Elsevier B.V.

  14. The Information Content of Discrete Functions and Their Application in Genetic Data Analysis

    DOE PAGES

    Sakhanenko, Nikita A.; Kunert-Graf, James; Galas, David J.

    2017-10-13

    The complex of central problems in data analysis consists of three components: (1) detecting the dependence of variables using quantitative measures, (2) defining the significance of these dependence measures, and (3) inferring the functional relationships among dependent variables. We have argued previously that an information theory approach allows separation of the detection problem from the inference of functional form problem. We approach here the third component of inferring functional forms based on information encoded in the functions. Here, we present here a direct method for classifying the functional forms of discrete functions of three variables represented in data sets. Discretemore » variables are frequently encountered in data analysis, both as the result of inherently categorical variables and from the binning of continuous numerical variables into discrete alphabets of values. The fundamental question of how much information is contained in a given function is answered for these discrete functions, and their surprisingly complex relationships are illustrated. The all-important effect of noise on the inference of function classes is found to be highly heterogeneous and reveals some unexpected patterns. We apply this classification approach to an important area of biological data analysis—that of inference of genetic interactions. Genetic analysis provides a rich source of real and complex biological data analysis problems, and our general methods provide an analytical basis and tools for characterizing genetic problems and for analyzing genetic data. Finally, we illustrate the functional description and the classes of a number of common genetic interaction modes and also show how different modes vary widely in their sensitivity to noise.« less

  15. The Information Content of Discrete Functions and Their Application in Genetic Data Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sakhanenko, Nikita A.; Kunert-Graf, James; Galas, David J.

    The complex of central problems in data analysis consists of three components: (1) detecting the dependence of variables using quantitative measures, (2) defining the significance of these dependence measures, and (3) inferring the functional relationships among dependent variables. We have argued previously that an information theory approach allows separation of the detection problem from the inference of functional form problem. We approach here the third component of inferring functional forms based on information encoded in the functions. Here, we present here a direct method for classifying the functional forms of discrete functions of three variables represented in data sets. Discretemore » variables are frequently encountered in data analysis, both as the result of inherently categorical variables and from the binning of continuous numerical variables into discrete alphabets of values. The fundamental question of how much information is contained in a given function is answered for these discrete functions, and their surprisingly complex relationships are illustrated. The all-important effect of noise on the inference of function classes is found to be highly heterogeneous and reveals some unexpected patterns. We apply this classification approach to an important area of biological data analysis—that of inference of genetic interactions. Genetic analysis provides a rich source of real and complex biological data analysis problems, and our general methods provide an analytical basis and tools for characterizing genetic problems and for analyzing genetic data. Finally, we illustrate the functional description and the classes of a number of common genetic interaction modes and also show how different modes vary widely in their sensitivity to noise.« less

  16. The Information Content of Discrete Functions and Their Application in Genetic Data Analysis.

    PubMed

    Sakhanenko, Nikita A; Kunert-Graf, James; Galas, David J

    2017-12-01

    The complex of central problems in data analysis consists of three components: (1) detecting the dependence of variables using quantitative measures, (2) defining the significance of these dependence measures, and (3) inferring the functional relationships among dependent variables. We have argued previously that an information theory approach allows separation of the detection problem from the inference of functional form problem. We approach here the third component of inferring functional forms based on information encoded in the functions. We present here a direct method for classifying the functional forms of discrete functions of three variables represented in data sets. Discrete variables are frequently encountered in data analysis, both as the result of inherently categorical variables and from the binning of continuous numerical variables into discrete alphabets of values. The fundamental question of how much information is contained in a given function is answered for these discrete functions, and their surprisingly complex relationships are illustrated. The all-important effect of noise on the inference of function classes is found to be highly heterogeneous and reveals some unexpected patterns. We apply this classification approach to an important area of biological data analysis-that of inference of genetic interactions. Genetic analysis provides a rich source of real and complex biological data analysis problems, and our general methods provide an analytical basis and tools for characterizing genetic problems and for analyzing genetic data. We illustrate the functional description and the classes of a number of common genetic interaction modes and also show how different modes vary widely in their sensitivity to noise.

  17. Single TRAM domain RNA-binding proteins in Archaea: functional insight from Ctr3 from the Antarctic methanogen Methanococcoides burtonii.

    PubMed

    Taha; Siddiqui, K S; Campanaro, S; Najnin, T; Deshpande, N; Williams, T J; Aldrich-Wright, J; Wilkins, M; Curmi, P M G; Cavicchioli, R

    2016-09-01

    TRAM domain proteins present in Archaea and Bacteria have a β-barrel shape with anti-parallel β-sheets that form a nucleic acid binding surface; a structure also present in cold shock proteins (Csps). Aside from protein structures, experimental data defining the function of TRAM domains is lacking. Here, we explore the possible functional properties of a single TRAM domain protein, Ctr3 (cold-responsive TRAM domain protein 3) from the Antarctic archaeon Methanococcoides burtonii that has increased abundance during low temperature growth. Ribonucleic acid (RNA) bound by Ctr3 in vitro was determined using RNA-seq. Ctr3-bound M. burtonii RNA with a preference for transfer (t)RNA and 5S ribosomal RNA, and a potential binding motif was identified. In tRNA, the motif represented the C loop; a region that is conserved in tRNA from all domains of life and appears to be solvent exposed, potentially providing access for Ctr3 to bind. Ctr3 and Csps are structurally similar and are both inferred to function in low temperature translation. The broad representation of single TRAM domain proteins within Archaea compared with their apparent absence in Bacteria, and scarcity of Csps in Archaea but prevalence in Bacteria, suggests they represent distinct evolutionary lineages of functionally equivalent RNA-binding proteins. © 2016 Society for Applied Microbiology and John Wiley & Sons Ltd.

  18. Form and function in hillslope hydrology: characterization of subsurface flow based on response observations

    NASA Astrophysics Data System (ADS)

    Angermann, Lisa; Jackisch, Conrad; Allroggen, Niklas; Sprenger, Matthias; Zehe, Erwin; Tronicke, Jens; Weiler, Markus; Blume, Theresa

    2017-07-01

    The phrase form and function was established in architecture and biology and refers to the idea that form and functionality are closely correlated, influence each other, and co-evolve. We suggest transferring this idea to hydrological systems to separate and analyze their two main characteristics: their form, which is equivalent to the spatial structure and static properties, and their function, equivalent to internal responses and hydrological behavior. While this approach is not particularly new to hydrological field research, we want to employ this concept to explicitly pursue the question of what information is most advantageous to understand a hydrological system. We applied this concept to subsurface flow within a hillslope, with a methodological focus on function: we conducted observations during a natural storm event and followed this with a hillslope-scale irrigation experiment. The results are used to infer hydrological processes of the monitored system. Based on these findings, the explanatory power and conclusiveness of the data are discussed. The measurements included basic hydrological monitoring methods, like piezometers, soil moisture, and discharge measurements. These were accompanied by isotope sampling and a novel application of 2-D time-lapse GPR (ground-penetrating radar). The main finding regarding the processes in the hillslope was that preferential flow paths were established quickly, despite unsaturated conditions. These flow paths also caused a detectable signal in the catchment response following a natural rainfall event, showing that these processes are relevant also at the catchment scale. Thus, we conclude that response observations (dynamics and patterns, i.e., indicators of function) were well suited to describing processes at the observational scale. Especially the use of 2-D time-lapse GPR measurements, providing detailed subsurface response patterns, as well as the combination of stream-centered and hillslope-centered approaches, allowed us to link processes and put them in a larger context. Transfer to other scales beyond observational scale and generalizations, however, rely on the knowledge of structures (form) and remain speculative. The complementary approach with a methodological focus on form (i.e., structure exploration) is presented and discussed in the companion paper by Jackisch et al.(2017).

  19. 49 CFR 214.505 - Required environmental control and protection systems for new on-track roadway maintenance...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... brooms; (4) Rotary scarifiers; (5) Undercutters; and (6) Functional equivalents of any of the machines... types identified in paragraphs (a)(1) through (a)(5) of this section, or functionally equivalent thereto...) of this section, or functionally equivalent thereto. The list shall be kept current and made...

  20. 49 CFR 214.505 - Required environmental control and protection systems for new on-track roadway maintenance...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... brooms; (4) Rotary scarifiers; (5) Undercutters; and (6) Functional equivalents of any of the machines... types identified in paragraphs (a)(1) through (a)(5) of this section, or functionally equivalent thereto...) of this section, or functionally equivalent thereto. The list shall be kept current and made...

  1. 49 CFR 214.505 - Required environmental control and protection systems for new on-track roadway maintenance...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... brooms; (4) Rotary scarifiers; (5) Undercutters; and (6) Functional equivalents of any of the machines... types identified in paragraphs (a)(1) through (a)(5) of this section, or functionally equivalent thereto...) of this section, or functionally equivalent thereto. The list shall be kept current and made...

  2. 49 CFR 214.505 - Required environmental control and protection systems for new on-track roadway maintenance...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... brooms; (4) Rotary scarifiers; (5) Undercutters; and (6) Functional equivalents of any of the machines... types identified in paragraphs (a)(1) through (a)(5) of this section, or functionally equivalent thereto...) of this section, or functionally equivalent thereto. The list shall be kept current and made...

  3. 49 CFR 214.505 - Required environmental control and protection systems for new on-track roadway maintenance...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... brooms; (4) Rotary scarifiers; (5) Undercutters; and (6) Functional equivalents of any of the machines... types identified in paragraphs (a)(1) through (a)(5) of this section, or functionally equivalent thereto...) of this section, or functionally equivalent thereto. The list shall be kept current and made...

  4. Interoceptive inference: From computational neuroscience to clinic.

    PubMed

    Owens, Andrew P; Allen, Micah; Ondobaka, Sasha; Friston, Karl J

    2018-04-22

    The central and autonomic nervous systems can be defined by their anatomical, functional and neurochemical characteristics, but neither functions in isolation. For example, fundamental components of autonomically mediated homeostatic processes are afferent interoceptive signals reporting the internal state of the body and efferent signals acting on interoceptive feedback assimilated by the brain. Recent predictive coding (interoceptive inference) models formulate interoception in terms of embodied predictive processes that support emotion and selfhood. We propose interoception may serve as a way to investigate holistic nervous system function and dysfunction in disorders of brain, body and behaviour. We appeal to predictive coding and (active) interoceptive inference, to describe the homeostatic functions of the central and autonomic nervous systems. We do so by (i) reviewing the active inference formulation of interoceptive and autonomic function, (ii) survey clinical applications of this formulation and (iii) describe how it offers an integrative approach to human physiology; particularly, interactions between the central and peripheral nervous systems in health and disease. Crown Copyright © 2018. Published by Elsevier Ltd. All rights reserved.

  5. Whatever Gave You That Idea? False Memories Following Equivalence Training: A Behavioral Account of the Misinformation Effect

    PubMed Central

    Challies, Danna M; Hunt, Maree; Garry, Maryanne; Harper, David N

    2011-01-01

    The misinformation effect is a term used in the cognitive psychological literature to describe both experimental and real-world instances in which misleading information is incorporated into an account of an historical event. In many real-world situations, it is not possible to identify a distinct source of misinformation, and it appears that the witness may have inferred a false memory by integrating information from a variety of sources. In a stimulus equivalence task, a small number of trained relations between some members of a class of arbitrary stimuli result in a large number of untrained, or emergent relations, between all members of the class. Misleading information was introduced into a simple memory task between a learning phase and a recognition test by means of a match-to-sample stimulus equivalence task that included both stimuli from the original learning task and novel stimuli. At the recognition test, participants given equivalence training were more likely to misidentify patterns than those who were not given such training. The misinformation effect was distinct from the effects of prior stimulus exposure, or partial stimulus control. In summary, stimulus equivalence processes may underlie some real-world manifestations of the misinformation effect. PMID:22084495

  6. On equivalent characterizations of convexity of functions

    NASA Astrophysics Data System (ADS)

    Gkioulekas, Eleftherios

    2013-04-01

    A detailed development of the theory of convex functions, not often found in complete form in most textbooks, is given. We adopt the strict secant line definition as the definitive definition of convexity. We then show that for differentiable functions, this definition becomes logically equivalent with the first derivative monotonicity definition and the tangent line definition. Consequently, for differentiable functions, all three characterizations are logically equivalent.

  7. THE NON-UNIVERSALITY OF THE LOW-MASS END OF THE IMF IS ROBUST AGAINST THE CHOICE OF SSP MODEL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spiniello, C.; Trager, S. C.; Koopmans, L. V. E.

    2015-04-20

    We perform a direct comparison of two state-of-the art single stellar population (SSP) models that have been used to demonstrate the non-universality of the low-mass end of the initial mass function (IMF) slope. The two public versions of the SSP models are restricted to either solar abundance patterns or solar metallicity, too restrictive if one aims to disentangle elemental enhancements, metallicity changes, and IMF variations in massive early-type galaxies (ETGs) with star formation histories different from those in the solar neighborhood. We define response functions (to metallicity and α-abundance) to extend the parameter space for each set of models. Wemore » compare these extended models with a sample of Sloan Digital Sky Survey (SDSS) ETG spectra with varying velocity dispersions. We measure equivalent widths of optical IMF-sensitive stellar features to examine the effect of the underlying model assumptions and ingredients, such as stellar libraries or isochrones, on the inference of the IMF slope down to ∼0.1 M{sub ⊙}. We demonstrate that the steepening of the low-mass end of the IMF based on a non-degenerate set of spectroscopic optical indicators is robust against the choice of the stellar population model. Although the models agree in a relative sense (i.e., both imply more bottom-heavy IMFs for more massive systems), we find non-negligible differences in the absolute values of the IMF slope inferred at each velocity dispersion by using the two different models. In particular, we find large inconsistencies in the quantitative predictions of the IMF slope variations and abundance patterns when sodium lines are used. We investigate the possible reasons for these inconsistencies.« less

  8. The influence of high viscosity slabs on post-glacial sea-level change: the case of Barbados

    NASA Astrophysics Data System (ADS)

    Austermann, Jacqueline; Mitrovica, Jerry X.; Latychev, Konstantin

    2013-04-01

    The coral record at Barbados is one of the best available measures of relative sea level during the last glacial cycle and has been widely used to reconstruct ice volume (or, equivalently, eustatic sea-level, ESL) changes during the last deglaciation phase of the ice age. However, to estimate ESL variations from the local relative sea level (RSL) history at Barbados, one has to account for the contaminating effect of glacial isostatic adjustment (GIA). In previous work, the GIA signal at this site has been corrected for by assuming a spherically symmetric (i.e., 1-D) viscoelastic Earth. Since Barbados is located at the margin of the South American - Caribbean subduction zone, this assumption may introduce a significant error in inferences of ice volumes. To address this issue, we use a finite-volume numerical code to model GIA in the Caribbean region including the effects of a lithosphere with variable elastic thickness, plate boundaries, lateral variations in lower mantle viscosity, and a high viscosity slab within the upper mantle. The geometry of the subducted slab is inferred from local seismicity. We find that predictions of relative sea-level change since the Last Glacial Maximum (LGM) in the Caribbean region are diminished by ~10 m, relative to 1-D calculations, which suggests that previous studies have underestimated post-LGM ESL change by the same amount. This perturbation, which largely reflects the impact of the high viscosity slab, is nearly twice the total GIA-induced departure from eustasy predicted at Barbados using the 1-D Earth model. Our calculations imply an excess ice-volume equivalent to ~130 m ESL at the LGM, which brings the Barbados-based estimate into agreement with inferences based on other far-field RSL histories, such as at Bonaparte Gulf. This inference, together with recent studies that have substantially lowered estimates of Antarctic Ice Sheet mass at LGM, suggest that a significant amount of ice remains unaccounted for in sea-level based ice sheet reconstructions. In addition, we conclude that inference of ice age ice volumes derived from RSL histories at sites in proximity to subduction zones must incorporate slab structure into the numerical predictions of the GIA process.

  9. Seasonal dynamics of freshwater pathogens as measured by microarray at Lake Sapanca, a drinking water source in the north-eastern part of Turkey.

    PubMed

    Akçaalan, Reyhan; Albay, Meric; Koker, Latife; Baudart, Julia; Guillebault, Delphine; Fischer, Sabine; Weigel, Wilfried; Medlin, Linda K

    2017-12-22

    Monitoring drinking water quality is an important public health issue. Two objectives from the 4 years, six nations, EU Project μAqua were to develop hierarchically specific probes to detect and quantify pathogens in drinking water using a PCR-free microarray platform and to design a standardised water sampling program from different sources in Europe to obtain sufficient material for downstream analysis. Our phylochip contains barcodes (probes) that specifically identify freshwater pathogens that are human health risks in a taxonomic hierarchical fashion such that if species is present, the entire taxonomic hierarchy (genus, family, order, phylum, kingdom) leading to it must also be present, which avoids false positives. Molecular tools are more rapid, accurate and reliable than traditional methods, which means faster mitigation strategies with less harm to humans and the community. We present microarray results for the presence of freshwater pathogens from a Turkish lake used drinking water and inferred cyanobacterial cell equivalents from samples concentrated from 40 into 1 L in 45 min using hollow fibre filters. In two companion studies from the same samples, cyanobacterial toxins were analysed using chemical methods and those dates with highest toxin values also had highest cell equivalents as inferred from this microarray study.

  10. Isoprene photo-oxidation products quantify the effect of pollution on hydroxyl radicals over Amazonia.

    PubMed

    Liu, Yingjun; Seco, Roger; Kim, Saewung; Guenther, Alex B; Goldstein, Allen H; Keutsch, Frank N; Springston, Stephen R; Watson, Thomas B; Artaxo, Paulo; Souza, Rodrigo A F; McKinney, Karena A; Martin, Scot T

    2018-04-01

    Nitrogen oxides (NO x ) emitted from human activities are believed to regulate the atmospheric oxidation capacity of the troposphere. However, observational evidence is limited for the low-to-median NO x concentrations prevalent outside of polluted regions. Directly measuring oxidation capacity, represented primarily by hydroxyl radicals (OH), is challenging, and the span in NO x concentrations at a single observation site is often not wide. Concentrations of isoprene and its photo-oxidation products were used to infer the equivalent noontime OH concentrations. The fetch at an observation site in central Amazonia experienced varied contributions from background regional air, urban pollution, and biomass burning. The afternoon concentrations of reactive nitrogen oxides (NO y ), indicative of NO x exposure during the preceding few hours, spanned from 0.3 to 3.5 parts per billion. Accompanying the increase of NO y concentration, the inferred equivalent noontime OH concentrations increased by at least 250% from 0.6 × 10 6 to 1.6 × 10 6 cm -3 . The conclusion is that, compared to background conditions of low NO x concentrations over the Amazon forest, pollution increased NO x concentrations and amplified OH concentrations, indicating the susceptibility of the atmospheric oxidation capacity over the forest to anthropogenic influence and reinforcing the important role of NO x in sustaining OH concentrations.

  11. Quantifying uncertainty in soot volume fraction estimates using Bayesian inference of auto-correlated laser-induced incandescence measurements

    NASA Astrophysics Data System (ADS)

    Hadwin, Paul J.; Sipkens, T. A.; Thomson, K. A.; Liu, F.; Daun, K. J.

    2016-01-01

    Auto-correlated laser-induced incandescence (AC-LII) infers the soot volume fraction (SVF) of soot particles by comparing the spectral incandescence from laser-energized particles to the pyrometrically inferred peak soot temperature. This calculation requires detailed knowledge of model parameters such as the absorption function of soot, which may vary with combustion chemistry, soot age, and the internal structure of the soot. This work presents a Bayesian methodology to quantify such uncertainties. This technique treats the additional "nuisance" model parameters, including the soot absorption function, as stochastic variables and incorporates the current state of knowledge of these parameters into the inference process through maximum entropy priors. While standard AC-LII analysis provides a point estimate of the SVF, Bayesian techniques infer the posterior probability density, which will allow scientists and engineers to better assess the reliability of AC-LII inferred SVFs in the context of environmental regulations and competing diagnostics.

  12. Functional networks inference from rule-based machine learning models.

    PubMed

    Lazzarini, Nicola; Widera, Paweł; Williamson, Stuart; Heer, Rakesh; Krasnogor, Natalio; Bacardit, Jaume

    2016-01-01

    Functional networks play an important role in the analysis of biological processes and systems. The inference of these networks from high-throughput (-omics) data is an area of intense research. So far, the similarity-based inference paradigm (e.g. gene co-expression) has been the most popular approach. It assumes a functional relationship between genes which are expressed at similar levels across different samples. An alternative to this paradigm is the inference of relationships from the structure of machine learning models. These models are able to capture complex relationships between variables, that often are different/complementary to the similarity-based methods. We propose a protocol to infer functional networks from machine learning models, called FuNeL. It assumes, that genes used together within a rule-based machine learning model to classify the samples, might also be functionally related at a biological level. The protocol is first tested on synthetic datasets and then evaluated on a test suite of 8 real-world datasets related to human cancer. The networks inferred from the real-world data are compared against gene co-expression networks of equal size, generated with 3 different methods. The comparison is performed from two different points of view. We analyse the enriched biological terms in the set of network nodes and the relationships between known disease-associated genes in a context of the network topology. The comparison confirms both the biological relevance and the complementary character of the knowledge captured by the FuNeL networks in relation to similarity-based methods and demonstrates its potential to identify known disease associations as core elements of the network. Finally, using a prostate cancer dataset as a case study, we confirm that the biological knowledge captured by our method is relevant to the disease and consistent with the specialised literature and with an independent dataset not used in the inference process. The implementation of our network inference protocol is available at: http://ico2s.org/software/funel.html.

  13. A Note on the Equivalence between Observed and Expected Information Functions with Polytomous IRT Models

    ERIC Educational Resources Information Center

    Magis, David

    2015-01-01

    The purpose of this note is to study the equivalence of observed and expected (Fisher) information functions with polytomous item response theory (IRT) models. It is established that observed and expected information functions are equivalent for the class of divide-by-total models (including partial credit, generalized partial credit, rating…

  14. Nonparametric Bayesian inference of the microcanonical stochastic block model

    NASA Astrophysics Data System (ADS)

    Peixoto, Tiago P.

    2017-01-01

    A principled approach to characterize the hidden modular structure of networks is to formulate generative models and then infer their parameters from data. When the desired structure is composed of modules or "communities," a suitable choice for this task is the stochastic block model (SBM), where nodes are divided into groups, and the placement of edges is conditioned on the group memberships. Here, we present a nonparametric Bayesian method to infer the modular structure of empirical networks, including the number of modules and their hierarchical organization. We focus on a microcanonical variant of the SBM, where the structure is imposed via hard constraints, i.e., the generated networks are not allowed to violate the patterns imposed by the model. We show how this simple model variation allows simultaneously for two important improvements over more traditional inference approaches: (1) deeper Bayesian hierarchies, with noninformative priors replaced by sequences of priors and hyperpriors, which not only remove limitations that seriously degrade the inference on large networks but also reveal structures at multiple scales; (2) a very efficient inference algorithm that scales well not only for networks with a large number of nodes and edges but also with an unlimited number of modules. We show also how this approach can be used to sample modular hierarchies from the posterior distribution, as well as to perform model selection. We discuss and analyze the differences between sampling from the posterior and simply finding the single parameter estimate that maximizes it. Furthermore, we expose a direct equivalence between our microcanonical approach and alternative derivations based on the canonical SBM.

  15. Adipose-derived stromal cells for the reconstruction of a human vesical equivalent.

    PubMed

    Rousseau, Alexandre; Fradette, Julie; Bernard, Geneviève; Gauvin, Robert; Laterreur, Véronique; Bolduc, Stéphane

    2015-11-01

    Despite a wide panel of tissue-engineering models available for vesical reconstruction, the lack of a differentiated urothelium remains their main common limitation. For the first time to our knowledge, an entirely human vesical equivalent, free of exogenous matrix, has been reconstructed using the self-assembly method. Moreover, we tested the contribution of adipose-derived stromal cells, an easily available source of mesenchymal cells featuring many potential advantages, by reconstructing three types of equivalent, named fibroblast vesical equivalent, adipose-derived stromal cell vesical equivalent and hybrid vesical equivalent--the latter containing both adipose-derived stromal cells and fibroblasts. The new substitutes have been compared and characterized for matrix composition and organization, functionality and mechanical behaviour. Although all three vesical equivalents displayed adequate collagen type I and III expression, only two of them, fibroblast vesical equivalent and hybrid vesical equivalent, sustained the development of a differentiated and functional urothelium. The presence of uroplakins Ib, II and III and the tight junction marker ZO-1 was detected and correlated with impermeability. The mechanical resistance of these tissues was sufficient for use by surgeons. We present here in vitro tissue-engineered vesical equivalents, built without the use of any exogenous matrix, able to sustain mechanical stress and to support the formation of a functional urothelium, i.e. able to display a barrier function similar to that of native tissue. Copyright © 2013 John Wiley & Sons, Ltd.

  16. Learning Quantitative Sequence-Function Relationships from Massively Parallel Experiments

    NASA Astrophysics Data System (ADS)

    Atwal, Gurinder S.; Kinney, Justin B.

    2016-03-01

    A fundamental aspect of biological information processing is the ubiquity of sequence-function relationships—functions that map the sequence of DNA, RNA, or protein to a biochemically relevant activity. Most sequence-function relationships in biology are quantitative, but only recently have experimental techniques for effectively measuring these relationships been developed. The advent of such "massively parallel" experiments presents an exciting opportunity for the concepts and methods of statistical physics to inform the study of biological systems. After reviewing these recent experimental advances, we focus on the problem of how to infer parametric models of sequence-function relationships from the data produced by these experiments. Specifically, we retrace and extend recent theoretical work showing that inference based on mutual information, not the standard likelihood-based approach, is often necessary for accurately learning the parameters of these models. Closely connected with this result is the emergence of "diffeomorphic modes"—directions in parameter space that are far less constrained by data than likelihood-based inference would suggest. Analogous to Goldstone modes in physics, diffeomorphic modes arise from an arbitrarily broken symmetry of the inference problem. An analytically tractable model of a massively parallel experiment is then described, providing an explicit demonstration of these fundamental aspects of statistical inference. This paper concludes with an outlook on the theoretical and computational challenges currently facing studies of quantitative sequence-function relationships.

  17. An expert system shell for inferring vegetation characteristics: Atmospheric techniques (Task G)

    NASA Technical Reports Server (NTRS)

    Harrison, P. Ann; Harrison, Patrick R.

    1993-01-01

    The NASA VEGetation Workbench (VEG) is a knowledge based system that infers vegetation characteristics from reflectance data. The VEG Subgoals have been reorganized into categories. A new subgoal category 'Atmospheric Techniques' containing two new subgoals has been implemented. The subgoal Atmospheric Passes allows the scientist to take reflectance data measured at ground level and predict what the reflectance values would be if the data were measured at a different atmospheric height. The subgoal Atmospheric Corrections allows atmospheric corrections to be made to data collected from an aircraft or by a satellite to determine what the equivalent reflectance values would be if the data were measured at ground level. The report describes the implementation and testing of the basic framework and interface for the Atmospheric Techniques Subgoals.

  18. Inference of gene regulatory networks from time series by Tsallis entropy

    PubMed Central

    2011-01-01

    Background The inference of gene regulatory networks (GRNs) from large-scale expression profiles is one of the most challenging problems of Systems Biology nowadays. Many techniques and models have been proposed for this task. However, it is not generally possible to recover the original topology with great accuracy, mainly due to the short time series data in face of the high complexity of the networks and the intrinsic noise of the expression measurements. In order to improve the accuracy of GRNs inference methods based on entropy (mutual information), a new criterion function is here proposed. Results In this paper we introduce the use of generalized entropy proposed by Tsallis, for the inference of GRNs from time series expression profiles. The inference process is based on a feature selection approach and the conditional entropy is applied as criterion function. In order to assess the proposed methodology, the algorithm is applied to recover the network topology from temporal expressions generated by an artificial gene network (AGN) model as well as from the DREAM challenge. The adopted AGN is based on theoretical models of complex networks and its gene transference function is obtained from random drawing on the set of possible Boolean functions, thus creating its dynamics. On the other hand, DREAM time series data presents variation of network size and its topologies are based on real networks. The dynamics are generated by continuous differential equations with noise and perturbation. By adopting both data sources, it is possible to estimate the average quality of the inference with respect to different network topologies, transfer functions and network sizes. Conclusions A remarkable improvement of accuracy was observed in the experimental results by reducing the number of false connections in the inferred topology by the non-Shannon entropy. The obtained best free parameter of the Tsallis entropy was on average in the range 2.5 ≤ q ≤ 3.5 (hence, subextensive entropy), which opens new perspectives for GRNs inference methods based on information theory and for investigation of the nonextensivity of such networks. The inference algorithm and criterion function proposed here were implemented and included in the DimReduction software, which is freely available at http://sourceforge.net/projects/dimreduction and http://code.google.com/p/dimreduction/. PMID:21545720

  19. Predictive regulatory models in Drosophila melanogaster by integrative inference of transcriptional networks

    PubMed Central

    Marbach, Daniel; Roy, Sushmita; Ay, Ferhat; Meyer, Patrick E.; Candeias, Rogerio; Kahveci, Tamer; Bristow, Christopher A.; Kellis, Manolis

    2012-01-01

    Gaining insights on gene regulation from large-scale functional data sets is a grand challenge in systems biology. In this article, we develop and apply methods for transcriptional regulatory network inference from diverse functional genomics data sets and demonstrate their value for gene function and gene expression prediction. We formulate the network inference problem in a machine-learning framework and use both supervised and unsupervised methods to predict regulatory edges by integrating transcription factor (TF) binding, evolutionarily conserved sequence motifs, gene expression, and chromatin modification data sets as input features. Applying these methods to Drosophila melanogaster, we predict ∼300,000 regulatory edges in a network of ∼600 TFs and 12,000 target genes. We validate our predictions using known regulatory interactions, gene functional annotations, tissue-specific expression, protein–protein interactions, and three-dimensional maps of chromosome conformation. We use the inferred network to identify putative functions for hundreds of previously uncharacterized genes, including many in nervous system development, which are independently confirmed based on their tissue-specific expression patterns. Last, we use the regulatory network to predict target gene expression levels as a function of TF expression, and find significantly higher predictive power for integrative networks than for motif or ChIP-based networks. Our work reveals the complementarity between physical evidence of regulatory interactions (TF binding, motif conservation) and functional evidence (coordinated expression or chromatin patterns) and demonstrates the power of data integration for network inference and studies of gene regulation at the systems level. PMID:22456606

  20. Functional Inference of Complex Anatomical Tendinous Networks at a Macroscopic Scale via Sparse Experimentation

    PubMed Central

    Saxena, Anupam; Lipson, Hod; Valero-Cuevas, Francisco J.

    2012-01-01

    In systems and computational biology, much effort is devoted to functional identification of systems and networks at the molecular-or cellular scale. However, similarly important networks exist at anatomical scales such as the tendon network of human fingers: the complex array of collagen fibers that transmits and distributes muscle forces to finger joints. This network is critical to the versatility of the human hand, and its function has been debated since at least the 16th century. Here, we experimentally infer the structure (both topology and parameter values) of this network through sparse interrogation with force inputs. A population of models representing this structure co-evolves in simulation with a population of informative future force inputs via the predator-prey estimation-exploration algorithm. Model fitness depends on their ability to explain experimental data, while the fitness of future force inputs depends on causing maximal functional discrepancy among current models. We validate our approach by inferring two known synthetic Latex networks, and one anatomical tendon network harvested from a cadaver's middle finger. We find that functionally similar but structurally diverse models can exist within a narrow range of the training set and cross-validation errors. For the Latex networks, models with low training set error [<4%] and resembling the known network have the smallest cross-validation errors [∼5%]. The low training set [<4%] and cross validation [<7.2%] errors for models for the cadaveric specimen demonstrate what, to our knowledge, is the first experimental inference of the functional structure of complex anatomical networks. This work expands current bioinformatics inference approaches by demonstrating that sparse, yet informative interrogation of biological specimens holds significant computational advantages in accurate and efficient inference over random testing, or assuming model topology and only inferring parameters values. These findings also hold clues to both our evolutionary history and the development of versatile machines. PMID:23144601

  1. Functional inference of complex anatomical tendinous networks at a macroscopic scale via sparse experimentation.

    PubMed

    Saxena, Anupam; Lipson, Hod; Valero-Cuevas, Francisco J

    2012-01-01

    In systems and computational biology, much effort is devoted to functional identification of systems and networks at the molecular-or cellular scale. However, similarly important networks exist at anatomical scales such as the tendon network of human fingers: the complex array of collagen fibers that transmits and distributes muscle forces to finger joints. This network is critical to the versatility of the human hand, and its function has been debated since at least the 16(th) century. Here, we experimentally infer the structure (both topology and parameter values) of this network through sparse interrogation with force inputs. A population of models representing this structure co-evolves in simulation with a population of informative future force inputs via the predator-prey estimation-exploration algorithm. Model fitness depends on their ability to explain experimental data, while the fitness of future force inputs depends on causing maximal functional discrepancy among current models. We validate our approach by inferring two known synthetic Latex networks, and one anatomical tendon network harvested from a cadaver's middle finger. We find that functionally similar but structurally diverse models can exist within a narrow range of the training set and cross-validation errors. For the Latex networks, models with low training set error [<4%] and resembling the known network have the smallest cross-validation errors [∼5%]. The low training set [<4%] and cross validation [<7.2%] errors for models for the cadaveric specimen demonstrate what, to our knowledge, is the first experimental inference of the functional structure of complex anatomical networks. This work expands current bioinformatics inference approaches by demonstrating that sparse, yet informative interrogation of biological specimens holds significant computational advantages in accurate and efficient inference over random testing, or assuming model topology and only inferring parameters values. These findings also hold clues to both our evolutionary history and the development of versatile machines.

  2. Inference of Vohradský's Models of Genetic Networks by Solving Two-Dimensional Function Optimization Problems

    PubMed Central

    Kimura, Shuhei; Sato, Masanao; Okada-Hatakeyama, Mariko

    2013-01-01

    The inference of a genetic network is a problem in which mutual interactions among genes are inferred from time-series of gene expression levels. While a number of models have been proposed to describe genetic networks, this study focuses on a mathematical model proposed by Vohradský. Because of its advantageous features, several researchers have proposed the inference methods based on Vohradský's model. When trying to analyze large-scale networks consisting of dozens of genes, however, these methods must solve high-dimensional non-linear function optimization problems. In order to resolve the difficulty of estimating the parameters of the Vohradský's model, this study proposes a new method that defines the problem as several two-dimensional function optimization problems. Through numerical experiments on artificial genetic network inference problems, we showed that, although the computation time of the proposed method is not the shortest, the method has the ability to estimate parameters of Vohradský's models more effectively with sufficiently short computation times. This study then applied the proposed method to an actual inference problem of the bacterial SOS DNA repair system, and succeeded in finding several reasonable regulations. PMID:24386175

  3. Approximate Bayesian computation in large-scale structure: constraining the galaxy-halo connection

    NASA Astrophysics Data System (ADS)

    Hahn, ChangHoon; Vakili, Mohammadjavad; Walsh, Kilian; Hearin, Andrew P.; Hogg, David W.; Campbell, Duncan

    2017-08-01

    Standard approaches to Bayesian parameter inference in large-scale structure assume a Gaussian functional form (chi-squared form) for the likelihood. This assumption, in detail, cannot be correct. Likelihood free inferences such as approximate Bayesian computation (ABC) relax these restrictions and make inference possible without making any assumptions on the likelihood. Instead ABC relies on a forward generative model of the data and a metric for measuring the distance between the model and data. In this work, we demonstrate that ABC is feasible for LSS parameter inference by using it to constrain parameters of the halo occupation distribution (HOD) model for populating dark matter haloes with galaxies. Using specific implementation of ABC supplemented with population Monte Carlo importance sampling, a generative forward model using HOD and a distance metric based on galaxy number density, two-point correlation function and galaxy group multiplicity function, we constrain the HOD parameters of mock observation generated from selected 'true' HOD parameters. The parameter constraints we obtain from ABC are consistent with the 'true' HOD parameters, demonstrating that ABC can be reliably used for parameter inference in LSS. Furthermore, we compare our ABC constraints to constraints we obtain using a pseudo-likelihood function of Gaussian form with MCMC and find consistent HOD parameter constraints. Ultimately, our results suggest that ABC can and should be applied in parameter inference for LSS analyses.

  4. PlaNet: Combined Sequence and Expression Comparisons across Plant Networks Derived from Seven Species[W][OA

    PubMed Central

    Mutwil, Marek; Klie, Sebastian; Tohge, Takayuki; Giorgi, Federico M.; Wilkins, Olivia; Campbell, Malcolm M.; Fernie, Alisdair R.; Usadel, Björn; Nikoloski, Zoran; Persson, Staffan

    2011-01-01

    The model organism Arabidopsis thaliana is readily used in basic research due to resource availability and relative speed of data acquisition. A major goal is to transfer acquired knowledge from Arabidopsis to crop species. However, the identification of functional equivalents of well-characterized Arabidopsis genes in other plants is a nontrivial task. It is well documented that transcriptionally coordinated genes tend to be functionally related and that such relationships may be conserved across different species and even kingdoms. To exploit such relationships, we constructed whole-genome coexpression networks for Arabidopsis and six important plant crop species. The interactive networks, clustered using the HCCA algorithm, are provided under the banner PlaNet (http://aranet.mpimp-golm.mpg.de). We implemented a comparative network algorithm that estimates similarities between network structures. Thus, the platform can be used to swiftly infer similar coexpressed network vicinities within and across species and can predict the identity of functional homologs. We exemplify this using the PSA-D and chalcone synthase-related gene networks. Finally, we assessed how ontology terms are transcriptionally connected in the seven species and provide the corresponding MapMan term coexpression networks. The data support the contention that this platform will considerably improve transfer of knowledge generated in Arabidopsis to valuable crop species. PMID:21441431

  5. 75 FR 7296 - New Postal Product

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-18

    ... Equivalent Global Expedited package Services 2 Negotiated Service Agreement and Application for Non-Public... Postal Service believes the instant contract is functionally equivalent to previously submitted GEPS 2... established GEPS 1 as a product, also authorized functionally equivalent agreements to be included within the...

  6. Efficient Bayesian hierarchical functional data analysis with basis function approximations using Gaussian-Wishart processes.

    PubMed

    Yang, Jingjing; Cox, Dennis D; Lee, Jong Soo; Ren, Peng; Choi, Taeryon

    2017-12-01

    Functional data are defined as realizations of random functions (mostly smooth functions) varying over a continuum, which are usually collected on discretized grids with measurement errors. In order to accurately smooth noisy functional observations and deal with the issue of high-dimensional observation grids, we propose a novel Bayesian method based on the Bayesian hierarchical model with a Gaussian-Wishart process prior and basis function representations. We first derive an induced model for the basis-function coefficients of the functional data, and then use this model to conduct posterior inference through Markov chain Monte Carlo methods. Compared to the standard Bayesian inference that suffers serious computational burden and instability in analyzing high-dimensional functional data, our method greatly improves the computational scalability and stability, while inheriting the advantage of simultaneously smoothing raw observations and estimating the mean-covariance functions in a nonparametric way. In addition, our method can naturally handle functional data observed on random or uncommon grids. Simulation and real studies demonstrate that our method produces similar results to those obtainable by the standard Bayesian inference with low-dimensional common grids, while efficiently smoothing and estimating functional data with random and high-dimensional observation grids when the standard Bayesian inference fails. In conclusion, our method can efficiently smooth and estimate high-dimensional functional data, providing one way to resolve the curse of dimensionality for Bayesian functional data analysis with Gaussian-Wishart processes. © 2017, The International Biometric Society.

  7. On imputing function to structure from the behavioural effects of brain lesions.

    PubMed

    Young, M P; Hilgetag, C C; Scannell, J W

    2000-01-29

    What is the link, if any, between the patterns of connections in the brain and the behavioural effects of localized brain lesions? We explored this question in four related ways. First, we investigated the distribution of activity decrements that followed simulated damage to elements of the thalamocortical network, using integrative mechanisms that have recently been used to successfully relate connection data to information on the spread of activation, and to account simultaneously for a variety of lesion effects. Second, we examined the consequences of the patterns of decrement seen in the simulation for each type of inference that has been employed to impute function to structure on the basis of the effects of brain lesions. Every variety of conventional inference, including double dissociation, readily misattributed function to structure. Third, we tried to derive a more reliable framework of inference for imputing function to structure, by clarifying concepts of function, and exploring a more formal framework, in which knowledge of connectivity is necessary but insufficient, based on concepts capable of mathematical specification. Fourth, we applied this framework to inferences about function relating to a simple network that reproduces intact, lesioned and paradoxically restored orientating behaviour. Lesion effects could be used to recover detailed and reliable information on which structures contributed to particular functions in this simple network. Finally, we explored how the effects of brain lesions and this formal approach could be used in conjunction with information from multiple neuroscience methodologies to develop a practical and reliable approach to inferring the functional roles of brain structures.

  8. Making Inferences: Comprehension of Physical Causality, Intentionality, and Emotions in Discourse by High-Functioning Older Children, Adolescents, and Adults with Autism

    ERIC Educational Resources Information Center

    Bodner, Kimberly E.; Engelhardt, Christopher R.; Minshew, Nancy J.; Williams, Diane L.

    2015-01-01

    Studies investigating inferential reasoning in autism spectrum disorder (ASD) have focused on the ability to make socially-related inferences or inferences more generally. Important variables for intervention planning such as whether inferences depend on physical experiences or the nature of social information have received less consideration. A…

  9. Inference of neuronal network spike dynamics and topology from calcium imaging data

    PubMed Central

    Lütcke, Henry; Gerhard, Felipe; Zenke, Friedemann; Gerstner, Wulfram; Helmchen, Fritjof

    2013-01-01

    Two-photon calcium imaging enables functional analysis of neuronal circuits by inferring action potential (AP) occurrence (“spike trains”) from cellular fluorescence signals. It remains unclear how experimental parameters such as signal-to-noise ratio (SNR) and acquisition rate affect spike inference and whether additional information about network structure can be extracted. Here we present a simulation framework for quantitatively assessing how well spike dynamics and network topology can be inferred from noisy calcium imaging data. For simulated AP-evoked calcium transients in neocortical pyramidal cells, we analyzed the quality of spike inference as a function of SNR and data acquisition rate using a recently introduced peeling algorithm. Given experimentally attainable values of SNR and acquisition rate, neural spike trains could be reconstructed accurately and with up to millisecond precision. We then applied statistical neuronal network models to explore how remaining uncertainties in spike inference affect estimates of network connectivity and topological features of network organization. We define the experimental conditions suitable for inferring whether the network has a scale-free structure and determine how well hub neurons can be identified. Our findings provide a benchmark for future calcium imaging studies that aim to reliably infer neuronal network properties. PMID:24399936

  10. ON GALACTIC DENSITY MODELING IN THE PRESENCE OF DUST EXTINCTION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bovy, Jo; Rix, Hans-Walter; Schlafly, Edward F.

    Inferences about the spatial density or phase-space structure of stellar populations in the Milky Way require a precise determination of the effective survey volume. The volume observed by surveys such as Gaia or near-infrared spectroscopic surveys, which have good coverage of the Galactic midplane region, is highly complex because of the abundant small-scale structure in the three-dimensional interstellar dust extinction. We introduce a novel framework for analyzing the importance of small-scale structure in the extinction. This formalism demonstrates that the spatially complex effect of extinction on the selection function of a pencil-beam or contiguous sky survey is equivalent to amore » low-pass filtering of the extinction-affected selection function with the smooth density field. We find that the angular resolution of current 3D extinction maps is sufficient for analyzing Gaia sub-samples of millions of stars. However, the current distance resolution is inadequate and needs to be improved by an order of magnitude, especially in the inner Galaxy. We also present a practical and efficient method for properly taking the effect of extinction into account in analyses of Galactic structure through an effective selection function. We illustrate its use with the selection function of red-clump stars in APOGEE using and comparing a variety of current 3D extinction maps.« less

  11. Understanding the contribution of phytoplankton phase functions to uncertainties in the water colour signal.

    PubMed

    Lain, Lisl Robertson; Bernard, Stewart; Matthews, Mark W

    2017-02-20

    The accurate description of a water body's volume scattering function (VSF), and hence its phase functions, is critical to the determination of the constituent inherent optical properties (IOPs), the associated spectral water-leaving reflectance, and consequently the retrieval of phytoplankton functional type (PFT) information. The equivalent algal populations (EAP) model has previously been evaluated for phytoplankton-dominated waters, and offers the ability to provide phytoplankton population-specific phase functions, unveiling a new opportunity to further understanding of the causality of the PFT signal. This study presents and evaluates the wavelength dependent, spectrally variable EAP particle phase functions and the subsequent effects on water-leaving reflectance. Comparisons are made with frequently used phase function approximations e.g. the Fournier Forand formulation, as well as with phase functions inferred from measured VSFs in coastal waters. Relative differences in shape and magnitude are quantified. Reflectance modelled with the EAP phase functions is then compared against measured reflectance data from phytoplankton-dominated waters. Further examples of modelled phytoplankton-dominated waters are discussed with reference to choice of phase function for two PFTs (eukaryote and prokaryote) across a range of biomass. Finally a demonstration of the sensitivity of reflectance due to the choice of phase function is presented. The EAP model phase functions account for both spectral and angular variability in phytoplankton backscattering i.e. they display variability which is both spectral and shape-related. It is concluded that phase functions modelled in this way are necessary for investigating the effects of assemblage variability on the ocean colour signal, and should be considered for model closure even in relatively low scattering conditions where phytoplankton dominate the IOPs.

  12. The Evolution of Cooperation in the Finitely Repeated Prisoner’s Dilemma

    DTIC Science & Technology

    1989-09-01

    biological evolutionary game theory, mathematical ecology (the replicator dynamics are formally equivalent to the Lotka - Volterra dynamics), and...repeated prisoner’s dilemma. Under the dynamics considered, if there is convergence to a limit (in general there need not be), then that limit must...of time. It will be noted also that this same behavior can create computation problems making it imprudent in general to try to infer limiting

  13. Effects of select and reject control on equivalence class formation and transfer of function.

    PubMed

    Perez, William F; Tomanari, Gerson Y; Vaidya, Manish

    2015-09-01

    The present study used a single-subject design to evaluate the effects of select or reject control on equivalence class formation and transfer of function. Adults were exposed to a matching-to-sample task with observing requirements (MTS-OR) in order to bias the establishment of sample/S+ (select) or sample/S- (reject) relations. In Experiment 1, four sets of baseline conditional relations were taught-two under reject control (A1B2C1, A2B1C2) and two under select control (D1E1F1, D2E2F2). Participants were tested for transitivity, symmetry, equivalence and reflexivity. They also learned a simple discrimination involving one of the stimuli from the equivalence classes and were tested for the transfer of the discriminative function. In general, participants performed with high accuracy on all equivalence-related probes as well as the transfer of function probes under select control. Under reject control, participants had high scores only on the symmetry test; transfer of function was attributed to stimuli programmed as S-. In Experiment 2, the equivalence class under reject control was expanded to four members (A1B2C1D2; A2B1C2D1). Participants had high scores only on symmetry and on transitivity and equivalence tests involving two nodes. Transfer of function was extended to the programmed S- added to each class. Results from both experiments suggest that select and reject controls might differently affect the formation of equivalence classes and the transfer of stimulus functions. © Society for the Experimental Analysis of Behavior.

  14. Nonparametric Bayesian inference for mean residual life functions in survival analysis.

    PubMed

    Poynor, Valerie; Kottas, Athanasios

    2018-01-19

    Modeling and inference for survival analysis problems typically revolves around different functions related to the survival distribution. Here, we focus on the mean residual life (MRL) function, which provides the expected remaining lifetime given that a subject has survived (i.e. is event-free) up to a particular time. This function is of direct interest in reliability, medical, and actuarial fields. In addition to its practical interpretation, the MRL function characterizes the survival distribution. We develop general Bayesian nonparametric inference for MRL functions built from a Dirichlet process mixture model for the associated survival distribution. The resulting model for the MRL function admits a representation as a mixture of the kernel MRL functions with time-dependent mixture weights. This model structure allows for a wide range of shapes for the MRL function. Particular emphasis is placed on the selection of the mixture kernel, taken to be a gamma distribution, to obtain desirable properties for the MRL function arising from the mixture model. The inference method is illustrated with a data set of two experimental groups and a data set involving right censoring. The supplementary material available at Biostatistics online provides further results on empirical performance of the model, using simulated data examples. © The Author 2018. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  15. Strategies of readers with autism when responding to inferential questions: An eye-movement study.

    PubMed

    Micai, Martina; Joseph, Holly; Vulchanova, Mila; Saldaña, David

    2017-05-01

    Previous research suggests that individuals with autism spectrum disorder (ASD) have difficulties with inference generation in reading tasks. However, most previous studies have examined how well children understand a text after reading or have measured on-line reading behavior without response to questions. The aim of this study was to investigate the online strategies of children and adolescents with autism during reading and at the same time responding to a question by monitoring their eye movements. The reading behavior of participants with ASD was compared with that of age-, language-, nonverbal intelligence-, reading-, and receptive language skills-matched participants without ASD (control group). The results showed that the ASD group were as accurate as the control group in generating inferences when answering questions about the short texts, and no differences were found between the two groups in the global paragraph reading and responding times. However, the ASD group displayed longer gaze latencies on a target word necessary to produce an inference. They also showed more regressions into the word that supported the inference compared to the control group after reading the question, irrespective of whether an inference was required or not. In conclusion, the ASD group achieved an equivalent level of inferential comprehension, but showed subtle differences in reading comprehension strategies compared to the control group. Autism Res 2017, 10: 888-900. © 2016 International Society for Autism Research, Wiley Periodicals, Inc. © 2016 International Society for Autism Research, Wiley Periodicals, Inc.

  16. Developing animals flout prominent assumptions of ecological physiology.

    PubMed

    Burggren, Warren W

    2005-08-01

    Every field of biology has its assumptions, but when they grow to be dogma, they can become constraining. This essay presents data-based challenges to several prominent assumptions of developmental physiologists. The ubiquity of allometry is such an assumption, yet animal development is characterized by rate changes that are counter to allometric predictions. Physiological complexity is assumed to increase with development, but examples are provided showing that complexity can be greatest at intermediate developmental stages. It is assumed that organs have functional equivalency in embryos and adults, yet embryonic structures can have quite different functions than inferred from adults. Another assumption challenged is the duality of neural control (typically sympathetic and parasympathetic), since one of these two regulatory mechanisms typically considerably precedes in development the appearance of the other. A final assumption challenged is the notion that divergent phylogeny creates divergent physiologies in embryos just as in adults, when in fact early in development disparate vertebrate taxa show great quantitative as well as qualitative similarity. Collectively, the inappropriateness of these prominent assumptions based on adult studies suggests that investigation of embryos, larvae and fetuses be conducted with appreciation for their potentially unique physiologies.

  17. Does Exercise Improve Cognitive Performance? A Conservative Message from Lord's Paradox.

    PubMed

    Liu, Sicong; Lebeau, Jean-Charles; Tenenbaum, Gershon

    2016-01-01

    Although extant meta-analyses support the notion that exercise results in cognitive performance enhancement, methodology shortcomings are noted among primary evidence. The present study examined relevant randomized controlled trials (RCTs) published in the past 20 years (1996-2015) for methodological concerns arise from Lord's paradox. Our analysis revealed that RCTs supporting the positive effect of exercise on cognition are likely to include Type I Error(s). This result can be attributed to the use of gain score analysis on pretest-posttest data as well as the presence of control group superiority over the exercise group on baseline cognitive measures. To improve accuracy of causal inferences in this area, analysis of covariance on pretest-posttest data is recommended under the assumption of group equivalence. Important experimental procedures are discussed to maintain group equivalence.

  18. Snow water equivalent determination by microwave radiometry

    NASA Technical Reports Server (NTRS)

    Chang, A. T. C.; Foster, J. L.; Hall, D. K.; Rango, A.; Hartline, B. K.

    1981-01-01

    One of the most important parameters for accurate snowmelt runoff prediction is snow water equivalent (SWE) which is contentionally monitored using observations made at widely scattered points in or around specific watersheds. Remote sensors which provide data with better spatial and temporal coverage can be used to improve the SWE estimates. Microwave radiation, which can penetrate through a snowpack, may be used to infer the SWE. Calculations made from a microscopic scattering model were used to simulate the effect of varying SWE on the microwave brightness temperature. Data obtained from truck mounted, airborne and spaceborne systems from various test sites were studied. The simulated SWE compares favorable with the measured SWE. In addition, whether the underlying soil is frozen or thawed can be discriminated successfully on the basis of the polarization of the microwave radiation.

  19. Remote sensing: Snow monitoring tool for today and tomorrow

    NASA Technical Reports Server (NTRS)

    Rango, A.

    1977-01-01

    Various types of remote sensing are now available or will be in the future for snowpack monitoring. Aircraft reconnaissance is now used in a conventional manner by various water resources agencies to obtain information on snowlines, depth, and melting of the snowpack for forecasting purposes. The use of earth resources satellites for mapping snowcovered area, snowlines, and changes in snowcover during the spring has increased during the last five years. Gamma ray aircraft flights, although confined to an extremely low altitude, provide a means for obtaining valuable information on snow water equivalent. The most recently developed remote sensing technology for snow, namely, microwave monitoring, has provided initial results that may eventually allow us to infer snow water equivalent or depth, snow wetness, and the hydrologic condition of the underlying soil.

  20. Pythran: enabling static optimization of scientific Python programs

    NASA Astrophysics Data System (ADS)

    Guelton, Serge; Brunet, Pierrick; Amini, Mehdi; Merlini, Adrien; Corbillon, Xavier; Raynaud, Alan

    2015-01-01

    Pythran is an open source static compiler that turns modules written in a subset of Python language into native ones. Assuming that scientific modules do not rely much on the dynamic features of the language, it trades them for powerful, possibly inter-procedural, optimizations. These optimizations include detection of pure functions, temporary allocation removal, constant folding, Numpy ufunc fusion and parallelization, explicit thread-level parallelism through OpenMP annotations, false variable polymorphism pruning, and automatic vector instruction generation such as AVX or SSE. In addition to these compilation steps, Pythran provides a C++ runtime library that leverages the C++ STL to provide generic containers, and the Numeric Template Toolbox for Numpy support. It takes advantage of modern C++11 features such as variadic templates, type inference, move semantics and perfect forwarding, as well as classical idioms such as expression templates. Unlike the Cython approach, Pythran input code remains compatible with the Python interpreter. Output code is generally as efficient as the annotated Cython equivalent, if not more, but without the backward compatibility loss.

  1. Positional orthology: putting genomic evolutionary relationships into context.

    PubMed

    Dewey, Colin N

    2011-09-01

    Orthology is a powerful refinement of homology that allows us to describe more precisely the evolution of genomes and understand the function of the genes they contain. However, because orthology is not concerned with genomic position, it is limited in its ability to describe genes that are likely to have equivalent roles in different genomes. Because of this limitation, the concept of 'positional orthology' has emerged, which describes the relation between orthologous genes that retain their ancestral genomic positions. In this review, we formally define this concept, for which we introduce the shorter term 'toporthology', with respect to the evolutionary events experienced by a gene's ancestors. Through a discussion of recent studies on the role of genomic context in gene evolution, we show that the distinction between orthology and toporthology is biologically significant. We then review a number of orthology prediction methods that take genomic context into account and thus that may be used to infer the important relation of toporthology.

  2. Positional orthology: putting genomic evolutionary relationships into context

    PubMed Central

    2011-01-01

    Orthology is a powerful refinement of homology that allows us to describe more precisely the evolution of genomes and understand the function of the genes they contain. However, because orthology is not concerned with genomic position, it is limited in its ability to describe genes that are likely to have equivalent roles in different genomes. Because of this limitation, the concept of ‘positional orthology’ has emerged, which describes the relation between orthologous genes that retain their ancestral genomic positions. In this review, we formally define this concept, for which we introduce the shorter term ‘toporthology’, with respect to the evolutionary events experienced by a gene’s ancestors. Through a discussion of recent studies on the role of genomic context in gene evolution, we show that the distinction between orthology and toporthology is biologically significant. We then review a number of orthology prediction methods that take genomic context into account and thus that may be used to infer the important relation of toporthology. PMID:21705766

  3. 3D structure of macropore networks within natural and de-embarked estuary saltmarsh sediments: towards an improved understanding of network structural control over hydrologic function

    NASA Astrophysics Data System (ADS)

    Carr, Simon; Spencer, Kate; James, Tempest; Lucy, Diggens

    2015-04-01

    Saltmarshes are globally important environments which, though occupying < 4% of the Earth's surface, provide a range of ecosystem services. Yet, they are threatened by sea level rise, human population growth, urbanization and pollution resulting in degradation. To compensate for this habitat loss many coastal restoration projects have been implemented over the last few decades, largely driven by legislative requirements for improved biodiversity e.g. the EU Habitats Directive and Birds Directive. However, there is growing evidence that restored saltmarshes, recreated through the return to tidal inundation of previously drained and defended low-lying coastal land, do not have the same species composition even after 100 years and while environmental enhancement has been achieved, there may be consequences for ecosystem functioning This study presents the findings of a comparative analysis of detailed sediment structure and hydrological functioning of equivalent natural and de-embanked saltmarsh sediments at Orplands Farm, Essex, UK. 3D x-ray CT scanning of triplicate undisturbed sediment cores recovered in 2013 have been used to derive detailed volumetric reconstructions of macropore structure and networks, and to infer differences in bulk microporosity between natural and de-embanked saltmarshes. These volumes have been further visualised for qualitative analysis of the main sediment components, and extraction of key macropore space parameters for quantified analysis including total porosity and connectivity, as well as structure, organisation and efficiency (tortuosity) of macropore networks. Although total porosity was significantly greater within the de-embanked saltmarsh sediments, pore networks in these samples were less organised and more tortuous, and were also inferred to have significantly lower micro-porosity than those of the natural saltmarsh. These datasets are applied to explain significant differences in the hydraulic behaviour and functioning observed between natural and de-embarked saltmarsh at Orplands. Piezometer wells and pressure transducers recorded fluctuations in water level at 15 minute intervals over a 4.5 month period (winter 2011-2012). Basic patterns for water level fluctuations in both the natural and de-embanked saltmarsh are similar and reflect tidal flooding. However, in the de-embanked saltmarsh, water levels are higher and less responsive to tidal flooding.

  4. Genome size, cytogenetic data and transferability of EST-SSRs markers in wild and cultivated species of the genus Theobroma L. (Byttnerioideae, Malvaceae)

    PubMed Central

    da Silva, Rangeline Azevedo; Souza, Gustavo; Lemos, Lívia Santos Lima; Lopes, Uilson Vanderlei; Patrocínio, Nara Geórgia Ribeiro Braz; Alves, Rafael Moysés; Marcellino, Lucília Helena; Clement, Didier; Micheli, Fabienne

    2017-01-01

    The genus Theobroma comprises several trees species native to the Amazon. Theobroma cacao L. plays a key economic role mainly in the chocolate industry. Both cultivated and wild forms are described within the genus. Variations in genome size and chromosome number have been used for prediction purposes including the frequency of interspecific hybridization or inference about evolutionary relationships. In this study, the nuclear DNA content, karyotype and genetic diversity using functional microsatellites (EST-SSR) of seven Theobroma species were characterized. The nuclear content of DNA for all analyzed Theobroma species was 1C = ~ 0.46 pg. These species presented 2n = 20 with small chromosomes and only one pair of terminal heterochromatic bands positively stained (CMA+/DAPI− bands). The small size of Theobroma ssp. genomes was equivalent to other Byttnerioideae species, suggesting that the basal lineage of Malvaceae have smaller genomes and that there was an expansion of 2C values in the more specialized family clades. A set of 20 EST-SSR primers were characterized for related species of Theobroma, in which 12 loci were polymorphic. The polymorphism information content (PIC) ranged from 0.23 to 0.65, indicating a high level of information per locus. Combined results of flow cytometry, cytogenetic data and EST-SSRs markers will contribute to better describe the species and infer about the evolutionary relationships among Theobroma species. In addition, the importance of a core collection for conservation purposes is highlighted. PMID:28187131

  5. Genome size, cytogenetic data and transferability of EST-SSRs markers in wild and cultivated species of the genus Theobroma L. (Byttnerioideae, Malvaceae).

    PubMed

    da Silva, Rangeline Azevedo; Souza, Gustavo; Lemos, Lívia Santos Lima; Lopes, Uilson Vanderlei; Patrocínio, Nara Geórgia Ribeiro Braz; Alves, Rafael Moysés; Marcellino, Lucília Helena; Clement, Didier; Micheli, Fabienne; Gramacho, Karina Peres

    2017-01-01

    The genus Theobroma comprises several trees species native to the Amazon. Theobroma cacao L. plays a key economic role mainly in the chocolate industry. Both cultivated and wild forms are described within the genus. Variations in genome size and chromosome number have been used for prediction purposes including the frequency of interspecific hybridization or inference about evolutionary relationships. In this study, the nuclear DNA content, karyotype and genetic diversity using functional microsatellites (EST-SSR) of seven Theobroma species were characterized. The nuclear content of DNA for all analyzed Theobroma species was 1C = ~ 0.46 pg. These species presented 2n = 20 with small chromosomes and only one pair of terminal heterochromatic bands positively stained (CMA+/DAPI- bands). The small size of Theobroma ssp. genomes was equivalent to other Byttnerioideae species, suggesting that the basal lineage of Malvaceae have smaller genomes and that there was an expansion of 2C values in the more specialized family clades. A set of 20 EST-SSR primers were characterized for related species of Theobroma, in which 12 loci were polymorphic. The polymorphism information content (PIC) ranged from 0.23 to 0.65, indicating a high level of information per locus. Combined results of flow cytometry, cytogenetic data and EST-SSRs markers will contribute to better describe the species and infer about the evolutionary relationships among Theobroma species. In addition, the importance of a core collection for conservation purposes is highlighted.

  6. Quantifying crustal thickness over time in magmatic arcs

    NASA Astrophysics Data System (ADS)

    Profeta, Lucia; Ducea, Mihai N.; Chapman, James B.; Paterson, Scott R.; Gonzales, Susana Marisol Henriquez; Kirsch, Moritz; Petrescu, Lucian; Decelles, Peter G.

    2015-12-01

    We present global and regional correlations between whole-rock values of Sr/Y and La/Yb and crustal thickness for intermediate rocks from modern subduction-related magmatic arcs formed around the Pacific. These correlations bolster earlier ideas that various geochemical parameters can be used to track changes of crustal thickness through time in ancient subduction systems. Inferred crustal thicknesses using our proposed empirical fits are consistent with independent geologic constraints for the Cenozoic evolution of the central Andes, as well as various Mesozoic magmatic arc segments currently exposed in the Coast Mountains, British Columbia, and the Sierra Nevada and Mojave-Transverse Range regions of California. We propose that these geochemical parameters can be used, when averaged over the typical lifetimes and spatial footprints of composite volcanoes and their intrusive equivalents to infer crustal thickness changes over time in ancient orogens.

  7. Quantifying crustal thickness over time in magmatic arcs

    PubMed Central

    Profeta, Lucia; Ducea, Mihai N.; Chapman, James B.; Paterson, Scott R.; Gonzales, Susana Marisol Henriquez; Kirsch, Moritz; Petrescu, Lucian; DeCelles, Peter G.

    2015-01-01

    We present global and regional correlations between whole-rock values of Sr/Y and La/Yb and crustal thickness for intermediate rocks from modern subduction-related magmatic arcs formed around the Pacific. These correlations bolster earlier ideas that various geochemical parameters can be used to track changes of crustal thickness through time in ancient subduction systems. Inferred crustal thicknesses using our proposed empirical fits are consistent with independent geologic constraints for the Cenozoic evolution of the central Andes, as well as various Mesozoic magmatic arc segments currently exposed in the Coast Mountains, British Columbia, and the Sierra Nevada and Mojave-Transverse Range regions of California. We propose that these geochemical parameters can be used, when averaged over the typical lifetimes and spatial footprints of composite volcanoes and their intrusive equivalents to infer crustal thickness changes over time in ancient orogens. PMID:26633804

  8. Infrared photometry of the dwarf nova V2051 Ophiuchi - I. The mass-donor star and the distance

    NASA Astrophysics Data System (ADS)

    Wojcikiewicz, Eduardo; Baptista, Raymundo; Ribeiro, Tiago

    2018-04-01

    We report the analysis of time series of infrared JHKs photometry of the dwarf nova V2051 Oph in quiescence. We modelled the ellipsoidal variations caused by the distorted mass-donor star to infer its JHKs fluxes. From its infrared colours, we estimate a spectral type of M(8.0 ± 1.5) and an equivalent blackbody temperature of TBB = (2700 ± 270) K. We used the Barnes & Evans relation to infer a photometric parallax distance of dBE = (102 ± 16) pc to the binary. At this short distance, the corresponding accretion disc temperatures in outburst are too low to be explained by the disc-instability model for dwarf nova outbursts, underscoring a previous suggestion that the outbursts of this binary are powered by mass-transfer bursts.

  9. MicroRNA-Target Network Inference and Local Network Enrichment Analysis Identify Two microRNA Clusters with Distinct Functions in Head and Neck Squamous Cell Carcinoma

    PubMed Central

    Sass, Steffen; Pitea, Adriana; Unger, Kristian; Hess, Julia; Mueller, Nikola S.; Theis, Fabian J.

    2015-01-01

    MicroRNAs represent ~22 nt long endogenous small RNA molecules that have been experimentally shown to regulate gene expression post-transcriptionally. One main interest in miRNA research is the investigation of their functional roles, which can typically be accomplished by identification of mi-/mRNA interactions and functional annotation of target gene sets. We here present a novel method “miRlastic”, which infers miRNA-target interactions using transcriptomic data as well as prior knowledge and performs functional annotation of target genes by exploiting the local structure of the inferred network. For the network inference, we applied linear regression modeling with elastic net regularization on matched microRNA and messenger RNA expression profiling data to perform feature selection on prior knowledge from sequence-based target prediction resources. The novelty of miRlastic inference originates in predicting data-driven intra-transcriptome regulatory relationships through feature selection. With synthetic data, we showed that miRlastic outperformed commonly used methods and was suitable even for low sample sizes. To gain insight into the functional role of miRNAs and to determine joint functional properties of miRNA clusters, we introduced a local enrichment analysis procedure. The principle of this procedure lies in identifying regions of high functional similarity by evaluating the shortest paths between genes in the network. We can finally assign functional roles to the miRNAs by taking their regulatory relationships into account. We thoroughly evaluated miRlastic on a cohort of head and neck cancer (HNSCC) patients provided by The Cancer Genome Atlas. We inferred an mi-/mRNA regulatory network for human papilloma virus (HPV)-associated miRNAs in HNSCC. The resulting network best enriched for experimentally validated miRNA-target interaction, when compared to common methods. Finally, the local enrichment step identified two functional clusters of miRNAs that were predicted to mediate HPV-associated dysregulation in HNSCC. Our novel approach was able to characterize distinct pathway regulations from matched miRNA and mRNA data. An R package of miRlastic was made available through: http://icb.helmholtz-muenchen.de/mirlastic. PMID:26694379

  10. MicroRNA-Target Network Inference and Local Network Enrichment Analysis Identify Two microRNA Clusters with Distinct Functions in Head and Neck Squamous Cell Carcinoma.

    PubMed

    Sass, Steffen; Pitea, Adriana; Unger, Kristian; Hess, Julia; Mueller, Nikola S; Theis, Fabian J

    2015-12-18

    MicroRNAs represent ~22 nt long endogenous small RNA molecules that have been experimentally shown to regulate gene expression post-transcriptionally. One main interest in miRNA research is the investigation of their functional roles, which can typically be accomplished by identification of mi-/mRNA interactions and functional annotation of target gene sets. We here present a novel method "miRlastic", which infers miRNA-target interactions using transcriptomic data as well as prior knowledge and performs functional annotation of target genes by exploiting the local structure of the inferred network. For the network inference, we applied linear regression modeling with elastic net regularization on matched microRNA and messenger RNA expression profiling data to perform feature selection on prior knowledge from sequence-based target prediction resources. The novelty of miRlastic inference originates in predicting data-driven intra-transcriptome regulatory relationships through feature selection. With synthetic data, we showed that miRlastic outperformed commonly used methods and was suitable even for low sample sizes. To gain insight into the functional role of miRNAs and to determine joint functional properties of miRNA clusters, we introduced a local enrichment analysis procedure. The principle of this procedure lies in identifying regions of high functional similarity by evaluating the shortest paths between genes in the network. We can finally assign functional roles to the miRNAs by taking their regulatory relationships into account. We thoroughly evaluated miRlastic on a cohort of head and neck cancer (HNSCC) patients provided by The Cancer Genome Atlas. We inferred an mi-/mRNA regulatory network for human papilloma virus (HPV)-associated miRNAs in HNSCC. The resulting network best enriched for experimentally validated miRNA-target interaction, when compared to common methods. Finally, the local enrichment step identified two functional clusters of miRNAs that were predicted to mediate HPV-associated dysregulation in HNSCC. Our novel approach was able to characterize distinct pathway regulations from matched miRNA and mRNA data. An R package of miRlastic was made available through: http://icb.helmholtz-muenchen.de/mirlastic.

  11. 78 FR 8090 - Misuse of Internet Protocol (IP) Captioned Telephone Service; Telecommunications Relay Services...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-05

    ... communicate in a manner that is functionally equivalent to communication by conventional voice telephone users... actually need the service to communicate in a manner that is functionally equivalent to communication by... equivalent to communication by conventional voice telephone users. In fact, the unobtrusive nature of IP CTS...

  12. Isoprene photo-oxidation products quantify the effect of pollution on hydroxyl radicals over Amazonia

    DOE PAGES

    Liu, Yingjun; Seco, Roger; Kim, Saewung; ...

    2018-04-11

    Nitrogen oxides (NO x) emitted from human activities are believed to regulate the atmospheric oxidation capacity of the troposphere. However, observational evidence is limited for the low-to-median NO x concentrations prevalent outside of polluted regions. Directly measuring oxidation capacity, represented primarily by hydroxyl radicals (OH), is challenging, and the span in NO x concentrations at a single observation site is often not wide. Concentrations of isoprene and its photo-oxidation products were used to infer the equivalent noontime OH concentrations. The fetch at an observation site in central Amazonia experienced varied contributions from background regional air, urban pollution, and biomass burning.more » The afternoon concentrations of reactive nitrogen oxides (NO y), indicative of NO x exposure during the preceding few hours, spanned from 0.3 to 3.5 parts per billion. Accompanying the increase of NO y concentration, the inferred equivalent noontime OH concentrations increased by at least 250% from 0.6 × 10 6 to 1.6 × 10 6 cm -3. The conclusion is that, compared to background conditions of low NO x concentrations over the Amazon forest, pollution increased NO x concentrations and amplified OH concentrations, indicating the susceptibility of the atmospheric oxidation capacity over the forest to anthropogenic influence and reinforcing the important role of NO x in sustaining OH concentrations.« less

  13. Isoprene photo-oxidation products quantify the effect of pollution on hydroxyl radicals over Amazonia

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Yingjun; Seco, Roger; Kim, Saewung

    Nitrogen oxides (NO x) emitted from human activities are believed to regulate the atmospheric oxidation capacity of the troposphere. However, observational evidence is limited for the low-to-median NO x concentrations prevalent outside of polluted regions. Directly measuring oxidation capacity, represented primarily by hydroxyl radicals (OH), is challenging, and the span in NO x concentrations at a single observation site is often not wide. Concentrations of isoprene and its photo-oxidation products were used to infer the equivalent noontime OH concentrations. The fetch at an observation site in central Amazonia experienced varied contributions from background regional air, urban pollution, and biomass burning.more » The afternoon concentrations of reactive nitrogen oxides (NO y), indicative of NO x exposure during the preceding few hours, spanned from 0.3 to 3.5 parts per billion. Accompanying the increase of NO y concentration, the inferred equivalent noontime OH concentrations increased by at least 250% from 0.6 × 10 6 to 1.6 × 10 6 cm -3. The conclusion is that, compared to background conditions of low NO x concentrations over the Amazon forest, pollution increased NO x concentrations and amplified OH concentrations, indicating the susceptibility of the atmospheric oxidation capacity over the forest to anthropogenic influence and reinforcing the important role of NO x in sustaining OH concentrations.« less

  14. Robust functional regression model for marginal mean and subject-specific inferences.

    PubMed

    Cao, Chunzheng; Shi, Jian Qing; Lee, Youngjo

    2017-01-01

    We introduce flexible robust functional regression models, using various heavy-tailed processes, including a Student t-process. We propose efficient algorithms in estimating parameters for the marginal mean inferences and in predicting conditional means as well as interpolation and extrapolation for the subject-specific inferences. We develop bootstrap prediction intervals (PIs) for conditional mean curves. Numerical studies show that the proposed model provides a robust approach against data contamination or distribution misspecification, and the proposed PIs maintain the nominal confidence levels. A real data application is presented as an illustrative example.

  15. Pipeline for inferring protein function from dynamics using coarse-grained molecular mechanics forcefield.

    PubMed

    Bhadra, Pratiti; Pal, Debnath

    2017-04-01

    Dynamics is integral to the function of proteins, yet the use of molecular dynamics (MD) simulation as a technique remains under-explored for molecular function inference. This is more important in the context of genomics projects where novel proteins are determined with limited evolutionary information. Recently we developed a method to match the query protein's flexible segments to infer function using a novel approach combining analysis of residue fluctuation-graphs and auto-correlation vectors derived from coarse-grained (CG) MD trajectory. The method was validated on a diverse dataset with sequence identity between proteins as low as 3%, with high function-recall rates. Here we share its implementation as a publicly accessible web service, named DynFunc (Dynamics Match for Function) to query protein function from ≥1 µs long CG dynamics trajectory information of protein subunits. Users are provided with the custom-developed coarse-grained molecular mechanics (CGMM) forcefield to generate the MD trajectories for their protein of interest. On upload of trajectory information, the DynFunc web server identifies specific flexible regions of the protein linked to putative molecular function. Our unique application does not use evolutionary information to infer molecular function from MD information and can, therefore, work for all proteins, including moonlighting and the novel ones, whenever structural information is available. Our pipeline is expected to be of utility to all structural biologists working with novel proteins and interested in moonlighting functions. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Large Nc equivalence and baryons

    NASA Astrophysics Data System (ADS)

    Blake, Mike; Cherman, Aleksey

    2012-09-01

    In the large Nc limit, gauge theories with different gauge groups and matter content sometimes turn out to be “large Nc equivalent,” in the sense of having a set of coincident correlation functions. Large Nc equivalence has mainly been explored in the glueball and meson sectors. However, a recent proposal to dodge the fermion sign problem of QCD with a quark number chemical potential using large Nc equivalence motivates investigating the applicability of large Nc equivalence to correlation functions involving baryon operators. Here we present evidence that large Nc equivalence extends to the baryon sector, under the same type of symmetry realization assumptions as in the meson sector, by adapting the classic Witten analysis of large Nc baryons.

  17. High-dimensional inference with the generalized Hopfield model: principal component analysis and corrections.

    PubMed

    Cocco, S; Monasson, R; Sessak, V

    2011-05-01

    We consider the problem of inferring the interactions between a set of N binary variables from the knowledge of their frequencies and pairwise correlations. The inference framework is based on the Hopfield model, a special case of the Ising model where the interaction matrix is defined through a set of patterns in the variable space, and is of rank much smaller than N. We show that maximum likelihood inference is deeply related to principal component analysis when the amplitude of the pattern components ξ is negligible compared to √N. Using techniques from statistical mechanics, we calculate the corrections to the patterns to the first order in ξ/√N. We stress the need to generalize the Hopfield model and include both attractive and repulsive patterns in order to correctly infer networks with sparse and strong interactions. We present a simple geometrical criterion to decide how many attractive and repulsive patterns should be considered as a function of the sampling noise. We moreover discuss how many sampled configurations are required for a good inference, as a function of the system size N and of the amplitude ξ. The inference approach is illustrated on synthetic and biological data.

  18. 15 CFR 90.3 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... non-functioning county or statistical equivalent means a sub-state entity that does not function as an... program, an eligible governmental unit also includes the District of Columbia and non-functioning counties or statistical equivalents represented by a FSCPE member agency. ...

  19. Pragmatic Inference Abilities in Individuals with Asperger Syndrome or High-Functioning Autism. A Review

    ERIC Educational Resources Information Center

    Loukusa, Soile; Moilanen, Irma

    2009-01-01

    This review summarizes studies involving pragmatic language comprehension and inference abilities in individuals with Asperger syndrome or high-functioning autism. Systematic searches of three electronic databases, selected journals, and reference lists identified 20 studies meeting the inclusion criteria. These studies were evaluated in terms of:…

  20. Obesity as a risk factor for developing functional limitation among older adults: A conditional inference tree analysis

    USDA-ARS?s Scientific Manuscript database

    Objective: To examine the risk factors of developing functional decline and make probabilistic predictions by using a tree-based method that allows higher order polynomials and interactions of the risk factors. Methods: The conditional inference tree analysis, a data mining approach, was used to con...

  1. Methodology for the inference of gene function from phenotype data.

    PubMed

    Ascensao, Joao A; Dolan, Mary E; Hill, David P; Blake, Judith A

    2014-12-12

    Biomedical ontologies are increasingly instrumental in the advancement of biological research primarily through their use to efficiently consolidate large amounts of data into structured, accessible sets. However, ontology development and usage can be hampered by the segregation of knowledge by domain that occurs due to independent development and use of the ontologies. The ability to infer data associated with one ontology to data associated with another ontology would prove useful in expanding information content and scope. We here focus on relating two ontologies: the Gene Ontology (GO), which encodes canonical gene function, and the Mammalian Phenotype Ontology (MP), which describes non-canonical phenotypes, using statistical methods to suggest GO functional annotations from existing MP phenotype annotations. This work is in contrast to previous studies that have focused on inferring gene function from phenotype primarily through lexical or semantic similarity measures. We have designed and tested a set of algorithms that represents a novel methodology to define rules for predicting gene function by examining the emergent structure and relationships between the gene functions and phenotypes rather than inspecting the terms semantically. The algorithms inspect relationships among multiple phenotype terms to deduce if there are cases where they all arise from a single gene function. We apply this methodology to data about genes in the laboratory mouse that are formally represented in the Mouse Genome Informatics (MGI) resource. From the data, 7444 rule instances were generated from five generalized rules, resulting in 4818 unique GO functional predictions for 1796 genes. We show that our method is capable of inferring high-quality functional annotations from curated phenotype data. As well as creating inferred annotations, our method has the potential to allow for the elucidation of unforeseen, biologically significant associations between gene function and phenotypes that would be overlooked by a semantics-based approach. Future work will include the implementation of the described algorithms for a variety of other model organism databases, taking full advantage of the abundance of available high quality curated data.

  2. Inferring probabilistic stellar rotation periods using Gaussian processes

    NASA Astrophysics Data System (ADS)

    Angus, Ruth; Morton, Timothy; Aigrain, Suzanne; Foreman-Mackey, Daniel; Rajpaul, Vinesh

    2018-02-01

    Variability in the light curves of spotted, rotating stars is often non-sinusoidal and quasi-periodic - spots move on the stellar surface and have finite lifetimes, causing stellar flux variations to slowly shift in phase. A strictly periodic sinusoid therefore cannot accurately model a rotationally modulated stellar light curve. Physical models of stellar surfaces have many drawbacks preventing effective inference, such as highly degenerate or high-dimensional parameter spaces. In this work, we test an appropriate effective model: a Gaussian Process with a quasi-periodic covariance kernel function. This highly flexible model allows sampling of the posterior probability density function of the periodic parameter, marginalizing over the other kernel hyperparameters using a Markov Chain Monte Carlo approach. To test the effectiveness of this method, we infer rotation periods from 333 simulated stellar light curves, demonstrating that the Gaussian process method produces periods that are more accurate than both a sine-fitting periodogram and an autocorrelation function method. We also demonstrate that it works well on real data, by inferring rotation periods for 275 Kepler stars with previously measured periods. We provide a table of rotation periods for these and many more, altogether 1102 Kepler objects of interest, and their posterior probability density function samples. Because this method delivers posterior probability density functions, it will enable hierarchical studies involving stellar rotation, particularly those involving population modelling, such as inferring stellar ages, obliquities in exoplanet systems, or characterizing star-planet interactions. The code used to implement this method is available online.

  3. The transfer of Cfunc contextual control through equivalence relations.

    PubMed

    Perez, William F; Fidalgo, Adriana P; Kovac, Roberta; Nico, Yara C

    2015-05-01

    Derived relational responding is affected by contextual stimuli (Cfunc) that select specific stimulus functions. The present study investigated the transfer of Cfunc contextual control through equivalence relations by evaluating both (a) the maintenance of Cfunc contextual control after the expansion of a relational network, and (b) the establishment of novel contextual stimuli by the transfer of Cfunc contextual control through equivalence relations. Initially, equivalence relations were established and contingencies were arranged so that colors functioned as Cfunc stimuli controlling participants' key-pressing responses in the presence of any stimulus from a three-member equivalence network. To investigate the first research question, the three-member equivalence relations were expanded to five members and the novel members were presented with the Cfunc stimuli in the key-pressing task. To address the second goal of this study, the colors (Cfunc) were established as equivalent to certain line patterns. The transfer of contextual cue function (Cfunc) was tested replacing the colored backgrounds with line patterns in the key-pressing task. Results suggest that the Cfunc contextual control was transferred to novel stimuli that were added to the relational network. In addition, the line patterns indirectly acquired the contextual cue function (Cfunc) initially established for the colored backgrounds. The conceptual and applied implications of Cfunc contextual control are discussed. © Society for the Experimental Analysis of Behavior.

  4. FuncPatch: a web server for the fast Bayesian inference of conserved functional patches in protein 3D structures.

    PubMed

    Huang, Yi-Fei; Golding, G Brian

    2015-02-15

    A number of statistical phylogenetic methods have been developed to infer conserved functional sites or regions in proteins. Many methods, e.g. Rate4Site, apply the standard phylogenetic models to infer site-specific substitution rates and totally ignore the spatial correlation of substitution rates in protein tertiary structures, which may reduce their power to identify conserved functional patches in protein tertiary structures when the sequences used in the analysis are highly similar. The 3D sliding window method has been proposed to infer conserved functional patches in protein tertiary structures, but the window size, which reflects the strength of the spatial correlation, must be predefined and is not inferred from data. We recently developed GP4Rate to solve these problems under the Bayesian framework. Unfortunately, GP4Rate is computationally slow. Here, we present an intuitive web server, FuncPatch, to perform a fast approximate Bayesian inference of conserved functional patches in protein tertiary structures. Both simulations and four case studies based on empirical data suggest that FuncPatch is a good approximation to GP4Rate. However, FuncPatch is orders of magnitudes faster than GP4Rate. In addition, simulations suggest that FuncPatch is potentially a useful tool complementary to Rate4Site, but the 3D sliding window method is less powerful than FuncPatch and Rate4Site. The functional patches predicted by FuncPatch in the four case studies are supported by experimental evidence, which corroborates the usefulness of FuncPatch. The software FuncPatch is freely available at the web site, http://info.mcmaster.ca/yifei/FuncPatch golding@mcmaster.ca Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  5. Image-Data Compression Using Edge-Optimizing Algorithm for WFA Inference.

    ERIC Educational Resources Information Center

    Culik, Karel II; Kari, Jarkko

    1994-01-01

    Presents an inference algorithm that produces a weighted finite automata (WFA), in particular, the grayness functions of graytone images. Image-data compression results based on the new inference algorithm produces a WFA with a relatively small number of edges. Image-data compression results alone and in combination with wavelets are discussed.…

  6. Sequence of ligand binding and structure change in the diphtheria toxin repressor upon activation by divalent transition metals.

    PubMed

    Rangachari, Vijayaraghavan; Marin, Vedrana; Bienkiewicz, Ewa A; Semavina, Maria; Guerrero, Luis; Love, John F; Murphy, John R; Logan, Timothy M

    2005-04-19

    The diphtheria toxin repressor (DtxR) is an Fe(II)-activated transcriptional regulator of iron homeostatic and virulence genes in Corynebacterium diphtheriae. DtxR is a two-domain protein that contains two structurally and functionally distinct metal binding sites. Here, we investigate the molecular steps associated with activation by Ni(II)Cl(2) and Cd(II)Cl(2). Equilibrium binding energetics for Ni(II) were obtained from isothermal titration calorimetry, indicating apparent metal dissociation constants of 0.2 and 1.7 microM for two independent sites. The binding isotherms for Ni(II) and Cd(II) exhibited a characteristic exothermic-endothermic pattern that was used to infer the metal binding sequence by comparing the wild-type isotherm with those of several binding site mutants. These data were complemented by measuring the distance between specific backbone amide nitrogens and the first equivalent of metal through heteronuclear NMR relaxation measurements. Previous studies indicated that metal binding affects a disordered to ordered transition in the metal binding domain. The coupling between metal binding and structure change was investigated using near-UV circular dichroism spectroscopy. Together, the data show that the first equivalent of metal is bound by the primary metal binding site. This binding orients the DNA binding helices and begins to fold the N-terminal domain. Subsequent binding at the ancillary site completes the folding of this domain and formation of the dimer interface. This model is used to explain the behavior of several mutants.

  7. Does Exercise Improve Cognitive Performance? A Conservative Message from Lord's Paradox

    PubMed Central

    Liu, Sicong; Lebeau, Jean-Charles; Tenenbaum, Gershon

    2016-01-01

    Although extant meta-analyses support the notion that exercise results in cognitive performance enhancement, methodology shortcomings are noted among primary evidence. The present study examined relevant randomized controlled trials (RCTs) published in the past 20 years (1996–2015) for methodological concerns arise from Lord's paradox. Our analysis revealed that RCTs supporting the positive effect of exercise on cognition are likely to include Type I Error(s). This result can be attributed to the use of gain score analysis on pretest-posttest data as well as the presence of control group superiority over the exercise group on baseline cognitive measures. To improve accuracy of causal inferences in this area, analysis of covariance on pretest-posttest data is recommended under the assumption of group equivalence. Important experimental procedures are discussed to maintain group equivalence. PMID:27493637

  8. Developmental Changes in Children's Inductive Inferences for Biological Concepts: Implications for the Development of Essentialist Beliefs

    ERIC Educational Resources Information Center

    Farrar, M. Jeffrey; Boyer-Pennington, Michelle

    2011-01-01

    We examined developmental changes in children's inductive inferences about biological concepts as a function of knowledge of properties and concepts. Specifically, 4- to 5-year-olds and 9- to 10-year-olds were taught either familiar or unfamiliar internal, external, or functional properties about known and unknown target animals. Children were…

  9. Specificity of Emotion Inferences as a Function of Emotional Contextual Support

    ERIC Educational Resources Information Center

    Gillioz, Christelle; Gygax, Pascal M.

    2017-01-01

    Research on emotion inferences has shown that readers include a representation of the main character's emotional state in their mental representations of the text. We examined the specificity of emotion representations as a function of the emotion content of short narratives, in terms of the quantity and quality of emotion components included in…

  10. Independence of Hot and Cold Executive Function Deficits in High-Functioning Adults with Autism Spectrum Disorder.

    PubMed

    Zimmerman, David L; Ownsworth, Tamara; O'Donovan, Analise; Roberts, Jacqueline; Gullo, Matthew J

    2016-01-01

    Individuals with autistic spectrum disorder (ASD) display diverse deficits in social, cognitive and behavioral functioning. To date, there has been mixed findings on the profile of executive function deficits for high-functioning adults (IQ > 70) with ASD. A conceptual distinction is commonly made between "cold" and "hot" executive functions. Cold executive functions refer to mechanistic higher-order cognitive operations (e.g., working memory), whereas hot executive functions entail cognitive abilities supported by emotional awareness and social perception (e.g., social cognition). This study aimed to determine the independence of deficits in hot and cold executive functions for high-functioning adults with ASD. Forty-two adults with ASD (64% male, aged 18-66 years) and 40 age and gender matched controls were administered The Awareness of Social Inference Test (TASIT; emotion recognition and social inference), Letter Number Sequencing (working memory) and Hayling Sentence Completion Test (response initiation and suppression). Between-group analyses identified that the ASD group performed significantly worse than matched controls on all measures of cold and hot executive functions (d = 0.54 - 1.5). Hierarchical multiple regression analyses revealed that the ASD sample performed more poorly on emotion recognition and social inference tasks than matched controls after controlling for cold executive functions and employment status. The findings also indicated that the ability to recognize emotions and make social inferences was supported by working memory and response initiation and suppression processes. Overall, this study supports the distinction between hot and cold executive function impairments for adults with ASD. Moreover, it advances understanding of higher-order impairments underlying social interaction difficulties for this population which, in turn, may assist with diagnosis and inform intervention programs.

  11. Independence of Hot and Cold Executive Function Deficits in High-Functioning Adults with Autism Spectrum Disorder

    PubMed Central

    Zimmerman, David L.; Ownsworth, Tamara; O'Donovan, Analise; Roberts, Jacqueline; Gullo, Matthew J.

    2016-01-01

    Individuals with autistic spectrum disorder (ASD) display diverse deficits in social, cognitive and behavioral functioning. To date, there has been mixed findings on the profile of executive function deficits for high-functioning adults (IQ > 70) with ASD. A conceptual distinction is commonly made between “cold” and “hot” executive functions. Cold executive functions refer to mechanistic higher-order cognitive operations (e.g., working memory), whereas hot executive functions entail cognitive abilities supported by emotional awareness and social perception (e.g., social cognition). This study aimed to determine the independence of deficits in hot and cold executive functions for high-functioning adults with ASD. Forty-two adults with ASD (64% male, aged 18–66 years) and 40 age and gender matched controls were administered The Awareness of Social Inference Test (TASIT; emotion recognition and social inference), Letter Number Sequencing (working memory) and Hayling Sentence Completion Test (response initiation and suppression). Between-group analyses identified that the ASD group performed significantly worse than matched controls on all measures of cold and hot executive functions (d = 0.54 − 1.5). Hierarchical multiple regression analyses revealed that the ASD sample performed more poorly on emotion recognition and social inference tasks than matched controls after controlling for cold executive functions and employment status. The findings also indicated that the ability to recognize emotions and make social inferences was supported by working memory and response initiation and suppression processes. Overall, this study supports the distinction between hot and cold executive function impairments for adults with ASD. Moreover, it advances understanding of higher-order impairments underlying social interaction difficulties for this population which, in turn, may assist with diagnosis and inform intervention programs. PMID:26903836

  12. Inferences from Images: Final Report 1984-1987

    DTIC Science & Technology

    1988-12-01

    can be described by a twice- differentiable surface (not necessarily planar) Z(X,Y). Image coordinates will be denoted by lower-case letters (z, yi...velocity at the origin in the z and y directions. The next four can be thought of as describing the deformation of a differential neighborhood around...equivalent, they give the same solution for ( N1e ...,A4,T1 ,T2 ). This establishes the claim for the general case. Two special cases that were excluded from

  13. Reconstructing temperatures from lake sediments in northern Europe: what do the biological proxies really tell us?

    NASA Astrophysics Data System (ADS)

    Cunningham, Laura; Holmes, Naomi; Bigler, Christian; Dadal, Anna; Bergman, Jonas; Eriksson, Lars; Brooks, Stephen; Langdon, Pete; Caseldine, Chris

    2010-05-01

    Over the past two decades considerable effort has been devoted to quantitatively reconstructing temperatures from biological proxies preserved in lake sediments, via transfer functions. Such transfer functions typically consist of modern sediment samples, collected over a broad environmental gradient. Correlations between the biological communities and environmental parameters observed over these broad gradients are assumed to be equally valid temporally. The predictive ability of such spatially based transfer functions has traditionally been assessed by comparisons of measured and inferred temperatures within the calibration sets, with little validation against historical data. Although statistical techniques such as bootstrapping may improve error estimation, this approach remains partly a circular argument. This raises the question of how reliable such reconstructions are for inferring past changes in temperature? In order to address this question, we used transfer functions to reconstruct July temperatures from diatoms and chironomids from several locations across northern Europe. The transfer functions used showed good internal calibration statistics (r2 = 0.66 - 0.91). The diatom and chironomid inferred July air temperatures were compared to local observational records. As the sediment records were non-annual, all data were first smoothed using a 15 yr moving average filter. None of the five biologically-inferred temperature records were correlated with the local meteorological records. Furthermore, diatom inferred temperatures did not agree with chironomid inferred temperatures from the same cores from the same sites. In an attempt to understand this poor performance the biological proxy data was compressed using principal component analysis (PCA), and the PCA axes compared to the local meteorological data. These analyses clearly demonstrated that July temperatures were not correlated with the biological data at these locations. Some correlations were observed between the biological proxies and autumn and spring temperatures, although this varied slightly between sites and proxies. For example, chironomid data from Iceland was most strongly correlated with temperatures in February, March and April whilst in northern Sweden, the chironomid data was most strongly correlated with temperatures in March, April and May. It is suggested that the biological data at these sites may be responding to changes in the length of the ice-free period or hydrological regimes (including snow melt), rather than temperature per se. Our findings demonstrate the need to validate inferred temperatures against local meteorological data. Where such validation cannot be undertaken, inferred temperature reconstructions should be treated cautiously.

  14. Morphology and function of Neandertal and modern human ear ossicles

    PubMed Central

    David, Romain; Gunz, Philipp; Schmidt, Tobias; Spoor, Fred; Hublin, Jean-Jacques

    2016-01-01

    The diminutive middle ear ossicles (malleus, incus, stapes) housed in the tympanic cavity of the temporal bone play an important role in audition. The few known ossicles of Neandertals are distinctly different from those of anatomically modern humans (AMHs), despite the close relationship between both human species. Although not mutually exclusive, these differences may affect hearing capacity or could reflect covariation with the surrounding temporal bone. Until now, detailed comparisons were hampered by the small sample of Neandertal ossicles and the unavailability of methods combining analyses of ossicles with surrounding structures. Here, we present an analysis of the largest sample of Neandertal ossicles to date, including many previously unknown specimens, covering a wide geographic and temporal range. Microcomputed tomography scans and 3D geometric morphometrics were used to quantify shape and functional properties of the ossicles and the tympanic cavity and make comparisons with recent and extinct AMHs as well as African apes. We find striking morphological differences between ossicles of AMHs and Neandertals. Ossicles of both Neandertals and AMHs appear derived compared with the inferred ancestral morphology, albeit in different ways. Brain size increase evolved separately in AMHs and Neandertals, leading to differences in the tympanic cavity and, consequently, the shape and spatial configuration of the ossicles. Despite these different evolutionary trajectories, functional properties of the middle ear of AMHs and Neandertals are largely similar. The relevance of these functionally equivalent solutions is likely to conserve a similar auditory sensitivity level inherited from their last common ancestor. PMID:27671643

  15. Bayesian functional integral method for inferring continuous data from discrete measurements.

    PubMed

    Heuett, William J; Miller, Bernard V; Racette, Susan B; Holloszy, John O; Chow, Carson C; Periwal, Vipul

    2012-02-08

    Inference of the insulin secretion rate (ISR) from C-peptide measurements as a quantification of pancreatic β-cell function is clinically important in diseases related to reduced insulin sensitivity and insulin action. ISR derived from C-peptide concentration is an example of nonparametric Bayesian model selection where a proposed ISR time-course is considered to be a "model". An inferred value of inaccessible continuous variables from discrete observable data is often problematic in biology and medicine, because it is a priori unclear how robust the inference is to the deletion of data points, and a closely related question, how much smoothness or continuity the data actually support. Predictions weighted by the posterior distribution can be cast as functional integrals as used in statistical field theory. Functional integrals are generally difficult to evaluate, especially for nonanalytic constraints such as positivity of the estimated parameters. We propose a computationally tractable method that uses the exact solution of an associated likelihood function as a prior probability distribution for a Markov-chain Monte Carlo evaluation of the posterior for the full model. As a concrete application of our method, we calculate the ISR from actual clinical C-peptide measurements in human subjects with varying degrees of insulin sensitivity. Our method demonstrates the feasibility of functional integral Bayesian model selection as a practical method for such data-driven inference, allowing the data to determine the smoothing timescale and the width of the prior probability distribution on the space of models. In particular, our model comparison method determines the discrete time-step for interpolation of the unobservable continuous variable that is supported by the data. Attempts to go to finer discrete time-steps lead to less likely models. Copyright © 2012 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  16. Inferring Higher Functional Information for RIKEN Mouse Full-Length cDNA Clones With FACTS

    PubMed Central

    Nagashima, Takeshi; Silva, Diego G.; Petrovsky, Nikolai; Socha, Luis A.; Suzuki, Harukazu; Saito, Rintaro; Kasukawa, Takeya; Kurochkin, Igor V.; Konagaya, Akihiko; Schönbach, Christian

    2003-01-01

    FACTS (Functional Association/Annotation of cDNA Clones from Text/Sequence Sources) is a semiautomated knowledge discovery and annotation system that integrates molecular function information derived from sequence analysis results (sequence inferred) with functional information extracted from text. Text-inferred information was extracted from keyword-based retrievals of MEDLINE abstracts and by matching of gene or protein names to OMIM, BIND, and DIP database entries. Using FACTS, we found that 47.5% of the 60,770 RIKEN mouse cDNA FANTOM2 clone annotations were informative for text searches. MEDLINE queries yielded molecular interaction-containing sentences for 23.1% of the clones. When disease MeSH and GO terms were matched with retrieved abstracts, 22.7% of clones were associated with potential diseases, and 32.5% with GO identifiers. A significant number (23.5%) of disease MeSH-associated clones were also found to have a hereditary disease association (OMIM Morbidmap). Inferred neoplastic and nervous system disease represented 49.6% and 36.0% of disease MeSH-associated clones, respectively. A comparison of sequence-based GO assignments with informative text-based GO assignments revealed that for 78.2% of clones, identical GO assignments were provided for that clone by either method, whereas for 21.8% of clones, the assignments differed. In contrast, for OMIM assignments, only 28.5% of clones had identical sequence-based and text-based OMIM assignments. Sequence, sentence, and term-based functional associations are included in the FACTS database (http://facts.gsc.riken.go.jp/), which permits results to be annotated and explored through web-accessible keyword and sequence search interfaces. The FACTS database will be a critical tool for investigating the functional complexity of the mouse transcriptome, cDNA-inferred interactome (molecular interactions), and pathome (pathologies). PMID:12819151

  17. Context recognition for a hyperintensional inference machine

    NASA Astrophysics Data System (ADS)

    Duží, Marie; Fait, Michal; Menšík, Marek

    2017-07-01

    The goal of this paper is to introduce the algorithm of context recognition in the functional programming language TIL-Script, which is a necessary condition for the implementation of the TIL-Script inference machine. The TIL-Script language is an operationally isomorphic syntactic variant of Tichý's Transparent Intensional Logic (TIL). From the formal point of view, TIL is a hyperintensional, partial, typed λ-calculus with procedural semantics. Hyperintensional, because TIL λ-terms denote procedures (defined as TIL constructions) producing set-theoretic functions rather than the functions themselves; partial, because TIL is a logic of partial functions; and typed, because all the entities of TIL ontology, including constructions, receive a type within a ramified hierarchy of types. These features make it possible to distinguish three levels of abstraction at which TIL constructions operate. At the highest hyperintensional level the object to operate on is a construction (though a higher-order construction is needed to present this lower-order construction as an object of predication). At the middle intensional level the object to operate on is the function presented, or constructed, by a construction, while at the lowest extensional level the object to operate on is the value (if any) of the presented function. Thus a necessary condition for the development of an inference machine for the TIL-Script language is recognizing a context in which a construction occurs, namely extensional, intensional and hyperintensional context, in order to determine the type of an argument at which a given inference rule can be properly applied. As a result, our logic does not flout logical rules of extensional logic, which makes it possible to develop a hyperintensional inference machine for the TIL-Script language.

  18. Making Inferences: Comprehension of Physical Causality, Intentionality, and Emotions in Discourse by High-Functioning Older Children, Adolescents, and Adults with Autism.

    PubMed

    Bodner, Kimberly E; Engelhardt, Christopher R; Minshew, Nancy J; Williams, Diane L

    2015-09-01

    Studies investigating inferential reasoning in autism spectrum disorder (ASD) have focused on the ability to make socially-related inferences or inferences more generally. Important variables for intervention planning such as whether inferences depend on physical experiences or the nature of social information have received less consideration. A measure of bridging inferences of physical causation, mental states, and emotional states was administered to older children, adolescents, and adults with and without ASD. The ASD group had more difficulty making inferences, particularly related to emotional understanding. Results suggest that individuals with ASD may not have the stored experiential knowledge that specific inferences depend upon or have difficulties accessing relevant experiences due to linguistic limitations. Further research is needed to tease these elements apart.

  19. Ancestral sequence reconstruction in primate mitochondrial DNA: compositional bias and effect on functional inference.

    PubMed

    Krishnan, Neeraja M; Seligmann, Hervé; Stewart, Caro-Beth; De Koning, A P Jason; Pollock, David D

    2004-10-01

    Reconstruction of ancestral DNA and amino acid sequences is an important means of inferring information about past evolutionary events. Such reconstructions suggest changes in molecular function and evolutionary processes over the course of evolution and are used to infer adaptation and convergence. Maximum likelihood (ML) is generally thought to provide relatively accurate reconstructed sequences compared to parsimony, but both methods lead to the inference of multiple directional changes in nucleotide frequencies in primate mitochondrial DNA (mtDNA). To better understand this surprising result, as well as to better understand how parsimony and ML differ, we constructed a series of computationally simple "conditional pathway" methods that differed in the number of substitutions allowed per site along each branch, and we also evaluated the entire Bayesian posterior frequency distribution of reconstructed ancestral states. We analyzed primate mitochondrial cytochrome b (Cyt-b) and cytochrome oxidase subunit I (COI) genes and found that ML reconstructs ancestral frequencies that are often more different from tip sequences than are parsimony reconstructions. In contrast, frequency reconstructions based on the posterior ensemble more closely resemble extant nucleotide frequencies. Simulations indicate that these differences in ancestral sequence inference are probably due to deterministic bias caused by high uncertainty in the optimization-based ancestral reconstruction methods (parsimony, ML, Bayesian maximum a posteriori). In contrast, ancestral nucleotide frequencies based on an average of the Bayesian set of credible ancestral sequences are much less biased. The methods involving simpler conditional pathway calculations have slightly reduced likelihood values compared to full likelihood calculations, but they can provide fairly unbiased nucleotide reconstructions and may be useful in more complex phylogenetic analyses than considered here due to their speed and flexibility. To determine whether biased reconstructions using optimization methods might affect inferences of functional properties, ancestral primate mitochondrial tRNA sequences were inferred and helix-forming propensities for conserved pairs were evaluated in silico. For ambiguously reconstructed nucleotides at sites with high base composition variability, ancestral tRNA sequences from Bayesian analyses were more compatible with canonical base pairing than were those inferred by other methods. Thus, nucleotide bias in reconstructed sequences apparently can lead to serious bias and inaccuracies in functional predictions.

  20. Inference comprehension in text reading: Performance of individuals with right- versus left-hemisphere lesions and the influence of cognitive functions.

    PubMed

    Silagi, Marcela Lima; Radanovic, Marcia; Conforto, Adriana Bastos; Mendonça, Lucia Iracema Zanotto; Mansur, Leticia Lessa

    2018-01-01

    Right-hemisphere lesions (RHL) may impair inference comprehension. However, comparative studies between left-hemisphere lesions (LHL) and RHL are rare, especially regarding reading comprehension. Moreover, further knowledge of the influence of cognition on inferential processing in this task is needed. To compare the performance of patients with RHL and LHL on an inference reading comprehension task. We also aimed to analyze the effects of lesion site and to verify correlations between cognitive functions and performance on the task. Seventy-five subjects were equally divided into the groups RHL, LHL, and control group (CG). The Implicit Management Test was used to evaluate inference comprehension. In this test, subjects read short written passages and subsequently answer five types of questions (explicit, logical, distractor, pragmatic, and other), which require different types of inferential reasoning. The cognitive functional domains of attention, memory, executive functions, language, and visuospatial abilities were assessed using the Cognitive Linguistic Quick Test (CLQT). The LHL and RHL groups presented difficulties in inferential comprehension in comparison with the CG. However, the RHL group presented lower scores than the LHL group on logical, pragmatic and other questions. A covariance analysis did not show any effect of lesion site within the hemispheres. Overall, all cognitive domains were correlated with all the types of questions from the inference test (especially logical, pragmatic, and other). Attention and visuospatial abilities affected the scores of both the RHL and LHL groups, and only memory influenced the performance of the RHL group. Lesions in either hemisphere may cause difficulties in making inferences during reading. However, processing more complex inferences was more difficult for patients with RHL than for those with LHL, which suggests that the right hemisphere plays an important role in tasks with higher comprehension demands. Cognition influences inferential processing during reading in brain-injured subjects.

  1. Unified Theory of Inference for Text Understanding

    DTIC Science & Technology

    1986-11-25

    Technical Report S. L. Graham Principal Investigator (4151 642-2059 DTIC ^ELECTE APR 2 21987 D "The views and conclusions contained in this document...obtain X? Function - Infer P will use X for its normal purpose, if it has one. Intervention - How could C keep P from obtaining X? Knowledge Propagation...likelihood is low otherwise, likelihood is moderate otherwise, does X have a normal function ? if so, does P do actions like this function ? if so

  2. Neural system prediction and identification challenge.

    PubMed

    Vlachos, Ioannis; Zaytsev, Yury V; Spreizer, Sebastian; Aertsen, Ad; Kumar, Arvind

    2013-01-01

    Can we infer the function of a biological neural network (BNN) if we know the connectivity and activity of all its constituent neurons?This question is at the core of neuroscience and, accordingly, various methods have been developed to record the activity and connectivity of as many neurons as possible. Surprisingly, there is no theoretical or computational demonstration that neuronal activity and connectivity are indeed sufficient to infer the function of a BNN. Therefore, we pose the Neural Systems Identification and Prediction Challenge (nuSPIC). We provide the connectivity and activity of all neurons and invite participants (1) to infer the functions implemented (hard-wired) in spiking neural networks (SNNs) by stimulating and recording the activity of neurons and, (2) to implement predefined mathematical/biological functions using SNNs. The nuSPICs can be accessed via a web-interface to the NEST simulator and the user is not required to know any specific programming language. Furthermore, the nuSPICs can be used as a teaching tool. Finally, nuSPICs use the crowd-sourcing model to address scientific issues. With this computational approach we aim to identify which functions can be inferred by systematic recordings of neuronal activity and connectivity. In addition, nuSPICs will help the design and application of new experimental paradigms based on the structure of the SNN and the presumed function which is to be discovered.

  3. Neural system prediction and identification challenge

    PubMed Central

    Vlachos, Ioannis; Zaytsev, Yury V.; Spreizer, Sebastian; Aertsen, Ad; Kumar, Arvind

    2013-01-01

    Can we infer the function of a biological neural network (BNN) if we know the connectivity and activity of all its constituent neurons?This question is at the core of neuroscience and, accordingly, various methods have been developed to record the activity and connectivity of as many neurons as possible. Surprisingly, there is no theoretical or computational demonstration that neuronal activity and connectivity are indeed sufficient to infer the function of a BNN. Therefore, we pose the Neural Systems Identification and Prediction Challenge (nuSPIC). We provide the connectivity and activity of all neurons and invite participants (1) to infer the functions implemented (hard-wired) in spiking neural networks (SNNs) by stimulating and recording the activity of neurons and, (2) to implement predefined mathematical/biological functions using SNNs. The nuSPICs can be accessed via a web-interface to the NEST simulator and the user is not required to know any specific programming language. Furthermore, the nuSPICs can be used as a teaching tool. Finally, nuSPICs use the crowd-sourcing model to address scientific issues. With this computational approach we aim to identify which functions can be inferred by systematic recordings of neuronal activity and connectivity. In addition, nuSPICs will help the design and application of new experimental paradigms based on the structure of the SNN and the presumed function which is to be discovered. PMID:24399966

  4. Integrating evolutionary and functional approaches to infer adaptation at specific loci.

    PubMed

    Storz, Jay F; Wheat, Christopher W

    2010-09-01

    Inferences about adaptation at specific loci are often exclusively based on the static analysis of DNA sequence variation. Ideally,population-genetic evidence for positive selection serves as a stepping-off point for experimental studies to elucidate the functional significance of the putatively adaptive variation. We argue that inferences about adaptation at specific loci are best achieved by integrating the indirect, retrospective insights provided by population-genetic analyses with the more direct, mechanistic insights provided by functional experiments. Integrative studies of adaptive genetic variation may sometimes be motivated by experimental insights into molecular function, which then provide the impetus to perform population genetic tests to evaluate whether the functional variation is of adaptive significance. In other cases, studies may be initiated by genome scans of DNA variation to identify candidate loci for recent adaptation. Results of such analyses can then motivate experimental efforts to test whether the identified candidate loci do in fact contribute to functional variation in some fitness-related phenotype. Functional studies can provide corroborative evidence for positive selection at particular loci, and can potentially reveal specific molecular mechanisms of adaptation.

  5. A Derived Transfer of Mood Functions through Equivalence Relations

    ERIC Educational Resources Information Center

    Barnes-Holmes, Yvonne; Barnes-Holmes, Dermot; Smeets, Paul M.; Luciano, Carmen

    2004-01-01

    The present study investigated the transfer of induced happy and sad mood functions through equivalence relations. Sixteen subjects participated in a combined equivalence and mood induction procedure. In Phase 1, all subjects were trained in 2 conditional discriminations using a matching-to-sample format (i.e., A1-B1, A2-B2, A1-C1, A2-C2). In…

  6. Diverse Effects, Complex Causes: Children Use Information about Machines' Functional Diversity to Infer Internal Complexity

    ERIC Educational Resources Information Center

    Ahl, Richard E.; Keil, Frank C.

    2017-01-01

    Four studies explored the abilities of 80 adults and 180 children (4-9 years), from predominantly middle-class families in the Northeastern United States, to use information about machines' observable functional capacities to infer their internal, "hidden" mechanistic complexity. Children as young as 4 and 5 years old used machines'…

  7. Shear wave velocity variation across the Taupo Volcanic Zone, New Zealand, from receiver function inversion

    USGS Publications Warehouse

    Bannister, S.; Bryan, C.J.; Bibby, H.M.

    2004-01-01

    The Taupo Volcanic Zone (TVZ), New Zealand is a region characterized by very high magma eruption rates and extremely high heat flow, which is manifest in high-temperature geothermal waters. The shear wave velocity structure across the region is inferred using non-linear inversion of receiver functions, which were derived from teleseismic earthquake data. Results from the non-linear inversion, and from forward synthetic modelling, indicate low S velocities at ???6- 16 km depth near the Rotorua and Reporoa calderas. We infer these low-velocity layers to represent the presence of high-level bodies of partial melt associated with the volcanism. Receiver functions at other stations are complicated by reverberations associated with near-surface sedimentary layers. The receiver function data also indicate that the Moho lies between 25 and 30 km, deeper than the 15 ?? 2 km depth previously inferred for the crust-mantle boundary beneath the TVZ. ?? 2004 RAS.

  8. Inferring Functional Neural Connectivity with Phase Synchronization Analysis: A Review of Methodology

    PubMed Central

    Sun, Junfeng; Li, Zhijun; Tong, Shanbao

    2012-01-01

    Functional neural connectivity is drawing increasing attention in neuroscience research. To infer functional connectivity from observed neural signals, various methods have been proposed. Among them, phase synchronization analysis is an important and effective one which examines the relationship of instantaneous phase between neural signals but neglecting the influence of their amplitudes. In this paper, we review the advances in methodologies of phase synchronization analysis. In particular, we discuss the definitions of instantaneous phase, the indexes of phase synchronization and their significance test, the issues that may affect the detection of phase synchronization and the extensions of phase synchronization analysis. In practice, phase synchronization analysis may be affected by observational noise, insufficient samples of the signals, volume conduction, and reference in recording neural signals. We make comments and suggestions on these issues so as to better apply phase synchronization analysis to inferring functional connectivity from neural signals. PMID:22577470

  9. Lyapounov Functions of Closed Cone Fields: From Conley Theory to Time Functions

    NASA Astrophysics Data System (ADS)

    Bernard, Patrick; Suhr, Stefan

    2018-03-01

    We propose a theory "à la Conley" for cone fields using a notion of relaxed orbits based on cone enlargements, in the spirit of space time geometry. We work in the setting of closed (or equivalently semi-continuous) cone fields with singularities. This setting contains (for questions which are parametrization independent such as the existence of Lyapounov functions) the case of continuous vector-fields on manifolds, of differential inclusions, of Lorentzian metrics, and of continuous cone fields. We generalize to this setting the equivalence between stable causality and the existence of temporal functions. We also generalize the equivalence between global hyperbolicity and the existence of a steep temporal function.

  10. Optical equivalence of isotropic ensembles of ellipsoidal particles in the Rayleigh-Gans-Debye and anomalous diffraction approximations and its consequences

    NASA Astrophysics Data System (ADS)

    Paramonov, L. E.

    2012-05-01

    Light scattering by isotropic ensembles of ellipsoidal particles is considered in the Rayleigh-Gans-Debye approximation. It is proved that randomly oriented ellipsoidal particles are optically equivalent to polydisperse randomly oriented spheroidal particles and polydisperse spherical particles. Density functions of the shape and size distributions for equivalent ensembles of spheroidal and spherical particles are presented. In the anomalous diffraction approximation, equivalent ensembles of particles are shown to also have equal extinction, scattering, and absorption coefficients. Consequences of optical equivalence are considered. The results are illustrated by numerical calculations of the angular dependence of the scattering phase function using the T-matrix method and the Mie theory.

  11. A Bayesian account of ‘hysteria’

    PubMed Central

    Adams, Rick A.; Brown, Harriet; Pareés, Isabel; Friston, Karl J.

    2012-01-01

    This article provides a neurobiological account of symptoms that have been called ‘hysterical’, ‘psychogenic’ or ‘medically unexplained’, which we will call functional motor and sensory symptoms. We use a neurobiologically informed model of hierarchical Bayesian inference in the brain to explain functional motor and sensory symptoms in terms of perception and action arising from inference based on prior beliefs and sensory information. This explanation exploits the key balance between prior beliefs and sensory evidence that is mediated by (body focused) attention, symptom expectations, physical and emotional experiences and beliefs about illness. Crucially, this furnishes an explanation at three different levels: (i) underlying neuromodulatory (synaptic) mechanisms; (ii) cognitive and experiential processes (attention and attribution of agency); and (iii) formal computations that underlie perceptual inference (representation of uncertainty or precision). Our explanation involves primary and secondary failures of inference; the primary failure is the (autonomous) emergence of a percept or belief that is held with undue certainty (precision) following top-down attentional modulation of synaptic gain. This belief can constitute a sensory percept (or its absence) or induce movement (or its absence). The secondary failure of inference is when the ensuing percept (and any somatosensory consequences) is falsely inferred to be a symptom to explain why its content was not predicted by the source of attentional modulation. This account accommodates several fundamental observations about functional motor and sensory symptoms, including: (i) their induction and maintenance by attention; (ii) their modification by expectation, prior experience and cultural beliefs and (iii) their involuntary and symptomatic nature. PMID:22641838

  12. Equivalence between contextuality and negativity of the Wigner function for qudits

    NASA Astrophysics Data System (ADS)

    Delfosse, Nicolas; Okay, Cihan; Bermejo-Vega, Juan; Browne, Dan E.; Raussendorf, Robert

    2017-12-01

    Understanding what distinguishes quantum mechanics from classical mechanics is crucial for quantum information processing applications. In this work, we consider two notions of non-classicality for quantum systems, negativity of the Wigner function and contextuality for Pauli measurements. We prove that these two notions are equivalent for multi-qudit systems with odd local dimension. For a single qudit, the equivalence breaks down. We show that there exist single qudit states that admit a non-contextual hidden variable model description and whose Wigner functions are negative.

  13. Functional Equivalence Acceptance Testing of FUN3D for Entry Descent and Landing Applications

    NASA Technical Reports Server (NTRS)

    Gnoffo, Peter A.; Wood, William A.; Kleb, William L.; Alter, Stephen J.; Glass, Christopher E.; Padilla, Jose F.; Hammond, Dana P.; White, Jeffery A.

    2013-01-01

    The functional equivalence of the unstructured grid code FUN3D to the the structured grid code LAURA (Langley Aerothermodynamic Upwind Relaxation Algorithm) is documented for applications of interest to the Entry, Descent, and Landing (EDL) community. Examples from an existing suite of regression tests are used to demonstrate the functional equivalence, encompassing various thermochemical models and vehicle configurations. Algorithm modifications required for the node-based unstructured grid code (FUN3D) to reproduce functionality of the cell-centered structured code (LAURA) are also documented. Challenges associated with computation on tetrahedral grids versus computation on structured-grid derived hexahedral systems are discussed.

  14. Gaussian process-based Bayesian nonparametric inference of population size trajectories from gene genealogies.

    PubMed

    Palacios, Julia A; Minin, Vladimir N

    2013-03-01

    Changes in population size influence genetic diversity of the population and, as a result, leave a signature of these changes in individual genomes in the population. We are interested in the inverse problem of reconstructing past population dynamics from genomic data. We start with a standard framework based on the coalescent, a stochastic process that generates genealogies connecting randomly sampled individuals from the population of interest. These genealogies serve as a glue between the population demographic history and genomic sequences. It turns out that only the times of genealogical lineage coalescences contain information about population size dynamics. Viewing these coalescent times as a point process, estimating population size trajectories is equivalent to estimating a conditional intensity of this point process. Therefore, our inverse problem is similar to estimating an inhomogeneous Poisson process intensity function. We demonstrate how recent advances in Gaussian process-based nonparametric inference for Poisson processes can be extended to Bayesian nonparametric estimation of population size dynamics under the coalescent. We compare our Gaussian process (GP) approach to one of the state-of-the-art Gaussian Markov random field (GMRF) methods for estimating population trajectories. Using simulated data, we demonstrate that our method has better accuracy and precision. Next, we analyze two genealogies reconstructed from real sequences of hepatitis C and human Influenza A viruses. In both cases, we recover more believed aspects of the viral demographic histories than the GMRF approach. We also find that our GP method produces more reasonable uncertainty estimates than the GMRF method. Copyright © 2013, The International Biometric Society.

  15. Bayesian inference of uncertainties in precipitation-streamflow modeling in a snow affected catchment

    NASA Astrophysics Data System (ADS)

    Koskela, J. J.; Croke, B. W. F.; Koivusalo, H.; Jakeman, A. J.; Kokkonen, T.

    2012-11-01

    Bayesian inference is used to study the effect of precipitation and model structural uncertainty on estimates of model parameters and confidence limits of predictive variables in a conceptual rainfall-runoff model in the snow-fed Rudbäck catchment (142 ha) in southern Finland. The IHACRES model is coupled with a simple degree day model to account for snow accumulation and melt. The posterior probability distribution of the model parameters is sampled by using the Differential Evolution Adaptive Metropolis (DREAM(ZS)) algorithm and the generalized likelihood function. Precipitation uncertainty is taken into account by introducing additional latent variables that were used as multipliers for individual storm events. Results suggest that occasional snow water equivalent (SWE) observations together with daily streamflow observations do not contain enough information to simultaneously identify model parameters, precipitation uncertainty and model structural uncertainty in the Rudbäck catchment. The addition of an autoregressive component to account for model structure error and latent variables having uniform priors to account for input uncertainty lead to dubious posterior distributions of model parameters. Thus our hypothesis that informative priors for latent variables could be replaced by additional SWE data could not be confirmed. The model was found to work adequately in 1-day-ahead simulation mode, but the results were poor in the simulation batch mode. This was caused by the interaction of parameters that were used to describe different sources of uncertainty. The findings may have lessons for other cases where parameterizations are similarly high in relation to available prior information.

  16. Infrasound and seismic detections associated with the 7 September 2015 Bangkok fireball

    DOE PAGES

    Caudron, Corentin; Taisne, Benoit; Perttu, Anna; ...

    2016-08-22

    A bright fireball was reported at 01:43:35 UTC on September 7, 2015 at a height of ~30 km above 14.5°N, 98.9°E near Bangkok, Thailand. It had a TNT yield equivalent of 3.9 kilotons (kt), making it the largest fireball detected in South–East Asia since the ~50 kt 2009 Sumatra bolide. Infrasonic signals were observed at four infrasound arrays that are part of the International Monitoring System (IMS) and one infrasound array located in Singapore. Acoustic bearings and event origin times inferred from array processing are consistent with the eyewitness accounts. A seismic signal associated with this event was also likelymore » recorded at station SRDT, in Thailand. As a result, an acoustic energy equivalent of 1.15 ± 0.24 kt is derived from the Singaporean acoustic data using the period of the peak energy.« less

  17. Infrasound and seismic detections associated with the 7 September 2015 Bangkok fireball

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Caudron, Corentin; Taisne, Benoit; Perttu, Anna

    A bright fireball was reported at 01:43:35 UTC on September 7, 2015 at a height of ~30 km above 14.5°N, 98.9°E near Bangkok, Thailand. It had a TNT yield equivalent of 3.9 kilotons (kt), making it the largest fireball detected in South–East Asia since the ~50 kt 2009 Sumatra bolide. Infrasonic signals were observed at four infrasound arrays that are part of the International Monitoring System (IMS) and one infrasound array located in Singapore. Acoustic bearings and event origin times inferred from array processing are consistent with the eyewitness accounts. A seismic signal associated with this event was also likelymore » recorded at station SRDT, in Thailand. As a result, an acoustic energy equivalent of 1.15 ± 0.24 kt is derived from the Singaporean acoustic data using the period of the peak energy.« less

  18. Realistic Subsurface Anomaly Discrimination Using Electromagnetic Induction and an SVM Classifier

    NASA Astrophysics Data System (ADS)

    Pablo Fernández, Juan; Shubitidze, Fridon; Shamatava, Irma; Barrowes, Benjamin E.; O'Neill, Kevin

    2010-12-01

    The environmental research program of the United States military has set up blind tests for detection and discrimination of unexploded ordnance. One such test consists of measurements taken with the EM-63 sensor at Camp Sibert, AL. We review the performance on the test of a procedure that combines a field-potential (HAP) method to locate targets, the normalized surface magnetic source (NSMS) model to characterize them, and a support vector machine (SVM) to classify them. The HAP method infers location from the scattered magnetic field and its associated scalar potential, the latter reconstructed using equivalent sources. NSMS replaces the target with an enclosing spheroid of equivalent radial magnetization whose integral it uses as a discriminator. SVM generalizes from empirical evidence and can be adapted for multiclass discrimination using a voting system. Our method identifies all potentially dangerous targets correctly and has a false-alarm rate of about 5%.

  19. Comment on "The effect of same-sex marriage laws on different-sex marriage: evidence from the Netherlands".

    PubMed

    Dinno, Alexis

    2014-12-01

    In the recent Demography article titled "The Effect of Same-Sex Marriage Laws on Different-Sex Marriage: Evidence From the Netherlands," Trandafir attempted to answer the question, Are rates of opposite sex marriage affected by legal recognition of same-sex marriages? The results of his approach to statistical inference-looking for evidence of a difference in rates of opposite-sex marriage-provide an absence of evidence of such effects. However, the validity of his conclusion of no causal relationship between same-sex marriage laws and rates of opposite-sex marriage is threatened by the fact that Trandafir did not also look for equivalence in rates of opposite-sex marriage in order to provide evidence of an absence of such an effect. Equivalence tests in combination with difference tests are introduced and presented in this article as a more valid inferential approach to the substantive question Trandafir attempted to answer.

  20. Statistical analysis of loopy belief propagation in random fields

    NASA Astrophysics Data System (ADS)

    Yasuda, Muneki; Kataoka, Shun; Tanaka, Kazuyuki

    2015-10-01

    Loopy belief propagation (LBP), which is equivalent to the Bethe approximation in statistical mechanics, is a message-passing-type inference method that is widely used to analyze systems based on Markov random fields (MRFs). In this paper, we propose a message-passing-type method to analytically evaluate the quenched average of LBP in random fields by using the replica cluster variation method. The proposed analytical method is applicable to general pairwise MRFs with random fields whose distributions differ from each other and can give the quenched averages of the Bethe free energies over random fields, which are consistent with numerical results. The order of its computational cost is equivalent to that of standard LBP. In the latter part of this paper, we describe the application of the proposed method to Bayesian image restoration, in which we observed that our theoretical results are in good agreement with the numerical results for natural images.

  1. Specimen-level phylogenetics in paleontology using the Fossilized Birth-Death model with sampled ancestors.

    PubMed

    Cau, Andrea

    2017-01-01

    Bayesian phylogenetic methods integrating simultaneously morphological and stratigraphic information have been applied increasingly among paleontologists. Most of these studies have used Bayesian methods as an alternative to the widely-used parsimony analysis, to infer macroevolutionary patterns and relationships among species-level or higher taxa. Among recently introduced Bayesian methodologies, the Fossilized Birth-Death (FBD) model allows incorporation of hypotheses on ancestor-descendant relationships in phylogenetic analyses including fossil taxa. Here, the FBD model is used to infer the relationships among an ingroup formed exclusively by fossil individuals, i.e., dipnoan tooth plates from four localities in the Ain el Guettar Formation of Tunisia. Previous analyses of this sample compared the results of phylogenetic analysis using parsimony with stratigraphic methods, inferred a high diversity (five or more genera) in the Ain el Guettar Formation, and interpreted it as an artifact inflated by depositional factors. In the analysis performed here, the uncertainty on the chronostratigraphic relationships among the specimens was included among the prior settings. The results of the analysis confirm the referral of most of the specimens to the taxa Asiatoceratodus , Equinoxiodus, Lavocatodus and Neoceratodus , but reject those to Ceratodus and Ferganoceratodus . The resulting phylogeny constrained the evolution of the Tunisian sample exclusively in the Early Cretaceous, contrasting with the previous scenario inferred by the stratigraphically-calibrated topology resulting from parsimony analysis. The phylogenetic framework also suggests that (1) the sampled localities are laterally equivalent, (2) but three localities are restricted to the youngest part of the section; both results are in agreement with previous stratigraphic analyses of these localities. The FBD model of specimen-level units provides a novel tool for phylogenetic inference among fossils but also for independent tests of stratigraphic scenarios.

  2. Assessment of organ-specific neutron equivalent doses in proton therapy using computational whole-body age-dependent voxel phantoms

    NASA Astrophysics Data System (ADS)

    Zacharatou Jarlskog, Christina; Lee, Choonik; Bolch, Wesley E.; Xu, X. George; Paganetti, Harald

    2008-02-01

    Proton beams used for radiotherapy will produce neutrons when interacting with matter. The purpose of this study was to quantify the equivalent dose to tissue due to secondary neutrons in pediatric and adult patients treated by proton therapy for brain lesions. Assessment of the equivalent dose to organs away from the target requires whole-body geometrical information. Furthermore, because the patient geometry depends on age at exposure, age-dependent representations are also needed. We implemented age-dependent phantoms into our proton Monte Carlo dose calculation environment. We considered eight typical radiation fields, two of which had been previously used to treat pediatric patients. The other six fields were additionally considered to allow a systematic study of equivalent doses as a function of field parameters. For all phantoms and all fields, we simulated organ-specific equivalent neutron doses and analyzed for each organ (1) the equivalent dose due to neutrons as a function of distance to the target; (2) the equivalent dose due to neutrons as a function of patient age; (3) the equivalent dose due to neutrons as a function of field parameters; and (4) the ratio of contributions to secondary dose from the treatment head versus the contribution from the patient's body tissues. This work reports organ-specific equivalent neutron doses for up to 48 organs in a patient. We demonstrate quantitatively how organ equivalent doses for adult and pediatric patients vary as a function of patient's age, organ and field parameters. Neutron doses increase with increasing range and modulation width but decrease with field size (as defined by the aperture). We analyzed the ratio of neutron dose contributions from the patient and from the treatment head, and found that neutron-equivalent doses fall off rapidly as a function of distance from the target, in agreement with experimental data. It appears that for the fields used in this study, the neutron dose lateral to the field is smaller than the reported scattered photon doses in a typical intensity-modulated photon treatment. Most importantly, our study shows that neutron doses to specific organs depend considerably on the patient's age and body stature. The younger the patient, the higher the dose deposited due to neutrons. Given the fact that the risk also increases with decreasing patient age, this factor needs to be taken into account when treating pediatric patients of very young ages and/or of small body size. The neutron dose from a course of proton therapy treatment (assuming 70 Gy in 30 fractions) could potentially (depending on patient's age, organ, treatment site and area of CT scan) be equivalent to up to ~30 CT scans.

  3. Circulant Matrices and Affine Equivalence of Monomial Rotation Symmetric Boolean Functions

    DTIC Science & Technology

    2015-01-01

    definitions , including monomial rotation symmetric (MRS) Boolean functions and affine equivalence, and a known result for such quadratic functions...degree of the MRS is, we have a similar result as [40, Theorem 1.1] for n = 4p (p prime), or squarefree integers n, which along with our Theorem 5.2

  4. Map LineUps: Effects of spatial structure on graphical inference.

    PubMed

    Beecham, Roger; Dykes, Jason; Meulemans, Wouter; Slingsby, Aidan; Turkay, Cagatay; Wood, Jo

    2017-01-01

    Fundamental to the effective use of visualization as an analytic and descriptive tool is the assurance that presenting data visually provides the capability of making inferences from what we see. This paper explores two related approaches to quantifying the confidence we may have in making visual inferences from mapped geospatial data. We adapt Wickham et al.'s 'Visual Line-up' method as a direct analogy with Null Hypothesis Significance Testing (NHST) and propose a new approach for generating more credible spatial null hypotheses. Rather than using as a spatial null hypothesis the unrealistic assumption of complete spatial randomness, we propose spatially autocorrelated simulations as alternative nulls. We conduct a set of crowdsourced experiments (n=361) to determine the just noticeable difference (JND) between pairs of choropleth maps of geographic units controlling for spatial autocorrelation (Moran's I statistic) and geometric configuration (variance in spatial unit area). Results indicate that people's abilities to perceive differences in spatial autocorrelation vary with baseline autocorrelation structure and the geometric configuration of geographic units. These results allow us, for the first time, to construct a visual equivalent of statistical power for geospatial data. Our JND results add to those provided in recent years by Klippel et al. (2011), Harrison et al. (2014) and Kay & Heer (2015) for correlation visualization. Importantly, they provide an empirical basis for an improved construction of visual line-ups for maps and the development of theory to inform geospatial tests of graphical inference.

  5. Inferring evolution of gene duplicates using probabilistic models and nonparametric belief propagation.

    PubMed

    Zeng, Jia; Hannenhalli, Sridhar

    2013-01-01

    Gene duplication, followed by functional evolution of duplicate genes, is a primary engine of evolutionary innovation. In turn, gene expression evolution is a critical component of overall functional evolution of paralogs. Inferring evolutionary history of gene expression among paralogs is therefore a problem of considerable interest. It also represents significant challenges. The standard approaches of evolutionary reconstruction assume that at an internal node of the duplication tree, the two duplicates evolve independently. However, because of various selection pressures functional evolution of the two paralogs may be coupled. The coupling of paralog evolution corresponds to three major fates of gene duplicates: subfunctionalization (SF), conserved function (CF) or neofunctionalization (NF). Quantitative analysis of these fates is of great interest and clearly influences evolutionary inference of expression. These two interrelated problems of inferring gene expression and evolutionary fates of gene duplicates have not been studied together previously and motivate the present study. Here we propose a novel probabilistic framework and algorithm to simultaneously infer (i) ancestral gene expression and (ii) the likely fate (SF, NF, CF) at each duplication event during the evolution of gene family. Using tissue-specific gene expression data, we develop a nonparametric belief propagation (NBP) algorithm to predict the ancestral expression level as a proxy for function, and describe a novel probabilistic model that relates the predicted and known expression levels to the possible evolutionary fates. We validate our model using simulation and then apply it to a genome-wide set of gene duplicates in human. Our results suggest that SF tends to be more frequent at the earlier stage of gene family expansion, while NF occurs more frequently later on.

  6. Experimental and statistical study on fracture boundary of non-irradiated Zircaloy-4 cladding tube under LOCA conditions

    NASA Astrophysics Data System (ADS)

    Narukawa, Takafumi; Yamaguchi, Akira; Jang, Sunghyon; Amaya, Masaki

    2018-02-01

    For estimating fracture probability of fuel cladding tube under loss-of-coolant accident conditions of light-water-reactors, laboratory-scale integral thermal shock tests were conducted on non-irradiated Zircaloy-4 cladding tube specimens. Then, the obtained binary data with respect to fracture or non-fracture of the cladding tube specimen were analyzed statistically. A method to obtain the fracture probability curve as a function of equivalent cladding reacted (ECR) was proposed using Bayesian inference for generalized linear models: probit, logit, and log-probit models. Then, model selection was performed in terms of physical characteristics and information criteria, a widely applicable information criterion and a widely applicable Bayesian information criterion. As a result, it was clarified that the log-probit model was the best among the three models to estimate the fracture probability in terms of the degree of prediction accuracy for both next data to be obtained and the true model. Using the log-probit model, it was shown that 20% ECR corresponded to a 5% probability level with a 95% confidence of fracture of the cladding tube specimens.

  7. 76 FR 593 - New Postal Product

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-05

    ... 1-2. \\1\\ Notice of United States Postal Service of Filing a Functionally Equivalent Global Expedited Package Services 3 Negotiated Service Agreement and Application for Non-Public Treatment of Materials... Filing a Functionally Equivalent Global Expedited Package Services 3 Negotiated Service Agreement and...

  8. 78 FR 79710 - New Postal Product

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-31

    ... Functionally Equivalent Global Expedited Package Services 3 Negotiated Service Agreement and Application for Non-Public Treatment of Materials Filed Under Seal, December 23, 2013 (Notice). II. Background The.... CP2010-71 to serve as the baseline agreement for comparison of potentially functionally equivalent...

  9. 76 FR 80412 - New Postal Product

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-23

    ... Service of Filing Functionally Equivalent Inbound Competitive Multi-Service Agreement with a Foreign... application for non-public treatment of certain materials. The Postal Service also provided a redacted copy of... that the Postal Service proposed adding other functionally equivalent agreements as price categories...

  10. Two- and Three-Year-Olds Infer and Reason about Design Intentions in Order to Categorize Broken Objects

    ERIC Educational Resources Information Center

    Nelson, Deborah G. Kemler; Holt, Morghan B.; Egan, Louisa Chan

    2004-01-01

    In naming artifacts, do young children infer and reason about the intended functions of the objects? Participants between the ages of 2 and 4 years were shown two kinds of objects derived from familiar categories. One kind was damaged so as to undermine its usual function. The other kind was also dysfunctional, but made so by adding features that…

  11. The role of fMRI in cognitive neuroscience: where do we stand?

    PubMed

    Poldrack, Russell A

    2008-04-01

    Functional magnetic resonance imaging (fMRI) has quickly become the most prominent tool in cognitive neuroscience. In this article, I outline some of the limits on the kinds of inferences that can be supported by fMRI, focusing particularly on reverse inference, in which the engagement of specific mental processes is inferred from patterns of brain activation. Although this form of inference is weak, newly developed methods from the field of machine learning offer the potential to formalize and strengthen reverse inferences. I conclude by discussing the increasing presence of fMRI results in the popular media and the ethical implications of the increasing predictive power of fMRI.

  12. The line continuum luminosity ratio in AGN: Or on the Baldwin Effect

    NASA Technical Reports Server (NTRS)

    Mushotzky, R.; Ferland, F. J.

    1983-01-01

    The luminosity dependence of the equivalent width of CIV in active galaxies, the "Baldwin" effect, is shown to be a consequence of a luminosity dependent ionization parameter. This law also agrees with the lack of a "Baldwin" effect in Ly alpha or other hydrogen lines. A fit to the available data gives a weak indication that the mean covering factor decreases with increasing luminosity, consistent with the inference from X-ray observations. The effects of continuum shape and density on various line ratios of interest are discussed.

  13. Impaired self-agency inferences in schizophrenia: The role of cognitive capacity and causal reasoning style.

    PubMed

    Prikken, M; van der Weiden, A; Kahn, R S; Aarts, H; van Haren, N E M

    2018-01-01

    The sense of self-agency, i.e., experiencing oneself as the cause of one's own actions, is impaired in patients with schizophrenia. Normally, inferences of self-agency are enhanced when actual outcomes match with pre-activated outcome information, where this pre-activation can result from explicitly set goals (i.e., goal-based route) or implicitly primed outcome information (i.e., prime-based route). Previous research suggests that patients show specific impairments in the prime-based route, implicating that they do not rely on matches between implicitly available outcome information and actual action-outcomes when inferring self-agency. The question remains: Why? Here, we examine whether neurocognitive functioning and self-serving bias (SSB) may explain abnormalities in patients' agency inferences. Thirty-six patients and 36 healthy controls performed a commonly used agency inference task to measure goal- and prime-based self-agency inferences. Neurocognitive functioning was assessed with the Brief Assessment of Cognition in Schizophrenia (BACS) and the SSB was assessed with the Internal Personal and Situational Attributions Questionnaire. Results showed a substantial smaller effect of primed outcome information on agency experiences in patients compared with healthy controls. Whereas patients and controls differed on BACS and marginally on SSB scores, these differences were not related to patients' impairments in prime-based agency inferences. Patients showed impairments in prime-based agency inferences, thereby replicating previous studies. This finding could not be explained by cognitive dysfunction or SSB. Results are discussed in the context of the recent surge to understand and examine deficits in agency experiences in schizophrenia. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  14. Time-varying coupling functions: Dynamical inference and cause of synchronization transitions

    NASA Astrophysics Data System (ADS)

    Stankovski, Tomislav

    2017-02-01

    Interactions in nature can be described by their coupling strength, direction of coupling, and coupling function. The coupling strength and directionality are relatively well understood and studied, at least for two interacting systems; however, there can be a complexity in the interactions uniquely dependent on the coupling functions. Such a special case is studied here: synchronization transition occurs only due to the time variability of the coupling functions, while the net coupling strength is constant throughout the observation time. To motivate the investigation, an example is used to present an analysis of cross-frequency coupling functions between delta and alpha brain waves extracted from the electroencephalography recording of a healthy human subject in a free-running resting state. The results indicate that time-varying coupling functions are a reality for biological interactions. A model of phase oscillators is used to demonstrate and detect the synchronization transition caused by the varying coupling functions during an invariant coupling strength. The ability to detect this phenomenon is discussed with the method of dynamical Bayesian inference, which was able to infer the time-varying coupling functions. The form of the coupling function acts as an additional dimension for the interactions, and it should be taken into account when detecting biological or other interactions from data.

  15. Inference of a Nonlinear Stochastic Model of the Cardiorespiratory Interaction

    NASA Astrophysics Data System (ADS)

    Smelyanskiy, V. N.; Luchinsky, D. G.; Stefanovska, A.; McClintock, P. V.

    2005-03-01

    We reconstruct a nonlinear stochastic model of the cardiorespiratory interaction in terms of a set of polynomial basis functions representing the nonlinear force governing system oscillations. The strength and direction of coupling and noise intensity are simultaneously inferred from a univariate blood pressure signal. Our new inference technique does not require extensive global optimization, and it is applicable to a wide range of complex dynamical systems subject to noise.

  16. The Dopaminergic Midbrain Encodes the Expected Certainty about Desired Outcomes.

    PubMed

    Schwartenbeck, Philipp; FitzGerald, Thomas H B; Mathys, Christoph; Dolan, Ray; Friston, Karl

    2015-10-01

    Dopamine plays a key role in learning; however, its exact function in decision making and choice remains unclear. Recently, we proposed a generic model based on active (Bayesian) inference wherein dopamine encodes the precision of beliefs about optimal policies. Put simply, dopamine discharges reflect the confidence that a chosen policy will lead to desired outcomes. We designed a novel task to test this hypothesis, where subjects played a "limited offer" game in a functional magnetic resonance imaging experiment. Subjects had to decide how long to wait for a high offer before accepting a low offer, with the risk of losing everything if they waited too long. Bayesian model comparison showed that behavior strongly supported active inference, based on surprise minimization, over classical utility maximization schemes. Furthermore, midbrain activity, encompassing dopamine projection neurons, was accurately predicted by trial-by-trial variations in model-based estimates of precision. Our findings demonstrate that human subjects infer both optimal policies and the precision of those inferences, and thus support the notion that humans perform hierarchical probabilistic Bayesian inference. In other words, subjects have to infer both what they should do as well as how confident they are in their choices, where confidence may be encoded by dopaminergic firing. © The Author 2014. Published by Oxford University Press.

  17. The Dopaminergic Midbrain Encodes the Expected Certainty about Desired Outcomes

    PubMed Central

    Schwartenbeck, Philipp; FitzGerald, Thomas H. B.; Mathys, Christoph; Dolan, Ray; Friston, Karl

    2015-01-01

    Dopamine plays a key role in learning; however, its exact function in decision making and choice remains unclear. Recently, we proposed a generic model based on active (Bayesian) inference wherein dopamine encodes the precision of beliefs about optimal policies. Put simply, dopamine discharges reflect the confidence that a chosen policy will lead to desired outcomes. We designed a novel task to test this hypothesis, where subjects played a “limited offer” game in a functional magnetic resonance imaging experiment. Subjects had to decide how long to wait for a high offer before accepting a low offer, with the risk of losing everything if they waited too long. Bayesian model comparison showed that behavior strongly supported active inference, based on surprise minimization, over classical utility maximization schemes. Furthermore, midbrain activity, encompassing dopamine projection neurons, was accurately predicted by trial-by-trial variations in model-based estimates of precision. Our findings demonstrate that human subjects infer both optimal policies and the precision of those inferences, and thus support the notion that humans perform hierarchical probabilistic Bayesian inference. In other words, subjects have to infer both what they should do as well as how confident they are in their choices, where confidence may be encoded by dopaminergic firing. PMID:25056572

  18. Category inference as a function of correlational structure, category discriminability, and number of available cues.

    PubMed

    Lancaster, Matthew E; Shelhamer, Ryan; Homa, Donald

    2013-04-01

    Two experiments investigated category inference when categories were composed of correlated or uncorrelated dimensions and the categories overlapped minimally or moderately. When the categories minimally overlapped, the dimensions were strongly correlated with the category label. Following a classification learning phase, subsequent transfer required the selection of either a category label or a feature when one, two, or three features were missing. Experiments 1 and 2 differed primarily in the number of learning blocks prior to transfer. In each experiment, the inference of the category label or category feature was influenced by both dimensional and category correlations, as well as their interaction. The number of cues available at test impacted performance more when the dimensional correlations were zero and category overlap was high. However, a minimal number of cues were sufficient to produce high levels of inference when the dimensions were highly correlated; additional cues had a positive but reduced impact, even when overlap was high. Subjects were generally more accurate in inferring the category label than a category feature regardless of dimensional correlation, category overlap, or number of cues available at test. Whether the category label functioned as a special feature or not was critically dependent upon these embedded correlations, with feature inference driven more strongly by dimensional correlations.

  19. Mineral and Geochemical Classification From Spectroscopy/Diffraction Through Neural Networks

    NASA Astrophysics Data System (ADS)

    Ferralis, N.; Grossman, J.; Summons, R. E.

    2017-12-01

    Spectroscopy and diffraction techniques are essential for understanding structural, chemical and functional properties of geological materials for Earth and Planetary Sciences. Beyond data collection, quantitative insight relies on experimentally assembled, or computationally derived spectra. Inference on the geochemical or geophysical properties (such as crystallographic order, chemical functionality, elemental composition, etc.) of a particular geological material (mineral, organic matter, etc.) is based on fitting unknown spectra and comparing the fit with consolidated databases. The complexity of fitting highly convoluted spectra, often limits the ability to infer geochemical characteristics, and limits the throughput for extensive datasets. With the emergence of heuristic approaches to pattern recognitions though machine learning, in this work we investigate the possibility and potential of using supervised neural networks trained on available public spectroscopic database to directly infer geochemical parameters from unknown spectra. Using Raman, infrared spectroscopy and powder x-ray diffraction from the publicly available RRUFF database, we train neural network models to classify mineral and organic compounds (pure or mixtures) based on crystallographic structure from diffraction, chemical functionality, elemental composition and bonding from spectroscopy. As expected, the accuracy of the inference is strongly dependent on the quality and extent of the training data. We will identify a series of requirements and guidelines for the training dataset needed to achieve consistent high accuracy inference, along with methods to compensate for limited of data.

  20. Transcriptional network inference from functional similarity and expression data: a global supervised approach.

    PubMed

    Ambroise, Jérôme; Robert, Annie; Macq, Benoit; Gala, Jean-Luc

    2012-01-06

    An important challenge in system biology is the inference of biological networks from postgenomic data. Among these biological networks, a gene transcriptional regulatory network focuses on interactions existing between transcription factors (TFs) and and their corresponding target genes. A large number of reverse engineering algorithms were proposed to infer such networks from gene expression profiles, but most current methods have relatively low predictive performances. In this paper, we introduce the novel TNIFSED method (Transcriptional Network Inference from Functional Similarity and Expression Data), that infers a transcriptional network from the integration of correlations and partial correlations of gene expression profiles and gene functional similarities through a supervised classifier. In the current work, TNIFSED was applied to predict the transcriptional network in Escherichia coli and in Saccharomyces cerevisiae, using datasets of 445 and 170 affymetrix arrays, respectively. Using the area under the curve of the receiver operating characteristics and the F-measure as indicators, we showed the predictive performance of TNIFSED to be better than unsupervised state-of-the-art methods. TNIFSED performed slightly worse than the supervised SIRENE algorithm for the target genes identification of the TF having a wide range of yet identified target genes but better for TF having only few identified target genes. Our results indicate that TNIFSED is complementary to the SIRENE algorithm, and particularly suitable to discover target genes of "orphan" TFs.

  1. 78 FR 70601 - International Mail Contract

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-26

    ... Notice of Filing Functionally Equivalent Agreement, November 15, 2013 (collectively, Notice). II. The... Service filed an application for non-public treatment of materials filed under seal (Attachment 1); a... agreement to one previously found to be functionally equivalent to the Inbound Market-Dominant Multi-Service...

  2. Equivalence-Equivalence: Matching Stimuli with Same Discriminative Functions

    ERIC Educational Resources Information Center

    Carpentier, Franck; Smeets, Paul M.; Barnes-Holmes, Dermot

    2004-01-01

    Previous studies have shown that after being trained on A-B and A-C match-to-sample tasks, adults match not only same-class B and C stimuli (equivalence) but also BC compounds with same-class elements and with different-class elements (BC-BC). The assumption was that the BC-BC performances are based on matching equivalence and nonequivalence…

  3. Evaluation of artificial time series microarray data for dynamic gene regulatory network inference.

    PubMed

    Xenitidis, P; Seimenis, I; Kakolyris, S; Adamopoulos, A

    2017-08-07

    High-throughput technology like microarrays is widely used in the inference of gene regulatory networks (GRNs). We focused on time series data since we are interested in the dynamics of GRNs and the identification of dynamic networks. We evaluated the amount of information that exists in artificial time series microarray data and the ability of an inference process to produce accurate models based on them. We used dynamic artificial gene regulatory networks in order to create artificial microarray data. Key features that characterize microarray data such as the time separation of directly triggered genes, the percentage of directly triggered genes and the triggering function type were altered in order to reveal the limits that are imposed by the nature of microarray data on the inference process. We examined the effect of various factors on the inference performance such as the network size, the presence of noise in microarray data, and the network sparseness. We used a system theory approach and examined the relationship between the pole placement of the inferred system and the inference performance. We examined the relationship between the inference performance in the time domain and the true system parameter identification. Simulation results indicated that time separation and the percentage of directly triggered genes are crucial factors. Also, network sparseness, the triggering function type and noise in input data affect the inference performance. When two factors were simultaneously varied, it was found that variation of one parameter significantly affects the dynamic response of the other. Crucial factors were also examined using a real GRN and acquired results confirmed simulation findings with artificial data. Different initial conditions were also used as an alternative triggering approach. Relevant results confirmed that the number of datasets constitutes the most significant parameter with regard to the inference performance. Copyright © 2017 Elsevier Ltd. All rights reserved.

  4. Approximation Of Multi-Valued Inverse Functions Using Clustering And Sugeno Fuzzy Inference

    NASA Technical Reports Server (NTRS)

    Walden, Maria A.; Bikdash, Marwan; Homaifar, Abdollah

    1998-01-01

    Finding the inverse of a continuous function can be challenging and computationally expensive when the inverse function is multi-valued. Difficulties may be compounded when the function itself is difficult to evaluate. We show that we can use fuzzy-logic approximators such as Sugeno inference systems to compute the inverse on-line. To do so, a fuzzy clustering algorithm can be used in conjunction with a discriminating function to split the function data into branches for the different values of the forward function. These data sets are then fed into a recursive least-squares learning algorithm that finds the proper coefficients of the Sugeno approximators; each Sugeno approximator finds one value of the inverse function. Discussions about the accuracy of the approximation will be included.

  5. Dynamical inference: where phase synchronization and generalized synchronization meet.

    PubMed

    Stankovski, Tomislav; McClintock, Peter V E; Stefanovska, Aneta

    2014-06-01

    Synchronization is a widespread phenomenon that occurs among interacting oscillatory systems. It facilitates their temporal coordination and can lead to the emergence of spontaneous order. The detection of synchronization from the time series of such systems is of great importance for the understanding and prediction of their dynamics, and several methods for doing so have been introduced. However, the common case where the interacting systems have time-variable characteristic frequencies and coupling parameters, and may also be subject to continuous external perturbation and noise, still presents a major challenge. Here we apply recent developments in dynamical Bayesian inference to tackle these problems. In particular, we discuss how to detect phase slips and the existence of deterministic coupling from measured data, and we unify the concepts of phase synchronization and general synchronization. Starting from phase or state observables, we present methods for the detection of both phase and generalized synchronization. The consistency and equivalence of phase and generalized synchronization are further demonstrated, by the analysis of time series from analog electronic simulations of coupled nonautonomous van der Pol oscillators. We demonstrate that the detection methods work equally well on numerically simulated chaotic systems. In all the cases considered, we show that dynamical Bayesian inference can clearly identify noise-induced phase slips and distinguish coherence from intrinsic coupling-induced synchronization.

  6. 76 FR 23690 - Version One Regional Reliability Standards for Facilities Design, Connections, and Maintenance...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-28

    ... Maintenance; Protection and Control; and Voltage and Reactive AGENCY: Federal Energy Regulatory Commission..., Connections, and Maintenance; Protection and Control; and Voltage and Reactive, Notice of Proposed Rulemaking... regional definitions for Functionally Equivalent Protection System, Functionally Equivalent Remedial Action...

  7. Function of pretribosphenic and tribosphenic mammalian molars inferred from 3D animation

    NASA Astrophysics Data System (ADS)

    Schultz, Julia A.; Martin, Thomas

    2014-10-01

    Appearance of the tribosphenic molar in the Late Jurassic (160 Ma) is a crucial innovation for food processing in mammalian evolution. This molar type is characterized by a protocone, a talonid basin and a two-phased chewing cycle, all of which are apomorphic. In this functional study on the teeth of Late Jurassic Dryolestes leiriensis and the living marsupial Monodelphis domestica, we demonstrate that pretribosphenic and tribosphenic molars show fundamental differences of food reduction strategies, representing a shift in dental function during the transition of tribosphenic mammals. By using the Occlusal Fingerprint Analyser (OFA), we simulated the chewing motions of the pretribosphenic Dryolestes that represents an evolutionary precursor condition to such tribosphenic mammals as Monodelphis. Animation of chewing path and detection of collisional contacts between virtual models of teeth suggests that Dryolestes differs from the classical two-phased chewing movement of tribosphenidans, due to the narrowing of the interdental space in cervical (crown-root transition) direction, the inclination angle of the hypoflexid groove, and the unicuspid talonid. The pretribosphenic chewing cycle is equivalent to phase I of the tribosphenic chewing cycle, but the former lacks phase II of the tribosphenic chewing. The new approach can analyze the chewing cycle of the jaw by using polygonal 3D models of tooth surfaces, in a way that is complementary to the electromyography and strain gauge studies of muscle function of living animals. The technique allows alignment and scaling of isolated fossil teeth and utilizes the wear facet orientation and striation of the teeth to reconstruct the chewing path of extinct mammals.

  8. Function of pretribosphenic and tribosphenic mammalian molars inferred from 3D animation.

    PubMed

    Schultz, Julia A; Martin, Thomas

    2014-10-01

    Appearance of the tribosphenic molar in the Late Jurassic (160 Ma) is a crucial innovation for food processing in mammalian evolution. This molar type is characterized by a protocone, a talonid basin and a two-phased chewing cycle, all of which are apomorphic. In this functional study on the teeth of Late Jurassic Dryolestes leiriensis and the living marsupial Monodelphis domestica, we demonstrate that pretribosphenic and tribosphenic molars show fundamental differences of food reduction strategies, representing a shift in dental function during the transition of tribosphenic mammals. By using the Occlusal Fingerprint Analyser (OFA), we simulated the chewing motions of the pretribosphenic Dryolestes that represents an evolutionary precursor condition to such tribosphenic mammals as Monodelphis. Animation of chewing path and detection of collisional contacts between virtual models of teeth suggests that Dryolestes differs from the classical two-phased chewing movement of tribosphenidans, due to the narrowing of the interdental space in cervical (crown-root transition) direction, the inclination angle of the hypoflexid groove, and the unicuspid talonid. The pretribosphenic chewing cycle is equivalent to phase I of the tribosphenic chewing cycle, but the former lacks phase II of the tribosphenic chewing. The new approach can analyze the chewing cycle of the jaw by using polygonal 3D models of tooth surfaces, in a way that is complementary to the electromyography and strain gauge studies of muscle function of living animals. The technique allows alignment and scaling of isolated fossil teeth and utilizes the wear facet orientation and striation of the teeth to reconstruct the chewing path of extinct mammals.

  9. The stellar population and initial mass function of NGC 1399 with MUSE

    NASA Astrophysics Data System (ADS)

    Vaughan, Sam P.; Davies, Roger L.; Zieleniewski, Simon; Houghton, Ryan C. W.

    2018-06-01

    We present spatially resolved measurements of the stellar initial mass function (IMF) in NGC 1399, the largest elliptical galaxy in the Fornax Cluster. Using data from the Multi Unit Spectroscopic Explorer (MUSE) and updated state-of-the-art stellar population synthesis models from Conroy et al. (2018), we use full spectral fitting to measure the low-mass IMF, as well as a number of individual elemental abundances, as a function of radius in this object. We find that the IMF in NGC 1399 is heavier than the Milky Way in its centre and remains radially constant at a super-salpeter slope out to 0.7 Re. At radii larger than this, the IMF slope decreases to become marginally consistent with a Milky Way IMF just beyond Re. The inferred central V-band M/L ratio is in excellent agreement with the previously reported dynamical M/L measurement from Houghton et al. (2006). The measured radial form of the M/L ratio may be evidence for a two-phase formation in this object, with the central regions forming differently to the outskirts. We also report measurements of a spatially resolved filament of ionised gas extending 4"(404 pc at DL = 21.1 Mpc) from the centre of NGC 1399, with very narrow equivalent width and low velocity dispersion (65 ± 14 kms-1). The location of the emission, combined with an analysis of the emission line ratios, leads us to conclude that NGC 1399's AGN is the source of ionising radiation.

  10. A statistical approach for inferring the 3D structure of the genome.

    PubMed

    Varoquaux, Nelle; Ay, Ferhat; Noble, William Stafford; Vert, Jean-Philippe

    2014-06-15

    Recent technological advances allow the measurement, in a single Hi-C experiment, of the frequencies of physical contacts among pairs of genomic loci at a genome-wide scale. The next challenge is to infer, from the resulting DNA-DNA contact maps, accurate 3D models of how chromosomes fold and fit into the nucleus. Many existing inference methods rely on multidimensional scaling (MDS), in which the pairwise distances of the inferred model are optimized to resemble pairwise distances derived directly from the contact counts. These approaches, however, often optimize a heuristic objective function and require strong assumptions about the biophysics of DNA to transform interaction frequencies to spatial distance, and thereby may lead to incorrect structure reconstruction. We propose a novel approach to infer a consensus 3D structure of a genome from Hi-C data. The method incorporates a statistical model of the contact counts, assuming that the counts between two loci follow a Poisson distribution whose intensity decreases with the physical distances between the loci. The method can automatically adjust the transfer function relating the spatial distance to the Poisson intensity and infer a genome structure that best explains the observed data. We compare two variants of our Poisson method, with or without optimization of the transfer function, to four different MDS-based algorithms-two metric MDS methods using different stress functions, a non-metric version of MDS and ChromSDE, a recently described, advanced MDS method-on a wide range of simulated datasets. We demonstrate that the Poisson models reconstruct better structures than all MDS-based methods, particularly at low coverage and high resolution, and we highlight the importance of optimizing the transfer function. On publicly available Hi-C data from mouse embryonic stem cells, we show that the Poisson methods lead to more reproducible structures than MDS-based methods when we use data generated using different restriction enzymes, and when we reconstruct structures at different resolutions. A Python implementation of the proposed method is available at http://cbio.ensmp.fr/pastis. © The Author 2014. Published by Oxford University Press.

  11. A general framework for updating belief distributions.

    PubMed

    Bissiri, P G; Holmes, C C; Walker, S G

    2016-11-01

    We propose a framework for general Bayesian inference. We argue that a valid update of a prior belief distribution to a posterior can be made for parameters which are connected to observations through a loss function rather than the traditional likelihood function, which is recovered as a special case. Modern application areas make it increasingly challenging for Bayesians to attempt to model the true data-generating mechanism. For instance, when the object of interest is low dimensional, such as a mean or median, it is cumbersome to have to achieve this via a complete model for the whole data distribution. More importantly, there are settings where the parameter of interest does not directly index a family of density functions and thus the Bayesian approach to learning about such parameters is currently regarded as problematic. Our framework uses loss functions to connect information in the data to functionals of interest. The updating of beliefs then follows from a decision theoretic approach involving cumulative loss functions. Importantly, the procedure coincides with Bayesian updating when a true likelihood is known yet provides coherent subjective inference in much more general settings. Connections to other inference frameworks are highlighted.

  12. The transfer of Crel contextual control (same, opposite, less than, more than) through equivalence relations.

    PubMed

    Perez, William F; Kovac, Roberta; Nico, Yara C; Caro, Daniel M; Fidalgo, Adriana P; Linares, Ila; de Almeida, João Henrique; de Rose, Júlio C

    2017-11-01

    According to Relational Frame Theory (RFT) C rel denotes a contextual stimulus that controls a particular type of relational response (sameness, opposition, comparative, temporal, hierarchical etc.) in a given situation. Previous studies suggest that contextual functions may be indirectly acquired via transfer of function. The present study investigated the transfer of C rel contextual control through equivalence relations. Experiment 1 evaluated the transfer of C rel contextual functions for relational responses based on sameness and opposition. Experiment 2 extended these findings by evaluating transfer of function using comparative C rel stimuli. Both experiments followed a similar sequence of phases. First, abstract forms were established as C rel stimuli via multiple exemplar training (Phase 1). The contextual cues were then applied to establish arbitrary relations among nonsense words and to test derived relations (Phase 2). After that, equivalence relations involving the original C rel stimuli and other abstract forms were trained and tested (Phase 3). Transfer of function was evaluated by replacing the directly established C rel stimuli with their equivalent stimuli in the former experimental tasks (Phases 1 and 2). Results from both experiments suggest that C rel contextual control may be extended via equivalence relations, allowing other arbitrarily related stimuli to indirectly acquire C rel functions and regulate behavior by evoking appropriate relational responses in the presence of both previously known and novel stimuli. © 2017 Society for the Experimental Analysis of Behavior.

  13. When ab ≠ c - c': published errors in the reports of single-mediator models.

    PubMed

    Petrocelli, John V; Clarkson, Joshua J; Whitmire, Melanie B; Moon, Paul E

    2013-06-01

    Accurate reports of mediation analyses are critical to the assessment of inferences related to causality, since these inferences are consequential for both the evaluation of previous research (e.g., meta-analyses) and the progression of future research. However, upon reexamination, approximately 15% of published articles in psychology contain at least one incorrect statistical conclusion (Bakker & Wicherts, Behavior research methods, 43, 666-678 2011), disparities that beget the question of inaccuracy in mediation reports. To quantify this question of inaccuracy, articles reporting standard use of single-mediator models in three high-impact journals in personality and social psychology during 2011 were examined. More than 24% of the 156 models coded failed an equivalence test (i.e., ab = c - c'), suggesting that one or more regression coefficients in mediation analyses are frequently misreported. The authors cite common sources of errors, provide recommendations for enhanced accuracy in reports of single-mediator models, and discuss implications for alternative methods.

  14. Sheathfolds in rheomorphic ignimbrites

    USGS Publications Warehouse

    Branney, M.J.; Barry, T.L.; Godchaux, Martha

    2004-01-01

    Structural reappraisal of several classic rheomorphic ignimbrites in Colorado, Idaho, the Canary Islands and Italy has, for the first time, revealed abundant oblique folds, curvilinear folds and sheathfolds which formed during emplacement. Like their equivalents in tectonic shear-zones, the sheathfold axes lie sub-parallel to a pervasive elongation lineation, and appear as eye structures on rock surfaces normal to the transport direction. With the recognition of sheathfolds, ignimbrites previously inferred to have undergone complex rheomorphic deformation histories are re-interpreted as recording a single, progressive deformation event. In some examples, the trends of sheathfolds and related lineations change with height through a single ignimbrite suggesting that rheomorphism did not affect the entire thickness of ignimbrite synchronously. Instead, we infer that in these ignimbrites a thin ductile shear-zone rose gradually through the aggrading agglutinating mass whilst the flow direction varied with time. This suggests that, in some cases, both welding and rheomorphism can be extremely rapid, with ductile strain rates significantly exceeding rates of ignimbrite aggradation. ?? Springer-Verlag 2004.

  15. The HCUP SID Imputation Project: Improving Statistical Inferences for Health Disparities Research by Imputing Missing Race Data.

    PubMed

    Ma, Yan; Zhang, Wei; Lyman, Stephen; Huang, Yihe

    2018-06-01

    To identify the most appropriate imputation method for missing data in the HCUP State Inpatient Databases (SID) and assess the impact of different missing data methods on racial disparities research. HCUP SID. A novel simulation study compared four imputation methods (random draw, hot deck, joint multiple imputation [MI], conditional MI) for missing values for multiple variables, including race, gender, admission source, median household income, and total charges. The simulation was built on real data from the SID to retain their hierarchical data structures and missing data patterns. Additional predictive information from the U.S. Census and American Hospital Association (AHA) database was incorporated into the imputation. Conditional MI prediction was equivalent or superior to the best performing alternatives for all missing data structures and substantially outperformed each of the alternatives in various scenarios. Conditional MI substantially improved statistical inferences for racial health disparities research with the SID. © Health Research and Educational Trust.

  16. Modern inhalation anesthetics: Potent greenhouse gases in the global atmosphere

    NASA Astrophysics Data System (ADS)

    Vollmer, Martin K.; Rhee, Tae Siek; Rigby, Matt; Hofstetter, Doris; Hill, Matthias; Schoenenberger, Fabian; Reimann, Stefan

    2015-03-01

    Modern halogenated inhalation anesthetics undergo little metabolization during clinical application and evaporate almost completely to the atmosphere. Based on their first measurements in a range of environments, from urban areas to the pristine Antarctic environment, we detect a rapid accumulation and ubiquitous presence of isoflurane, desflurane, and sevoflurane in the global atmosphere. Over the past decade, their abundances in the atmosphere have increased to global mean mole fractions in 2014 of 0.097ppt, 0.30ppt, and 0.13ppt (parts per trillion, 10-12, in dry air), respectively. Emissions of these long-lived greenhouse gases inferred from the observations suggest a global combined release to the atmosphere of 3.1 ± 0.6 million t CO2 equivalent in 2014 of which ≈80% stems from desflurane. We also report on halothane, a previously widely used anesthetic. Its global mean mole fraction has declined to 9.2ppq (parts per quadrillion, 10-15) by 2014. However, the inferred present usage is still 280 ±120t yr-1.

  17. The 4D hyperspherical diffusion wavelet: A new method for the detection of localized anatomical variation.

    PubMed

    Hosseinbor, Ameer Pasha; Kim, Won Hwa; Adluru, Nagesh; Acharya, Amit; Vorperian, Houri K; Chung, Moo K

    2014-01-01

    Recently, the HyperSPHARM algorithm was proposed to parameterize multiple disjoint objects in a holistic manner using the 4D hyperspherical harmonics. The HyperSPHARM coefficients are global; they cannot be used to directly infer localized variations in signal. In this paper, we present a unified wavelet framework that links Hyper-SPHARM to the diffusion wavelet transform. Specifically, we will show that the HyperSPHARM basis forms a subset of a wavelet-based multiscale representation of surface-based signals. This wavelet, termed the hyperspherical diffusion wavelet, is a consequence of the equivalence of isotropic heat diffusion smoothing and the diffusion wavelet transform on the hypersphere. Our framework allows for the statistical inference of highly localized anatomical changes, which we demonstrate in the first-ever developmental study on the hyoid bone investigating gender and age effects. We also show that the hyperspherical wavelet successfully picks up group-wise differences that are barely detectable using SPHARM.

  18. The 4D Hyperspherical Diffusion Wavelet: A New Method for the Detection of Localized Anatomical Variation

    PubMed Central

    Hosseinbor, A. Pasha; Kim, Won Hwa; Adluru, Nagesh; Acharya, Amit; Vorperian, Houri K.; Chung, Moo K.

    2014-01-01

    Recently, the HyperSPHARM algorithm was proposed to parameterize multiple disjoint objects in a holistic manner using the 4D hyperspherical harmonics. The HyperSPHARM coefficients are global; they cannot be used to directly infer localized variations in signal. In this paper, we present a unified wavelet framework that links HyperSPHARM to the diffusion wavelet transform. Specifically, we will show that the HyperSPHARM basis forms a subset of a wavelet-based multiscale representation of surface-based signals. This wavelet, termed the hyperspherical diffusion wavelet, is a consequence of the equivalence of isotropic heat diffusion smoothing and the diffusion wavelet transform on the hypersphere. Our framework allows for the statistical inference of highly localized anatomical changes, which we demonstrate in the firstever developmental study on the hyoid bone investigating gender and age effects. We also show that the hyperspherical wavelet successfully picks up group-wise differences that are barely detectable using SPHARM. PMID:25320783

  19. 76 FR 36583 - New Postal Product

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-06-22

    ... Docket No. CP2010-36 serve as the baseline contract for future functional equivalence analyses of the... commitments, do not alter the contract's functional equivalency. Id. at 4. The Postal Service asserts that... offered or the fundamental structure of the contract. Therefore, it requests that the instant contract be...

  20. 75 FR 72845 - New Postal Product

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-26

    ... contract for future functional equivalence analyses of the GEPS 3 product. \\1\\ Notice of United States... analysis of the formulas, and certification of the Governors' vote; and Attachment 4--an application for... costing information and volume commitments, do not alter the contracts' functional equivalency. Id. at 3-4...

  1. 75 FR 69715 - New Postal Product

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-11-15

    ... contract for future functional equivalence analyses of the GEPS 3 product. \\1\\ Notice of United States... GEPS contracts, a description of applicable GEPS contracts, formulas for prices, an analysis of the... information and volume commitments, do not alter the contracts' functional equivalency. Id. at 3-4. The Postal...

  2. Exponential localization of Wannier functions in insulators.

    PubMed

    Brouder, Christian; Panati, Gianluca; Calandra, Matteo; Mourougane, Christophe; Marzari, Nicola

    2007-01-26

    The exponential localization of Wannier functions in two or three dimensions is proven for all insulators that display time-reversal symmetry, settling a long-standing conjecture. Our proof relies on the equivalence between the existence of analytic quasi-Bloch functions and the nullity of the Chern numbers (or of the Hall current) for the system under consideration. The same equivalence implies that Chern insulators cannot display exponentially localized Wannier functions. An explicit condition for the reality of the Wannier functions is identified.

  3. Inferring the temperature dependence of population parameters: the effects of experimental design and inference algorithm

    PubMed Central

    Palamara, Gian Marco; Childs, Dylan Z; Clements, Christopher F; Petchey, Owen L; Plebani, Marco; Smith, Matthew J

    2014-01-01

    Understanding and quantifying the temperature dependence of population parameters, such as intrinsic growth rate and carrying capacity, is critical for predicting the ecological responses to environmental change. Many studies provide empirical estimates of such temperature dependencies, but a thorough investigation of the methods used to infer them has not been performed yet. We created artificial population time series using a stochastic logistic model parameterized with the Arrhenius equation, so that activation energy drives the temperature dependence of population parameters. We simulated different experimental designs and used different inference methods, varying the likelihood functions and other aspects of the parameter estimation methods. Finally, we applied the best performing inference methods to real data for the species Paramecium caudatum. The relative error of the estimates of activation energy varied between 5% and 30%. The fraction of habitat sampled played the most important role in determining the relative error; sampling at least 1% of the habitat kept it below 50%. We found that methods that simultaneously use all time series data (direct methods) and methods that estimate population parameters separately for each temperature (indirect methods) are complementary. Indirect methods provide a clearer insight into the shape of the functional form describing the temperature dependence of population parameters; direct methods enable a more accurate estimation of the parameters of such functional forms. Using both methods, we found that growth rate and carrying capacity of Paramecium caudatum scale with temperature according to different activation energies. Our study shows how careful choice of experimental design and inference methods can increase the accuracy of the inferred relationships between temperature and population parameters. The comparison of estimation methods provided here can increase the accuracy of model predictions, with important implications in understanding and predicting the effects of temperature on the dynamics of populations. PMID:25558365

  4. Reading biological processes from nucleotide sequences

    NASA Astrophysics Data System (ADS)

    Murugan, Anand

    Cellular processes have traditionally been investigated by techniques of imaging and biochemical analysis of the molecules involved. The recent rapid progress in our ability to manipulate and read nucleic acid sequences gives us direct access to the genetic information that directs and constrains biological processes. While sequence data is being used widely to investigate genotype-phenotype relationships and population structure, here we use sequencing to understand biophysical mechanisms. We present work on two different systems. First, in chapter 2, we characterize the stochastic genetic editing mechanism that produces diverse T-cell receptors in the human immune system. We do this by inferring statistical distributions of the underlying biochemical events that generate T-cell receptor coding sequences from the statistics of the observed sequences. This inferred model quantitatively describes the potential repertoire of T-cell receptors that can be produced by an individual, providing insight into its potential diversity and the probability of generation of any specific T-cell receptor. Then in chapter 3, we present work on understanding the functioning of regulatory DNA sequences in both prokaryotes and eukaryotes. Here we use experiments that measure the transcriptional activity of large libraries of mutagenized promoters and enhancers and infer models of the sequence-function relationship from this data. For the bacterial promoter, we infer a physically motivated 'thermodynamic' model of the interaction of DNA-binding proteins and RNA polymerase determining the transcription rate of the downstream gene. For the eukaryotic enhancers, we infer heuristic models of the sequence-function relationship and use these models to find synthetic enhancer sequences that optimize inducibility of expression. Both projects demonstrate the utility of sequence information in conjunction with sophisticated statistical inference techniques for dissecting underlying biophysical mechanisms.

  5. Spectral likelihood expansions for Bayesian inference

    NASA Astrophysics Data System (ADS)

    Nagel, Joseph B.; Sudret, Bruno

    2016-03-01

    A spectral approach to Bayesian inference is presented. It pursues the emulation of the posterior probability density. The starting point is a series expansion of the likelihood function in terms of orthogonal polynomials. From this spectral likelihood expansion all statistical quantities of interest can be calculated semi-analytically. The posterior is formally represented as the product of a reference density and a linear combination of polynomial basis functions. Both the model evidence and the posterior moments are related to the expansion coefficients. This formulation avoids Markov chain Monte Carlo simulation and allows one to make use of linear least squares instead. The pros and cons of spectral Bayesian inference are discussed and demonstrated on the basis of simple applications from classical statistics and inverse modeling.

  6. In search of functional association from time-series microarray data based on the change trend and level of gene expression

    PubMed Central

    He, Feng; Zeng, An-Ping

    2006-01-01

    Background The increasing availability of time-series expression data opens up new possibilities to study functional linkages of genes. Present methods used to infer functional linkages between genes from expression data are mainly based on a point-to-point comparison. Change trends between consecutive time points in time-series data have been so far not well explored. Results In this work we present a new method based on extracting main features of the change trend and level of gene expression between consecutive time points. The method, termed as trend correlation (TC), includes two major steps: 1, calculating a maximal local alignment of change trend score by dynamic programming and a change trend correlation coefficient between the maximal matched change levels of each gene pair; 2, inferring relationships of gene pairs based on two statistical extraction procedures. The new method considers time shifts and inverted relationships in a similar way as the local clustering (LC) method but the latter is merely based on a point-to-point comparison. The TC method is demonstrated with data from yeast cell cycle and compared with the LC method and the widely used Pearson correlation coefficient (PCC) based clustering method. The biological significance of the gene pairs is examined with several large-scale yeast databases. Although the TC method predicts an overall lower number of gene pairs than the other two methods at a same p-value threshold, the additional number of gene pairs inferred by the TC method is considerable: e.g. 20.5% compared with the LC method and 49.6% with the PCC method for a p-value threshold of 2.7E-3. Moreover, the percentage of the inferred gene pairs consistent with databases by our method is generally higher than the LC method and similar to the PCC method. A significant number of the gene pairs only inferred by the TC method are process-identity or function-similarity pairs or have well-documented biological interactions, including 443 known protein interactions and some known cell cycle related regulatory interactions. It should be emphasized that the overlapping of gene pairs detected by the three methods is normally not very high, indicating a necessity of combining the different methods in search of functional association of genes from time-series data. For a p-value threshold of 1E-5 the percentage of process-identity and function-similarity gene pairs among the shared part of the three methods reaches 60.2% and 55.6% respectively, building a good basis for further experimental and functional study. Furthermore, the combined use of methods is important to infer more complete regulatory circuits and network as exemplified in this study. Conclusion The TC method can significantly augment the current major methods to infer functional linkages and biological network and is well suitable for exploring temporal relationships of gene expression in time-series data. PMID:16478547

  7. 78 FR 57470 - Special Conditions: Eclipse, EA500, Certification of Autothrottle Functions

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-19

    ... Engine Control System 23-112A-SC for High Intensity Radiated Fields (HIRF) Protection Equivalent Levels... transient. (e) Under rare normal and non-normal conditions, disengagement of any automatic control function... standards that the Administrator considers necessary to establish a level of safety equivalent to that...

  8. Diagnostic causal reasoning with verbal information.

    PubMed

    Meder, Björn; Mayrhofer, Ralf

    2017-08-01

    In diagnostic causal reasoning, the goal is to infer the probability of causes from one or multiple observed effects. Typically, studies investigating such tasks provide subjects with precise quantitative information regarding the strength of the relations between causes and effects or sample data from which the relevant quantities can be learned. By contrast, we sought to examine people's inferences when causal information is communicated through qualitative, rather vague verbal expressions (e.g., "X occasionally causes A"). We conducted three experiments using a sequential diagnostic inference task, where multiple pieces of evidence were obtained one after the other. Quantitative predictions of different probabilistic models were derived using the numerical equivalents of the verbal terms, taken from an unrelated study with different subjects. We present a novel Bayesian model that allows for incorporating the temporal weighting of information in sequential diagnostic reasoning, which can be used to model both primacy and recency effects. On the basis of 19,848 judgments from 292 subjects, we found a remarkably close correspondence between the diagnostic inferences made by subjects who received only verbal information and those of a matched control group to whom information was presented numerically. Whether information was conveyed through verbal terms or numerical estimates, diagnostic judgments closely resembled the posterior probabilities entailed by the causes' prior probabilities and the effects' likelihoods. We observed interindividual differences regarding the temporal weighting of evidence in sequential diagnostic reasoning. Our work provides pathways for investigating judgment and decision making with verbal information within a computational modeling framework. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Divide et impera: subgoaling reduces the complexity of probabilistic inference and problem solving

    PubMed Central

    Maisto, Domenico; Donnarumma, Francesco; Pezzulo, Giovanni

    2015-01-01

    It has long been recognized that humans (and possibly other animals) usually break problems down into smaller and more manageable problems using subgoals. Despite a general consensus that subgoaling helps problem solving, it is still unclear what the mechanisms guiding online subgoal selection are during the solution of novel problems for which predefined solutions are not available. Under which conditions does subgoaling lead to optimal behaviour? When is subgoaling better than solving a problem from start to finish? Which is the best number and sequence of subgoals to solve a given problem? How are these subgoals selected during online inference? Here, we present a computational account of subgoaling in problem solving. Following Occam's razor, we propose that good subgoals are those that permit planning solutions and controlling behaviour using less information resources, thus yielding parsimony in inference and control. We implement this principle using approximate probabilistic inference: subgoals are selected using a sampling method that considers the descriptive complexity of the resulting sub-problems. We validate the proposed method using a standard reinforcement learning benchmark (four-rooms scenario) and show that the proposed method requires less inferential steps and permits selecting more compact control programs compared to an equivalent procedure without subgoaling. Furthermore, we show that the proposed method offers a mechanistic explanation of the neuronal dynamics found in the prefrontal cortex of monkeys that solve planning problems. Our computational framework provides a novel integrative perspective on subgoaling and its adaptive advantages for planning, control and learning, such as for example lowering cognitive effort and working memory load. PMID:25652466

  10. Equivalent electron fluence for solar proton damage in GaAs shallow junction cells

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; Stock, L. V.

    1984-01-01

    The short-circuit current reduction in GaAs shallow junction heteroface solar cells was calculated according to a simplified solar cell damage model in which the nonuniformity of the damage as a function of penetration depth is treated explicitly. Although the equivalent electron fluence was not uniquely defined for low-energy monoenergetic proton exposure, an equivalent electron fluence is found for proton spectra characteristic of the space environment. The equivalent electron fluence ratio was calculated for a typical large solar flare event for which the proton spectrum is PHI(sub p)(E) = A/E(p/sq. cm) where E is in MeV. The equivalent fluence ratio is a function of the cover glass shield thickness or the corresponding cutoff energy E(sub c). In terms of the cutoff energy, the equivalent 1 MeV electron fluence ratio is r(sub p)(E sub c) = 10(9)/E(sub c)(1.8) where E(sub c) is in units of KeV.

  11. Inferring Toxicological Responses of HepG2 Cells from ...

    EPA Pesticide Factsheets

    Understanding the dynamic perturbation of cell states by chemicals can aid in for predicting their adverse effects. High-content imaging (HCI) was used to measure the state of HepG2 cells over three time points (1, 24, and 72 h) in response to 976 ToxCast chemicals for 10 different concentrations (0.39-200µM). Cell state was characterized by p53 activation (p53), c-Jun activation (SK), phospho-Histone H2A.x (OS), phospho-Histone H3 (MA), alpha tubulin (Mt), mitochondrial membrane potential (MMP), mitochondrial mass (MM), cell cycle arrest (CCA), nuclear size (NS) and cell number (CN). Dynamic cell state perturbations due to each chemical concentration were utilized to infer coarse-grained dependencies between cellular functions as Boolean networks (BNs). BNs were inferred from data in two steps. First, the data for each state variable were discretized into changed/active (> 1 standard deviation), and unchanged/inactive values. Second, the discretized data were used to learn Boolean relationships between variables. In our case, a BN is a wiring diagram between nodes that represent 10 previously described observable phenotypes. Functional relationships between nodes were represented as Boolean functions. We found that inferred BN show that HepG2 cell response is chemical and concentration specific. We observed presence of both point and cycle BN attractors. In addition, there are instances where Boolean functions were not found. We believe that this may be either

  12. Factorizing the motion sensitivity function into equivalent input noise and calculation efficiency.

    PubMed

    Allard, Rémy; Arleo, Angelo

    2017-01-01

    The photopic motion sensitivity function of the energy-based motion system is band-pass peaking around 8 Hz. Using an external noise paradigm to factorize the sensitivity into equivalent input noise and calculation efficiency, the present study investigated if the variation in photopic motion sensitivity as a function of the temporal frequency is due to a variation of equivalent input noise (e.g., early temporal filtering) or calculation efficiency (ability to select and integrate motion). For various temporal frequencies, contrast thresholds for a direction discrimination task were measured in presence and absence of noise. Up to 15 Hz, the sensitivity variation was mainly due to a variation of equivalent input noise and little variation in calculation efficiency was observed. The sensitivity fall-off at very high temporal frequencies (from 15 to 30 Hz) was due to a combination of a drop of calculation efficiency and a rise of equivalent input noise. A control experiment in which an artificial temporal integration was applied to the stimulus showed that an early temporal filter (generally assumed to affect equivalent input noise, not calculation efficiency) could impair both the calculation efficiency and equivalent input noise at very high temporal frequencies. We conclude that at the photopic luminance intensity tested, the variation of motion sensitivity as a function of the temporal frequency was mainly due to early temporal filtering, not to the ability to select and integrate motion. More specifically, we conclude that photopic motion sensitivity at high temporal frequencies is limited by internal noise occurring after the transduction process (i.e., neural noise), not by quantal noise resulting from the probabilistic absorption of photons by the photoreceptors as previously suggested.

  13. Emotional ties that bind: the roles of valence and consistency of group emotion in inferences of cohesiveness and common fate.

    PubMed

    Magee, Joe C; Tiedens, Larissa Z

    2006-12-01

    In three studies, observers based inferences about the cohesiveness and common fate of groups on the emotions expressed by group members. The valence of expressions affected cohesiveness inferences, whereas the consistency of expressions affected inferences of whether members have common fate. These emotion composition effects were stronger than those due to the race or sex composition of the group. Furthermore, the authors show that emotion valence and consistency are differentially involved in judgments about the degree to which the group as a whole was responsible for group performance. Finally, it is demonstrated that valence-cohesiveness effects are mediated by inferences of interpersonal liking and that consistency-common fate effects are mediated by inferences of psychological similarity. These findings have implications for the literature on entitativity and regarding the function of emotions in social contexts.

  14. Human Inferences about Sequences: A Minimal Transition Probability Model

    PubMed Central

    2016-01-01

    The brain constantly infers the causes of the inputs it receives and uses these inferences to generate statistical expectations about future observations. Experimental evidence for these expectations and their violations include explicit reports, sequential effects on reaction times, and mismatch or surprise signals recorded in electrophysiology and functional MRI. Here, we explore the hypothesis that the brain acts as a near-optimal inference device that constantly attempts to infer the time-varying matrix of transition probabilities between the stimuli it receives, even when those stimuli are in fact fully unpredictable. This parsimonious Bayesian model, with a single free parameter, accounts for a broad range of findings on surprise signals, sequential effects and the perception of randomness. Notably, it explains the pervasive asymmetry between repetitions and alternations encountered in those studies. Our analysis suggests that a neural machinery for inferring transition probabilities lies at the core of human sequence knowledge. PMID:28030543

  15. A prior-based integrative framework for functional transcriptional regulatory network inference

    PubMed Central

    Siahpirani, Alireza F.

    2017-01-01

    Abstract Transcriptional regulatory networks specify regulatory proteins controlling the context-specific expression levels of genes. Inference of genome-wide regulatory networks is central to understanding gene regulation, but remains an open challenge. Expression-based network inference is among the most popular methods to infer regulatory networks, however, networks inferred from such methods have low overlap with experimentally derived (e.g. ChIP-chip and transcription factor (TF) knockouts) networks. Currently we have a limited understanding of this discrepancy. To address this gap, we first develop a regulatory network inference algorithm, based on probabilistic graphical models, to integrate expression with auxiliary datasets supporting a regulatory edge. Second, we comprehensively analyze our and other state-of-the-art methods on different expression perturbation datasets. Networks inferred by integrating sequence-specific motifs with expression have substantially greater agreement with experimentally derived networks, while remaining more predictive of expression than motif-based networks. Our analysis suggests natural genetic variation as the most informative perturbation for network inference, and, identifies core TFs whose targets are predictable from expression. Multiple reasons make the identification of targets of other TFs difficult, including network architecture and insufficient variation of TF mRNA level. Finally, we demonstrate the utility of our inference algorithm to infer stress-specific regulatory networks and for regulator prioritization. PMID:27794550

  16. Comparative Analysis of Membership Function on Mamdani Fuzzy Inference System for Decision Making

    NASA Astrophysics Data System (ADS)

    harliana, Putri; Rahim, Robbi

    2017-12-01

    Membership function is a curve that shows mapping the input data points into the value or degree of membership which has an interval between 0 and 1. One way to get membership value is through a function approach. There are some membership functions can be used on mamdani fuzzy inference system. They are triangular, trapezoid, singleton, sigmoid, Gaussian, etc. In this paper only discuss three membership functions, are triangular, trapezoid and Gaussian. These three membership functions will be compared to see the difference in parameter values and results obtained. For case study in this paper is admission of students at popular school. There are three variable can be used, they are students’ report, IQ score and parents’ income. Which will then be created if-then rules.

  17. Conceptual Influences on Category-Based Induction

    ERIC Educational Resources Information Center

    Gelman, Susan A.; Davidson, Natalie S.

    2013-01-01

    One important function of categories is to permit rich inductive inferences. Prior work shows that children use category labels to guide their inductive inferences. However, there are competing theories to explain this phenomenon, differing in the roles attributed to conceptual information vs. perceptual similarity. Seven experiments with 4- to…

  18. Flexible Retrieval: When True Inferences Produce False Memories

    ERIC Educational Resources Information Center

    Carpenter, Alexis C.; Schacter, Daniel L.

    2017-01-01

    Episodic memory involves flexible retrieval processes that allow us to link together distinct episodes, make novel inferences across overlapping events, and recombine elements of past experiences when imagining future events. However, the same flexible retrieval and recombination processes that underpin these adaptive functions may also leave…

  19. IMNN: Information Maximizing Neural Networks

    NASA Astrophysics Data System (ADS)

    Charnock, Tom; Lavaux, Guilhem; Wandelt, Benjamin D.

    2018-04-01

    This software trains artificial neural networks to find non-linear functionals of data that maximize Fisher information: information maximizing neural networks (IMNNs). As compressing large data sets vastly simplifies both frequentist and Bayesian inference, important information may be inadvertently missed. Likelihood-free inference based on automatically derived IMNN summaries produces summaries that are good approximations to sufficient statistics. IMNNs are robustly capable of automatically finding optimal, non-linear summaries of the data even in cases where linear compression fails: inferring the variance of Gaussian signal in the presence of noise, inferring cosmological parameters from mock simulations of the Lyman-α forest in quasar spectra, and inferring frequency-domain parameters from LISA-like detections of gravitational waveforms. In this final case, the IMNN summary outperforms linear data compression by avoiding the introduction of spurious likelihood maxima.

  20. Accounting for Non-Gaussian Sources of Spatial Correlation in Parametric Functional Magnetic Resonance Imaging Paradigms I: Revisiting Cluster-Based Inferences.

    PubMed

    Gopinath, Kaundinya; Krishnamurthy, Venkatagiri; Sathian, K

    2018-02-01

    In a recent study, Eklund et al. employed resting-state functional magnetic resonance imaging data as a surrogate for null functional magnetic resonance imaging (fMRI) datasets and posited that cluster-wise family-wise error (FWE) rate-corrected inferences made by using parametric statistical methods in fMRI studies over the past two decades may have been invalid, particularly for cluster defining thresholds less stringent than p < 0.001; this was principally because the spatial autocorrelation functions (sACF) of fMRI data had been modeled incorrectly to follow a Gaussian form, whereas empirical data suggested otherwise. Here, we show that accounting for non-Gaussian signal components such as those arising from resting-state neural activity as well as physiological responses and motion artifacts in the null fMRI datasets yields first- and second-level general linear model analysis residuals with nearly uniform and Gaussian sACF. Further comparison with nonparametric permutation tests indicates that cluster-based FWE corrected inferences made with Gaussian spatial noise approximations are valid.

  1. The Derived Transfer and Reversal of Mood Functions through Equivalence Relations: II

    ERIC Educational Resources Information Center

    Cahill, Jane; Barnes-Holmes, Yvonne; Barnes-Holmes, Dermot; Rodriguez-Valverde, Miguel; Luciano, Carmen; Smeets, Paul M.

    2007-01-01

    Recent research has demonstrated the transfer of induced mood functions through equivalence relations by means of a musical mood-induction procedure. The research described in this article replicated and extended such work, primarily with the inclusion of a baseline and two types of reversal procedures. First, 16 adult participants were trained…

  2. 21 CFR 1404.925 - Conviction.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 9 2010-04-01 2010-04-01 false Conviction. 1404.925 Section 1404.925 Food and... a plea of nolo contendere; or (b) Any other resolution that is the functional equivalent of a... participation of the court is the functional equivalent of a judgment only if it includes an admission of guilt. ...

  3. 77 FR 48448 - Connect America Fund; A National Broadband Plan for Our Future; Establishing Just and Reasonable...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-14

    ... to the total reduction required in 2012. In addition, the Bureau clarifies that non-commercial mobile... than their functionally equivalent interstate rates in making this transition. 6. Carriers and state... functionally equivalent interstate switched access rate element rates. Other of the carrier's intrastate...

  4. Are Letter Detection and Proofreading Tasks Equivalent?

    ERIC Educational Resources Information Center

    Saint-Aubin, Jean; Losier, Marie-Claire; Roy, Macha; Lawrence, Mike

    2015-01-01

    When readers search for misspellings in a proofreading task or for a letter in a letter detection task, they are more likely to omit function words than content words. However, with misspelled words, previous findings for the letter detection task were mixed. In two experiments, the authors tested the functional equivalence of both tasks. Results…

  5. 21 CFR 1404.925 - Conviction.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 9 2011-04-01 2011-04-01 false Conviction. 1404.925 Section 1404.925 Food and... a plea of nolo contendere; or (b) Any other resolution that is the functional equivalent of a... participation of the court is the functional equivalent of a judgment only if it includes an admission of guilt. ...

  6. Mapping causal functional contributions derived from the clinical assessment of brain damage after stroke

    PubMed Central

    Zavaglia, Melissa; Forkert, Nils D.; Cheng, Bastian; Gerloff, Christian; Thomalla, Götz; Hilgetag, Claus C.

    2015-01-01

    Lesion analysis reveals causal contributions of brain regions to mental functions, aiding the understanding of normal brain function as well as rehabilitation of brain-damaged patients. We applied a novel lesion inference technique based on game theory, Multi-perturbation Shapley value Analysis (MSA), to a large clinical lesion dataset. We used MSA to analyze the lesion patterns of 148 acute stroke patients together with their neurological deficits, as assessed by the National Institutes of Health Stroke Scale (NIHSS). The results revealed regional functional contributions to essential behavioral and cognitive functions as reflected in the NIHSS, particularly by subcortical structures. There were also side specific differences of functional contributions between the right and left hemispheric brain regions which may reflect the dominance of the left hemispheric syndrome aphasia in the NIHSS. Comparison of MSA to established lesion inference methods demonstrated the feasibility of the approach for analyzing clinical data and indicated its capability for objectively inferring functional contributions from multiple injured, potentially interacting sites, at the cost of having to predict the outcome of unknown lesion configurations. The analysis of regional functional contributions to neurological symptoms measured by the NIHSS contributes to the interpretation of this widely used standardized stroke scale in clinical practice as well as clinical trials and provides a first approximation of a ‘map of stroke’. PMID:26448908

  7. Mapping causal functional contributions derived from the clinical assessment of brain damage after stroke.

    PubMed

    Zavaglia, Melissa; Forkert, Nils D; Cheng, Bastian; Gerloff, Christian; Thomalla, Götz; Hilgetag, Claus C

    2015-01-01

    Lesion analysis reveals causal contributions of brain regions to mental functions, aiding the understanding of normal brain function as well as rehabilitation of brain-damaged patients. We applied a novel lesion inference technique based on game theory, Multi-perturbation Shapley value Analysis (MSA), to a large clinical lesion dataset. We used MSA to analyze the lesion patterns of 148 acute stroke patients together with their neurological deficits, as assessed by the National Institutes of Health Stroke Scale (NIHSS). The results revealed regional functional contributions to essential behavioral and cognitive functions as reflected in the NIHSS, particularly by subcortical structures. There were also side specific differences of functional contributions between the right and left hemispheric brain regions which may reflect the dominance of the left hemispheric syndrome aphasia in the NIHSS. Comparison of MSA to established lesion inference methods demonstrated the feasibility of the approach for analyzing clinical data and indicated its capability for objectively inferring functional contributions from multiple injured, potentially interacting sites, at the cost of having to predict the outcome of unknown lesion configurations. The analysis of regional functional contributions to neurological symptoms measured by the NIHSS contributes to the interpretation of this widely used standardized stroke scale in clinical practice as well as clinical trials and provides a first approximation of a 'map of stroke'.

  8. Obesity as a risk factor for developing functional limitation among older adults: A conditional inference tree analysis.

    PubMed

    Cheng, Feon W; Gao, Xiang; Bao, Le; Mitchell, Diane C; Wood, Craig; Sliwinski, Martin J; Smiciklas-Wright, Helen; Still, Christopher D; Rolston, David D K; Jensen, Gordon L

    2017-07-01

    To examine the risk factors of developing functional decline and make probabilistic predictions by using a tree-based method that allows higher order polynomials and interactions of the risk factors. The conditional inference tree analysis, a data mining approach, was used to construct a risk stratification algorithm for developing functional limitation based on BMI and other potential risk factors for disability in 1,951 older adults without functional limitations at baseline (baseline age 73.1 ± 4.2 y). We also analyzed the data with multivariate stepwise logistic regression and compared the two approaches (e.g., cross-validation). Over a mean of 9.2 ± 1.7 years of follow-up, 221 individuals developed functional limitation. Higher BMI, age, and comorbidity were consistently identified as significant risk factors for functional decline using both methods. Based on these factors, individuals were stratified into four risk groups via the conditional inference tree analysis. Compared to the low-risk group, all other groups had a significantly higher risk of developing functional limitation. The odds ratio comparing two extreme categories was 9.09 (95% confidence interval: 4.68, 17.6). Higher BMI, age, and comorbid disease were consistently identified as significant risk factors for functional decline among older individuals across all approaches and analyses. © 2017 The Obesity Society.

  9. Large-scale inference of gene function through phylogenetic annotation of Gene Ontology terms: case study of the apoptosis and autophagy cellular processes.

    PubMed

    Feuermann, Marc; Gaudet, Pascale; Mi, Huaiyu; Lewis, Suzanna E; Thomas, Paul D

    2016-01-01

    We previously reported a paradigm for large-scale phylogenomic analysis of gene families that takes advantage of the large corpus of experimentally supported Gene Ontology (GO) annotations. This 'GO Phylogenetic Annotation' approach integrates GO annotations from evolutionarily related genes across ∼100 different organisms in the context of a gene family tree, in which curators build an explicit model of the evolution of gene functions. GO Phylogenetic Annotation models the gain and loss of functions in a gene family tree, which is used to infer the functions of uncharacterized (or incompletely characterized) gene products, even for human proteins that are relatively well studied. Here, we report our results from applying this paradigm to two well-characterized cellular processes, apoptosis and autophagy. This revealed several important observations with respect to GO annotations and how they can be used for function inference. Notably, we applied only a small fraction of the experimentally supported GO annotations to infer function in other family members. The majority of other annotations describe indirect effects, phenotypes or results from high throughput experiments. In addition, we show here how feedback from phylogenetic annotation leads to significant improvements in the PANTHER trees, the GO annotations and GO itself. Thus GO phylogenetic annotation both increases the quantity and improves the accuracy of the GO annotations provided to the research community. We expect these phylogenetically based annotations to be of broad use in gene enrichment analysis as well as other applications of GO annotations.Database URL: http://amigo.geneontology.org/amigo. © The Author(s) 2016. Published by Oxford University Press.

  10. Receiver function deconvolution using transdimensional hierarchical Bayesian inference

    NASA Astrophysics Data System (ADS)

    Kolb, J. M.; Lekić, V.

    2014-06-01

    Teleseismic waves can convert from shear to compressional (Sp) or compressional to shear (Ps) across impedance contrasts in the subsurface. Deconvolving the parent waveforms (P for Ps or S for Sp) from the daughter waveforms (S for Ps or P for Sp) generates receiver functions which can be used to analyse velocity structure beneath the receiver. Though a variety of deconvolution techniques have been developed, they are all adversely affected by background and signal-generated noise. In order to take into account the unknown noise characteristics, we propose a method based on transdimensional hierarchical Bayesian inference in which both the noise magnitude and noise spectral character are parameters in calculating the likelihood probability distribution. We use a reversible-jump implementation of a Markov chain Monte Carlo algorithm to find an ensemble of receiver functions whose relative fits to the data have been calculated while simultaneously inferring the values of the noise parameters. Our noise parametrization is determined from pre-event noise so that it approximates observed noise characteristics. We test the algorithm on synthetic waveforms contaminated with noise generated from a covariance matrix obtained from observed noise. We show that the method retrieves easily interpretable receiver functions even in the presence of high noise levels. We also show that we can obtain useful estimates of noise amplitude and frequency content. Analysis of the ensemble solutions produced by our method can be used to quantify the uncertainties associated with individual receiver functions as well as with individual features within them, providing an objective way for deciding which features warrant geological interpretation. This method should make possible more robust inferences on subsurface structure using receiver function analysis, especially in areas of poor data coverage or under noisy station conditions.

  11. Imaging of the native inversion layer in Silicon-On-Insulator wafers via Scanning Surface Photovoltage: Implications for RF device performance

    NASA Astrophysics Data System (ADS)

    Dahanayaka, Daminda; Wong, Andrew; Kaszuba, Philip; Moszkowicz, Leon; Slinkman, James; IBM SPV Lab Team

    2014-03-01

    Silicon-On-Insulator (SOI) technology has proved beneficial for RF cell phone technologies, which have equivalent performance to GaAs technologies. However, there is evident parasitic inversion layer under the Buried Oxide (BOX) at the interface with the high resistivity Si substrate. The latter is inferred from capacitance-voltage measurements on MOSCAPs. The inversion layer has adverse effects on RF device performance. We present data which, for the first time, show the extent of the inversion layer in the underlying substrate. This knowledge has driven processing techniques to suppress the inversion.

  12. The curious incident of the photo that was accused of being false: issues of domain specificity in development, autism, and brain imaging.

    PubMed

    Perner, Josef; Leekam, Susan

    2008-01-01

    We resume an exchange of ideas with Uta Frith that started before the turn of the century. The curious incident responsible for this exchange was the finding that children with autism fail tests of false belief, while they pass Zaitchik's (1990) photograph task (Leekam & Perner, 1991). This finding led to the conclusion that children with autism have a domain-specific impairment in Theory of Mind (mental representations), because the photograph task and the false-belief task are structurally equivalent except for the nonmental character of photographs. In this paper we argue that the false-belief task and the false-photograph task are not structurally equivalent and are not empirically associated. Instead a truly structurally equivalent task is the false-sign task. Performance on this task is strongly associated with the false-belief task. A version of this task, the misleading-signal task, also poses severe problems for children with autism (Bowler, Briskman, Gurvidi, & Fornells-Ambrojo, 2005). These new findings therefore challenge the earlier interpretation of a domain-specific difficulty in inferring mental states and suggest that children with autism also have difficulty understanding misleading nonmental objects. Brain imaging data using false-belief, "false"-photo, and false-sign scenarios provide further supporting evidence for our conclusions.

  13. Inter-Individual Variability in High-Throughput Risk ...

    EPA Pesticide Factsheets

    We incorporate realistic human variability into an open-source high-throughput (HT) toxicokinetics (TK) modeling framework for use in a next-generation risk prioritization approach. Risk prioritization involves rapid triage of thousands of environmental chemicals, most which have little or no existing TK data. Chemicals are prioritized based on model estimates of hazard and exposure, to decide which chemicals should be first in line for further study. Hazard may be estimated with in vitro HT screening assays, e.g., U.S. EPA’s ToxCast program. Bioactive ToxCast concentrations can be extrapolated to doses that produce equivalent concentrations in body tissues using a reverse TK approach in which generic TK models are parameterized with 1) chemical-specific parameters derived from in vitro measurements and predicted from chemical structure; and 2) with physiological parameters for a virtual population. Here we draw physiological parameters from realistic estimates of distributions of demographic and anthropometric quantities in the modern U.S. population, based on the most recent CDC NHANES data. A Monte Carlo approach, accounting for the correlation structure in physiological parameters, is used to estimate ToxCast equivalent doses for the most sensitive portion of the population. To quantify risk, ToxCast equivalent doses are compared to estimates of exposure rates based on Bayesian inferences drawn from NHANES urinary analyte biomonitoring data. The inclusion

  14. [Equivalent continuous noise level in neonatal intensive care unit associated to burnout syndrome].

    PubMed

    Garrido Galindo, A P; Camargo Caicedo, Y; Vélez-Pereira, A M

    2015-01-01

    Noise levels in neonatal intensive care units allow the appearance of symptoms associated with burnout such as stress, irritability, fatigue and emotional instability on health care personnel. The aim of this study was to evaluate the equivalent continuous noise levels in the neonatal intensive care unit and compare the results with noise levels associated with the occurrence of burnout syndrome on the care team. Continuous sampling was conducted for 20 days using a type I sound level meter on the unit. The maximum, the ninetieth percentile and the equivalent continuous noise level (Leq) values were recorded. Noise level is reported in the range of 51.4-77.6 decibels A (dBA) with an average of 64 dBA, 100.6 dBA maximum, and average background noise from 57.9 dBA. Noise levels exceed the standards suggested for neonatal intensive care units, are close to maximum values referred for noise exposure in the occupational standards and to noise levels associated with the onset of burnout; thus allowing to infer the probability of occurrence of high levels of noise present in the unit on the development of burnout in caregivers. Copyright © 2013 Elsevier España, S.L.U. y SEEIUC. All rights reserved.

  15. Ecosystem Food Web Lift-The-Flap Pages

    ERIC Educational Resources Information Center

    Atwood-Blaine, Dana; Rule, Audrey C.; Morgan, Hannah

    2016-01-01

    In the lesson on which this practical article is based, third grade students constructed a "lift-the-flap" page to explore food webs on the prairie. The moveable papercraft focused student attention on prairie animals' external structures and how the inferred functions of those structures could support further inferences about the…

  16. Investigation of Mentalizing and Visuospatial Perspective Taking for Self and Other in Asperger Syndrome

    ERIC Educational Resources Information Center

    David, Nicole; Aumann, Carolin; Bewernick, Bettina H.; Santos, Natacha S.; Lehnhardt, Fritz-G.; Vogeley, Kai

    2010-01-01

    Mentalizing refers to making inferences about other people's mental states, whereas visuospatial perspective taking refers to inferring other people's viewpoints. Both abilities seem vital for social functioning; yet, their exact relationship is unclear. We directly compared mentalizing and visuospatial perspective taking in nineteen adults with…

  17. Application of Adjoint Methodology to Supersonic Aircraft Design Using Reversed Equivalent Areas

    NASA Technical Reports Server (NTRS)

    Rallabhandi, Sriram K.

    2013-01-01

    This paper presents an approach to shape an aircraft to equivalent area based objectives using the discrete adjoint approach. Equivalent areas can be obtained either using reversed augmented Burgers equation or direct conversion of off-body pressures into equivalent area. Formal coupling with CFD allows computation of sensitivities of equivalent area objectives with respect to aircraft shape parameters. The exactness of the adjoint sensitivities is verified against derivatives obtained using the complex step approach. This methodology has the benefit of using designer-friendly equivalent areas in the shape design of low-boom aircraft. Shape optimization results with equivalent area cost functionals are discussed and further refined using ground loudness based objectives.

  18. Response of a tissue equivalent proportional counter to neutrons

    NASA Technical Reports Server (NTRS)

    Badhwar, G. D.; Robbins, D. E.; Gibbons, F.; Braby, L. A.

    2002-01-01

    The absorbed dose as a function of lineal energy was measured at the CERN-EC Reference-field Facility (CERF) using a 512-channel tissue equivalent proportional counter (TEPC), and neutron dose equivalent response evaluated. Although there are some differences, the measured dose equivalent is in agreement with that measured by the 16-channel HANDI tissue equivalent counter. Comparison of TEPC measurements with those made by a silicon solid-state detector for low linear energy transfer particles produced by the same beam, is presented. The measurements show that about 4% of dose equivalent is delivered by particles heavier than protons generated in the conducting tissue equivalent plastic. c2002 Elsevier Science Ltd. All rights reserved.

  19. Assessing population genetic structure via the maximisation of genetic distance

    PubMed Central

    2009-01-01

    Background The inference of the hidden structure of a population is an essential issue in population genetics. Recently, several methods have been proposed to infer population structure in population genetics. Methods In this study, a new method to infer the number of clusters and to assign individuals to the inferred populations is proposed. This approach does not make any assumption on Hardy-Weinberg and linkage equilibrium. The implemented criterion is the maximisation (via a simulated annealing algorithm) of the averaged genetic distance between a predefined number of clusters. The performance of this method is compared with two Bayesian approaches: STRUCTURE and BAPS, using simulated data and also a real human data set. Results The simulations show that with a reduced number of markers, BAPS overestimates the number of clusters and presents a reduced proportion of correct groupings. The accuracy of the new method is approximately the same as for STRUCTURE. Also, in Hardy-Weinberg and linkage disequilibrium cases, BAPS performs incorrectly. In these situations, STRUCTURE and the new method show an equivalent behaviour with respect to the number of inferred clusters, although the proportion of correct groupings is slightly better with the new method. Re-establishing equilibrium with the randomisation procedures improves the precision of the Bayesian approaches. All methods have a good precision for FST ≥ 0.03, but only STRUCTURE estimates the correct number of clusters for FST as low as 0.01. In situations with a high number of clusters or a more complex population structure, MGD performs better than STRUCTURE and BAPS. The results for a human data set analysed with the new method are congruent with the geographical regions previously found. Conclusion This new method used to infer the hidden structure in a population, based on the maximisation of the genetic distance and not taking into consideration any assumption about Hardy-Weinberg and linkage equilibrium, performs well under different simulated scenarios and with real data. Therefore, it could be a useful tool to determine genetically homogeneous groups, especially in those situations where the number of clusters is high, with complex population structure and where Hardy-Weinberg and/or linkage equilibrium are present. PMID:19900278

  20. Centralized PI control for high dimensional multivariable systems based on equivalent transfer function.

    PubMed

    Luan, Xiaoli; Chen, Qiang; Liu, Fei

    2014-09-01

    This article presents a new scheme to design full matrix controller for high dimensional multivariable processes based on equivalent transfer function (ETF). Differing from existing ETF method, the proposed ETF is derived directly by exploiting the relationship between the equivalent closed-loop transfer function and the inverse of open-loop transfer function. Based on the obtained ETF, the full matrix controller is designed utilizing the existing PI tuning rules. The new proposed ETF model can more accurately represent the original processes. Furthermore, the full matrix centralized controller design method proposed in this paper is applicable to high dimensional multivariable systems with satisfactory performance. Comparison with other multivariable controllers shows that the designed ETF based controller is superior with respect to design-complexity and obtained performance. Copyright © 2014 ISA. Published by Elsevier Ltd. All rights reserved.

  1. Inferring multi-scale neural mechanisms with brain network modelling

    PubMed Central

    Schirner, Michael; McIntosh, Anthony Randal; Jirsa, Viktor; Deco, Gustavo

    2018-01-01

    The neurophysiological processes underlying non-invasive brain activity measurements are incompletely understood. Here, we developed a connectome-based brain network model that integrates individual structural and functional data with neural population dynamics to support multi-scale neurophysiological inference. Simulated populations were linked by structural connectivity and, as a novelty, driven by electroencephalography (EEG) source activity. Simulations not only predicted subjects' individual resting-state functional magnetic resonance imaging (fMRI) time series and spatial network topologies over 20 minutes of activity, but more importantly, they also revealed precise neurophysiological mechanisms that underlie and link six empirical observations from different scales and modalities: (1) resting-state fMRI oscillations, (2) functional connectivity networks, (3) excitation-inhibition balance, (4, 5) inverse relationships between α-rhythms, spike-firing and fMRI on short and long time scales, and (6) fMRI power-law scaling. These findings underscore the potential of this new modelling framework for general inference and integration of neurophysiological knowledge to complement empirical studies. PMID:29308767

  2. Shared neural circuits for mentalizing about the self and others.

    PubMed

    Lombardo, Michael V; Chakrabarti, Bhismadev; Bullmore, Edward T; Wheelwright, Sally J; Sadek, Susan A; Suckling, John; Baron-Cohen, Simon

    2010-07-01

    Although many examples exist for shared neural representations of self and other, it is unknown how such shared representations interact with the rest of the brain. Furthermore, do high-level inference-based shared mentalizing representations interact with lower level embodied/simulation-based shared representations? We used functional neuroimaging (fMRI) and a functional connectivity approach to assess these questions during high-level inference-based mentalizing. Shared mentalizing representations in ventromedial prefrontal cortex, posterior cingulate/precuneus, and temporo-parietal junction (TPJ) all exhibited identical functional connectivity patterns during mentalizing of both self and other. Connectivity patterns were distributed across low-level embodied neural systems such as the frontal operculum/ventral premotor cortex, the anterior insula, the primary sensorimotor cortex, and the presupplementary motor area. These results demonstrate that identical neural circuits are implementing processes involved in mentalizing of both self and other and that the nature of such processes may be the integration of low-level embodied processes within higher level inference-based mentalizing.

  3. Revisiting Evidence for Modularity and Functional Equivalence across Verbal and Spatial Domains in Memory

    ERIC Educational Resources Information Center

    Guerard, Katherine; Tremblay, Sebastien

    2008-01-01

    The authors revisited evidence in favor of modularity and of functional equivalence between the processing of verbal and spatial information in short-term memory. This was done by investigating the patterns of intrusions, omissions, transpositions, and fill-ins in verbal and spatial serial recall and order reconstruction tasks under control,…

  4. Evaluating Treatments for Functionally Equivalent Problem Behavior Maintained by Adult Compliance with Mands during Interactive Play

    ERIC Educational Resources Information Center

    Schmidt, Jonathan D.; Bednar, Mary K.; Willse, Lena V.; Goetzel, Amanda L.; Concepcion, Anthony; Pincus, Shari M.; Hardesty, Samantha L.; Bowman, Lynn G.

    2017-01-01

    A primary goal of behavioral interventions is to reduce dangerous or inappropriate behavior and to generalize treatment effects across various settings. However, there is a lack of research evaluating generalization of treatment effects while individuals with functionally equivalent problem behavior interact with each other. For the current study,…

  5. Contingency Mapping: Use of a Novel Visual Support Strategy as an Adjunct to Functional Equivalence Training

    ERIC Educational Resources Information Center

    Brown, Kenneth E.; Mirenda, Pat

    2006-01-01

    This study evaluated the effectiveness of contingency mapping, a new visual support strategy designed to enhance clients' understanding of the contingencies associated with functional equivalence training (FET). The study was conducted in a general education classroom with an adolescent boy with autism who engaged in prompt dependent behavior. A…

  6. 46 CFR 62.15-1 - Conditions under which equivalents may be used.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 2 2014-10-01 2014-10-01 false Conditions under which equivalents may be used. 62.15-1 Section 62.15-1 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING VITAL... level of safety and reliability. Demonstration of functional equivalence must include comparison of a...

  7. 46 CFR 62.15-1 - Conditions under which equivalents may be used.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 2 2012-10-01 2012-10-01 false Conditions under which equivalents may be used. 62.15-1 Section 62.15-1 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING VITAL... level of safety and reliability. Demonstration of functional equivalence must include comparison of a...

  8. 46 CFR 62.15-1 - Conditions under which equivalents may be used.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 2 2013-10-01 2013-10-01 false Conditions under which equivalents may be used. 62.15-1 Section 62.15-1 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING VITAL... level of safety and reliability. Demonstration of functional equivalence must include comparison of a...

  9. 46 CFR 62.15-1 - Conditions under which equivalents may be used.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 2 2011-10-01 2011-10-01 false Conditions under which equivalents may be used. 62.15-1 Section 62.15-1 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING VITAL... level of safety and reliability. Demonstration of functional equivalence must include comparison of a...

  10. Bees Algorithm for Construction of Multiple Test Forms in E-Testing

    ERIC Educational Resources Information Center

    Songmuang, Pokpong; Ueno, Maomi

    2011-01-01

    The purpose of this research is to automatically construct multiple equivalent test forms that have equivalent qualities indicated by test information functions based on item response theory. There has been a trade-off in previous studies between the computational costs and the equivalent qualities of test forms. To alleviate this problem, we…

  11. EMR-based medical knowledge representation and inference via Markov random fields and distributed representation learning.

    PubMed

    Zhao, Chao; Jiang, Jingchi; Guan, Yi; Guo, Xitong; He, Bin

    2018-05-01

    Electronic medical records (EMRs) contain medical knowledge that can be used for clinical decision support (CDS). Our objective is to develop a general system that can extract and represent knowledge contained in EMRs to support three CDS tasks-test recommendation, initial diagnosis, and treatment plan recommendation-given the condition of a patient. We extracted four kinds of medical entities from records and constructed an EMR-based medical knowledge network (EMKN), in which nodes are entities and edges reflect their co-occurrence in a record. Three bipartite subgraphs (bigraphs) were extracted from the EMKN, one to support each task. One part of the bigraph was the given condition (e.g., symptoms), and the other was the condition to be inferred (e.g., diseases). Each bigraph was regarded as a Markov random field (MRF) to support the inference. We proposed three graph-based energy functions and three likelihood-based energy functions. Two of these functions are based on knowledge representation learning and can provide distributed representations of medical entities. Two EMR datasets and three metrics were utilized to evaluate the performance. As a whole, the evaluation results indicate that the proposed system outperformed the baseline methods. The distributed representation of medical entities does reflect similarity relationships with respect to knowledge level. Combining EMKN and MRF is an effective approach for general medical knowledge representation and inference. Different tasks, however, require individually designed energy functions. Copyright © 2018 Elsevier B.V. All rights reserved.

  12. Symbolic computation of equivalence transformations and parameter reduction for nonlinear physical models

    NASA Astrophysics Data System (ADS)

    Cheviakov, Alexei F.

    2017-11-01

    An efficient systematic procedure is provided for symbolic computation of Lie groups of equivalence transformations and generalized equivalence transformations of systems of differential equations that contain arbitrary elements (arbitrary functions and/or arbitrary constant parameters), using the software package GeM for Maple. Application of equivalence transformations to the reduction of the number of arbitrary elements in a given system of equations is discussed, and several examples are considered. The first computational example of generalized equivalence transformations where the transformation of the dependent variable involves an arbitrary constitutive function is presented. As a detailed physical example, a three-parameter family of nonlinear wave equations describing finite anti-plane shear displacements of an incompressible hyperelastic fiber-reinforced medium is considered. Equivalence transformations are computed and employed to radically simplify the model for an arbitrary fiber direction, invertibly reducing the model to a simple form that corresponds to a special fiber direction, and involves no arbitrary elements. The presented computation algorithm is applicable to wide classes of systems of differential equations containing arbitrary elements.

  13. Water storage in marine sediment and implications for inferences of past global ice volume

    NASA Astrophysics Data System (ADS)

    Ferrier, K.; Li, Q.; Pico, T.; Austermann, J.

    2017-12-01

    Changes in past sea level are of wide interest because they provide information on the sensitivity of ice sheets to climate change, and thus inform predictions of future sea-level change. Sea level changes are influenced by many processes, including the storage of water in sedimentary pore space. Here we use a recent extension of gravitationally self-consistent sea-level models to explore the effects of marine sedimentary water storage on the global seawater balance and inferences of past global ice volume. Our analysis suggests that sedimentary water storage can be a significant component of the global seawater budget over the 105-year timescales associated with glacial-interglacial cycles, and an even larger component over longer timescales. Estimates of global sediment fluxes to the oceans suggest that neglecting marine sedimentary water storage may produce meter-scale errors in estimates of peak global mean sea level equivalent (GMSL) during the Last Interglacial (LIG). These calculations show that marine sedimentary water storage can be a significant contributor to the overall effects of sediment redistribution on sea-level change, and that neglecting sedimentary water storage can lead to substantial errors in inferences of global ice volume at past interglacials. This highlights the importance of accounting for the influences of sediment fluxes and sedimentary water storage on sea-level change over glacial-interglacial timescales.

  14. Fast and Accurate Multivariate Gaussian Modeling of Protein Families: Predicting Residue Contacts and Protein-Interaction Partners

    PubMed Central

    Feinauer, Christoph; Procaccini, Andrea; Zecchina, Riccardo; Weigt, Martin; Pagnani, Andrea

    2014-01-01

    In the course of evolution, proteins show a remarkable conservation of their three-dimensional structure and their biological function, leading to strong evolutionary constraints on the sequence variability between homologous proteins. Our method aims at extracting such constraints from rapidly accumulating sequence data, and thereby at inferring protein structure and function from sequence information alone. Recently, global statistical inference methods (e.g. direct-coupling analysis, sparse inverse covariance estimation) have achieved a breakthrough towards this aim, and their predictions have been successfully implemented into tertiary and quaternary protein structure prediction methods. However, due to the discrete nature of the underlying variable (amino-acids), exact inference requires exponential time in the protein length, and efficient approximations are needed for practical applicability. Here we propose a very efficient multivariate Gaussian modeling approach as a variant of direct-coupling analysis: the discrete amino-acid variables are replaced by continuous Gaussian random variables. The resulting statistical inference problem is efficiently and exactly solvable. We show that the quality of inference is comparable or superior to the one achieved by mean-field approximations to inference with discrete variables, as done by direct-coupling analysis. This is true for (i) the prediction of residue-residue contacts in proteins, and (ii) the identification of protein-protein interaction partner in bacterial signal transduction. An implementation of our multivariate Gaussian approach is available at the website http://areeweb.polito.it/ricerca/cmp/code. PMID:24663061

  15. Simulation-based Bayesian inference for latent traits of item response models: Introduction to the ltbayes package for R.

    PubMed

    Johnson, Timothy R; Kuhn, Kristine M

    2015-12-01

    This paper introduces the ltbayes package for R. This package includes a suite of functions for investigating the posterior distribution of latent traits of item response models. These include functions for simulating realizations from the posterior distribution, profiling the posterior density or likelihood function, calculation of posterior modes or means, Fisher information functions and observed information, and profile likelihood confidence intervals. Inferences can be based on individual response patterns or sets of response patterns such as sum scores. Functions are included for several common binary and polytomous item response models, but the package can also be used with user-specified models. This paper introduces some background and motivation for the package, and includes several detailed examples of its use.

  16. Social perception in adults with Parkinson's disease.

    PubMed

    Pell, Marc D; Monetta, Laura; Rothermich, Kathrin; Kotz, Sonja A; Cheang, Henry S; McDonald, Skye

    2014-11-01

    Our study assessed how nondemented patients with Parkinson's disease (PD) interpret the affective and mental states of others from spoken language (adopt a "theory of mind") in ecologically valid social contexts. A secondary goal was to examine the relationship between emotion processing, mentalizing, and executive functions in PD during interpersonal communication. Fifteen adults with PD and 16 healthy adults completed The Awareness of Social Inference Test, a standardized tool comprised of videotaped vignettes of everyday social interactions (McDonald, Flanagan, Rollins, & Kinch, 2003). Individual subtests assessed participants' ability to recognize basic emotions and to infer speaker intentions (sincerity, lies, sarcasm) from verbal and nonverbal cues, and to judge speaker knowledge, beliefs, and feelings. A comprehensive neuropsychological evaluation was also conducted. Patients with mild-moderate PD were impaired in the ability to infer "enriched" social intentions, such as sarcasm or lies, from nonliteral remarks; in contrast, adults with and without PD showed a similar capacity to recognize emotions and social intentions meant to be literal. In the PD group, difficulties using theory of mind to draw complex social inferences were significantly correlated with limitations in working memory and executive functioning. In early PD, functional compromise of the frontal-striatal-dorsal system yields impairments in social perception and understanding nonliteral speaker intentions that draw upon cognitive theory of mind. Deficits in social perception in PD are exacerbated by a decline in executive resources, which could hamper the strategic deployment of attention to multiple information sources necessary to infer social intentions. PsycINFO Database Record (c) 2014 APA, all rights reserved.

  17. Correlation between Hox code and vertebral morphology in archosaurs.

    PubMed

    Böhmer, Christine; Rauhut, Oliver W M; Wörheide, Gert

    2015-07-07

    The relationship between developmental genes and phenotypic variation is of central interest in evolutionary biology. An excellent example is the role of Hox genes in the anteroposterior regionalization of the vertebral column in vertebrates. Archosaurs (crocodiles, dinosaurs including birds) are highly variable both in vertebral morphology and number. Nevertheless, functionally equivalent Hox genes are active in the axial skeleton during embryonic development, indicating that the morphological variation across taxa is likely owing to modifications in the pattern of Hox gene expression. By using geometric morphometrics, we demonstrate a correlation between vertebral Hox code and quantifiable vertebral morphology in modern archosaurs, in which the boundaries between morphological subgroups of vertebrae can be linked to anterior Hox gene expression boundaries. Our findings reveal homologous units of cervical vertebrae in modern archosaurs, each with their specific Hox gene pattern, enabling us to trace these homologies in the extinct sauropodomorph dinosaurs, a group with highly variable vertebral counts. Based on the quantifiable vertebral morphology, this allows us to infer the underlying genetic mechanisms in vertebral evolution in fossils, which represents not only an important case study, but will lead to a better understanding of the origin of morphological disparity in recent archosaur vertebral columns.

  18. Correlation between Hox code and vertebral morphology in archosaurs

    PubMed Central

    Böhmer, Christine; Rauhut, Oliver W. M.; Wörheide, Gert

    2015-01-01

    The relationship between developmental genes and phenotypic variation is of central interest in evolutionary biology. An excellent example is the role of Hox genes in the anteroposterior regionalization of the vertebral column in vertebrates. Archosaurs (crocodiles, dinosaurs including birds) are highly variable both in vertebral morphology and number. Nevertheless, functionally equivalent Hox genes are active in the axial skeleton during embryonic development, indicating that the morphological variation across taxa is likely owing to modifications in the pattern of Hox gene expression. By using geometric morphometrics, we demonstrate a correlation between vertebral Hox code and quantifiable vertebral morphology in modern archosaurs, in which the boundaries between morphological subgroups of vertebrae can be linked to anterior Hox gene expression boundaries. Our findings reveal homologous units of cervical vertebrae in modern archosaurs, each with their specific Hox gene pattern, enabling us to trace these homologies in the extinct sauropodomorph dinosaurs, a group with highly variable vertebral counts. Based on the quantifiable vertebral morphology, this allows us to infer the underlying genetic mechanisms in vertebral evolution in fossils, which represents not only an important case study, but will lead to a better understanding of the origin of morphological disparity in recent archosaur vertebral columns. PMID:26085583

  19. Optic Flow Dominates Visual Scene Polarity in Causing Adaptive Modification of Locomotor Trajectory

    NASA Technical Reports Server (NTRS)

    Nomura, Y.; Mulavara, A. P.; Richards, J. T.; Brady, R.; Bloomberg, Jacob J.

    2005-01-01

    Locomotion and posture are influenced and controlled by vestibular, visual and somatosensory information. Optic flow and scene polarity are two characteristics of a visual scene that have been identified as being critical in how they affect perceived body orientation and self-motion. The goal of this study was to determine the role of optic flow and visual scene polarity on adaptive modification in locomotor trajectory. Two computer-generated virtual reality scenes were shown to subjects during 20 minutes of treadmill walking. One scene was a highly polarized scene while the other was composed of objects displayed in a non-polarized fashion. Both virtual scenes depicted constant rate self-motion equivalent to walking counterclockwise around the perimeter of a room. Subjects performed Stepping Tests blindfolded before and after scene exposure to assess adaptive changes in locomotor trajectory. Subjects showed a significant difference in heading direction, between pre and post adaptation stepping tests, when exposed to either scene during treadmill walking. However, there was no significant difference in the subjects heading direction between the two visual scene polarity conditions. Therefore, it was inferred from these data that optic flow has a greater role than visual polarity in influencing adaptive locomotor function.

  20. Generalisation within specialization: inter-individual diet variation in the only specialized salamander in the world

    PubMed Central

    Costa, Andrea; Salvidio, Sebastiano; Posillico, Mario; Matteucci, Giorgio; De Cinti, Bruno; Romano, Antonio

    2015-01-01

    Specialization is typically inferred at population and species level but in the last decade many authors highlighted this trait at the individual level, finding that generalist populations can be composed by both generalist and specialist individual. Despite hundreds of reported cases of individual specialization there is a complete lack of information on inter-individual diet variation in specialist species. We studied the diet of the Italian endemic Spectacled Salamander (Salamandrina perspicillata), in a temperate forest ecosystem, to disclose the realised trophic niche, prey selection strategy in function of phenotypic variation and inter-individual diet variation. Our results showed that Salamandrina is highly specialized on Collembola and the more specialized individuals are the better performing ones. Analyses of inter-individual diet variation showed that a subset of animals exhibited a broader trophic niche, adopting different foraging strategies. Our findings reflects the optimal foraging theory both at population and individual level, since animals in better physiological conditions are able to exploit the most profitable prey, suggesting that the two coexisting strategies are not equivalent. At last this species, feeding on decomposers of litter detritus, could play a key role determining litter retention rate, nutrient cycle and carbon sequestration. PMID:26292804

  1. Inferring extinction risks from sighting records.

    PubMed

    Thompson, C J; Lee, T E; Stone, L; McCarthy, M A; Burgman, M A

    2013-12-07

    Estimating the probability that a species is extinct based on historical sighting records is important when deciding how much effort and money to invest in conservation policies. The framework we offer is more general than others in the literature to date. Our formulation allows for definite and uncertain observations, and thus better accommodates the realities of sighting record quality. Typically, the probability of observing a species given it is extant/extinct is challenging to define, especially when the possibility of a false observation is included. As such, we assume that observation probabilities derive from a representative probability density function. We incorporate this randomness in two different ways ("quenched" versus "annealed") using a framework that is equivalent to a Bayes formulation. The two methods can lead to significantly different estimates for extinction. In the case of definite sightings only, we provide an explicit deterministic calculation (in which observation probabilities are point estimates). Furthermore, our formulation replicates previous work in certain limiting cases. In the case of uncertain sightings, we allow for the possibility of several independent observational types (specimen, photographs, etc.). The method is applied to the Caribbean monk seal, Monachus tropicalis (which has only definite sightings), and synthetic data, with uncertain sightings. © 2013 Elsevier Ltd. All rights reserved.

  2. On the inherent competition between valid and spurious inductive inferences in Boolean data

    NASA Astrophysics Data System (ADS)

    Andrecut, M.

    Inductive inference is the process of extracting general rules from specific observations. This problem also arises in the analysis of biological networks, such as genetic regulatory networks, where the interactions are complex and the observations are incomplete. A typical task in these problems is to extract general interaction rules as combinations of Boolean covariates, that explain a measured response variable. The inductive inference process can be considered as an incompletely specified Boolean function synthesis problem. This incompleteness of the problem will also generate spurious inferences, which are a serious threat to valid inductive inference rules. Using random Boolean data as a null model, here we attempt to measure the competition between valid and spurious inductive inference rules from a given data set. We formulate two greedy search algorithms, which synthesize a given Boolean response variable in a sparse disjunct normal form, and respectively a sparse generalized algebraic normal form of the variables from the observation data, and we evaluate numerically their performance.

  3. Learning control of inverted pendulum system by neural network driven fuzzy reasoning: The learning function of NN-driven fuzzy reasoning under changes of reasoning environment

    NASA Technical Reports Server (NTRS)

    Hayashi, Isao; Nomura, Hiroyoshi; Wakami, Noboru

    1991-01-01

    Whereas conventional fuzzy reasonings are associated with tuning problems, which are lack of membership functions and inference rule designs, a neural network driven fuzzy reasoning (NDF) capable of determining membership functions by neural network is formulated. In the antecedent parts of the neural network driven fuzzy reasoning, the optimum membership function is determined by a neural network, while in the consequent parts, an amount of control for each rule is determined by other plural neural networks. By introducing an algorithm of neural network driven fuzzy reasoning, inference rules for making a pendulum stand up from its lowest suspended point are determined for verifying the usefulness of the algorithm.

  4. Helping Students Bridge Inferences in Science Texts Using Graphic Organizers

    ERIC Educational Resources Information Center

    Roman, Diego; Jones, Francesca; Basaraba, Deni; Hironaka, Stephanie

    2016-01-01

    The difficulties that students face when reading science texts go beyond understanding vocabulary and syntactic structures. Comprehension of science texts requires students to infer how these texts function as a unit to communicate scientific meaning. To help students in this process, science texts sometimes employ logical connectives (e.g.,…

  5. Principle-Based Inferences in Young Children's Categorization: Revisiting the Impact of Function on the Naming of Artifacts.

    ERIC Educational Resources Information Center

    Nelson, Deborah G. Kemler

    1995-01-01

    Three studies investigated the influence of principle-based inferences and unprincipled similarity relations on new category learning by three- to six-year-old children. Results indicated that categorization into newly learned categories may activate self-initiated, principle-based reasoning in young children, suggesting that spontaneous…

  6. The Argumentative Connective "Meme" in French: An Experimental Study in Eight- to Ten-Year-Old Children.

    ERIC Educational Resources Information Center

    Bassano, Dominique; Champaud, Christian

    1989-01-01

    Examines how children understand the argumentative function of the French connective meme (even). Two completion tasks, related to the argumentative properties of the morpheme, were used: 1) to infer the conclusion of an "even" sentence, and 2) to infer the argument position. (34 references) (Author/CB)

  7. Premise and Inference Memory as a Function of Age and Context.

    ERIC Educational Resources Information Center

    Wagner, Michael; Rohwer, William D., Jr.

    A sentence completion task was used to investigate age differences in childrens' ability or inclination to invoke premises and inferences during prose processing, tasks which (according to the constructive hypothesis) adults typically perform. A pilot study confirmed that the sentence completion task was preferable to the recognition paradigm,…

  8. Lessons for Religious Education from Cognitive Science of Religion

    ERIC Educational Resources Information Center

    Brelsford, Theodore

    2005-01-01

    Recent work in the cognitive sciences provides new neurological/biological and evolutionary bases for understanding the construction of knowledge (in the form of sets of ideas containing functionally useful inferences) and the capacity for imagination (as the ability to run inferences and generate ideas from information) in the human mind. In…

  9. Quantitative inferences on the locomotor behaviour of extinct species applied to Simocyon batalleri (Ailuridae, Late Miocene, Spain)

    NASA Astrophysics Data System (ADS)

    Fabre, Anne-Claire; Salesa, Manuel J.; Cornette, Raphael; Antón, Mauricio; Morales, Jorge; Peigné, Stéphane

    2015-06-01

    Inferences of function and ecology in extinct taxa have long been a subject of interest because it is fundamental to understand the evolutionary history of species. In this study, we use a quantitative approach to investigate the locomotor behaviour of Simocyon batalleri, a key taxon related to the ailurid family. To do so, we use 3D surface geometric morphometric approaches on the three long bones of the forelimb of an extant reference sample. Next, we test the locomotor strategy of S. batalleri using a leave-one-out cross-validated linear discriminant analysis. Our results show that S. batalleri is included in the morphospace of the living species of musteloids. However, each bone of the forelimb appears to show a different functional signal suggesting that inferring the lifestyle or locomotor behaviour of fossils can be difficult and dependent on the bone investigated. This highlights the importance of studying, where possible, a maximum of skeletal elements to be able to make robust inferences on the lifestyle of extinct species. Finally, our results suggest that S. batalleri may be more arboreal than previously suggested.

  10. A multi-criteria inference approach for anti-desertification management.

    PubMed

    Tervonen, Tommi; Sepehr, Adel; Kadziński, Miłosz

    2015-10-01

    We propose an approach for classifying land zones into categories indicating their resilience against desertification. Environmental management support is provided by a multi-criteria inference method that derives a set of value functions compatible with the given classification examples, and applies them to define, for the rest of the zones, their possible classes. In addition, a representative value function is inferred to explain the relative importance of the criteria to the stakeholders. We use the approach for classifying 28 administrative regions of the Khorasan Razavi province in Iran into three equilibrium classes: collapsed, transition, and sustainable zones. The model is parameterized with enhanced vegetation index measurements from 2005 to 2012, and 7 other natural and anthropogenic indicators for the status of the region in 2012. Results indicate that grazing density and land use changes are the main anthropogenic factors affecting desertification in Khorasan Razavi. The inference procedure suggests that the classification model is underdetermined in terms of attributes, but the approach itself is promising for supporting the management of anti-desertification efforts. Copyright © 2015 Elsevier Ltd. All rights reserved.

  11. A comparison of ROC inferred from FROC and conventional ROC

    NASA Astrophysics Data System (ADS)

    McEntee, Mark F.; Littlefair, Stephen; Pietrzyk, Mariusz W.

    2014-03-01

    This study aims to determine whether receiver operating characteristic (ROC) scores inferred from free-response receiver operating characteristic (FROC) were equivalent to conventional ROC scores for the same readers and cases. Forty-five examining radiologists of the American Board of Radiology independently reviewed 47 PA chest radiographs under at least two conditions. Thirty-seven cases had abnormal findings and 10 cases had normal findings. Half the readers were asked to first locate any visualized lung nodules, mark them and assign a level of confidence [the FROC mark-rating pair] and second give an overall to the entire image on the same scale [the ROC score]. The second half of readers gave the ROC rating first followed by the FROC mark-rating pairs. A normal image was represented with number 1 and malignant lesions with numbers 2-5. A jackknife free-response receiver operating characteristic (JAFROC), and inferred ROC (infROC) was calculated from the mark-rating pairs using JAFROC V4.1 software. ROC based on the overall rating of the image calculated using DBM MRMC software, which was also used to compare infROC and ROC AUCs treating the methods as modalities. Pearson's correlations coefficient and linear regression were used to examine their relationship using SPSS, version 21.0; (SPSS, Chicago, IL). The results of this study showed no significant difference between the ROC and Inferred ROC AUCs (p≤0.25). While Pearson's correlation coefficient was 0.7 (p≤0.01). Inter-reader correlation calculated from Obuchowski- Rockette covariance's ranged from 0.43-0.86 while intra-reader agreement was greater than previously reported ranging from 0.68-0.82.

  12. Inferring influenza dynamics and control in households

    PubMed Central

    Lau, Max S.Y.; Cowling, Benjamin J.; Cook, Alex R.; Riley, Steven

    2015-01-01

    Household-based interventions are the mainstay of public health policy against epidemic respiratory pathogens when vaccination is not available. Although the efficacy of these interventions has traditionally been measured by their ability to reduce the proportion of household contacts who exhibit symptoms [household secondary attack rate (hSAR)], this metric is difficult to interpret and makes only partial use of data collected by modern field studies. Here, we use Bayesian transmission model inference to analyze jointly both symptom reporting and viral shedding data from a three-armed study of influenza interventions. The reduction in hazard of infection in the increased hand hygiene intervention arm was 37.0% [8.3%, 57.8%], whereas the equivalent reduction in the other intervention arm was 27.2% [−0.46%, 52.3%] (increased hand hygiene and face masks). By imputing the presence and timing of unobserved infection, we estimated that only 61.7% [43.1%, 76.9%] of infections met the case criteria and were thus detected by the study design. An assessment of interventions using inferred infections produced more intuitively consistent attack rates when households were stratified by the speed of intervention, compared with the crude hSAR. Compared with adults, children were 2.29 [1.66, 3.23] times as infectious and 3.36 [2.31, 4.82] times as susceptible. The mean generation time was 3.39 d [3.06, 3.70]. Laboratory confirmation of infections by RT-PCR was only able to detect 79.6% [76.5%, 83.0%] of symptomatic infections, even at the peak of shedding. Our results highlight the potential use of robust inference with well-designed mechanistic transmission models to improve the design of intervention studies. PMID:26150502

  13. Divide et impera: subgoaling reduces the complexity of probabilistic inference and problem solving.

    PubMed

    Maisto, Domenico; Donnarumma, Francesco; Pezzulo, Giovanni

    2015-03-06

    It has long been recognized that humans (and possibly other animals) usually break problems down into smaller and more manageable problems using subgoals. Despite a general consensus that subgoaling helps problem solving, it is still unclear what the mechanisms guiding online subgoal selection are during the solution of novel problems for which predefined solutions are not available. Under which conditions does subgoaling lead to optimal behaviour? When is subgoaling better than solving a problem from start to finish? Which is the best number and sequence of subgoals to solve a given problem? How are these subgoals selected during online inference? Here, we present a computational account of subgoaling in problem solving. Following Occam's razor, we propose that good subgoals are those that permit planning solutions and controlling behaviour using less information resources, thus yielding parsimony in inference and control. We implement this principle using approximate probabilistic inference: subgoals are selected using a sampling method that considers the descriptive complexity of the resulting sub-problems. We validate the proposed method using a standard reinforcement learning benchmark (four-rooms scenario) and show that the proposed method requires less inferential steps and permits selecting more compact control programs compared to an equivalent procedure without subgoaling. Furthermore, we show that the proposed method offers a mechanistic explanation of the neuronal dynamics found in the prefrontal cortex of monkeys that solve planning problems. Our computational framework provides a novel integrative perspective on subgoaling and its adaptive advantages for planning, control and learning, such as for example lowering cognitive effort and working memory load. © 2015 The Author(s) Published by the Royal Society. All rights reserved.

  14. One-dimensional Euclidean matching problem: exact solutions, correlation functions, and universality.

    PubMed

    Caracciolo, Sergio; Sicuro, Gabriele

    2014-10-01

    We discuss the equivalence relation between the Euclidean bipartite matching problem on the line and on the circumference and the Brownian bridge process on the same domains. The equivalence allows us to compute the correlation function and the optimal cost of the original combinatorial problem in the thermodynamic limit; moreover, we solve also the minimax problem on the line and on the circumference. The properties of the average cost and correlation functions are discussed.

  15. Statistical inference based on the nonparametric maximum likelihood estimator under double-truncation.

    PubMed

    Emura, Takeshi; Konno, Yoshihiko; Michimae, Hirofumi

    2015-07-01

    Doubly truncated data consist of samples whose observed values fall between the right- and left- truncation limits. With such samples, the distribution function of interest is estimated using the nonparametric maximum likelihood estimator (NPMLE) that is obtained through a self-consistency algorithm. Owing to the complicated asymptotic distribution of the NPMLE, the bootstrap method has been suggested for statistical inference. This paper proposes a closed-form estimator for the asymptotic covariance function of the NPMLE, which is computationally attractive alternative to bootstrapping. Furthermore, we develop various statistical inference procedures, such as confidence interval, goodness-of-fit tests, and confidence bands to demonstrate the usefulness of the proposed covariance estimator. Simulations are performed to compare the proposed method with both the bootstrap and jackknife methods. The methods are illustrated using the childhood cancer dataset.

  16. Inter-Vertebral Flexibility of the Ostrich Neck: Implications for Estimating Sauropod Neck Flexibility

    PubMed Central

    Cobley, Matthew J.; Rayfield, Emily J.; Barrett, Paul M.

    2013-01-01

    The flexibility and posture of the neck in sauropod dinosaurs has long been contentious. Improved constraints on sauropod neck function will have major implications for what we know of their foraging strategies, ecology and overall biology. Several hypotheses have been proposed, based primarily on osteological data, suggesting different degrees of neck flexibility. This study attempts to assess the effects of reconstructed soft tissues on sauropod neck flexibility through systematic removal of muscle groups and measures of flexibility of the neck in a living analogue, the ostrich (Struthio camelus). The possible effect of cartilage on flexibility is also examined, as this was previously overlooked in osteological estimates of sauropod neck function. These comparisons show that soft tissues are likely to have limited the flexibility of the neck beyond the limits suggested by osteology alone. In addition, the inferred presence of cartilage, and varying the inter-vertebral spacing within the synovial capsule, also affect neck flexibility. One hypothesis proposed that flexibility is constrained by requiring a minimum overlap between successive zygapophyses equivalent to 50% of zygapophyseal articular surface length (ONP50). This assumption is tested by comparing the maximum flexibility of the articulated cervical column in ONP50 and the flexibility of the complete neck with all tissues intact. It is found that this model does not adequately convey the pattern of flexibility in the ostrich neck, suggesting that the ONP50 model may not be useful in determining neck function if considered in isolation from myological and other soft tissue data. PMID:23967284

  17. On the Mathematical Modeling of Single and Multiple Scattering of Ultrasonic Guided Waves by Small Scatterers: A Structural Health Monitoring Measurement Model

    NASA Astrophysics Data System (ADS)

    Strom, Brandon William

    In an effort to assist in the paradigm shift from schedule based maintenance to conditioned based maintenance, we derive measurement models to be used within structural health monitoring algorithms. Our models are physics based, and use scattered Lamb waves to detect and quantify pitting corrosion. After covering the basics of Lamb waves and the reciprocity theorem, we develop a technique for the scattered wave solution. The first application is two-dimensional, and is employed in two different ways. The first approach integrates a traction distribution and replaces it by an equivalent force. The second approach is higher order and uses the actual traction distribution. We find that the equivalent force version of the solution technique holds well for small pits at low frequencies. The second application is three-dimensional. The equivalent force caused by the scattered wave of an arbitrary equivalent force is calculated. We obtain functions for the scattered wave displacements as a function of equivalent forces, equivalent forces as a function of incident wave, and scattered wave amplitudes as a function of incident amplitude. The third application uses self-consistency to derive governing equations for the scattered waves due to multiple corrosion pits. We decouple the implicit set of equations and solve explicitly by using a recursive series solution. Alternatively, we solve via an undetermined coefficient method which results in an interaction operator and solution via matrix inversion. The general solution is given for N pits including mode conversion. We show that the two approaches are equivalent, and give a solution for three pits. Various approximations are advanced to simplify the problem while retaining the leading order physics. As a final application, we use the multiple scattering model to investigate resonance of Lamb waves. We begin with a one-dimensional problem and progress to a three-dimensional problem. A directed graph enables interpretation of the interaction operator, and we show that a series solution converges due to loss of energy in the system. We see that there are four causes of resonance and plot the modulation depth as a function of spacing between the pits.

  18. Gene-network inference by message passing

    NASA Astrophysics Data System (ADS)

    Braunstein, A.; Pagnani, A.; Weigt, M.; Zecchina, R.

    2008-01-01

    The inference of gene-regulatory processes from gene-expression data belongs to the major challenges of computational systems biology. Here we address the problem from a statistical-physics perspective and develop a message-passing algorithm which is able to infer sparse, directed and combinatorial regulatory mechanisms. Using the replica technique, the algorithmic performance can be characterized analytically for artificially generated data. The algorithm is applied to genome-wide expression data of baker's yeast under various environmental conditions. We find clear cases of combinatorial control, and enrichment in common functional annotations of regulated genes and their regulators.

  19. Assessing dynamics, spatial scale, and uncertainty in task-related brain network analyses

    PubMed Central

    Stephen, Emily P.; Lepage, Kyle Q.; Eden, Uri T.; Brunner, Peter; Schalk, Gerwin; Brumberg, Jonathan S.; Guenther, Frank H.; Kramer, Mark A.

    2014-01-01

    The brain is a complex network of interconnected elements, whose interactions evolve dynamically in time to cooperatively perform specific functions. A common technique to probe these interactions involves multi-sensor recordings of brain activity during a repeated task. Many techniques exist to characterize the resulting task-related activity, including establishing functional networks, which represent the statistical associations between brain areas. Although functional network inference is commonly employed to analyze neural time series data, techniques to assess the uncertainty—both in the functional network edges and the corresponding aggregate measures of network topology—are lacking. To address this, we describe a statistically principled approach for computing uncertainty in functional networks and aggregate network measures in task-related data. The approach is based on a resampling procedure that utilizes the trial structure common in experimental recordings. We show in simulations that this approach successfully identifies functional networks and associated measures of confidence emergent during a task in a variety of scenarios, including dynamically evolving networks. In addition, we describe a principled technique for establishing functional networks based on predetermined regions of interest using canonical correlation. Doing so provides additional robustness to the functional network inference. Finally, we illustrate the use of these methods on example invasive brain voltage recordings collected during an overt speech task. The general strategy described here—appropriate for static and dynamic network inference and different statistical measures of coupling—permits the evaluation of confidence in network measures in a variety of settings common to neuroscience. PMID:24678295

  20. Predicted Arabidopsis Interactome Resource and Gene Set Linkage Analysis: A Transcriptomic Analysis Resource.

    PubMed

    Yao, Heng; Wang, Xiaoxuan; Chen, Pengcheng; Hai, Ling; Jin, Kang; Yao, Lixia; Mao, Chuanzao; Chen, Xin

    2018-05-01

    An advanced functional understanding of omics data is important for elucidating the design logic of physiological processes in plants and effectively controlling desired traits in plants. We present the latest versions of the Predicted Arabidopsis Interactome Resource (PAIR) and of the gene set linkage analysis (GSLA) tool, which enable the interpretation of an observed transcriptomic change (differentially expressed genes [DEGs]) in Arabidopsis ( Arabidopsis thaliana ) with respect to its functional impact for biological processes. PAIR version 5.0 integrates functional association data between genes in multiple forms and infers 335,301 putative functional interactions. GSLA relies on this high-confidence inferred functional association network to expand our perception of the functional impacts of an observed transcriptomic change. GSLA then interprets the biological significance of the observed DEGs using established biological concepts (annotation terms), describing not only the DEGs themselves but also their potential functional impacts. This unique analytical capability can help researchers gain deeper insights into their experimental results and highlight prospective directions for further investigation. We demonstrate the utility of GSLA with two case studies in which GSLA uncovered how molecular events may have caused physiological changes through their collective functional influence on biological processes. Furthermore, we showed that typical annotation-enrichment tools were unable to produce similar insights to PAIR/GSLA. The PAIR version 5.0-inferred interactome and GSLA Web tool both can be accessed at http://public.synergylab.cn/pair/. © 2018 American Society of Plant Biologists. All Rights Reserved.

  1. Assessing dynamics, spatial scale, and uncertainty in task-related brain network analyses.

    PubMed

    Stephen, Emily P; Lepage, Kyle Q; Eden, Uri T; Brunner, Peter; Schalk, Gerwin; Brumberg, Jonathan S; Guenther, Frank H; Kramer, Mark A

    2014-01-01

    The brain is a complex network of interconnected elements, whose interactions evolve dynamically in time to cooperatively perform specific functions. A common technique to probe these interactions involves multi-sensor recordings of brain activity during a repeated task. Many techniques exist to characterize the resulting task-related activity, including establishing functional networks, which represent the statistical associations between brain areas. Although functional network inference is commonly employed to analyze neural time series data, techniques to assess the uncertainty-both in the functional network edges and the corresponding aggregate measures of network topology-are lacking. To address this, we describe a statistically principled approach for computing uncertainty in functional networks and aggregate network measures in task-related data. The approach is based on a resampling procedure that utilizes the trial structure common in experimental recordings. We show in simulations that this approach successfully identifies functional networks and associated measures of confidence emergent during a task in a variety of scenarios, including dynamically evolving networks. In addition, we describe a principled technique for establishing functional networks based on predetermined regions of interest using canonical correlation. Doing so provides additional robustness to the functional network inference. Finally, we illustrate the use of these methods on example invasive brain voltage recordings collected during an overt speech task. The general strategy described here-appropriate for static and dynamic network inference and different statistical measures of coupling-permits the evaluation of confidence in network measures in a variety of settings common to neuroscience.

  2. Using the Rasch Model to Determine Equivalence of Forms In the Trilingual Lollipop Readiness Test

    ERIC Educational Resources Information Center

    Lang, W. Steve; Chew, Alex L.; Crownover, Carol; Wilkerson, Judy R.

    2007-01-01

    Determining the cross-cultural equivalence of multilingual tests is a challenge that is more complex than simple horizontal equating of test forms. This study examines the functioning of a trilingual test of preschool readiness to determine the equivalence. Different forms of the test have previously been examined using classical statistical…

  3. Probability of Equivalence Formation: Familiar Stimuli and Training Sequence

    ERIC Educational Resources Information Center

    Arntzen, Erik

    2004-01-01

    The present study was conducted to show how responding in accord with equivalence relations changes as a function of position of familiar stimuli, pictures, and with the use of nonsense syllables in an MTO-training structure. Fifty college students were tested for responding in accord with equivalence in an AB, CB, DB, and EB training structure.…

  4. Functional Equivalence of Spatial Images from Touch and Vision: Evidence from Spatial Updating in Blind and Sighted Individuals

    ERIC Educational Resources Information Center

    Giudice, Nicholas A.; Betty, Maryann R.; Loomis, Jack M.

    2011-01-01

    This research examined whether visual and haptic map learning yield functionally equivalent spatial images in working memory, as evidenced by similar encoding bias and updating performance. In 3 experiments, participants learned 4-point routes either by seeing or feeling the maps. At test, blindfolded participants made spatial judgments about the…

  5. Using Stimulus Equivalence-Based Instruction to Teach Graduate Students in Applied Behavior Analysis to Interpret Operant Functions of Behavior

    ERIC Educational Resources Information Center

    Albright, Leif; Schnell, Lauren; Reeve, Kenneth F.; Sidener, Tina M.

    2016-01-01

    Stimulus equivalence-based instruction (EBI) was used to teach four, 4-member classes representing functions of behavior to ten graduate students. The classes represented behavior maintained by attention (Class 1), escape (Class 2), access to tangibles (Class 3), and automatic reinforcement (Class 4). Stimuli within each class consisted of a…

  6. [The equivalence and interchangeability of medical articles].

    PubMed

    Antonov, V S

    2013-11-01

    The information concerning the interchangeability of medical articles is highly valuable because it makes it possible to correlate most precisely medical articles with medical technologies and medical care standards and to optimize budget costs under public purchasing. The proposed procedure of determination of interchangeability is based on criteria of equivalence of prescriptions, functional technical and technological characteristics and effectiveness of functioning of medical articles.

  7. Sample size determination for equivalence assessment with multiple endpoints.

    PubMed

    Sun, Anna; Dong, Xiaoyu; Tsong, Yi

    2014-01-01

    Equivalence assessment between a reference and test treatment is often conducted by two one-sided tests (TOST). The corresponding power function and sample size determination can be derived from a joint distribution of the sample mean and sample variance. When an equivalence trial is designed with multiple endpoints, it often involves several sets of two one-sided tests. A naive approach for sample size determination in this case would select the largest sample size required for each endpoint. However, such a method ignores the correlation among endpoints. With the objective to reject all endpoints and when the endpoints are uncorrelated, the power function is the production of all power functions for individual endpoints. With correlated endpoints, the sample size and power should be adjusted for such a correlation. In this article, we propose the exact power function for the equivalence test with multiple endpoints adjusted for correlation under both crossover and parallel designs. We further discuss the differences in sample size for the naive method without and with correlation adjusted methods and illustrate with an in vivo bioequivalence crossover study with area under the curve (AUC) and maximum concentration (Cmax) as the two endpoints.

  8. Ostracod-inferred conductivity transfer function and its utility in palaeo-conductivity reconstruction in Tibetan Lakes

    NASA Astrophysics Data System (ADS)

    Peng, P.; Zhu, L.; Guo, Y.; Wang, J.; Fürstenberg, S.; Ju, J.; Wang, Y.; Frenzel, P.

    2016-12-01

    Ostracod, was used as a sensitive monitor in palaeo-environmental change research. Ostracod transfer function was developing as a quantitate indicator in palaeo-limnology research. Plenty of lakes scattered on the Tibetan Plateau supplied sediments for analyzing indexes of environment in past climate change research. This application was research on samples of sub-fossil ostracod and its habitat condition, including water sample and water parameters, to produce a database for a forward transfer function based on gradient analyses. This transfer function was used for environment reconstruction of Tibetan lakes to preview past climate changes. In our research, twelve species belonging to ten genus were documented from 114 studied samples in 34 lakes. This research illustrated a specific conductivity gradient gradually increased by L.sinensis-L.dorsotuberosa-C.xizangensis, L.dorsotuberosa-L.inopinata and L.inopinata to indicate fresh-lightly brackish, brackish, brine water condition, respectively. Gradient analysis revealed that specific conductivity was the most important variable drove the distribution of sub-fossil Ostracods. A specific conductivity transfer function using a weighted averaging partial least squares (WA-PLS) model was set up to reconstruct palaeo-specific conductivity. The model presented a good correlation of measured and estimated specific conductivity (R2=0.67), a relative low root mean squared error of prediction (RMSEP=0.47). Multi-proxies, including ostracod assemblages, ostracod-inferred lake level and specific conductivity, mean grain size, total organic carbon and total inorganic carbon of sediment from core of Tibetan Lakes, inferred the palaeo-climate change history of the research area. The environmental change probably was an adaption to the weakening activities of India monsoon since mid-Holocene inferred from the comparable climatic change records from the Tibetan Plateau and relative monsoonal areas.

  9. Penalized spline estimation for functional coefficient regression models.

    PubMed

    Cao, Yanrong; Lin, Haiqun; Wu, Tracy Z; Yu, Yan

    2010-04-01

    The functional coefficient regression models assume that the regression coefficients vary with some "threshold" variable, providing appreciable flexibility in capturing the underlying dynamics in data and avoiding the so-called "curse of dimensionality" in multivariate nonparametric estimation. We first investigate the estimation, inference, and forecasting for the functional coefficient regression models with dependent observations via penalized splines. The P-spline approach, as a direct ridge regression shrinkage type global smoothing method, is computationally efficient and stable. With established fixed-knot asymptotics, inference is readily available. Exact inference can be obtained for fixed smoothing parameter λ, which is most appealing for finite samples. Our penalized spline approach gives an explicit model expression, which also enables multi-step-ahead forecasting via simulations. Furthermore, we examine different methods of choosing the important smoothing parameter λ: modified multi-fold cross-validation (MCV), generalized cross-validation (GCV), and an extension of empirical bias bandwidth selection (EBBS) to P-splines. In addition, we implement smoothing parameter selection using mixed model framework through restricted maximum likelihood (REML) for P-spline functional coefficient regression models with independent observations. The P-spline approach also easily allows different smoothness for different functional coefficients, which is enabled by assigning different penalty λ accordingly. We demonstrate the proposed approach by both simulation examples and a real data application.

  10. Mobile sensing of point-source fugitive methane emissions using Bayesian inference: the determination of the likelihood function

    NASA Astrophysics Data System (ADS)

    Zhou, X.; Albertson, J. D.

    2016-12-01

    Natural gas is considered as a bridge fuel towards clean energy due to its potential lower greenhouse gas emission comparing with other fossil fuels. Despite numerous efforts, an efficient and cost-effective approach to monitor fugitive methane emissions along the natural gas production-supply chain has not been developed yet. Recently, mobile methane measurement has been introduced which applies a Bayesian approach to probabilistically infer methane emission rates and update estimates recursively when new measurements become available. However, the likelihood function, especially the error term which determines the shape of the estimate uncertainty, is not rigorously defined and evaluated with field data. To address this issue, we performed a series of near-source (< 30 m) controlled methane release experiments using a specialized vehicle mounted with fast response methane analyzers and a GPS unit. Methane concentrations were measured at two different heights along mobile traversals downwind of the sources, and concurrent wind and temperature data are recorded by nearby 3-D sonic anemometers. With known methane release rates, the measurements were used to determine the functional form and the parameterization of the likelihood function in the Bayesian inference scheme under different meteorological conditions.

  11. Efficient Exact Inference With Loss Augmented Objective in Structured Learning.

    PubMed

    Bauer, Alexander; Nakajima, Shinichi; Muller, Klaus-Robert

    2016-08-19

    Structural support vector machine (SVM) is an elegant approach for building complex and accurate models with structured outputs. However, its applicability relies on the availability of efficient inference algorithms--the state-of-the-art training algorithms repeatedly perform inference to compute a subgradient or to find the most violating configuration. In this paper, we propose an exact inference algorithm for maximizing nondecomposable objectives due to special type of a high-order potential having a decomposable internal structure. As an important application, our method covers the loss augmented inference, which enables the slack and margin scaling formulations of structural SVM with a variety of dissimilarity measures, e.g., Hamming loss, precision and recall, Fβ-loss, intersection over union, and many other functions that can be efficiently computed from the contingency table. We demonstrate the advantages of our approach in natural language parsing and sequence segmentation applications.

  12. The Impact of Experiential Avoidance on the Inference of Characters' Emotions: Evidence for an Emotional Processing Bias.

    PubMed

    Pickett, Scott M; Kurby, Christopher A

    2010-12-01

    Experiential avoidance is a functional class of maladaptive strategies that contribute to the development and maintenance of psychopathology. Although previous research has demonstrated group differences in the interpretation of aversive stimuli, there is limited work on the influence of experiential avoidance during the online processing of emotion. An experimental design investigated the influence of self-reported experiential avoidance during emotion processing by assessing emotion inferences during the comprehension of narratives that imply different emotions. Results suggest that experiential avoidance is partially characterized by an emotional information processing bias. Specifically, individuals reporting higher experiential avoidance scores exhibited a bias towards activating negative emotion inferences, whereas individuals reporting lower experiential avoidance scores exhibited a bias towards activating positive emotion inferences. Minimal emotional inference was observed for the non-bias affective valence. Findings are discussed in terms of the implications of experiential avoidance as a cognitive vulnerability for psychopathology.

  13. Laser-based irradiation apparatus and method to measure the functional dose-rate response of semiconductor devices

    DOEpatents

    Horn, Kevin M [Albuquerque, NM

    2008-05-20

    A broad-beam laser irradiation apparatus can measure the parametric or functional response of a semiconductor device to exposure to dose-rate equivalent infrared laser light. Comparisons of dose-rate response from before, during, and after accelerated aging of a device, or from periodic sampling of devices from fielded operational systems can determine if aging has affected the device's overall functionality. The dependence of these changes on equivalent dose-rate pulse intensity and/or duration can be measured with the apparatus. The synchronized introduction of external electrical transients into the device under test can be used to simulate the electrical effects of the surrounding circuitry's response to a radiation exposure while exposing the device to dose-rate equivalent infrared laser light.

  14. A caveat regarding diatom-inferred nitrogen concentrations in oligotrophic lakes

    USGS Publications Warehouse

    Arnett, Heather A.; Saros, Jasmine E.; Mast, M. Alisa

    2012-01-01

    Atmospheric deposition of reactive nitrogen (Nr) has enriched oligotrophic lakes with nitrogen (N) in many regions of the world and elicited dramatic changes in diatom community structure. The lakewater concentrations of nitrate that cause these community changes remain unclear, raising interest in the development of diatom-based transfer functions to infer nitrate. We developed a diatom calibration set using surface sediment samples from 46 high-elevation lakes across the Rocky Mountains of the western US, a region spanning an N deposition gradient from very low to moderate levels (<1 to 3.2 kg Nr ha−1 year−1 in wet deposition). Out of the fourteen measured environmental variables for these 46 lakes, ordination analysis identified that nitrate, specific conductance, total phosphorus, and hypolimnetic water temperature were related to diatom distributions. A transfer function was developed for nitrate and applied to a sedimentary diatom profile from Heart Lake in the central Rockies. The model coefficient of determination (bootstrapping validation) of 0.61 suggested potential for diatom-inferred reconstructions of lakewater nitrate concentrations over time, but a comparison of observed versus diatom-inferred nitrate values revealed the poor performance of this model at low nitrate concentrations. Resource physiology experiments revealed that nitrogen requirements of two key taxa were opposite to nitrate optima defined in the transfer function. Our data set reveals two underlying ecological constraints that impede the development of nitrate transfer functions in oligotrophic lakes: (1) even in lakes with nitrate concentrations below quantification (<1 μg L−1), diatom assemblages were already dominated by species indicative of moderate N enrichment; (2) N-limited oligotrophic lakes switch to P limitation after receiving only modest inputs of reactive N, shifting the controls on diatom species changes along the length of the nitrate gradient. These constraints suggest that quantitative inferences of nitrate from diatom assemblages will likely require experimental approaches.

  15. Emulation of reionization simulations for Bayesian inference of astrophysics parameters using neural networks

    NASA Astrophysics Data System (ADS)

    Schmit, C. J.; Pritchard, J. R.

    2018-03-01

    Next generation radio experiments such as LOFAR, HERA, and SKA are expected to probe the Epoch of Reionization (EoR) and claim a first direct detection of the cosmic 21cm signal within the next decade. Data volumes will be enormous and can thus potentially revolutionize our understanding of the early Universe and galaxy formation. However, numerical modelling of the EoR can be prohibitively expensive for Bayesian parameter inference and how to optimally extract information from incoming data is currently unclear. Emulation techniques for fast model evaluations have recently been proposed as a way to bypass costly simulations. We consider the use of artificial neural networks as a blind emulation technique. We study the impact of training duration and training set size on the quality of the network prediction and the resulting best-fitting values of a parameter search. A direct comparison is drawn between our emulation technique and an equivalent analysis using 21CMMC. We find good predictive capabilities of our network using training sets of as low as 100 model evaluations, which is within the capabilities of fully numerical radiative transfer codes.

  16. Inference of median difference based on the Box-Cox model in randomized clinical trials.

    PubMed

    Maruo, K; Isogawa, N; Gosho, M

    2015-05-10

    In randomized clinical trials, many medical and biological measurements are not normally distributed and are often skewed. The Box-Cox transformation is a powerful procedure for comparing two treatment groups for skewed continuous variables in terms of a statistical test. However, it is difficult to directly estimate and interpret the location difference between the two groups on the original scale of the measurement. We propose a helpful method that infers the difference of the treatment effect on the original scale in a more easily interpretable form. We also provide statistical analysis packages that consistently include an estimate of the treatment effect, covariance adjustments, standard errors, and statistical hypothesis tests. The simulation study that focuses on randomized parallel group clinical trials with two treatment groups indicates that the performance of the proposed method is equivalent to or better than that of the existing non-parametric approaches in terms of the type-I error rate and power. We illustrate our method with cluster of differentiation 4 data in an acquired immune deficiency syndrome clinical trial. Copyright © 2015 John Wiley & Sons, Ltd.

  17. What a speaker's choice of frame reveals: reference points, frame selection, and framing effects.

    PubMed

    McKenzie, Craig R M; Nelson, Jonathan D

    2003-09-01

    Framing effects are well established: Listeners' preferences depend on how outcomes are described to them, or framed. Less well understood is what determines how speakers choose frames. Two experiments revealed that reference points systematically influenced speakers' choices between logically equivalent frames. For example, speakers tended to describe a 4-ounce cup filled to the 2-ounce line as half full if it was previously empty but described it as half empty if it was previously full. Similar results were found when speakers could describe the outcome of a medical treatment in terms of either mortality or survival (e.g., 25% die vs. 75% survive). Two additional experiments showed that listeners made accurate inferences about speakers' reference points on the basis of the selected frame (e.g., if a speaker described a cup as half empty, listeners inferred that the cup used to be full). Taken together, the data suggest that frames reliably convey implicit information in addition to their explicit content, which helps explain why framing effects are so robust.

  18. Quantitative Proteomics via High Resolution MS Quantification: Capabilities and Limitations

    PubMed Central

    Higgs, Richard E.; Butler, Jon P.; Han, Bomie; Knierman, Michael D.

    2013-01-01

    Recent improvements in the mass accuracy and resolution of mass spectrometers have led to renewed interest in label-free quantification using data from the primary mass spectrum (MS1) acquired from data-dependent proteomics experiments. The capacity for higher specificity quantification of peptides from samples enriched for proteins of biological interest offers distinct advantages for hypothesis generating experiments relative to immunoassay detection methods or prespecified peptide ions measured by multiple reaction monitoring (MRM) approaches. Here we describe an evaluation of different methods to post-process peptide level quantification information to support protein level inference. We characterize the methods by examining their ability to recover a known dilution of a standard protein in background matrices of varying complexity. Additionally, the MS1 quantification results are compared to a standard, targeted, MRM approach on the same samples under equivalent instrument conditions. We show the existence of multiple peptides with MS1 quantification sensitivity similar to the best MRM peptides for each of the background matrices studied. Based on these results we provide recommendations on preferred approaches to leveraging quantitative measurements of multiple peptides to improve protein level inference. PMID:23710359

  19. Gene function prediction with gene interaction networks: a context graph kernel approach.

    PubMed

    Li, Xin; Chen, Hsinchun; Li, Jiexun; Zhang, Zhu

    2010-01-01

    Predicting gene functions is a challenge for biologists in the postgenomic era. Interactions among genes and their products compose networks that can be used to infer gene functions. Most previous studies adopt a linkage assumption, i.e., they assume that gene interactions indicate functional similarities between connected genes. In this study, we propose to use a gene's context graph, i.e., the gene interaction network associated with the focal gene, to infer its functions. In a kernel-based machine-learning framework, we design a context graph kernel to capture the information in context graphs. Our experimental study on a testbed of p53-related genes demonstrates the advantage of using indirect gene interactions and shows the empirical superiority of the proposed approach over linkage-assumption-based methods, such as the algorithm to minimize inconsistent connected genes and diffusion kernels.

  20. How Far Is "Near"? Inferring Distance from Spatial Descriptions

    ERIC Educational Resources Information Center

    Carlson, Laura A.; Covey, Eric S.

    2005-01-01

    A word may mean different things in different contexts. The current study explored the changing denotations of spatial terms, focusing on how the distance inferred from a spatial description varied as a function of the size of the objects being spatially related. We examined both terms that explicitly convey distance (i.e., topological terms such…

  1. Detective Questions: A Strategy for Improving Inference-Making in Children With Mild Disabilities

    ERIC Educational Resources Information Center

    Jiménez-Fernández, Gracia

    2015-01-01

    One of the most frequent problems in reading comprehension is the difficulty in making inferences from the text, especially for students with mild disabilities (i.e., children with learning disabilities or with high-functioning autism). It is essential, therefore, that educators include the teaching of reading strategies to improve their students'…

  2. cosmoabc: Likelihood-free inference for cosmology

    NASA Astrophysics Data System (ADS)

    Ishida, Emille E. O.; Vitenti, Sandro D. P.; Penna-Lima, Mariana; Trindade, Arlindo M.; Cisewski, Jessi; M.; de Souza, Rafael; Cameron, Ewan; Busti, Vinicius C.

    2015-05-01

    Approximate Bayesian Computation (ABC) enables parameter inference for complex physical systems in cases where the true likelihood function is unknown, unavailable, or computationally too expensive. It relies on the forward simulation of mock data and comparison between observed and synthetic catalogs. cosmoabc is a Python Approximate Bayesian Computation (ABC) sampler featuring a Population Monte Carlo variation of the original ABC algorithm, which uses an adaptive importance sampling scheme. The code can be coupled to an external simulator to allow incorporation of arbitrary distance and prior functions. When coupled with the numcosmo library, it has been used to estimate posterior probability distributions over cosmological parameters based on measurements of galaxy clusters number counts without computing the likelihood function.

  3. The Effect of Star Formation History on the Inferred Stellar Initial Mass Function

    NASA Astrophysics Data System (ADS)

    Elmegreen, Bruce G.; Scalo, John

    2006-01-01

    Peaks and lulls in the star formation rate (SFR) over the history of the Galaxy produce plateaus and declines in the present-day mass function (PDMF) where the main-sequence lifetime overlaps the age and duration of the SFR variation. These PDMF features can be misinterpreted as the form of the intrinsic stellar initial mass function (IMF) if the star formation rate is assumed to be constant or slowly varying with time. This effect applies to all regions that have formed stars for longer than the age of the most massive stars, including OB associations, star complexes, and especially galactic field stars. Related problems may apply to embedded clusters. Evidence is summarized for temporal SFR variations from parsec scales to entire galaxies, all of which should contribute to inferred IMF distortions. We give examples of various star formation histories to demonstrate the types of false IMF structures that might be seen. These include short-duration bursts, stochastic histories with lognormal amplitude distributions, and oscillating histories with various periods and phases. The inferred IMF should appear steeper than the intrinsic IMF over mass ranges where the stellar lifetimes correspond to times of decreasing SFRs; shallow portions of the inferred IMF correspond to times of increasing SFRs. If field regions are populated by dispersed clusters and defined by their low current SFRs, then they should have steeper inferred IMFs than the clusters. The SFRs required to give the steep field IMFs in the LMC and SMC are determined. Structure observed in several determinations of the Milky Way field star IMF can be accounted for by a stochastic and bursty star formation history.

  4. The anatomy of choice: dopamine and decision-making

    PubMed Central

    Friston, Karl; Schwartenbeck, Philipp; FitzGerald, Thomas; Moutoussis, Michael; Behrens, Timothy; Dolan, Raymond J.

    2014-01-01

    This paper considers goal-directed decision-making in terms of embodied or active inference. We associate bounded rationality with approximate Bayesian inference that optimizes a free energy bound on model evidence. Several constructs such as expected utility, exploration or novelty bonuses, softmax choice rules and optimism bias emerge as natural consequences of free energy minimization. Previous accounts of active inference have focused on predictive coding. In this paper, we consider variational Bayes as a scheme that the brain might use for approximate Bayesian inference. This scheme provides formal constraints on the computational anatomy of inference and action, which appear to be remarkably consistent with neuroanatomy. Active inference contextualizes optimal decision theory within embodied inference, where goals become prior beliefs. For example, expected utility theory emerges as a special case of free energy minimization, where the sensitivity or inverse temperature (associated with softmax functions and quantal response equilibria) has a unique and Bayes-optimal solution. Crucially, this sensitivity corresponds to the precision of beliefs about behaviour. The changes in precision during variational updates are remarkably reminiscent of empirical dopaminergic responses—and they may provide a new perspective on the role of dopamine in assimilating reward prediction errors to optimize decision-making. PMID:25267823

  5. The anatomy of choice: dopamine and decision-making.

    PubMed

    Friston, Karl; Schwartenbeck, Philipp; FitzGerald, Thomas; Moutoussis, Michael; Behrens, Timothy; Dolan, Raymond J

    2014-11-05

    This paper considers goal-directed decision-making in terms of embodied or active inference. We associate bounded rationality with approximate Bayesian inference that optimizes a free energy bound on model evidence. Several constructs such as expected utility, exploration or novelty bonuses, softmax choice rules and optimism bias emerge as natural consequences of free energy minimization. Previous accounts of active inference have focused on predictive coding. In this paper, we consider variational Bayes as a scheme that the brain might use for approximate Bayesian inference. This scheme provides formal constraints on the computational anatomy of inference and action, which appear to be remarkably consistent with neuroanatomy. Active inference contextualizes optimal decision theory within embodied inference, where goals become prior beliefs. For example, expected utility theory emerges as a special case of free energy minimization, where the sensitivity or inverse temperature (associated with softmax functions and quantal response equilibria) has a unique and Bayes-optimal solution. Crucially, this sensitivity corresponds to the precision of beliefs about behaviour. The changes in precision during variational updates are remarkably reminiscent of empirical dopaminergic responses-and they may provide a new perspective on the role of dopamine in assimilating reward prediction errors to optimize decision-making.

  6. The inference from a single case: moral versus scientific inferences in implementing new biotechnologies.

    PubMed

    Hofmann, B

    2008-06-01

    Are there similarities between scientific and moral inference? This is the key question in this article. It takes as its point of departure an instance of one person's story in the media changing both Norwegian public opinion and a brand-new Norwegian law prohibiting the use of saviour siblings. The case appears to falsify existing norms and to establish new ones. The analysis of this case reveals similarities in the modes of inference in science and morals, inasmuch as (a) a single case functions as a counter-example to an existing rule; (b) there is a common presupposition of stability, similarity and order, which makes it possible to reason from a few cases to a general rule; and (c) this makes it possible to hold things together and retain order. In science, these modes of inference are referred to as falsification, induction and consistency. In morals, they have a variety of other names. Hence, even without abandoning the fact-value divide, there appear to be similarities between inference in science and inference in morals, which may encourage communication across the boundaries between "the two cultures" and which are relevant to medical humanities.

  7. Delayed Matching to Sample: Probability of Responding in Accord with Equivalence as a Function of Different Delays

    ERIC Educational Resources Information Center

    Arntzen, Erik

    2006-01-01

    The present series of 4 experiments investigated the probability of responding in accord with equivalence in adult human participants as a function of increasing or decreasing delays in a many-to-one (MTO) or comparison-as-node and one-to-many (OTM) or sample-as-node conditional discrimination procedure. In Experiment 1, 12 participants started…

  8. The Bacillus subtilis ywjI (glpX) gene encodes a class II fructose-1,6-bisphosphatase, functionally equivalent to the class III Fbp enzyme.

    PubMed

    Jules, Matthieu; Le Chat, Ludovic; Aymerich, Stéphane; Le Coq, Dominique

    2009-05-01

    We present here experimental evidence that the Bacillus subtilis ywjI gene encodes a class II fructose-1,6-bisphosphatase, functionally equivalent to the fbp-encoded class III enzyme, and constitutes with the upstream gene, murAB, an operon transcribed at the same level under glycolytic or gluconeogenic conditions.

  9. The Bacillus subtilis ywjI (glpX) Gene Encodes a Class II Fructose-1,6-Bisphosphatase, Functionally Equivalent to the Class III Fbp Enzyme▿

    PubMed Central

    Jules, Matthieu; Le Chat, Ludovic; Aymerich, Stéphane; Le Coq, Dominique

    2009-01-01

    We present here experimental evidence that the Bacillus subtilis ywjI gene encodes a class II fructose-1,6-bisphosphatase, functionally equivalent to the fbp-encoded class III enzyme, and constitutes with the upstream gene, murAB, an operon transcribed at the same level under glycolytic or gluconeogenic conditions. PMID:19270101

  10. Using CTX Image Features to Predict HiRISE-Equivalent Rock Density

    NASA Technical Reports Server (NTRS)

    Serrano, Navid; Huertas, Andres; McGuire, Patrick; Mayer, David; Ardvidson, Raymond

    2010-01-01

    Methods have been developed to quantitatively assess rock hazards at candidate landing sites with the aid of images from the HiRISE camera onboard NASA s Mars Reconnaissance Orbiter. HiRISE is able to resolve rocks as small as 1-m in diameter. Some sites of interest do not have adequate coverage with the highest resolution sensors and there is a need to infer relevant information (like site safety or underlying geomorphology). The proposed approach would make it possible to obtain rock density estimates at a level close to or equal to those obtained from high-resolution sensors where individual rocks are discernable.

  11. Super-resolution with a positive epsilon multi-quantum-well super-lens

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bak, A. O.; Giannini, V.; Maier, S. A.

    2013-12-23

    We design an anisotropic and dichroic quantum metamaterial that is able to achieve super-resolution without the need for a negative permittivity. When exploring the parameters of the structure, we take into account the limits of semiconductor fabrication technology based on quantum well stacks. By heavily doping the structure with free electrons, we infer an anisotropic effective medium with a prolate ellipsoid dispersion curve which allows for near-diffractionless propagation of light (similar to an epsilon-near-zero hyperbolic lens). This, coupled with low absorption, allows us to resolve images at the sub-wavelength scale at distances 6 times greater than equivalent natural materials.

  12. Stochastic inference with spiking neurons in the high-conductance state

    NASA Astrophysics Data System (ADS)

    Petrovici, Mihai A.; Bill, Johannes; Bytschok, Ilja; Schemmel, Johannes; Meier, Karlheinz

    2016-10-01

    The highly variable dynamics of neocortical circuits observed in vivo have been hypothesized to represent a signature of ongoing stochastic inference but stand in apparent contrast to the deterministic response of neurons measured in vitro. Based on a propagation of the membrane autocorrelation across spike bursts, we provide an analytical derivation of the neural activation function that holds for a large parameter space, including the high-conductance state. On this basis, we show how an ensemble of leaky integrate-and-fire neurons with conductance-based synapses embedded in a spiking environment can attain the correct firing statistics for sampling from a well-defined target distribution. For recurrent networks, we examine convergence toward stationarity in computer simulations and demonstrate sample-based Bayesian inference in a mixed graphical model. This points to a new computational role of high-conductance states and establishes a rigorous link between deterministic neuron models and functional stochastic dynamics on the network level.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brewer, Brendon J.; Foreman-Mackey, Daniel; Hogg, David W., E-mail: bj.brewer@auckland.ac.nz

    We present and implement a probabilistic (Bayesian) method for producing catalogs from images of stellar fields. The method is capable of inferring the number of sources N in the image and can also handle the challenges introduced by noise, overlapping sources, and an unknown point-spread function. The luminosity function of the stars can also be inferred, even when the precise luminosity of each star is uncertain, via the use of a hierarchical Bayesian model. The computational feasibility of the method is demonstrated on two simulated images with different numbers of stars. We find that our method successfully recovers the inputmore » parameter values along with principled uncertainties even when the field is crowded. We also compare our results with those obtained from the SExtractor software. While the two approaches largely agree about the fluxes of the bright stars, the Bayesian approach provides more accurate inferences about the faint stars and the number of stars, particularly in the crowded case.« less

  14. Topics in inference and decision-making with partial knowledge

    NASA Technical Reports Server (NTRS)

    Safavian, S. Rasoul; Landgrebe, David

    1990-01-01

    Two essential elements needed in the process of inference and decision-making are prior probabilities and likelihood functions. When both of these components are known accurately and precisely, the Bayesian approach provides a consistent and coherent solution to the problems of inference and decision-making. In many situations, however, either one or both of the above components may not be known, or at least may not be known precisely. This problem of partial knowledge about prior probabilities and likelihood functions is addressed. There are at least two ways to cope with this lack of precise knowledge: robust methods, and interval-valued methods. First, ways of modeling imprecision and indeterminacies in prior probabilities and likelihood functions are examined; then how imprecision in the above components carries over to the posterior probabilities is examined. Finally, the problem of decision making with imprecise posterior probabilities and the consequences of such actions are addressed. Application areas where the above problems may occur are in statistical pattern recognition problems, for example, the problem of classification of high-dimensional multispectral remote sensing image data.

  15. Testing two mechanisms by which rational and irrational beliefs may affect the functionality of inferences.

    PubMed

    Bond, F W; Dryden, W; Briscoe, R

    1999-12-01

    This article describes a role playing experiment that examined the sufficiency hypothesis of Rational Emotive Behaviour Therapy (REBT). This proposition states that it is sufficient for rational and irrational beliefs to refer to preferences and musts, respectively, if those beliefs are to affect the functionality of inferences (FI). Consistent with the REBT literature (e.g. Dryden, 1994; Dryden & Ellis, 1988; Palmer, Dryden, Ellis & Yapp, 1995) results from this experiment showed that rational and irrational beliefs, as defined by REBT, do affect FI. Specifically, results showed that people who hold a rational belief form inferences that are significantly more functional than those that are formed by people who hold an irrational belief. Contrary to REBT theory, the sufficiency hypothesis was not supported. Thus, results indicated that it is not sufficient for rational and irrational beliefs to refer to preferences and musts, respectively, if those beliefs are to affect the FI. It appears, then, that preferences and musts are not sufficient mechanisms by which rational and irrational beliefs, respectively, affect the FI. Psychotherapeutic implications of these findings are considered.

  16. Experimental demonstration of a multi-wavelength distributed feedback semiconductor laser array with an equivalent chirped grating profile based on the equivalent chirp technology.

    PubMed

    Li, Wangzhe; Zhang, Xia; Yao, Jianping

    2013-08-26

    We report, to the best of our knowledge, the first realization of a multi-wavelength distributed feedback (DFB) semiconductor laser array with an equivalent chirped grating profile based on equivalent chirp technology. All the lasers in the laser array have an identical grating period with an equivalent chirped grating structure, which are realized by nonuniform sampling of the gratings. Different wavelengths are achieved by changing the sampling functions. A multi-wavelength DFB semiconductor laser array is fabricated and the lasing performance is evaluated. The results show that the equivalent chirp technology is an effective solution for monolithic integration of a multi-wavelength laser array with potential for large volume fabrication.

  17. I know why you voted for Trump: (Over)inferring motives based on choice.

    PubMed

    Barasz, Kate; Kim, Tami; Evangelidis, Ioannis

    2018-05-10

    People often speculate about why others make the choices they do. This paper investigates how such inferences are formed as a function of what is chosen. Specifically, when observers encounter someone else's choice (e.g., of political candidate), they use the chosen option's attribute values (e.g., a candidate's specific stance on a policy issue) to infer the importance of that attribute (e.g., the policy issue) to the decision-maker. Consequently, when a chosen option has an attribute whose value is extreme (e.g., an extreme policy stance), observers infer-sometimes incorrectly-that this attribute disproportionately motivated the decision-maker's choice. Seven studies demonstrate how observers use an attribute's value to infer its weight-the value-weight heuristic-and identify the role of perceived diagnosticity: more extreme attribute values give observers the subjective sense that they know more about a decision-maker's preferences, and in turn, increase the attribute's perceived importance. The paper explores how this heuristic can produce erroneous inferences and influence broader beliefs about decision-makers. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. Inference Engine in an Intelligent Ship Course-Keeping System

    PubMed Central

    2017-01-01

    The article presents an original design of an expert system, whose function is to automatically stabilize ship's course. The focus is put on the inference engine, a mechanism that consists of two functional components. One is responsible for the construction of state space regions, implemented on the basis of properly processed signals recorded by sensors from the input and output of an object. The other component is responsible for generating a control decision based on the knowledge obtained in the first module. The computing experiments described herein prove the effective and correct operation of the proposed system. PMID:29317859

  19. Functional characterization of somatic mutations in cancer using network-based inference of protein activity | Office of Cancer Genomics

    Cancer.gov

    Identifying the multiple dysregulated oncoproteins that contribute to tumorigenesis in a given patient is crucial for developing personalized treatment plans. However, accurate inference of aberrant protein activity in biological samples is still challenging as genetic alterations are only partially predictive and direct measurements of protein activity are generally not feasible.

  20. Attachment comes of age: adolescents' narrative coherence and reflective functioning predict well-being in emerging adulthood.

    PubMed

    Borelli, Jessica L; Brugnera, Agostino; Zarbo, Cristina; Rabboni, Massimo; Bondi, Emi; Tasca, Giorgio A; Compare, Angelo

    2018-06-04

    This study investigated the effects of adolescents' attachment security and reflective functioning (RF) (assessed by the adult attachment interview [AAI]) in the prediction of well-being in adulthood. Adolescents (N = 79; M = 14.6 years old; SD = 3.5 years) completed the AAI at Time 1 (T1), which was subsequently coded for inferred attachment experiences, narrative coherence, and RF by three nonoverlapping teams of raters. Participants completed the Psychological General Well-being Index at T1 and 8 years later (Time 2, T2). Analyses showed that (a) both adolescent narrative coherence and RF were significant predictors of almost all indices of well-being at T2 in adulthood; (b) both narrative coherence and RF indirectly linked inferred loving parental care and T2 well-being; (c) when included in the same model, RF was a significant indirect effect linking inferred loving parental care and T2 well-being. These findings contribute to theory in suggesting that both RF and narrative coherence are predictive of subsequent psychological well-being and operate as links between inferred parental care and subsequent adjustment. Possible mechanisms underlying these findings are discussed.

  1. The Effects of Different Training Structures in the Establishment of Conditional Discriminations and Subsequent Performance on Tests for Stimulus Equivalence

    ERIC Educational Resources Information Center

    Arntzen, Erik; Grondahl, Terje; Eilifsen, Christoffer

    2010-01-01

    Previous studies comparing groups of subjects have indicated differential probabilities of stimulus equivalence outcome as a function of training structures. One-to-Many (OTM) and Many-to-One (MTO) training structures seem to produce positive outcomes on tests for stimulus equivalence more often than a Linear Series (LS) training structure does.…

  2. Development of an Algorithm for Automatic Analysis of the Impedance Spectrum Based on a Measurement Model

    NASA Astrophysics Data System (ADS)

    Kobayashi, Kiyoshi; Suzuki, Tohru S.

    2018-03-01

    A new algorithm for the automatic estimation of an equivalent circuit and the subsequent parameter optimization is developed by combining the data-mining concept and complex least-squares method. In this algorithm, the program generates an initial equivalent-circuit model based on the sampling data and then attempts to optimize the parameters. The basic hypothesis is that the measured impedance spectrum can be reproduced by the sum of the partial-impedance spectra presented by the resistor, inductor, resistor connected in parallel to a capacitor, and resistor connected in parallel to an inductor. The adequacy of the model is determined by using a simple artificial-intelligence function, which is applied to the output function of the Levenberg-Marquardt module. From the iteration of model modifications, the program finds an adequate equivalent-circuit model without any user input to the equivalent-circuit model.

  3. Ionospheric and Birkeland current distributions inferred from the MAGSAT magnetometer data

    NASA Technical Reports Server (NTRS)

    Zanetti, L. J.; Potemra, T. A.; Baumjohann, W.

    1983-01-01

    Ionospheric and field-aligned sheet current density distributions are presently inferred by means of MAGSAT vector magnetometer data, together with an accurate magnetic field model. By comparing Hall current densities inferred from the MAGSAT data and those inferred from simultaneously recorded ground based data acquired by the Scandinavian magnetometer array, it is determined that the former have previously been underestimated due to high damping of magnetic variations with high spatial wave numbers between the ionosphere and the MAGSAT orbit. Among important results of this study is noted the fact that the Birkeland and electrojet current systems are colocated. The analyses have shown a tendency for triangular rather than constant electrojet current distributions as a function of latitude, consistent with the statistical, uniform regions 1 and 2 Birkeland current patterns.

  4. Inferring Markov chains: Bayesian estimation, model comparison, entropy rate, and out-of-class modeling.

    PubMed

    Strelioff, Christopher C; Crutchfield, James P; Hübler, Alfred W

    2007-07-01

    Markov chains are a natural and well understood tool for describing one-dimensional patterns in time or space. We show how to infer kth order Markov chains, for arbitrary k , from finite data by applying Bayesian methods to both parameter estimation and model-order selection. Extending existing results for multinomial models of discrete data, we connect inference to statistical mechanics through information-theoretic (type theory) techniques. We establish a direct relationship between Bayesian evidence and the partition function which allows for straightforward calculation of the expectation and variance of the conditional relative entropy and the source entropy rate. Finally, we introduce a method that uses finite data-size scaling with model-order comparison to infer the structure of out-of-class processes.

  5. With or without you: predictive coding and Bayesian inference in the brain

    PubMed Central

    Aitchison, Laurence; Lengyel, Máté

    2018-01-01

    Two theoretical ideas have emerged recently with the ambition to provide a unifying functional explanation of neural population coding and dynamics: predictive coding and Bayesian inference. Here, we describe the two theories and their combination into a single framework: Bayesian predictive coding. We clarify how the two theories can be distinguished, despite sharing core computational concepts and addressing an overlapping set of empirical phenomena. We argue that predictive coding is an algorithmic / representational motif that can serve several different computational goals of which Bayesian inference is but one. Conversely, while Bayesian inference can utilize predictive coding, it can also be realized by a variety of other representations. We critically evaluate the experimental evidence supporting Bayesian predictive coding and discuss how to test it more directly. PMID:28942084

  6. Obfuscation Framework Based on Functionally Equivalent Combinatorial Logic Families

    DTIC Science & Technology

    2008-03-01

    of Defense, or the United States Government . AFIT/GCS/ENG/08-12 Obfuscation Framework Based on Functionally Equivalent Combinatorial Logic Families...time, United States policy strongly encourages the sale and transfer of some military equipment to foreign governments and makes it easier for...Proceedings of the International Conference on Availability, Reliability and Security, 2007. 14. McDonald, J. Todd and Alec Yasinsac. “Of unicorns and random

  7. Massive optimal data compression and density estimation for scalable, likelihood-free inference in cosmology

    NASA Astrophysics Data System (ADS)

    Alsing, Justin; Wandelt, Benjamin; Feeney, Stephen

    2018-07-01

    Many statistical models in cosmology can be simulated forwards but have intractable likelihood functions. Likelihood-free inference methods allow us to perform Bayesian inference from these models using only forward simulations, free from any likelihood assumptions or approximations. Likelihood-free inference generically involves simulating mock data and comparing to the observed data; this comparison in data space suffers from the curse of dimensionality and requires compression of the data to a small number of summary statistics to be tractable. In this paper, we use massive asymptotically optimal data compression to reduce the dimensionality of the data space to just one number per parameter, providing a natural and optimal framework for summary statistic choice for likelihood-free inference. Secondly, we present the first cosmological application of Density Estimation Likelihood-Free Inference (DELFI), which learns a parametrized model for joint distribution of data and parameters, yielding both the parameter posterior and the model evidence. This approach is conceptually simple, requires less tuning than traditional Approximate Bayesian Computation approaches to likelihood-free inference and can give high-fidelity posteriors from orders of magnitude fewer forward simulations. As an additional bonus, it enables parameter inference and Bayesian model comparison simultaneously. We demonstrate DELFI with massive data compression on an analysis of the joint light-curve analysis supernova data, as a simple validation case study. We show that high-fidelity posterior inference is possible for full-scale cosmological data analyses with as few as ˜104 simulations, with substantial scope for further improvement, demonstrating the scalability of likelihood-free inference to large and complex cosmological data sets.

  8. BOREAS HYD-2 Estimated Snow Water Equivalent (SWE) from Microwave Measurements

    NASA Technical Reports Server (NTRS)

    Powell, Hugh; Chang, Alfred T. C.; Hall, Forrest G. (Editor); Knapp, David E. (Editor); Smith, David E. (Technical Monitor)

    2000-01-01

    The surface meteorological data collected at the Boreal Ecosystem-Atmosphere Study (BOREAS) tower and ancillary sites are being used as inputs to an energy balance model to monitor the amount of snow storage in the boreal forest region. The BOREAS Hydrology (HYD)-2 team used Snow Water Equivalent (SWE) derived from an energy balance model and in situ observed SWE to compare the SWE inferred from airborne and spaceborne microwave data, and to assess the accuracy of microwave retrieval algorithms. The major external measurements that are needed are snowpack temperature profiles, in situ snow areal extent, and SWE data. The data in this data set were collected during February 1994 and cover portions of the Southern Study Area (SSA), Northern Study Area (NSA), and the transect areas. The data are available from BORIS as comma-delimited tabular ASCII files. The SWE data are available from the Earth Observing System Data and Information System (EOSDIS) Oak Ridge National Laboratory (ORNL) Distributed Active Archive Center (DAAC). The data files are available on a CD-ROM (see document number 20010000884).

  9. The space of ultrametric phylogenetic trees.

    PubMed

    Gavryushkin, Alex; Drummond, Alexei J

    2016-08-21

    The reliability of a phylogenetic inference method from genomic sequence data is ensured by its statistical consistency. Bayesian inference methods produce a sample of phylogenetic trees from the posterior distribution given sequence data. Hence the question of statistical consistency of such methods is equivalent to the consistency of the summary of the sample. More generally, statistical consistency is ensured by the tree space used to analyse the sample. In this paper, we consider two standard parameterisations of phylogenetic time-trees used in evolutionary models: inter-coalescent interval lengths and absolute times of divergence events. For each of these parameterisations we introduce a natural metric space on ultrametric phylogenetic trees. We compare the introduced spaces with existing models of tree space and formulate several formal requirements that a metric space on phylogenetic trees must possess in order to be a satisfactory space for statistical analysis, and justify them. We show that only a few known constructions of the space of phylogenetic trees satisfy these requirements. However, our results suggest that these basic requirements are not enough to distinguish between the two metric spaces we introduce and that the choice between metric spaces requires additional properties to be considered. Particularly, that the summary tree minimising the square distance to the trees from the sample might be different for different parameterisations. This suggests that further fundamental insight is needed into the problem of statistical consistency of phylogenetic inference methods. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  10. Flexible Retrieval: When True Inferences Produce False Memories

    PubMed Central

    Carpenter, Alexis C.; Schacter, Daniel L.

    2016-01-01

    Episodic memory involves flexible retrieval processes that allow us to link together distinct episodes, make novel inferences across overlapping events, and recombine elements of past experiences when imagining future events. However, the same flexible retrieval and recombination processes that underpin these adaptive functions may also leave memory prone to error or distortion, such as source misattributions in which details of one event are mistakenly attributed to another related event. To determine whether the same recombination-related retrieval mechanism supports both successful inference and source memory errors, we developed a modified version of an associative inference paradigm in which participants encoded everyday scenes comprised of people, objects, and other contextual details. These scenes contained overlapping elements (AB, BC) that could later be linked to support novel inferential retrieval regarding elements that had not appeared together previously (AC). Our critical experimental manipulation concerned whether contextual details were probed before or after the associative inference test, thereby allowing us to assess whether a) false memories increased for successful versus unsuccessful inferences, and b) any such effects were specific to after as compared to before participants received the inference test. In each of four experiments that used variants of this paradigm, participants were more susceptible to false memories for contextual details after successful than unsuccessful inferential retrieval, but only when contextual details were probed after the associative inference test. These results suggest that the retrieval-mediated recombination mechanism that underlies associative inference also contributes to source misattributions that result from combining elements of distinct episodes. PMID:27918169

  11. Inverse Function: Pre-Service Teachers' Techniques and Meanings

    ERIC Educational Resources Information Center

    Paoletti, Teo; Stevens, Irma E.; Hobson, Natalie L. F.; Moore, Kevin C.; LaForest, Kevin R.

    2018-01-01

    Researchers have argued teachers and students are not developing connected meanings for function inverse, thus calling for a closer examination of teachers' and students' inverse function meanings. Responding to this call, we characterize 25 pre-service teachers' inverse function meanings as inferred from our analysis of clinical interviews. After…

  12. Executive Functions in Adolescence: Inferences from Brain and Behavior

    ERIC Educational Resources Information Center

    Crone, Eveline A.

    2009-01-01

    Despite the advances in understanding cognitive improvements in executive function in adolescence, much less is known about the influence of affective and social modulators on executive function and the biological underpinnings of these functions and sensitivities. Here, recent behavioral and neuroscientific studies are summarized that have used…

  13. LANDSAT-D investigations in snow hydrology

    NASA Technical Reports Server (NTRS)

    Dozier, J. (Principal Investigator)

    1982-01-01

    Snow reflectance in all 6 TM reflective bands, i.e., 1, 2, 3, 4, 5, and 7 was simulated using a delta-Eddington model. Snow reflectance in bands 4, 5, and 7 appear sensitive to grain size. It appears that the TM filters resemble a ""square-wave'' closely enough that a square-wave can be assumed in calculations. Integrated band reflectance over the actual response functions was calculated using sensor data supplied by Santa Barbara Research Center. Differences between integrating over the actual response functions and the equivalent square wave were negligible. Tables are given which show (1) sensor saturation radiance as a percentage of the solar constant, integrated through the band response function; (2) comparisons of integrations through the sensor response function with integrations over the equivalent square wave; and (3) calculations of integrated reflectance for snow over all reflective TM bands, and water and ice clouds with thickness of 1 mm water equivalent over TM bands 5 and 7. These calculations look encouraging for snow/cloud discrimination with TM bands 5 and 7.

  14. Integrating hydrogeophysics and hydrological tracers to characterise the spatial structure of groundwater storage in the critical zone of montane environments

    NASA Astrophysics Data System (ADS)

    Dick, J.; Tetzlaff, D.; Bradford, J.; Soulsby, C.

    2015-12-01

    It is increasingly recognised that groundwater (GW) in montane watersheds has a major influence on the distribution of vegetation communities and ecosystem function, as well as sustaining downstream river flows. In glaciated landscapes, complex and heterogenous drift deposits can have a dominant influence on GW stores and fluxes, and form a poorly understood component of the critical zone. Given the logistical problems and limitations of drilling observation wells in such terrain, hydrogeophysics has outstanding potential to help characterise aquifer structure and understand shallow GW in the critical zone of montane environments. We present the results of electrical resistivity tomography (ERT) surveys in an intensively monitored 3.2km2 watershed in the Scottish Highlands with a strong glacial past. We sought to characterise the structure and spatial organisation of GW stores in diverse quaternary drift deposits. This utilized distributed ERT transects that provided a basis for spatial interpolation using geostatistical methods and high resolution LiDAR surveys. Some transects coincided with shallow observation wells that were used to "ground-truth" the inversion of resistivity data. The surveys showed that the drifts covered around 70% of the catchment and varied from 5m deep on the hillslopes to 40m in the valleys. The water table was within 0.2m of the soil surface in the valley bottom areas and about 1.5m deep on steeper hillslopes. The water content of drifts inferred by the ERT surveys and characterisation of the aquifer properties showed highest water content in the peat (~80%) and basal till (20-30%), and low storage in moraine deposits (10%). Upscaling these estimates of inferred storage to the catchment scale indicated around ~2-3 m of GW storage, equivalent to around 4-6 years of effective precipitation. This generally compared well with independent storage estimates inferred from long-term stable isotope time series collected from the aquifers. Elucidating the importance of the critical zone for water storage in montane environments using ERT provides a basis for predicting their likely resistance and resilience to environmental change. This is of practical importance in the Scottish uplands where both climate and land use change are likely to have implications for water availability.

  15. Systematic inference of functional phosphorylation events in yeast metabolism.

    PubMed

    Chen, Yu; Wang, Yonghong; Nielsen, Jens

    2017-07-01

    Protein phosphorylation is a post-translational modification that affects proteins by changing their structure and conformation in a rapid and reversible way, and it is an important mechanism for metabolic regulation in cells. Phosphoproteomics enables high-throughput identification of phosphorylation events on metabolic enzymes, but identifying functional phosphorylation events still requires more detailed biochemical characterization. Therefore, development of computational methods for investigating unknown functions of a large number of phosphorylation events identified by phosphoproteomics has received increased attention. We developed a mathematical framework that describes the relationship between phosphorylation level of a metabolic enzyme and the corresponding flux through the enzyme. Using this framework, it is possible to quantitatively estimate contribution of phosphorylation events to flux changes. We showed that phosphorylation regulation analysis, combined with a systematic workflow and correlation analysis, can be used for inference of functional phosphorylation events in steady and dynamic conditions, respectively. Using this analysis, we assigned functionality to phosphorylation events of 17 metabolic enzymes in the yeast Saccharomyces cerevisiae , among which 10 are novel. Phosphorylation regulation analysis cannot only be extended for inference of other functional post-translational modifications but also be a promising scaffold for multi-omics data integration in systems biology. Matlab codes for flux balance analysis in this study are available in Supplementary material. yhwang@ecust.edu.cn or nielsenj@chalmers.se. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com

  16. Deficits in social perception in opioid maintenance patients, abstinent opioid users and non-opioid users.

    PubMed

    McDonald, Skye; Darke, Shane; Kaye, Sharlene; Torok, Michelle

    2013-03-01

    This study aimed to compare emotion perception and social inference in opioid maintenance patients with abstinent ex-users and non-heroin-using controls, and determine whether any deficits in could be accounted for by cognitive deficits and/or risk factors for brain damage. Case-control. Sydney, Australia. A total of 125 maintenance patients (MAIN), 50 abstinent opiate users (ABST) and 50 matched controls (CON). The Awareness of Social Inference Test (TASIT) was used to measure emotion perception and social inference. Measures were also taken of executive function, working memory, information processing speed, verbal/non-verbal learning and psychological distress. After adjusting for age, sex, pre-morbid IQ and psychological distress, the MAIN group was impaired relative to CON (β = -0.19, P < 0.05) and ABST (β = -0.19, P < 0.05) on emotion perception and relative to CON (β = -0.25, P < 0.001) and ABST (β = -0.24, P < 0.01) on social inference. In neither case did the CON and ABST groups differ. For both emotion perception (P < 0.001) and social inference (P < 0.001), pre-morbid IQ was a significant independent predictor. Cognitive function was a major predictor of poor emotion perception (β = -0.44, P < 0.001) and social inference (β = -0.48, P < 0.001). Poor emotion recognition was also predicted by number of heroin overdoses (β = -0.14, P < 0.05). Neither time in treatment or type of maintenance medication (methadone or buprenorphine) were related to performance. People in opioid maintenance treatment may have an impaired capacity for emotion perception and ability to make inferences about social situations. © 2012 The Authors, Addiction © 2012 Society for the Study of Addiction.

  17. Impact of Prematurity and Perinatal Antibiotics on the Developing Intestinal Microbiota: A Functional Inference Study.

    PubMed

    Arboleya, Silvia; Sánchez, Borja; Solís, Gonzalo; Fernández, Nuria; Suárez, Marta; Hernández-Barranco, Ana M; Milani, Christian; Margolles, Abelardo; de Los Reyes-Gavilán, Clara G; Ventura, Marco; Gueimonde, Miguel

    2016-04-29

    The microbial colonization of the neonatal gut provides a critical stimulus for normal maturation and development. This process of early microbiota establishment, known to be affected by several factors, constitutes an important determinant for later health. We studied the establishment of the microbiota in preterm and full-term infants and the impact of perinatal antibiotics upon this process in premature babies. To this end, 16S rRNA gene sequence-based microbiota assessment was performed at phylum level and functional inference analyses were conducted. Moreover, the levels of the main intestinal microbial metabolites, the short-chain fatty acids (SCFA) acetate, propionate and butyrate, were measured by Gas-Chromatography Flame ionization/Mass spectrometry detection. Prematurity affects microbiota composition at phylum level, leading to increases of Proteobacteria and reduction of other intestinal microorganisms. Perinatal antibiotic use further affected the microbiota of the preterm infant. These changes involved a concomitant alteration in the levels of intestinal SCFA. Moreover, functional inference analyses allowed for identifying metabolic pathways potentially affected by prematurity and perinatal antibiotics use. A deficiency or delay in the establishment of normal microbiota function seems to be present in preterm infants. Perinatal antibiotic use, such as intrapartum prophylaxis, affected the early life microbiota establishment in preterm newborns, which may have consequences for later health.

  18. End of the Little Ice Age in the Alps forced by industrial black carbon

    PubMed Central

    Painter, Thomas H.; Flanner, Mark G.; Kaser, Georg; Marzeion, Ben; VanCuren, Richard A.; Abdalati, Waleed

    2013-01-01

    Glaciers in the European Alps began to retreat abruptly from their mid-19th century maximum, marking what appeared to be the end of the Little Ice Age. Alpine temperature and precipitation records suggest that glaciers should instead have continued to grow until circa 1910. Radiative forcing by increasing deposition of industrial black carbon to snow may represent the driver of the abrupt glacier retreats in the Alps that began in the mid-19th century. Ice cores indicate that black carbon concentrations increased abruptly in the mid-19th century and largely continued to increase into the 20th century, consistent with known increases in black carbon emissions from the industrialization of Western Europe. Inferred annual surface radiative forcings increased stepwise to 13–17 W⋅m−2 between 1850 and 1880, and to 9–22 W⋅m−2 in the early 1900s, with snowmelt season (April/May/June) forcings reaching greater than 35 W⋅m−2 by the early 1900s. These snowmelt season radiative forcings would have resulted in additional annual snow melting of as much as 0.9 m water equivalent across the melt season. Simulations of glacier mass balances with radiative forcing-equivalent changes in atmospheric temperatures result in conservative estimates of accumulating negative mass balances of magnitude −15 m water equivalent by 1900 and −30 m water equivalent by 1930, magnitudes and timing consistent with the observed retreat. These results suggest a possible physical explanation for the abrupt retreat of glaciers in the Alps in the mid-19th century that is consistent with existing temperature and precipitation records and reconstructions. PMID:24003138

  19. End of the Little Ice Age in the Alps forced by industrial black carbon.

    PubMed

    Painter, Thomas H; Flanner, Mark G; Kaser, Georg; Marzeion, Ben; VanCuren, Richard A; Abdalati, Waleed

    2013-09-17

    Glaciers in the European Alps began to retreat abruptly from their mid-19th century maximum, marking what appeared to be the end of the Little Ice Age. Alpine temperature and precipitation records suggest that glaciers should instead have continued to grow until circa 1910. Radiative forcing by increasing deposition of industrial black carbon to snow may represent the driver of the abrupt glacier retreats in the Alps that began in the mid-19th century. Ice cores indicate that black carbon concentrations increased abruptly in the mid-19th century and largely continued to increase into the 20th century, consistent with known increases in black carbon emissions from the industrialization of Western Europe. Inferred annual surface radiative forcings increased stepwise to 13-17 W⋅m(-2) between 1850 and 1880, and to 9-22 W⋅m(-2) in the early 1900s, with snowmelt season (April/May/June) forcings reaching greater than 35 W⋅m(-2) by the early 1900s. These snowmelt season radiative forcings would have resulted in additional annual snow melting of as much as 0.9 m water equivalent across the melt season. Simulations of glacier mass balances with radiative forcing-equivalent changes in atmospheric temperatures result in conservative estimates of accumulating negative mass balances of magnitude -15 m water equivalent by 1900 and -30 m water equivalent by 1930, magnitudes and timing consistent with the observed retreat. These results suggest a possible physical explanation for the abrupt retreat of glaciers in the Alps in the mid-19th century that is consistent with existing temperature and precipitation records and reconstructions.

  20. Magnetic Footpoint Velocities: A Combination Of Minimum Energy Fit AndLocal Correlation Tracking

    NASA Astrophysics Data System (ADS)

    Belur, Ravindra; Longcope, D.

    2006-06-01

    Many numerical and time dependent MHD simulations of the solar atmosphererequire the underlying velocity fields which should be consistent with theinduction equation. Recently, Longcope (2004) introduced a new techniqueto infer the photospheric velocity field from sequence of vector magnetogramswhich are in agreement with the induction equation. The method, the Minimum Energy Fit (MEF), determines a set of velocities and selects the velocity which is smallest overall flow speed by minimizing an energy functional. The inferred velocity can be further constrained by information aboutthe velocity inferred from other techniques. With this adopted techniquewe would expect that the inferred velocity will be close to the photospheric velocity of magnetic footpoints. Here, we demonstrate that the inferred horizontal velocities from LCT can be used to constrain the MEFvelocities. We also apply this technique to actual vector magnetogramsequences and compare these velocities with velocities from LCT alone.This work is supported by DoD MURI and NSF SHINE programs.

  1. The influence of category coherence on inference about cross-classified entities.

    PubMed

    Patalano, Andrea L; Wengrovitz, Steven M; Sharpes, Kirsten M

    2009-01-01

    A critical function of categories is their use in property inference (Heit, 2000). However, one challenge to using categories in inference is that most entities in the world belong to multiple categories (e.g., Fido could be a dog, a pet, a mammal, or a security system). Building on Patalano, Chin-Parker, and Ross (2006), we tested the hypothesis that category coherence (the extent to which category features go together in light of prior knowledge) influences the selection of categories for use in property inference about cross-classified entities. In two experiments, we directly contrasted coherent and incoherent categories, both of which included cross-classified entities as members, and we found that the coherent categories were used more readily as the source of both property transfer and property extension. We conclude that category coherence, which has been found to be a potent influence on strength of inference for singly classified entities (Rehder & Hastie, 2004), is also central to category use in reasoning about novel cross-classified ones.

  2. Rethinking fast and slow based on a critique of reaction-time reverse inference

    PubMed Central

    Krajbich, Ian; Bartling, Björn; Hare, Todd; Fehr, Ernst

    2015-01-01

    Do people intuitively favour certain actions over others? In some dual-process research, reaction-time (RT) data have been used to infer that certain choices are intuitive. However, the use of behavioural or biological measures to infer mental function, popularly known as ‘reverse inference', is problematic because it does not take into account other sources of variability in the data, such as discriminability of the choice options. Here we use two example data sets obtained from value-based choice experiments to demonstrate that, after controlling for discriminability (that is, strength-of-preference), there is no evidence that one type of choice is systematically faster than the other. Moreover, using specific variations of a prominent value-based choice experiment, we are able to predictably replicate, eliminate or reverse previously reported correlations between RT and selfishness. Thus, our findings shed crucial light on the use of RT in inferring mental processes and strongly caution against using RT differences as evidence favouring dual-process accounts. PMID:26135809

  3. Affective expressions in groups and inferences about members' relational well-being: The effects of socially engaging and disengaging emotions.

    PubMed

    Rothman, Naomi B; Magee, Joe C

    2016-01-01

    Our findings draw attention to the interpersonal communication function of a relatively unexplored dimension of emotions-the level of social engagement versus disengagement. In four experiments, regardless of valence and target group gender, observers infer greater relational well-being (more cohesiveness and less conflict) between group members from socially engaging (sadness and appreciation) versus disengaging (anger and pride) emotion expressions. Supporting our argument that social (dis)engagement is a critical dimension communicated by these emotions, we demonstrate (1) that inferences about group members' self-interest mediate the effect of socially engaging emotions on cohesiveness and (2) that the influence of socially disengaging emotion expressions on inferences of conflict is attenuated when groups have collectivistic norms (i.e., members value a high level of social engagement). Furthermore, we show an important downstream consequence of these inferences of relational well-being: Groups that seem less cohesive because of their members' proud (versus appreciative) expressions are also expected to have worse task performance.

  4. Rethinking fast and slow based on a critique of reaction-time reverse inference.

    PubMed

    Krajbich, Ian; Bartling, Björn; Hare, Todd; Fehr, Ernst

    2015-07-02

    Do people intuitively favour certain actions over others? In some dual-process research, reaction-time (RT) data have been used to infer that certain choices are intuitive. However, the use of behavioural or biological measures to infer mental function, popularly known as 'reverse inference', is problematic because it does not take into account other sources of variability in the data, such as discriminability of the choice options. Here we use two example data sets obtained from value-based choice experiments to demonstrate that, after controlling for discriminability (that is, strength-of-preference), there is no evidence that one type of choice is systematically faster than the other. Moreover, using specific variations of a prominent value-based choice experiment, we are able to predictably replicate, eliminate or reverse previously reported correlations between RT and selfishness. Thus, our findings shed crucial light on the use of RT in inferring mental processes and strongly caution against using RT differences as evidence favouring dual-process accounts.

  5. minet: A R/Bioconductor package for inferring large transcriptional networks using mutual information.

    PubMed

    Meyer, Patrick E; Lafitte, Frédéric; Bontempi, Gianluca

    2008-10-29

    This paper presents the R/Bioconductor package minet (version 1.1.6) which provides a set of functions to infer mutual information networks from a dataset. Once fed with a microarray dataset, the package returns a network where nodes denote genes, edges model statistical dependencies between genes and the weight of an edge quantifies the statistical evidence of a specific (e.g transcriptional) gene-to-gene interaction. Four different entropy estimators are made available in the package minet (empirical, Miller-Madow, Schurmann-Grassberger and shrink) as well as four different inference methods, namely relevance networks, ARACNE, CLR and MRNET. Also, the package integrates accuracy assessment tools, like F-scores, PR-curves and ROC-curves in order to compare the inferred network with a reference one. The package minet provides a series of tools for inferring transcriptional networks from microarray data. It is freely available from the Comprehensive R Archive Network (CRAN) as well as from the Bioconductor website.

  6. Network inference from functional experimental data (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Desrosiers, Patrick; Labrecque, Simon; Tremblay, Maxime; Bélanger, Mathieu; De Dorlodot, Bertrand; Côté, Daniel C.

    2016-03-01

    Functional connectivity maps of neuronal networks are critical tools to understand how neurons form circuits, how information is encoded and processed by neurons, how memory is shaped, and how these basic processes are altered under pathological conditions. Current light microscopy allows to observe calcium or electrical activity of thousands of neurons simultaneously, yet assessing comprehensive connectivity maps directly from such data remains a non-trivial analytical task. There exist simple statistical methods, such as cross-correlation and Granger causality, but they only detect linear interactions between neurons. Other more involved inference methods inspired by information theory, such as mutual information and transfer entropy, identify more accurately connections between neurons but also require more computational resources. We carried out a comparative study of common connectivity inference methods. The relative accuracy and computational cost of each method was determined via simulated fluorescence traces generated with realistic computational models of interacting neurons in networks of different topologies (clustered or non-clustered) and sizes (10-1000 neurons). To bridge the computational and experimental works, we observed the intracellular calcium activity of live hippocampal neuronal cultures infected with the fluorescent calcium marker GCaMP6f. The spontaneous activity of the networks, consisting of 50-100 neurons per field of view, was recorded from 20 to 50 Hz on a microscope controlled by a homemade software. We implemented all connectivity inference methods in the software, which rapidly loads calcium fluorescence movies, segments the images, extracts the fluorescence traces, and assesses the functional connections (with strengths and directions) between each pair of neurons. We used this software to assess, in real time, the functional connectivity from real calcium imaging data in basal conditions, under plasticity protocols, and epileptic conditions.

  7. Protective measurement of the wave function of a single squeezed harmonic-oscillator state

    NASA Astrophysics Data System (ADS)

    Alter, Orly; Yamamoto, Yoshihisa

    1996-05-01

    A scheme for the "protective measurement"

    [Phys. Rev. A 47, 4616 (1993)]
    of the wave function of a squeezed harmonic-oscillator state is described. This protective measurement is shown to be equivalent to a measurement of an ensemble of states. The protective measurement, therefore, allows for a definition of the quantum wave function on a single system. Yet, this equivalency also suggests that both measurement schemes account for the epistemological meaning of the wave function only. The protective measurement requires a full a priori knowledge of the measured state. The intermediate cases, in which only partial a priori information is given, are also discussed.

  8. Evaluating ecological equivalence of created marshes: comparing structural indicators with stable isotope indicators of blue crab trophic support

    USGS Publications Warehouse

    Llewellyn, Chris; LaPeyre, Megan K.

    2010-01-01

    This study sought to examine ecological equivalence of created marshes of different ages using traditional structural measures of equivalence, and tested a relatively novel approach using stable isotopes as a measure of functional equivalence. We compared soil properties, vegetation, nekton communities, and δ13C and δ15N isotope values of blue crab muscle and hepatopancreas tissue and primary producers at created (5-24 years old) and paired reference marshes in SW Louisiana. Paired contrasts indicated that created and reference marshes supported equivalent plant and nekton communities, but differed in soil characteristics. Stable isotope indicators examining blue crab food web support found that the older marshes (8 years+) were characterized by comparable trophic diversity and breadth compared to their reference marshes. Interpretation of results for the youngest site was confounded by the fact that the paired reference, which represented the desired end goal of restoration, contained a greater diversity of basal resources. Stable isotope techniques may give coastal managers an additional tool to assess functional equivalency of created marshes, as measured by trophic support, but may be limited to comparisons of marshes with similar vegetative communities and basal resources, or require the development of robust standardization techniques.

  9. Abnormal agency experiences in schizophrenia patients: Examining the role of psychotic symptoms and familial risk.

    PubMed

    Prikken, Merel; van der Weiden, Anouk; Renes, Robert A; Koevoets, Martijn G J C; Heering, Henriette D; Kahn, René S; Aarts, Henk; van Haren, Neeltje E M

    2017-04-01

    Experiencing self-agency over one's own action outcomes is essential for social functioning. Recent research revealed that patients with schizophrenia do not use implicitly available information about their action-outcomes (i.e., prime-based agency inference) to arrive at self-agency experiences. Here, we examined whether this is related to symptoms and/or familial risk to develop the disease. Fifty-four patients, 54 controls, and 19 unaffected (and unrelated) siblings performed an agency inference task, in which experienced agency was measured over action-outcomes that matched or mismatched outcome-primes that were presented before action performance. The Positive and Negative Syndrome Scale (PANSS) and Comprehensive Assessment of Symptoms and History (CASH) were administered to assess psychopathology. Impairments in prime-based inferences did not differ between patients with symptoms of over- and underattribution. However, patients with agency underattribution symptoms reported significantly lower overall self-agency experiences. Siblings displayed stronger prime-based agency inferences than patients, but weaker prime-based inferences than healthy controls. However, these differences were not statistically significant. Findings suggest that impairments in prime-based agency inferences may be a trait characteristic of schizophrenia. Moreover, this study may stimulate further research on the familial basis and the clinical relevance of impairments in implicit agency inferences. Copyright © 2017 Elsevier Ireland Ltd. All rights reserved.

  10. Annotation of gene function in citrus using gene expression information and co-expression networks

    PubMed Central

    2014-01-01

    Background The genus Citrus encompasses major cultivated plants such as sweet orange, mandarin, lemon and grapefruit, among the world’s most economically important fruit crops. With increasing volumes of transcriptomics data available for these species, Gene Co-expression Network (GCN) analysis is a viable option for predicting gene function at a genome-wide scale. GCN analysis is based on a “guilt-by-association” principle whereby genes encoding proteins involved in similar and/or related biological processes may exhibit similar expression patterns across diverse sets of experimental conditions. While bioinformatics resources such as GCN analysis are widely available for efficient gene function prediction in model plant species including Arabidopsis, soybean and rice, in citrus these tools are not yet developed. Results We have constructed a comprehensive GCN for citrus inferred from 297 publicly available Affymetrix Genechip Citrus Genome microarray datasets, providing gene co-expression relationships at a genome-wide scale (33,000 transcripts). The comprehensive citrus GCN consists of a global GCN (condition-independent) and four condition-dependent GCNs that survey the sweet orange species only, all citrus fruit tissues, all citrus leaf tissues, or stress-exposed plants. All of these GCNs are clustered using genome-wide, gene-centric (guide) and graph clustering algorithms for flexibility of gene function prediction. For each putative cluster, gene ontology (GO) enrichment and gene expression specificity analyses were performed to enhance gene function, expression and regulation pattern prediction. The guide-gene approach was used to infer novel roles of genes involved in disease susceptibility and vitamin C metabolism, and graph-clustering approaches were used to investigate isoprenoid/phenylpropanoid metabolism in citrus peel, and citric acid catabolism via the GABA shunt in citrus fruit. Conclusions Integration of citrus gene co-expression networks, functional enrichment analysis and gene expression information provide opportunities to infer gene function in citrus. We present a publicly accessible tool, Network Inference for Citrus Co-Expression (NICCE, http://citrus.adelaide.edu.au/nicce/home.aspx), for the gene co-expression analysis in citrus. PMID:25023870

  11. PanFP: Pangenome-based functional profiles for microbial communities

    DOE PAGES

    Jun, Se -Ran; Hauser, Loren John; Schadt, Christopher Warren; ...

    2015-09-26

    For decades there has been increasing interest in understanding the relationships between microbial communities and ecosystem functions. Current DNA sequencing technologies allows for the exploration of microbial communities in two principle ways: targeted rRNA gene surveys and shotgun metagenomics. For large study designs, it is often still prohibitively expensive to sequence metagenomes at both the breadth and depth necessary to statistically capture the true functional diversity of a community. Although rRNA gene surveys provide no direct evidence of function, they do provide a reasonable estimation of microbial diversity, while being a very cost effective way to screen samples of interestmore » for later shotgun metagenomic analyses. However, there is a great deal of 16S rRNA gene survey data currently available from diverse environments, and thus a need for tools to infer functional composition of environmental samples based on 16S rRNA gene survey data. As a result, we present a computational method called pangenome based functional profiles (PanFP), which infers functional profiles of microbial communities from 16S rRNA gene survey data for Bacteria and Archaea. PanFP is based on pangenome reconstruction of a 16S rRNA gene operational taxonomic unit (OTU) from known genes and genomes pooled from the OTU s taxonomic lineage. From this lineage, we derive an OTU functional profile by weighting a pangenome s functional profile with the OTUs abundance observed in a given sample. We validated our method by comparing PanFP to the functional profiles obtained from the direct shotgun metagenomic measurement of 65 diverse communities via Spearman correlation coefficients. These correlations improved with increasing sequencing depth, within the range of 0.8 0.9 for the most deeply sequenced Human Microbiome Project mock community samples. PanFP is very similar in performance to another recently released tool, PICRUSt, for almost all of survey data analysed here. But, our method is unique in that any OTU building method can be used, as opposed to being limited to closed reference OTU picking strategies against specific reference sequence databases. In conclusion, we developed an automated computational method, which derives an inferred functional profile based on the 16S rRNA gene surveys of microbial communities. The inferred functional profile provides a cost effective way to study complex ecosystems through predicted comparative functional metagenomes and metadata analysis. All PanFP source code and additional documentation are freely available online at GitHub.« less

  12. PanFP: pangenome-based functional profiles for microbial communities.

    PubMed

    Jun, Se-Ran; Robeson, Michael S; Hauser, Loren J; Schadt, Christopher W; Gorin, Andrey A

    2015-09-26

    For decades there has been increasing interest in understanding the relationships between microbial communities and ecosystem functions. Current DNA sequencing technologies allows for the exploration of microbial communities in two principle ways: targeted rRNA gene surveys and shotgun metagenomics. For large study designs, it is often still prohibitively expensive to sequence metagenomes at both the breadth and depth necessary to statistically capture the true functional diversity of a community. Although rRNA gene surveys provide no direct evidence of function, they do provide a reasonable estimation of microbial diversity, while being a very cost-effective way to screen samples of interest for later shotgun metagenomic analyses. However, there is a great deal of 16S rRNA gene survey data currently available from diverse environments, and thus a need for tools to infer functional composition of environmental samples based on 16S rRNA gene survey data. We present a computational method called pangenome-based functional profiles (PanFP), which infers functional profiles of microbial communities from 16S rRNA gene survey data for Bacteria and Archaea. PanFP is based on pangenome reconstruction of a 16S rRNA gene operational taxonomic unit (OTU) from known genes and genomes pooled from the OTU's taxonomic lineage. From this lineage, we derive an OTU functional profile by weighting a pangenome's functional profile with the OTUs abundance observed in a given sample. We validated our method by comparing PanFP to the functional profiles obtained from the direct shotgun metagenomic measurement of 65 diverse communities via Spearman correlation coefficients. These correlations improved with increasing sequencing depth, within the range of 0.8-0.9 for the most deeply sequenced Human Microbiome Project mock community samples. PanFP is very similar in performance to another recently released tool, PICRUSt, for almost all of survey data analysed here. But, our method is unique in that any OTU building method can be used, as opposed to being limited to closed-reference OTU picking strategies against specific reference sequence databases. We developed an automated computational method, which derives an inferred functional profile based on the 16S rRNA gene surveys of microbial communities. The inferred functional profile provides a cost effective way to study complex ecosystems through predicted comparative functional metagenomes and metadata analysis. All PanFP source code and additional documentation are freely available online at GitHub ( https://github.com/srjun/PanFP ).

  13. A Game-Theoretic Approach to Information-Flow Control via Protocol Composition

    NASA Astrophysics Data System (ADS)

    Alvim, Mário; Chatzikokolakis, Konstantinos; Kawamoto, Yusuke; Palamidessi, Catuscia

    2018-05-01

    In the inference attacks studied in Quantitative Information Flow (QIF), the attacker typically tries to interfere with the system in the attempt to increase its leakage of secret information. The defender, on the other hand, typically tries to decrease leakage by introducing some controlled noise. This noise introduction can be modeled as a type of protocol composition, i.e., a probabilistic choice among different protocols, and its effect on the amount of leakage depends heavily on whether or not this choice is visible to the attacker. In this work, we consider operators for modeling visible and hidden choice in protocol composition, and we study their algebraic properties. We then formalize the interplay between defender and attacker in a game-theoretic framework adapted to the specific issues of QIF, where the payoff is information leakage. We consider various kinds of leakage games, depending on whether players act simultaneously or sequentially, and on whether or not the choices of the defender are visible to the attacker. In the case of sequential games, the choice of the second player is generally a function of the choice of the first player, and his/her probabilistic choice can be either over the possible functions (mixed strategy) or it can be on the result of the function (behavioral strategy). We show that when the attacker moves first in a sequential game with a hidden choice, then behavioral strategies are more advantageous for the defender than mixed strategies. This contrasts with the standard game theory, where the two types of strategies are equivalent. Finally, we establish a hierarchy of these games in terms of their information leakage and provide methods for finding optimal strategies (at the points of equilibrium) for both attacker and defender in the various cases.

  14. De novo inference of protein function from coarse-grained dynamics.

    PubMed

    Bhadra, Pratiti; Pal, Debnath

    2014-10-01

    Inference of molecular function of proteins is the fundamental task in the quest for understanding cellular processes. The task is getting increasingly difficult with thousands of new proteins discovered each day. The difficulty arises primarily due to lack of high-throughput experimental technique for assessing protein molecular function, a lacunae that computational approaches are trying hard to fill. The latter too faces a major bottleneck in absence of clear evidence based on evolutionary information. Here we propose a de novo approach to annotate protein molecular function through structural dynamics match for a pair of segments from two dissimilar proteins, which may share even <10% sequence identity. To screen these matches, corresponding 1 µs coarse-grained (CG) molecular dynamics trajectories were used to compute normalized root-mean-square-fluctuation graphs and select mobile segments, which were, thereafter, matched for all pairs using unweighted three-dimensional autocorrelation vectors. Our in-house custom-built forcefield (FF), extensively validated against dynamics information obtained from experimental nuclear magnetic resonance data, was specifically used to generate the CG dynamics trajectories. The test for correspondence of dynamics-signature of protein segments and function revealed 87% true positive rate and 93.5% true negative rate, on a dataset of 60 experimentally validated proteins, including moonlighting proteins and those with novel functional motifs. A random test against 315 unique fold/function proteins for a negative test gave >99% true recall. A blind prediction on a novel protein appears consistent with additional evidences retrieved therein. This is the first proof-of-principle of generalized use of structural dynamics for inferring protein molecular function leveraging our custom-made CG FF, useful to all. © 2014 Wiley Periodicals, Inc.

  15. Flexible retrieval: When true inferences produce false memories.

    PubMed

    Carpenter, Alexis C; Schacter, Daniel L

    2017-03-01

    Episodic memory involves flexible retrieval processes that allow us to link together distinct episodes, make novel inferences across overlapping events, and recombine elements of past experiences when imagining future events. However, the same flexible retrieval and recombination processes that underpin these adaptive functions may also leave memory prone to error or distortion, such as source misattributions in which details of one event are mistakenly attributed to another related event. To determine whether the same recombination-related retrieval mechanism supports both successful inference and source memory errors, we developed a modified version of an associative inference paradigm in which participants encoded everyday scenes comprised of people, objects, and other contextual details. These scenes contained overlapping elements (AB, BC) that could later be linked to support novel inferential retrieval regarding elements that had not appeared together previously (AC). Our critical experimental manipulation concerned whether contextual details were probed before or after the associative inference test, thereby allowing us to assess whether (a) false memories increased for successful versus unsuccessful inferences, and (b) any such effects were specific to after compared with before participants received the inference test. In each of 4 experiments that used variants of this paradigm, participants were more susceptible to false memories for contextual details after successful than unsuccessful inferential retrieval, but only when contextual details were probed after the associative inference test. These results suggest that the retrieval-mediated recombination mechanism that underlies associative inference also contributes to source misattributions that result from combining elements of distinct episodes. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  16. Inferential functioning in visually impaired children.

    PubMed

    Puche-Navarro, Rebeca; Millán, Rafael

    2007-01-01

    The current study explores the inferential abilities of visually impaired children in a task presented in two formats, manipulative and verbal. The results showed that in the group of visually impaired children, just as with children with normal sight, there was a wide range of inference types. It was found that the visually impaired children perform slightly better in the use of inductive and relational inferences in the verbal format, while in the manipulative format children with normal sight perform better. These results suggest that in inferential functioning of young children, and especially visually impaired children, the format of the task influences performance more than the child's visual ability.

  17. Participant characteristics and observed support in conversations involving people with communication disorders.

    PubMed

    Eriksson, Karin; Hartelius, Lena; Saldert, Charlotta

    2016-10-01

    Communication partner training is an increasingly common approach to improve the possibilities for people with communication disorders to participate in everyday interaction. So far, though, little is known about what conversation partner characteristics might influence the ability to be a supportive partner in conversation. The current study explored possible associations between the observed skill to support a person with communication difficulties in conversation and the following characteristics of the conversation partner; executive function, inference ability, age, education level and relationship to the person with communication disorder. The impact of the aetiology of the communication difficulties was also explored. Thirty-five dyads participated: 23 people with aphasia along with 18 significant others and five enrolled nurses and 12 people with Parkinson's disease along with 10 significant others and two enrolled nurses. Only tendencies of associations were found between observed skill to support conversation and executive function for the significant others and inference ability for the enrolled nurses. Although type of activity involved in the conversation may be a key factor, the results indicate that executive function and ability to make mental inferences may matter for the ability to support a person with communication disorder in conversation.

  18. Inferring Phylogenetic Networks Using PhyloNet.

    PubMed

    Wen, Dingqiao; Yu, Yun; Zhu, Jiafan; Nakhleh, Luay

    2018-07-01

    PhyloNet was released in 2008 as a software package for representing and analyzing phylogenetic networks. At the time of its release, the main functionalities in PhyloNet consisted of measures for comparing network topologies and a single heuristic for reconciling gene trees with a species tree. Since then, PhyloNet has grown significantly. The software package now includes a wide array of methods for inferring phylogenetic networks from data sets of unlinked loci while accounting for both reticulation (e.g., hybridization) and incomplete lineage sorting. In particular, PhyloNet now allows for maximum parsimony, maximum likelihood, and Bayesian inference of phylogenetic networks from gene tree estimates. Furthermore, Bayesian inference directly from sequence data (sequence alignments or biallelic markers) is implemented. Maximum parsimony is based on an extension of the "minimizing deep coalescences" criterion to phylogenetic networks, whereas maximum likelihood and Bayesian inference are based on the multispecies network coalescent. All methods allow for multiple individuals per species. As computing the likelihood of a phylogenetic network is computationally hard, PhyloNet allows for evaluation and inference of networks using a pseudolikelihood measure. PhyloNet summarizes the results of the various analyzes and generates phylogenetic networks in the extended Newick format that is readily viewable by existing visualization software.

  19. Weight propagation and equivalent horsepower for alternate-engined cars

    NASA Technical Reports Server (NTRS)

    Klose, G. J.; Kurtz, D. W.

    1978-01-01

    In order to evaluate properly the consequences of replacing conventional Otto-cycle engines with alternate power systems, comparisons must be carried out at the vehicle level with functionally equivalent cars. This paper presents the development and application of a procedure for establishing equivalent vehicles. A systematic weight propagation methodology, based on detailed weight breakdowns and influence factors, yields the vehicle weight impacts due to changes in engine weight and power. Performance-matching criteria, utilizing a vehicle simulation program, are then employed to establish Otto-engine-equivalent vehicles, whose characteristics can form the basis for alternative engine evaluations.

  20. 47 CFR 51.903 - Definitions.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... equivalent of the incumbent local exchange carrier access service provided by a non-incumbent local exchange... or other customer provided by an incumbent local exchange carrier or any functional equivalent of the incumbent local exchange carrier access service provided by a non-incumbent local exchange carrier...

  1. 47 CFR 51.903 - Definitions.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... equivalent of the incumbent local exchange carrier access service provided by a non-incumbent local exchange... or other customer provided by an incumbent local exchange carrier or any functional equivalent of the incumbent local exchange carrier access service provided by a non-incumbent local exchange carrier...

  2. 47 CFR 51.903 - Definitions.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... equivalent of the incumbent local exchange carrier access service provided by a non-incumbent local exchange... or other customer provided by an incumbent local exchange carrier or any functional equivalent of the incumbent local exchange carrier access service provided by a non-incumbent local exchange carrier...

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jun, Se -Ran; Hauser, Loren John; Schadt, Christopher Warren

    For decades there has been increasing interest in understanding the relationships between microbial communities and ecosystem functions. Current DNA sequencing technologies allows for the exploration of microbial communities in two principle ways: targeted rRNA gene surveys and shotgun metagenomics. For large study designs, it is often still prohibitively expensive to sequence metagenomes at both the breadth and depth necessary to statistically capture the true functional diversity of a community. Although rRNA gene surveys provide no direct evidence of function, they do provide a reasonable estimation of microbial diversity, while being a very cost effective way to screen samples of interestmore » for later shotgun metagenomic analyses. However, there is a great deal of 16S rRNA gene survey data currently available from diverse environments, and thus a need for tools to infer functional composition of environmental samples based on 16S rRNA gene survey data. As a result, we present a computational method called pangenome based functional profiles (PanFP), which infers functional profiles of microbial communities from 16S rRNA gene survey data for Bacteria and Archaea. PanFP is based on pangenome reconstruction of a 16S rRNA gene operational taxonomic unit (OTU) from known genes and genomes pooled from the OTU s taxonomic lineage. From this lineage, we derive an OTU functional profile by weighting a pangenome s functional profile with the OTUs abundance observed in a given sample. We validated our method by comparing PanFP to the functional profiles obtained from the direct shotgun metagenomic measurement of 65 diverse communities via Spearman correlation coefficients. These correlations improved with increasing sequencing depth, within the range of 0.8 0.9 for the most deeply sequenced Human Microbiome Project mock community samples. PanFP is very similar in performance to another recently released tool, PICRUSt, for almost all of survey data analysed here. But, our method is unique in that any OTU building method can be used, as opposed to being limited to closed reference OTU picking strategies against specific reference sequence databases. In conclusion, we developed an automated computational method, which derives an inferred functional profile based on the 16S rRNA gene surveys of microbial communities. The inferred functional profile provides a cost effective way to study complex ecosystems through predicted comparative functional metagenomes and metadata analysis. All PanFP source code and additional documentation are freely available online at GitHub.« less

  4. The transmission process: A combinatorial stochastic process for the evolution of transmission trees over networks.

    PubMed

    Sainudiin, Raazesh; Welch, David

    2016-12-07

    We derive a combinatorial stochastic process for the evolution of the transmission tree over the infected vertices of a host contact network in a susceptible-infected (SI) model of an epidemic. Models of transmission trees are crucial to understanding the evolution of pathogen populations. We provide an explicit description of the transmission process on the product state space of (rooted planar ranked labelled) binary transmission trees and labelled host contact networks with SI-tags as a discrete-state continuous-time Markov chain. We give the exact probability of any transmission tree when the host contact network is a complete, star or path network - three illustrative examples. We then develop a biparametric Beta-splitting model that directly generates transmission trees with exact probabilities as a function of the model parameters, but without explicitly modelling the underlying contact network, and show that for specific values of the parameters we can recover the exact probabilities for our three example networks through the Markov chain construction that explicitly models the underlying contact network. We use the maximum likelihood estimator (MLE) to consistently infer the two parameters driving the transmission process based on observations of the transmission trees and use the exact MLE to characterize equivalence classes over the space of contact networks with a single initial infection. An exploratory simulation study of the MLEs from transmission trees sampled from three other deterministic and four random families of classical contact networks is conducted to shed light on the relation between the MLEs of these families with some implications for statistical inference along with pointers to further extensions of our models. The insights developed here are also applicable to the simplest models of "meme" evolution in online social media networks through transmission events that can be distilled from observable actions such as "likes", "mentions", "retweets" and "+1s" along with any concomitant comments. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.

  5. Are written and spoken recall of text equivalent?

    PubMed

    Kellogg, Ronald T

    2007-01-01

    Writing is less practiced than speaking, graphemic codes are activated only in writing, and the retrieved representations of the text must be maintained in working memory longer because handwritten output is slower than speech. These extra demands on working memory could result in less effort being given to retrieval during written compared with spoken text recall. To test this hypothesis, college students read or heard Bartlett's "War of the Ghosts" and then recalled the text in writing or speech. Spoken recall produced more accurately recalled propositions and more major distortions (e.g., inferences) than written recall. The results suggest that writing reduces the retrieval effort given to reconstructing the propositions of a text.

  6. Calculation of Dose, Dose Equivalent, and Relative Biological Effectiveness for High Charge and Energy Ion Beams

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; Reginatto, M.; Hajnal, F.; Chun, S. Y.

    1995-01-01

    The Green's function for the transport of ions of high charge and energy is utilized with a nuclear fragmentation database to evaluate dose, dose equivalent, and RBE for C3H1OT1/2 cell survival and neoplastic transformation as a function of depth in soft tissue. Such evaluations are useful to estimates of biological risk for high altitude aircraft, space operations, accelerator operations, and biomedical applications.

  7. Calculation of dose, dose equivalent, and relative biological effectiveness for high charge and energy ion beams

    NASA Technical Reports Server (NTRS)

    Wilson, J. W.; Chun, S. Y.; Reginatto, M.; Hajnal, F.

    1995-01-01

    The Green's function for the transport of ions of high charge and energy is utilized with a nuclear fragmentation database to evaluate dose, dose equivalent, and RBE for C3H10T1/2 cell survival and neo-plastic transformation as function of depth in soft tissue. Such evaluations are useful to estimates of biological risk for high altitude aircraft, space operations, accelerator operations, and biomedical application.

  8. Equivalences between nonuniform exponential dichotomy and admissibility

    NASA Astrophysics Data System (ADS)

    Zhou, Linfeng; Lu, Kening; Zhang, Weinian

    2017-01-01

    Relationship between exponential dichotomies and admissibility of function classes is a significant problem for hyperbolic dynamical systems. It was proved that a nonuniform exponential dichotomy implies several admissible pairs of function classes and conversely some admissible pairs were found to imply a nonuniform exponential dichotomy. In this paper we find an appropriate admissible pair of classes of Lyapunov bounded functions which is equivalent to the existence of nonuniform exponential dichotomy on half-lines R± separately, on both half-lines R± simultaneously, and on the whole line R. Additionally, the maximal admissibility is proved in the case on both half-lines R± simultaneously.

  9. Modal Decomposition of TTV: Inferring Planet Masses and Eccentricities

    NASA Astrophysics Data System (ADS)

    Linial, Itai; Gilbaum, Shmuel; Sari, Re’em

    2018-06-01

    Transit timing variations (TTVs) are a powerful tool for characterizing the properties of transiting exoplanets. However, inferring planet properties from the observed timing variations is a challenging task, which is usually addressed by extensive numerical searches. We propose a new, computationally inexpensive method for inverting TTV signals in a planetary system of two transiting planets. To the lowest order in planetary masses and eccentricities, TTVs can be expressed as a linear combination of three functions, which we call the TTV modes. These functions depend only on the planets’ linear ephemerides, and can be either constructed analytically, or by performing three orbital integrations of the three-body system. Given a TTV signal, the underlying physical parameters are found by decomposing the data as a sum of the TTV modes. We demonstrate the use of this method by inferring the mass and eccentricity of six Kepler planets that were previously characterized in other studies. Finally we discuss the implications and future prospects of our new method.

  10. The link between social cognition and self-referential thought in the medial prefrontal cortex.

    PubMed

    Mitchell, Jason P; Banaji, Mahzarin R; Macrae, C Neil

    2005-08-01

    The medial prefrontal cortex (mPFC) has been implicated in seemingly disparate cognitive functions, such as understanding the minds of other people and processing information about the self. This functional overlap would be expected if humans use their own experiences to infer the mental states of others, a basic postulate of simulation theory. Neural activity was measured while participants attended to either the mental or physical aspects of a series of other people. To permit a test of simulation theory's prediction that inferences based on self-reflection should only be made for similar others, targets were subsequently rated for their degree of similarity to self. Parametric analyses revealed a region of the ventral mPFC--previously implicated in self-referencing tasks--in which activity correlated with perceived self/other similarity, but only for mentalizing trials. These results suggest that self-reflection may be used to infer the mental states of others when they are sufficiently similar to self.

  11. Inferring genome-wide interplay landscape between DNA methylation and transcriptional regulation.

    PubMed

    Tang, Binhua; Wang, Xin

    2015-01-01

    DNA methylation and transcriptional regulation play important roles in cancer cell development and differentiation processes. Based on the currently available cell line profiling information from the ENCODE Consortium, we propose a Bayesian inference model to infer and construct genome-wide interaction landscape between DNA methylation and transcriptional regulation, which sheds light on the underlying complex functional mechanisms important within the human cancer and disease context. For the first time, we select all the currently available cell lines (>=20) and transcription factors (>=80) profiling information from the ENCODE Consortium portal. Through the integration of those genome-wide profiling sources, our genome-wide analysis detects multiple functional loci of interest, and indicates that DNA methylation is cell- and region-specific, due to the interplay mechanisms with transcription regulatory activities. We validate our analysis results with the corresponding RNA-sequencing technique for those detected genomic loci. Our results provide novel and meaningful insights for the interplay mechanisms of transcriptional regulation and gene expression for the human cancer and disease studies.

  12. Genetic Network Inference: From Co-Expression Clustering to Reverse Engineering

    NASA Technical Reports Server (NTRS)

    Dhaeseleer, Patrik; Liang, Shoudan; Somogyi, Roland

    2000-01-01

    Advances in molecular biological, analytical, and computational technologies are enabling us to systematically investigate the complex molecular processes underlying biological systems. In particular, using high-throughput gene expression assays, we are able to measure the output of the gene regulatory network. We aim here to review datamining and modeling approaches for conceptualizing and unraveling the functional relationships implicit in these datasets. Clustering of co-expression profiles allows us to infer shared regulatory inputs and functional pathways. We discuss various aspects of clustering, ranging from distance measures to clustering algorithms and multiple-duster memberships. More advanced analysis aims to infer causal connections between genes directly, i.e., who is regulating whom and how. We discuss several approaches to the problem of reverse engineering of genetic networks, from discrete Boolean networks, to continuous linear and non-linear models. We conclude that the combination of predictive modeling with systematic experimental verification will be required to gain a deeper insight into living organisms, therapeutic targeting, and bioengineering.

  13. The dorsal anterior cingulate cortex is selective for pain: Results from large-scale reverse inference

    PubMed Central

    Lieberman, Matthew D.; Eisenberger, Naomi I.

    2015-01-01

    Dorsal anterior cingulate cortex (dACC) activation is commonly observed in studies of pain, executive control, conflict monitoring, and salience processing, making it difficult to interpret the dACC’s specific psychological function. Using Neurosynth, an automated brainmapping database [of over 10,000 functional MRI (fMRI) studies], we performed quantitative reverse inference analyses to explore the best general psychological account of the dACC function P(Ψ process|dACC activity). Results clearly indicated that the best psychological description of dACC function was related to pain processing—not executive, conflict, or salience processing. We conclude by considering that physical pain may be an instance of a broader class of survival-relevant goals monitored by the dACC, in contrast to more arbitrary temporary goals, which may be monitored by the supplementary motor area. PMID:26582792

  14. SNPit: a federated data integration system for the purpose of functional SNP annotation.

    PubMed

    Shen, Terry H; Carlson, Christopher S; Tarczy-Hornoch, Peter

    2009-08-01

    Genome wide association studies can potentially identify the genetic causes behind the majority of human diseases. With the advent of more advanced genotyping techniques, there is now an explosion of data gathered on single nucleotide polymorphisms (SNPs). The need exists for an integrated system that can provide up-to-date functional annotation information on SNPs. We have developed the SNP Integration Tool (SNPit) system to address this need. Built upon a federated data integration system, SNPit provides current information on a comprehensive list of SNP data sources. Additional logical inference analysis was included through an inference engine plug in. The SNPit web servlet is available online for use. SNPit allows users to go to one source for up-to-date information on the functional annotation of SNPs. A tool that can help to integrate and analyze the potential functional significance of SNPs is important for understanding the results from genome wide association studies.

  15. Assessing the equivalence of Web-based and paper-and-pencil questionnaires using differential item and test functioning (DIF and DTF) analysis: a case of the Four-Dimensional Symptom Questionnaire (4DSQ).

    PubMed

    Terluin, Berend; Brouwers, Evelien P M; Marchand, Miquelle A G; de Vet, Henrica C W

    2018-05-01

    Many paper-and-pencil (P&P) questionnaires have been migrated to electronic platforms. Differential item and test functioning (DIF and DTF) analysis constitutes a superior research design to assess measurement equivalence across modes of administration. The purpose of this study was to demonstrate an item response theory (IRT)-based DIF and DTF analysis to assess the measurement equivalence of a Web-based version and the original P&P format of the Four-Dimensional Symptom Questionnaire (4DSQ), measuring distress, depression, anxiety, and somatization. The P&P group (n = 2031) and the Web group (n = 958) consisted of primary care psychology clients. Unidimensionality and local independence of the 4DSQ scales were examined using IRT and Yen's Q3. Bifactor modeling was used to assess the scales' essential unidimensionality. Measurement equivalence was assessed using IRT-based DIF analysis using a 3-stage approach: linking on the latent mean and variance, selection of anchor items, and DIF testing using the Wald test. DTF was evaluated by comparing expected scale scores as a function of the latent trait. The 4DSQ scales proved to be essentially unidimensional in both modalities. Five items, belonging to the distress and somatization scales, displayed small amounts of DIF. DTF analysis revealed that the impact of DIF on the scale level was negligible. IRT-based DIF and DTF analysis is demonstrated as a way to assess the equivalence of Web-based and P&P questionnaire modalities. Data obtained with the Web-based 4DSQ are equivalent to data obtained with the P&P version.

  16. Copper-transporting P-type ATPases use a unique ion-release pathway

    PubMed Central

    Andersson, Magnus; Mattle, Daniel; Sitsel, Oleg; Nielsen, Anna Marie; White, Stephen H.; Nissen, Poul; Gourdon, Pontus

    2014-01-01

    Heavy metals in cells are typically regulated by PIB-type ATPases such as the copper transporting Cu+-ATPases. The first crystal structure of a Cu+-ATPase (LpCopA) was trapped in a transition state of dephosphorylation (E2.Pi) and inferred to be occluded. The structure revealed a PIB-specific topology and suggested a copper transport pathway across the membrane. Here we show by molecular dynamics (MD) simulations that extracellular water solvates the transmembrane (TM) domain, indicative of a pathway for Cu+ release. Furthermore, a new LpCopA crystal structure determined at 2.8 Å resolution, trapped in the E2P state (which is associated with extracellular exchange in PII-type ATPases), delineates the same conduit as also further supported by site-directed mutagenesis. The E2P and E2.Pi states therefore appear equivalent and open to the extracellular side, in contrast to PII-type ATPases where the E2.Pi state is occluded. This indicates that Cu+-ATPases couple dephosphorylation differently to the conformational changes associated with ion extrusion. The ion pathway may explain why Menkes’ and Wilson’s disease mutations at the extracellular side impair protein function, and points to an accessible site for novel inhibitors targeting Cu+-ATPases of pathogens. PMID:24317491

  17. Maximum entropy principle for stationary states underpinned by stochastic thermodynamics.

    PubMed

    Ford, Ian J

    2015-11-01

    The selection of an equilibrium state by maximizing the entropy of a system, subject to certain constraints, is often powerfully motivated as an exercise in logical inference, a procedure where conclusions are reached on the basis of incomplete information. But such a framework can be more compelling if it is underpinned by dynamical arguments, and we show how this can be provided by stochastic thermodynamics, where an explicit link is made between the production of entropy and the stochastic dynamics of a system coupled to an environment. The separation of entropy production into three components allows us to select a stationary state by maximizing the change, averaged over all realizations of the motion, in the principal relaxational or nonadiabatic component, equivalent to requiring that this contribution to the entropy production should become time independent for all realizations. We show that this recovers the usual equilibrium probability density function (pdf) for a conservative system in an isothermal environment, as well as the stationary nonequilibrium pdf for a particle confined to a potential under nonisothermal conditions, and a particle subject to a constant nonconservative force under isothermal conditions. The two remaining components of entropy production account for a recently discussed thermodynamic anomaly between over- and underdamped treatments of the dynamics in the nonisothermal stationary state.

  18. Convergent evolution as natural experiment: the tape of life reconsidered.

    PubMed

    Powell, Russell; Mariscal, Carlos

    2015-12-06

    Stephen Jay Gould argued that replaying the 'tape of life' would result in radically different evolutionary outcomes. Recently, biologists and philosophers of science have paid increasing attention to the theoretical importance of convergent evolution-the independent origination of similar biological forms and functions-which many interpret as evidence against Gould's thesis. In this paper, we examine the evidentiary relevance of convergent evolution for the radical contingency debate. We show that under the right conditions, episodes of convergent evolution can constitute valid natural experiments that support inferences regarding the deep counterfactual stability of macroevolutionary outcomes. However, we argue that proponents of convergence have problematically lumped causally heterogeneous phenomena into a single evidentiary basket, in effect treating all convergent events as if they are of equivalent theoretical import. As a result, the 'critique from convergent evolution' fails to engage with key claims of the radical contingency thesis. To remedy this, we develop ways to break down the heterogeneous set of convergent events based on the nature of the generalizations they support. Adopting this more nuanced approach to convergent evolution allows us to differentiate iterated evolutionary outcomes that are probably common among alternative evolutionary histories and subject to law-like generalizations, from those that do little to undermine and may even support, the Gouldian view of life.

  19. The effect of a paraffin screen on the neutron dose at the maze door of a 15 MV linear accelerator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Krmar, M.; Kuzmanović, A.; Nikolić, D.

    2013-08-15

    Purpose: The purpose of this study was to explore the effects of a paraffin screen located at various positions in the maze on the neutron dose equivalent at the maze door.Methods: The neutron dose equivalent was measured at the maze door of a room containing a 15 MV linear accelerator for x-ray therapy. Measurements were performed for several positions of the paraffin screen covering only 27.5% of the cross-sectional area of the maze. The neutron dose equivalent was also measured at all screen positions. Two simple models of the neutron source were considered in which the first assumed that themore » source was the cross-sectional area at the inner entrance of the maze, radiating neutrons in an isotropic manner. In the second model the reduction in the neutron dose equivalent at the maze door due to the paraffin screen was considered to be a function of the mean values of the neutron fluence and energy at the screen.Results: The results of this study indicate that the equivalent dose at the maze door was reduced by a factor of 3 through the use of a paraffin screen that was placed inside the maze. It was also determined that the contributions to the dosage from areas that were not covered by the paraffin screen as viewed from the dosimeter, were 2.5 times higher than the contributions from the covered areas. This study also concluded that the contributions of the maze walls, ceiling, and floor to the total neutron dose equivalent were an order of magnitude lower than those from the surface at the far end of the maze.Conclusions: This study demonstrated that a paraffin screen could be used to reduce the neutron dose equivalent at the maze door by a factor of 3. This paper also found that the reduction of the neutron dose equivalent was a linear function of the area covered by the maze screen and that the decrease in the dose at the maze door could be modeled as an exponential function of the product φ·E at the screen.« less

  20. Extending local canonical correlation analysis to handle general linear contrasts for FMRI data.

    PubMed

    Jin, Mingwu; Nandy, Rajesh; Curran, Tim; Cordes, Dietmar

    2012-01-01

    Local canonical correlation analysis (CCA) is a multivariate method that has been proposed to more accurately determine activation patterns in fMRI data. In its conventional formulation, CCA has several drawbacks that limit its usefulness in fMRI. A major drawback is that, unlike the general linear model (GLM), a test of general linear contrasts of the temporal regressors has not been incorporated into the CCA formalism. To overcome this drawback, a novel directional test statistic was derived using the equivalence of multivariate multiple regression (MVMR) and CCA. This extension will allow CCA to be used for inference of general linear contrasts in more complicated fMRI designs without reparameterization of the design matrix and without reestimating the CCA solutions for each particular contrast of interest. With the proper constraints on the spatial coefficients of CCA, this test statistic can yield a more powerful test on the inference of evoked brain regional activations from noisy fMRI data than the conventional t-test in the GLM. The quantitative results from simulated and pseudoreal data and activation maps from fMRI data were used to demonstrate the advantage of this novel test statistic.

  1. Inferred UV Fluence Focal-Spot Profiles from Soft X-Ray Pinhole Camera Measurements on OMEGA

    NASA Astrophysics Data System (ADS)

    Theobald, W.; Sorce, C.; Epstein, R.; Keck, R. L.; Kellogg, C.; Kessler, T. J.; Kwiatkowski, J.; Marshall, F. J.; Seka, W.; Shvydky, A.; Stoeckl, C.

    2017-10-01

    The drive uniformity of OMEGA cryogenic implosions is affected by UV beamfluence variations on target, which require careful monitoring at full laser power. This is routinely performed with multiple pinhole cameras equipped with charge-injection devices (CID's) that record the x-ray emission in the 3- to 7-keV photon energy range from an Au-coated target. The technique relies on the knowledge of the relation between x-ray fluence Fx and UV fluence FUV ,Fx FUVγ , with a measured γ = 3.42 for the CID-based diagnostic and 1-ns laser pulse. It is demonstrated here that using a back-thinned charge-coupled-device camera with softer filtration for x-rays with photon energies <2 keV and well calibrated pinhole provides a lower γ 2 and a larger dynamic range in the measured UV fluence. Inferred UV fluence profiles were measured for 100-ps and 1-ns laser pulses and were compared to directly measured profiles from a UV equivalent-target-plane diagnostic. Good agreement between both techniques is reported for selected beams. This material is based upon work supported by the Department of Energy National Nuclear Security Administration under Award Number DE-NA0001944.

  2. Extending Local Canonical Correlation Analysis to Handle General Linear Contrasts for fMRI Data

    PubMed Central

    Jin, Mingwu; Nandy, Rajesh; Curran, Tim; Cordes, Dietmar

    2012-01-01

    Local canonical correlation analysis (CCA) is a multivariate method that has been proposed to more accurately determine activation patterns in fMRI data. In its conventional formulation, CCA has several drawbacks that limit its usefulness in fMRI. A major drawback is that, unlike the general linear model (GLM), a test of general linear contrasts of the temporal regressors has not been incorporated into the CCA formalism. To overcome this drawback, a novel directional test statistic was derived using the equivalence of multivariate multiple regression (MVMR) and CCA. This extension will allow CCA to be used for inference of general linear contrasts in more complicated fMRI designs without reparameterization of the design matrix and without reestimating the CCA solutions for each particular contrast of interest. With the proper constraints on the spatial coefficients of CCA, this test statistic can yield a more powerful test on the inference of evoked brain regional activations from noisy fMRI data than the conventional t-test in the GLM. The quantitative results from simulated and pseudoreal data and activation maps from fMRI data were used to demonstrate the advantage of this novel test statistic. PMID:22461786

  3. Active inference and epistemic value.

    PubMed

    Friston, Karl; Rigoli, Francesco; Ognibene, Dimitri; Mathys, Christoph; Fitzgerald, Thomas; Pezzulo, Giovanni

    2015-01-01

    We offer a formal treatment of choice behavior based on the premise that agents minimize the expected free energy of future outcomes. Crucially, the negative free energy or quality of a policy can be decomposed into extrinsic and epistemic (or intrinsic) value. Minimizing expected free energy is therefore equivalent to maximizing extrinsic value or expected utility (defined in terms of prior preferences or goals), while maximizing information gain or intrinsic value (or reducing uncertainty about the causes of valuable outcomes). The resulting scheme resolves the exploration-exploitation dilemma: Epistemic value is maximized until there is no further information gain, after which exploitation is assured through maximization of extrinsic value. This is formally consistent with the Infomax principle, generalizing formulations of active vision based upon salience (Bayesian surprise) and optimal decisions based on expected utility and risk-sensitive (Kullback-Leibler) control. Furthermore, as with previous active inference formulations of discrete (Markovian) problems, ad hoc softmax parameters become the expected (Bayes-optimal) precision of beliefs about, or confidence in, policies. This article focuses on the basic theory, illustrating the ideas with simulations. A key aspect of these simulations is the similarity between precision updates and dopaminergic discharges observed in conditioning paradigms.

  4. Implications of the Utopia Gravity Anomaly for the Resurfacing of the Northern Plains of Mars

    NASA Technical Reports Server (NTRS)

    Banerdt, W. B.

    2004-01-01

    Whereas the surface units of the northern plain of Mars generally exhibit ages ranging from late Hesperian to Amazonian, interpretation of precise topographic measurements indicate that the age of the underlying "basement" is early Noachian, or almost as old as the southern highlands. This suggests that widespread but relatively superficial resurfacing has occurred throughout the northern plains since the end of early heavy bombardment. In this abstract I examine some of the possible implications of the subsurface structure inferred for the Utopia basin from gravity data on the nature of this resurfacing. The large, shallow, circular depression in Utopia Planitia has been identified as a huge impact basin, based on both geological evidence and detailed analysis of MOLA topography. Its diameter (approx. 3000 km) is equivalent to that of the Hellas basin, as is its inferred age (early Noachian). However, whereas Hellas is extremely deep with rough terrain and large slopes, the Utopia basin is a smooth, shallow, almost imperceptible bowl. Conversely, Utopia displays one of the largest (non-Tharsis-related) positive geoid anomalies on Mars, in contrast to a much more subdued negative anomaly over Hellas.

  5. A Comparison of Fuzzy Models in Similarity Assessment of Misregistered Area Class Maps

    NASA Astrophysics Data System (ADS)

    Brown, Scott

    Spatial uncertainty refers to unknown error and vagueness in geographic data. It is relevant to land change and urban growth modelers, soil and biome scientists, geological surveyors and others, who must assess thematic maps for similarity, or categorical agreement. In this paper I build upon prior map comparison research, testing the effectiveness of similarity measures on misregistered data. Though several methods compare uncertain thematic maps, few methods have been tested on misregistration. My objective is to test five map comparison methods for sensitivity to misregistration, including sub-pixel errors in both position and rotation. Methods included four fuzzy categorical models: fuzzy kappa's model, fuzzy inference, cell aggregation, and the epsilon band. The fifth method used conventional crisp classification. I applied these methods to a case study map and simulated data in two sets: a test set with misregistration error, and a control set with equivalent uniform random error. For all five methods, I used raw accuracy or the kappa statistic to measure similarity. Rough-set epsilon bands report the most similarity increase in test maps relative to control data. Conversely, the fuzzy inference model reports a decrease in test map similarity.

  6. Equivalence principle and bound kinetic energy.

    PubMed

    Hohensee, Michael A; Müller, Holger; Wiringa, R B

    2013-10-11

    We consider the role of the internal kinetic energy of bound systems of matter in tests of the Einstein equivalence principle. Using the gravitational sector of the standard model extension, we show that stringent limits on equivalence principle violations in antimatter can be indirectly obtained from tests using bound systems of normal matter. We estimate the bound kinetic energy of nucleons in a range of light atomic species using Green's function Monte Carlo calculations, and for heavier species using a Woods-Saxon model. We survey the sensitivities of existing and planned experimental tests of the equivalence principle, and report new constraints at the level of between a few parts in 10(6) and parts in 10(8) on violations of the equivalence principle for matter and antimatter.

  7. The new physician as unwitting quantum mechanic: is adapting Dirac's inference system best practice for personalized medicine, genomics, and proteomics?

    PubMed

    Robson, Barry

    2007-08-01

    What is the Best Practice for automated inference in Medical Decision Support for personalized medicine? A known system already exists as Dirac's inference system from quantum mechanics (QM) using bra-kets and bras where A and B are states, events, or measurements representing, say, clinical and biomedical rules. Dirac's system should theoretically be the universal best practice for all inference, though QM is notorious as sometimes leading to bizarre conclusions that appear not to be applicable to the macroscopic world of everyday world human experience and medical practice. It is here argued that this apparent difficulty vanishes if QM is assigned one new multiplication function @, which conserves conditionality appropriately, making QM applicable to classical inference including a quantitative form of the predicate calculus. An alternative interpretation with the same consequences is if every i = radical-1 in Dirac's QM is replaced by h, an entity distinct from 1 and i and arguably a hidden root of 1 such that h2 = 1. With that exception, this paper is thus primarily a review of the application of Dirac's system, by application of linear algebra in the complex domain to help manipulate information about associations and ontology in complicated data. Any combined bra-ket can be shown to be composed only of the sum of QM-like bra and ket weights c(), times an exponential function of Fano's mutual information measure I(A; B) about the association between A and B, that is, an association rule from data mining. With the weights and Fano measure re-expressed as expectations on finite data using Riemann's Incomplete (i.e., Generalized) Zeta Functions, actual counts of observations for real world sparse data can be readily utilized. Finally, the paper compares identical character, distinguishability of states events or measurements, correlation, mutual information, and orthogonal character, important issues in data mining and biomedical analytics, as in QM.

  8. DEFINING THE PLAYERS IN HIGHER-ORDER NETWORKS: PREDICTIVE MODELING FOR REVERSE ENGINEERING FUNCTIONAL INFLUENCE NETWORKS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDermott, Jason E.; Costa, Michelle N.; Stevens, S.L.

    A difficult problem that is currently growing rapidly due to the sharp increase in the amount of high-throughput data available for many systems is that of determining useful and informative causative influence networks. These networks can be used to predict behavior given observation of a small number of components, predict behavior at a future time point, or identify components that are critical to the functioning of the system under particular conditions. In these endeavors incorporating observations of systems from a wide variety of viewpoints can be particularly beneficial, but has often been undertaken with the objective of inferring networks thatmore » are generally applicable. The focus of the current work is to integrate both general observations and measurements taken for a particular pathology, that of ischemic stroke, to provide improved ability to produce useful predictions of systems behavior. A number of hybrid approaches have recently been proposed for network generation in which the Gene Ontology is used to filter or enrich network links inferred from gene expression data through reverse engineering methods. These approaches have been shown to improve the biological plausibility of the inferred relationships determined, but still treat knowledge-based and machine-learning inferences as incommensurable inputs. In this paper, we explore how further improvements may be achieved through a full integration of network inference insights achieved through application of the Gene Ontology and reverse engineering methods with specific reference to the construction of dynamic models of transcriptional regulatory networks. We show that integrating two approaches to network construction, one based on reverse-engineering from conditional transcriptional data, one based on reverse-engineering from in situ hybridization data, and another based on functional associations derived from Gene Ontology, using probabilities can improve results of clustering as evaluated by a predictive model of transcriptional expression levels.« less

  9. 14 CFR 25.1329 - Flight guidance system.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... (or equivalent). The autothrust quick disengagement controls must be located on the thrust control... wheel (or equivalent) and thrust control levers. (b) The effects of a failure of the system to disengage... guidance system. (a) Quick disengagement controls for the autopilot and autothrust functions must be...

  10. 14 CFR 25.1329 - Flight guidance system.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... (or equivalent). The autothrust quick disengagement controls must be located on the thrust control... wheel (or equivalent) and thrust control levers. (b) The effects of a failure of the system to disengage... guidance system. (a) Quick disengagement controls for the autopilot and autothrust functions must be...

  11. 14 CFR 25.1329 - Flight guidance system.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... (or equivalent). The autothrust quick disengagement controls must be located on the thrust control... wheel (or equivalent) and thrust control levers. (b) The effects of a failure of the system to disengage... guidance system. (a) Quick disengagement controls for the autopilot and autothrust functions must be...

  12. 14 CFR 25.1329 - Flight guidance system.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... (or equivalent). The autothrust quick disengagement controls must be located on the thrust control... wheel (or equivalent) and thrust control levers. (b) The effects of a failure of the system to disengage... guidance system. (a) Quick disengagement controls for the autopilot and autothrust functions must be...

  13. 14 CFR 25.1329 - Flight guidance system.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... (or equivalent). The autothrust quick disengagement controls must be located on the thrust control... wheel (or equivalent) and thrust control levers. (b) The effects of a failure of the system to disengage... guidance system. (a) Quick disengagement controls for the autopilot and autothrust functions must be...

  14. Membrane voltage changes in passive dendritic trees: a tapering equivalent cylinder model.

    PubMed

    Poznański, R R

    1988-01-01

    An exponentially tapering equivalent cylinder model is employed in order to approximate the loss of the dendritic trunk parameter observed from anatomical data on apical and basilar dendrites of CA1 and CA3 hippocampal pyramidal neurons. This model allows dendritic trees with a relative paucity of branching to be treated. In particular, terminal branches are not required to end at the same electrotonic distance. The Laplace transform method is used to obtain analytic expressions for the Green's function corresponding to an instantaneous pulse of current injected at a single point along a tapering equivalent cylinder with sealed ends. The time course of the voltage in response to an arbitrary input is computed using the Green's function in a convolution integral. Examples of current input considered are (1) an infinitesimally brief (Dirac delta function) pulse and (2) a step pulse. It is demonstrated that inputs located on a tapering equivalent cylinder are more effective at the soma than identically placed inputs on a nontapering equivalent cylinder. Asymptotic solutions are derived to enable the voltage response behaviour over both relatively short and long time periods to be analysed. Semilogarithmic plots of these solutions provide a basis for estimating the membrane time constant tau m from experimental transients. Transient voltage decrement from a clamped soma reveals that tapering tends to reduce the error associated with inadequate voltage clamping of the dendritic membrane. A formula is derived which shows that tapering tends to increase the estimate of the electrotonic length parameter L.

  15. Synthetic resistivity calculations for the canonical depth-to-bedrock problem: A critical examination of the thin interbed problem and electrical equivalence theories

    NASA Astrophysics Data System (ADS)

    Weiss, C. J.; Knight, R.

    2009-05-01

    One of the key factors in the sensible inference of subsurface geologic properties from both field and laboratory experiments is the ability to quantify the linkages between the inherently fine-scale structures, such as bedding planes and fracture sets, and their macroscopic expression through geophysical interrogation. Central to this idea is the concept of a "minimal sampling volume" over which a given geophysical method responds to an effective medium property whose value is dictated by the geometry and distribution of sub- volume heterogeneities as well as the experiment design. In this contribution we explore the concept of effective resistivity volumes for the canonical depth-to-bedrock problem subject to industry-standard DC resistivity survey designs. Four models representing a sedimentary overburden and flat bedrock interface were analyzed through numerical experiments of six different resistivity arrays. In each of the four models, the sedimentary overburden consists of a thinly interbedded resistive and conductive laminations, with equivalent volume-averaged resistivity but differing lamination thickness, geometry, and layering sequence. The numerical experiments show striking differences in the apparent resistivity pseudo-sections which belie the volume-averaged equivalence of the models. These models constitute the synthetic data set offered for inversion in this Back to Basics Resistivity Modeling session and offer the promise to further our understanding of how the sampling volume, as affected by survey design, can be constrained by joint-array inversion of resistivity data.

  16. Social representations and contextual adjustments as two distinct components of the Theory of Mind brain network: Evidence from the REMICS task.

    PubMed

    Lavoie, Marie-Audrey; Vistoli, Damien; Sutliff, Stephanie; Jackson, Philip L; Achim, Amélie M

    2016-08-01

    Theory of mind (ToM) refers to the ability to infer the mental states of others. Behavioral measures of ToM usually present information about both a character and the context in which this character is placed, and these different pieces of information can be used to infer the character's mental states. A set of brain regions designated as the ToM brain network is recognized to support (ToM) inferences. Different brain regions within that network could however support different ToM processes. This functional magnetic resonance imaging (fMRI) study aimed to distinguish the brain regions supporting two aspects inherent to many ToM tasks, i.e., the ability to infer or represent mental states and the ability to use the context to adjust these inferences. Nineteen healthy subjects were scanned during the REMICS task, a novel task designed to orthogonally manipulate mental state inferences (as opposed to physical inferences) and contextual adjustments of inferences (as opposed to inferences that do not require contextual adjustments). We observed that mental state inferences and contextual adjustments, which are important aspects of most behavioral ToM tasks, rely on distinct brain regions or subregions within the classical brain network activated in previous ToM research. Notably, an interesting dissociation emerged within the medial prefrontal cortex (mPFC) and temporo-parietal junctions (TPJ) such that the inferior part of these brain regions responded to mental state inferences while the superior part of these brain regions responded to the requirement for contextual adjustments. This study provides evidence that the overall set of brain regions activated during ToM tasks supports different processes, and highlights that cognitive processes related to contextual adjustments have an important role in ToM and should be further studied. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Automatic physical inference with information maximizing neural networks

    NASA Astrophysics Data System (ADS)

    Charnock, Tom; Lavaux, Guilhem; Wandelt, Benjamin D.

    2018-04-01

    Compressing large data sets to a manageable number of summaries that are informative about the underlying parameters vastly simplifies both frequentist and Bayesian inference. When only simulations are available, these summaries are typically chosen heuristically, so they may inadvertently miss important information. We introduce a simulation-based machine learning technique that trains artificial neural networks to find nonlinear functionals of data that maximize Fisher information: information maximizing neural networks (IMNNs). In test cases where the posterior can be derived exactly, likelihood-free inference based on automatically derived IMNN summaries produces nearly exact posteriors, showing that these summaries are good approximations to sufficient statistics. In a series of numerical examples of increasing complexity and astrophysical relevance we show that IMNNs are robustly capable of automatically finding optimal, nonlinear summaries of the data even in cases where linear compression fails: inferring the variance of Gaussian signal in the presence of noise, inferring cosmological parameters from mock simulations of the Lyman-α forest in quasar spectra, and inferring frequency-domain parameters from LISA-like detections of gravitational waveforms. In this final case, the IMNN summary outperforms linear data compression by avoiding the introduction of spurious likelihood maxima. We anticipate that the automatic physical inference method described in this paper will be essential to obtain both accurate and precise cosmological parameter estimates from complex and large astronomical data sets, including those from LSST and Euclid.

  18. Causal learning and inference as a rational process: the new synthesis.

    PubMed

    Holyoak, Keith J; Cheng, Patricia W

    2011-01-01

    Over the past decade, an active line of research within the field of human causal learning and inference has converged on a general representational framework: causal models integrated with bayesian probabilistic inference. We describe this new synthesis, which views causal learning and inference as a fundamentally rational process, and review a sample of the empirical findings that support the causal framework over associative alternatives. Causal events, like all events in the distal world as opposed to our proximal perceptual input, are inherently unobservable. A central assumption of the causal approach is that humans (and potentially nonhuman animals) have been designed in such a way as to infer the most invariant causal relations for achieving their goals based on observed events. In contrast, the associative approach assumes that learners only acquire associations among important observed events, omitting the representation of the distal relations. By incorporating bayesian inference over distributions of causal strength and causal structures, along with noisy-logical (i.e., causal) functions for integrating the influences of multiple causes on a single effect, human judgments about causal strength and structure can be predicted accurately for relatively simple causal structures. Dynamic models of learning based on the causal framework can explain patterns of acquisition observed with serial presentation of contingency data and are consistent with available neuroimaging data. The approach has been extended to a diverse range of inductive tasks, including category-based and analogical inferences.

  19. About probabilistic integration of ill-posed geophysical tomography and logging data: A knowledge discovery approach versus petrophysical transfer function concepts illustrated using cross-borehole radar-, P- and S-wave traveltime tomography in combination with cone penetration and dielectric logging data

    NASA Astrophysics Data System (ADS)

    Paasche, Hendrik

    2018-01-01

    Site characterization requires detailed and ideally spatially continuous information about the subsurface. Geophysical tomographic experiments allow for spatially continuous imaging of physical parameter variations, e.g., seismic wave propagation velocities. Such physical parameters are often related to typical geotechnical or hydrological target parameters, e.g. as achieved from 1D direct push or borehole logging. Here, the probabilistic inference of 2D tip resistance, sleeve friction, and relative dielectric permittivity distributions in near-surface sediments is constrained by ill-posed cross-borehole seismic P- and S-wave and radar wave traveltime tomography. In doing so, we follow a discovery science strategy employing a fully data-driven approach capable of accounting for tomographic ambiguity and differences in spatial resolution between the geophysical tomograms and the geotechnical logging data used for calibration. We compare the outcome to results achieved employing classical hypothesis-driven approaches, i.e., deterministic transfer functions derived empirically for the inference of 2D sleeve friction from S-wave velocity tomograms and theoretically for the inference of 2D dielectric permittivity from radar wave velocity tomograms. The data-driven approach offers maximal flexibility in combination with very relaxed considerations about the character of the expected links. This makes it a versatile tool applicable to almost any combination of data sets. However, error propagation may be critical and justify thinking about a hypothesis-driven pre-selection of an optimal database going along with the risk of excluding relevant information from the analyses. Results achieved by transfer function rely on information about the nature of the link and optimal calibration settings drawn as retrospective hypothesis by other authors. Applying such transfer functions at other sites turns them into a priori valid hypothesis, which can, particularly for empirically derived transfer functions, result in poor predictions. However, a mindful utilization and critical evaluation of the consequences of turning a retrospectively drawn hypothesis into an a priori valid hypothesis can also result in good results for inference and prediction problems when using classical transfer function concepts.

  20. Sparse Bayesian Inference and the Temperature Structure of the Solar Corona

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Warren, Harry P.; Byers, Jeff M.; Crump, Nicholas A.

    Measuring the temperature structure of the solar atmosphere is critical to understanding how it is heated to high temperatures. Unfortunately, the temperature of the upper atmosphere cannot be observed directly, but must be inferred from spectrally resolved observations of individual emission lines that span a wide range of temperatures. Such observations are “inverted” to determine the distribution of plasma temperatures along the line of sight. This inversion is ill posed and, in the absence of regularization, tends to produce wildly oscillatory solutions. We introduce the application of sparse Bayesian inference to the problem of inferring the temperature structure of themore » solar corona. Within a Bayesian framework a preference for solutions that utilize a minimum number of basis functions can be encoded into the prior and many ad hoc assumptions can be avoided. We demonstrate the efficacy of the Bayesian approach by considering a test library of 40 assumed temperature distributions.« less

  1. Reviving and Refining Psychodynamic Interpretation of the Wechsler Intelligence Tests: The Verbal Comprehension Subtests.

    PubMed

    Bram, Anthony D

    2017-01-01

    The Wechsler intelligence tests (currently Wechsler, 2008 , 2014) have traditionally been part of the multimethod test battery favored by psychodynamically oriented assessors. In this tradition, assessors have used Wechsler data to make inferences about personality that transcend cognition. Recent trends in clinical psychology, however, have deemphasized this psychodynamic way of working. In this article, I make a conceptual and clinical case for reviving and refining a psychodynamic approach to inference making about personality using the Wechsler Verbal Comprehension subtests. Specifically, I (a) describe the psychological and environmental conditions sampled by the Wechsler tests, (b) discuss the Wechsler tests conceptually in terms of assessing vulnerability to breakdowns in adaptive defensive functioning, (c) review a general framework for inference making, and (d) offer considerations for and illustrate pragmatic application of the Verbal Comprehension subtests data to make inferences that help answer referral questions and have important treatment implications.

  2. Inferring time derivatives including cell growth rates using Gaussian processes

    NASA Astrophysics Data System (ADS)

    Swain, Peter S.; Stevenson, Keiran; Leary, Allen; Montano-Gutierrez, Luis F.; Clark, Ivan B. N.; Vogel, Jackie; Pilizota, Teuta

    2016-12-01

    Often the time derivative of a measured variable is of as much interest as the variable itself. For a growing population of biological cells, for example, the population's growth rate is typically more important than its size. Here we introduce a non-parametric method to infer first and second time derivatives as a function of time from time-series data. Our approach is based on Gaussian processes and applies to a wide range of data. In tests, the method is at least as accurate as others, but has several advantages: it estimates errors both in the inference and in any summary statistics, such as lag times, and allows interpolation with the corresponding error estimation. As illustrations, we infer growth rates of microbial cells, the rate of assembly of an amyloid fibril and both the speed and acceleration of two separating spindle pole bodies. Our algorithm should thus be broadly applicable.

  3. Single board system for fuzzy inference

    NASA Technical Reports Server (NTRS)

    Symon, James R.; Watanabe, Hiroyuki

    1991-01-01

    The very large scale integration (VLSI) implementation of a fuzzy logic inference mechanism allows the use of rule-based control and decision making in demanding real-time applications. Researchers designed a full custom VLSI inference engine. The chip was fabricated using CMOS technology. The chip consists of 688,000 transistors of which 476,000 are used for RAM memory. The fuzzy logic inference engine board system incorporates the custom designed integrated circuit into a standard VMEbus environment. The Fuzzy Logic system uses Transistor-Transistor Logic (TTL) parts to provide the interface between the Fuzzy chip and a standard, double height VMEbus backplane, allowing the chip to perform application process control through the VMEbus host. High level C language functions hide details of the hardware system interface from the applications level programmer. The first version of the board was installed on a robot at Oak Ridge National Laboratory in January of 1990.

  4. Game Theory of Mind

    PubMed Central

    Yoshida, Wako; Dolan, Ray J.; Friston, Karl J.

    2008-01-01

    This paper introduces a model of ‘theory of mind’, namely, how we represent the intentions and goals of others to optimise our mutual interactions. We draw on ideas from optimum control and game theory to provide a ‘game theory of mind’. First, we consider the representations of goals in terms of value functions that are prescribed by utility or rewards. Critically, the joint value functions and ensuing behaviour are optimised recursively, under the assumption that I represent your value function, your representation of mine, your representation of my representation of yours, and so on ad infinitum. However, if we assume that the degree of recursion is bounded, then players need to estimate the opponent's degree of recursion (i.e., sophistication) to respond optimally. This induces a problem of inferring the opponent's sophistication, given behavioural exchanges. We show it is possible to deduce whether players make inferences about each other and quantify their sophistication on the basis of choices in sequential games. This rests on comparing generative models of choices with, and without, inference. Model comparison is demonstrated using simulated and real data from a ‘stag-hunt’. Finally, we note that exactly the same sophisticated behaviour can be achieved by optimising the utility function itself (through prosocial utility), producing unsophisticated but apparently altruistic agents. This may be relevant ethologically in hierarchal game theory and coevolution. PMID:19112488

  5. Graded-index fibers, Wigner-distribution functions, and the fractional Fourier transform.

    PubMed

    Mendlovic, D; Ozaktas, H M; Lohmann, A W

    1994-09-10

    Two definitions of a fractional Fourier transform have been proposed previously. One is based on the propagation of a wave field through a graded-index medium, and the other is based on rotating a function's Wigner distribution. It is shown that both definitions are equivalent. An important result of this equivalency is that the Wigner distribution of a wave field rotates as the wave field propagates through a quadratic graded-index medium. The relation with ray-optics phase space is discussed.

  6. CA1 subfield contributions to memory integration and inference

    PubMed Central

    Schlichting, Margaret L.; Zeithamova, Dagmar; Preston, Alison R.

    2014-01-01

    The ability to combine information acquired at different times to make novel inferences is a powerful function of episodic memory. One perspective suggests that by retrieving related knowledge during new experiences, existing memories can be linked to the new, overlapping information as it is encoded. The resulting memory traces would thus incorporate content across event boundaries, representing important relationships among items encountered during separate experiences. While prior work suggests that the hippocampus is involved in linking memories experienced at different times, the involvement of specific subfields in this process remains unknown. Using both univariate and multivariate analyses of high-resolution functional magnetic resonance imaging (fMRI) data, we localized this specialized encoding mechanism to human CA1. Specifically, right CA1 responses during encoding of events that overlapped with prior experience predicted subsequent success on a test requiring inferences about the relationships among events. Furthermore, we employed neural pattern similarity analysis to show that patterns of activation evoked during overlapping event encoding were later reinstated in CA1 during successful inference. The reinstatement of CA1 patterns during inference was specific to those trials that were performed quickly and accurately, consistent with the notion that linking memories during learning facilitates novel judgments. These analyses provide converging evidence that CA1 plays a unique role in encoding overlapping events and highlight the dynamic interactions between hippocampal-mediated encoding and retrieval processes. More broadly, our data reflect the adaptive nature of episodic memories, in which representations are derived across events in anticipation of future judgments. PMID:24888442

  7. The Universe Is Reionizing at z ∼ 7: Bayesian Inference of the IGM Neutral Fraction Using Lyα Emission from Galaxies

    NASA Astrophysics Data System (ADS)

    Mason, Charlotte A.; Treu, Tommaso; Dijkstra, Mark; Mesinger, Andrei; Trenti, Michele; Pentericci, Laura; de Barros, Stephane; Vanzella, Eros

    2018-03-01

    We present a new flexible Bayesian framework for directly inferring the fraction of neutral hydrogen in the intergalactic medium (IGM) during the Epoch of Reionization (EoR, z ∼ 6–10) from detections and non-detections of Lyman Alpha (Lyα) emission from Lyman Break galaxies (LBGs). Our framework combines sophisticated reionization simulations with empirical models of the interstellar medium (ISM) radiative transfer effects on Lyα. We assert that the Lyα line profile emerging from the ISM has an important impact on the resulting transmission of photons through the IGM, and that these line profiles depend on galaxy properties. We model this effect by considering the peak velocity offset of Lyα lines from host galaxies’ systemic redshifts, which are empirically correlated with UV luminosity and redshift (or halo mass at fixed redshift). We use our framework on the sample of LBGs presented in Pentericci et al. and infer a global neutral fraction at z ∼ 7 of {\\overline{x}}{{H}{{I}}}={0.59}-0.15+0.11, consistent with other robust probes of the EoR and confirming that reionization is ongoing ∼700 Myr after the Big Bang. We show that using the full distribution of Lyα equivalent width detections and upper limits from LBGs places tighter constraints on the evolving IGM than the standard Lyα emitter fraction, and that larger samples are within reach of deep spectroscopic surveys of gravitationally lensed fields and James Webb Space Telescope NIRSpec.

  8. Earthquake fracture energy inferred from kinematic rupture models on extended faults

    USGS Publications Warehouse

    Tinti, E.; Spudich, P.; Cocco, M.

    2005-01-01

    We estimate fracture energy on extended faults for several recent earthquakes by retrieving dynamic traction evolution at each point on the fault plane from slip history imaged by inverting ground motion waveforms. We define the breakdown work (Wb) as the excess of work over some minimum traction level achieved during slip. Wb is equivalent to "seismological" fracture energy (G) in previous investigations. Our numerical approach uses slip velocity as a boundary condition on the fault. We employ a three-dimensional finite difference algorithm to compute the dynamic traction evolution in the time domain during the earthquake rupture. We estimate Wb by calculating the scalar product between dynamic traction and slip velocity vectors. This approach does not require specifying a constitutive law and assuming dynamic traction to be collinear with slip velocity. If these vectors are not collinear, the inferred breakdown work depends on the initial traction level. We show that breakdown work depends on the square of slip. The spatial distribution of breakdown work in a single earthquake is strongly correlated with the slip distribution. Breakdown work density and its integral over the fault, breakdown energy, scale with seismic moment according to a power law (with exponent 0.59 and 1.18, respectively). Our estimates of breakdown work range between 4 ?? 105 and 2 ?? 107 J/m2 for earthquakes having moment magnitudes between 5.6 and 7.2. We also compare our inferred values with geologic surface energies. This comparison might suggest that breakdown work for large earthquakes goes primarily into heat production. Copyright 2005 by the American Geophysical Union.

  9. Bayesian Model Selection in Geophysics: The evidence

    NASA Astrophysics Data System (ADS)

    Vrugt, J. A.

    2016-12-01

    Bayesian inference has found widespread application and use in science and engineering to reconcile Earth system models with data, including prediction in space (interpolation), prediction in time (forecasting), assimilation of observations and deterministic/stochastic model output, and inference of the model parameters. Per Bayes theorem, the posterior probability, , P(H|D), of a hypothesis, H, given the data D, is equivalent to the product of its prior probability, P(H), and likelihood, L(H|D), divided by a normalization constant, P(D). In geophysics, the hypothesis, H, often constitutes a description (parameterization) of the subsurface for some entity of interest (e.g. porosity, moisture content). The normalization constant, P(D), is not required for inference of the subsurface structure, yet of great value for model selection. Unfortunately, it is not particularly easy to estimate P(D) in practice. Here, I will introduce the various building blocks of a general purpose method which provides robust and unbiased estimates of the evidence, P(D). This method uses multi-dimensional numerical integration of the posterior (parameter) distribution. I will then illustrate this new estimator by application to three competing subsurface models (hypothesis) using GPR travel time data from the South Oyster Bacterial Transport Site, in Virginia, USA. The three subsurface models differ in their treatment of the porosity distribution and use (a) horizontal layering with fixed layer thicknesses, (b) vertical layering with fixed layer thicknesses and (c) a multi-Gaussian field. The results of the new estimator are compared against the brute force Monte Carlo method, and the Laplace-Metropolis method.

  10. An integrative method for testing form–function linkages and reconstructed evolutionary pathways of masticatory specialization

    PubMed Central

    Tseng, Z. Jack; Flynn, John J.

    2015-01-01

    Morphology serves as a ubiquitous proxy in macroevolutionary studies to identify potential adaptive processes and patterns. Inferences of functional significance of phenotypes or their evolution are overwhelmingly based on data from living taxa. Yet, correspondence between form and function has been tested in only a few model species, and those linkages are highly complex. The lack of explicit methodologies to integrate form and function analyses within a deep-time and phylogenetic context weakens inferences of adaptive morphological evolution, by invoking but not testing form–function linkages. Here, we provide a novel approach to test mechanical properties at reconstructed ancestral nodes/taxa and the strength and direction of evolutionary pathways in feeding biomechanics, in a case study of carnivorous mammals. Using biomechanical profile comparisons that provide functional signals for the separation of feeding morphologies, we demonstrate, using experimental optimization criteria on estimation of strength and direction of functional changes on a phylogeny, that convergence in mechanical properties and degree of evolutionary optimization can be decoupled. This integrative approach is broadly applicable to other clades, by using quantitative data and model-based tests to evaluate interpretations of function from morphology and functional explanations for observed macroevolutionary pathways. PMID:25994295

  11. Degree of Ice Particle Surface Roughness Inferred from Polarimetric Observations

    NASA Technical Reports Server (NTRS)

    Hioki, Souichiro; Yang, Ping; Baum, Bryan A.; Platnick, Steven; Meyer, Kerry G.; King, Michael D.; Riedi, Jerome

    2016-01-01

    The degree of surface roughness of ice particles within thick, cold ice clouds is inferred from multidirectional, multi-spectral satellite polarimetric observations over oceans, assuming a column-aggregate particle habit. An improved roughness inference scheme is employed that provides a more noise-resilient roughness estimate than the conventional best-fit approach. The improvements include the introduction of a quantitative roughness parameter based on empirical orthogonal function analysis and proper treatment of polarization due to atmospheric scattering above clouds. A global 1-month data sample supports the use of a severely roughened ice habit to simulate the polarized reflectivity associated with ice clouds over ocean. The density distribution of the roughness parameter inferred from the global 1- month data sample and further analyses of a few case studies demonstrate the significant variability of ice cloud single-scattering properties. However, the present theoretical results do not agree with observations in the tropics. In the extra-tropics, the roughness parameter is inferred but 74% of the sample is out of the expected parameter range. Potential improvements are discussed to enhance the depiction of the natural variability on a global scale.

  12. Impact of Bayesian Priors on the Characterization of Binary Black Hole Coalescences

    NASA Astrophysics Data System (ADS)

    Vitale, Salvatore; Gerosa, Davide; Haster, Carl-Johan; Chatziioannou, Katerina; Zimmerman, Aaron

    2017-12-01

    In a regime where data are only mildly informative, prior choices can play a significant role in Bayesian statistical inference, potentially affecting the inferred physics. We show this is indeed the case for some of the parameters inferred from current gravitational-wave measurements of binary black hole coalescences. We reanalyze the first detections performed by the twin LIGO interferometers using alternative (and astrophysically motivated) prior assumptions. We find different prior distributions can introduce deviations in the resulting posteriors that impact the physical interpretation of these systems. For instance, (i) limits on the 90% credible interval on the effective black hole spin χeff are subject to variations of ˜10 % if a prior with black hole spins mostly aligned to the binary's angular momentum is considered instead of the standard choice of isotropic spin directions, and (ii) under priors motivated by the initial stellar mass function, we infer tighter constraints on the black hole masses, and in particular, we find no support for any of the inferred masses within the putative mass gap M ≲5 M⊙.

  13. Impact of Bayesian Priors on the Characterization of Binary Black Hole Coalescences.

    PubMed

    Vitale, Salvatore; Gerosa, Davide; Haster, Carl-Johan; Chatziioannou, Katerina; Zimmerman, Aaron

    2017-12-22

    In a regime where data are only mildly informative, prior choices can play a significant role in Bayesian statistical inference, potentially affecting the inferred physics. We show this is indeed the case for some of the parameters inferred from current gravitational-wave measurements of binary black hole coalescences. We reanalyze the first detections performed by the twin LIGO interferometers using alternative (and astrophysically motivated) prior assumptions. We find different prior distributions can introduce deviations in the resulting posteriors that impact the physical interpretation of these systems. For instance, (i) limits on the 90% credible interval on the effective black hole spin χ_{eff} are subject to variations of ∼10% if a prior with black hole spins mostly aligned to the binary's angular momentum is considered instead of the standard choice of isotropic spin directions, and (ii) under priors motivated by the initial stellar mass function, we infer tighter constraints on the black hole masses, and in particular, we find no support for any of the inferred masses within the putative mass gap M≲5  M_{⊙}.

  14. Skin integrated with perfusable vascular channels on a chip.

    PubMed

    Mori, Nobuhito; Morimoto, Yuya; Takeuchi, Shoji

    2017-02-01

    This paper describes a method for fabricating perfusable vascular channels coated with endothelial cells within a cultured skin-equivalent by fixing it to a culture device connected to an external pump and tubes. A histological analysis showed that vascular channels were constructed in the skin-equivalent, which showed a conventional dermal/epidermal morphology, and the endothelial cells formed tight junctions on the vascular channel wall. The barrier function of the skin-equivalent was also confirmed. Cell distribution analysis indicated that the vascular channels supplied nutrition to the skin-equivalent. Moreover, the feasibility of a skin-equivalent containing vascular channels as a model for studying vascular absorption was demonstrated by measuring test molecule permeation from the epidermal layer into the vascular channels. The results suggested that this skin-equivalent can be used for skin-on-a-chip applications including drug development, cosmetics testing, and studying skin biology. Copyright © 2016 Elsevier Ltd. All rights reserved.

  15. Hawking radiation, Unruh radiation, and the equivalence principle.

    PubMed

    Singleton, Douglas; Wilburn, Steve

    2011-08-19

    We compare the response function of an Unruh-DeWitt detector for different space-times and different vacua and show that there is a detailed violation of the equivalence principle. In particular comparing the response of an accelerating detector to a detector at rest in a Schwarzschild space-time we find that both detectors register thermal radiation, but for a given, equivalent acceleration the fixed detector in the Schwarzschild space-time measures a higher temperature. This allows one to locally distinguish the two cases. As one approaches the horizon the two temperatures have the same limit so that the equivalence principle is restored at the horizon. © 2011 American Physical Society

  16. Generalized serial search code acquisition - The equivalent circular state diagram approach

    NASA Technical Reports Server (NTRS)

    Polydoros, A.; Simon, M. K.

    1984-01-01

    A transform-domain method for deriving the generating function of the acquisition process resulting from an arbitrary serial search strategy is presented. The method relies on equivalent circular state diagrams, uses Mason's formula from flow-graph theory, and employs a minimum number of required parameters. The transform-domain approach is briefly described and the concept of equivalent circular state diagrams is introduced and exploited to derive the generating function and resulting mean acquisition time for three particular cases of interest, the continuous/center Z search, the broken/center Z search, and the expanding window search. An optimization of the latter technique is performed whereby the number of partial windows which minimizes the mean acquisition time is determined. The numerical results satisfy certain intuitive predictions and provide useful design guidelines for such systems.

  17. Comparison of physically- and economically-based CO2-equivalences for methane

    NASA Astrophysics Data System (ADS)

    Boucher, O.

    2012-01-01

    There is a controversy on the role methane (and other short-lived species) should play in climate mitigation policies and no consensus on what an optimal methane CO2-equivalence should be. We revisit this question by discussing the relative merits of physically-based (i.e. Global Warming Potential or GWP and Global Temperature change Potential or GTP) and socio-economically-based climate metrics. To this effect we use a simplified Global Damage Potential (GDP) that was introduced by earlier authors and investigate the uncertainties in the methane CO2-equivalence that arise from physical and socio-economic factors. The median value of the methane GDP comes out very close to the widely used methane 100-year GWP because of various compensating effects. However there is a large spread in possible methane CO2-equivalences (1-99% interval: 10.0-42.5; 5-95% interval: 12.5-38.0) that is essentially due to the choice in some socio-economic parameters (i.e. the damage cost function and the discount rate). The methane 100-year GTP falls outside these ranges. It is legitimate to increase the methane CO2-equivalence in the future as global warming unfolds. While changes in biogeochemical cycles and radiative efficiencies cause some small changes to physically-based metrics, a systematic increase in the methane CO2-equivalence can only be achieved by some ad-hoc shortening of the time horizon. In contrast using a convex damage cost function provides a natural increase in the methane CO2-equivalence for the socio-economically-based metrics. We also show that a methane CO2-equivalence based on a pulse emission is sufficient to inform multi-year climate policies and emissions reductions as long as there is some degree of visibility on CO2 prices and CO2-equivalences.

  18. Quasars Probing Quasars. VII. The Pinnacle of the Cool Circumgalactic Medium Surrounds Massive z ~ 2 Galaxies

    NASA Astrophysics Data System (ADS)

    Prochaska, J. Xavier; Lau, Marie Wingyee; Hennawi, Joseph F.

    2014-12-01

    We survey the incidence and absorption strength of the metal-line transitions C II 1334 and C IV 1548 from the circumgalactic medium (CGM) surrounding z ~ 2 quasars, which act as signposts for massive dark matter halos M halo ≈ 1012.5 M ⊙. On scales of the virial radius (r vir ≈ 160 kpc), we measure a high covering fraction fC = 0.73 ± 0.10 to strong C II 1334 absorption (rest equivalent width W 1334 >= 0.2 Å), implying a massive reservoir of cool (T ~ 104 K) metal enriched gas. We conservatively estimate a metal mass exceeding 108 M ⊙. We propose that these metals trace enrichment of the incipient intragroup/intracluster medium that these halos eventually inhabit. This cool CGM around quasars is the pinnacle among galaxies observed at all epochs, as regards covering the fraction and average equivalent width of H I Lyα and low-ion metal absorption. We argue that the properties of this cool CGM primarily reflect the halo mass, and that other factors such as feedback, star-formation rate, and accretion from the intergalactic medium are secondary. We further estimate that the CGM of massive, z ~ 2 galaxies accounts for the majority of strong Mg II absorption along random quasar sightlines. Last, we detect an excess of strong C IV 1548 absorption (W 1548 >= 0.3 Å) over random incidence to the 1 Mpc physical impact parameter and measure the quasar-C IV cross-correlation function: ξ C \\scriptsize{IV-Q}(r) = (r/r_0)-γ with r0 = 7.5+2.8-1.4 h-1 Mpc and γ = 1.7+0.1-0.2. Consistent with previous work on larger scales, we infer that this highly ionized C IV gas traces massive (1012 M ⊙) halos.

  19. Analytic continuation of quantum Monte Carlo data by stochastic analytical inference.

    PubMed

    Fuchs, Sebastian; Pruschke, Thomas; Jarrell, Mark

    2010-05-01

    We present an algorithm for the analytic continuation of imaginary-time quantum Monte Carlo data which is strictly based on principles of Bayesian statistical inference. Within this framework we are able to obtain an explicit expression for the calculation of a weighted average over possible energy spectra, which can be evaluated by standard Monte Carlo simulations, yielding as by-product also the distribution function as function of the regularization parameter. Our algorithm thus avoids the usual ad hoc assumptions introduced in similar algorithms to fix the regularization parameter. We apply the algorithm to imaginary-time quantum Monte Carlo data and compare the resulting energy spectra with those from a standard maximum-entropy calculation.

  20. Tissue Equivalents Based on Cell-Seeded Biodegradable Microfluidic Constructs

    PubMed Central

    Borenstein, Jeffrey T.; Megley, Katie; Wall, Kimberly; Pritchard, Eleanor M.; Truong, David; Kaplan, David L.; Tao, Sarah L.; Herman, Ira M.

    2010-01-01

    One of the principal challenges in the field of tissue engineering and regenerative medicine is the formation of functional microvascular networks capable of sustaining tissue constructs. Complex tissues and vital organs require a means to support oxygen and nutrient transport during the development of constructs both prior to and after host integration, and current approaches have not demonstrated robust solutions to this challenge. Here, we present a technology platform encompassing the design, construction, cell seeding and functional evaluation of tissue equivalents for wound healing and other clinical applications. These tissue equivalents are comprised of biodegradable microfluidic scaffolds lined with microvascular cells and designed to replicate microenvironmental cues necessary to generate and sustain cell populations to replace dermal and/or epidermal tissues lost due to trauma or disease. Initial results demonstrate that these biodegradable microfluidic devices promote cell adherence and support basic cell functions. These systems represent a promising pathway towards highly integrated three-dimensional engineered tissue constructs for a wide range of clinical applications.

  1. Functional Equivalence of Spatial Images from Touch and Vision: Evidence from Spatial Updating in Blind and Sighted Individuals

    PubMed Central

    Giudice, Nicholas A.; Betty, Maryann R.; Loomis, Jack M.

    2012-01-01

    This research examines whether visual and haptic map learning yield functionally equivalent spatial images in working memory, as evidenced by similar encoding bias and updating performance. In three experiments, participants learned four-point routes either by seeing or feeling the maps. At test, blindfolded participants made spatial judgments about the maps from imagined perspectives that were either aligned or misaligned with the maps as represented in working memory. Results from Experiments 1 and 2 revealed a highly similar pattern of latencies and errors between visual and haptic conditions. These findings extend the well known alignment biases for visual map learning to haptic map learning, provide further evidence of haptic updating, and most importantly, show that learning from the two modalities yields very similar performance across all conditions. Experiment 3 found the same encoding biases and updating performance with blind individuals, demonstrating that functional equivalence cannot be due to visual recoding and is consistent with an amodal hypothesis of spatial images. PMID:21299331

  2. Understanding advanced theory of mind and empathy in high-functioning adults with autism spectrum disorder.

    PubMed

    Mathersul, Danielle; McDonald, Skye; Rushby, Jacqueline A

    2013-01-01

    It has been argued that higher functioning individuals with autism spectrum disorders (ASDs) have specific deficits in advanced but not simple theory of mind (ToM), yet the questionable ecological validity of some tasks reduces the strength of this assumption. The present study employed The Awareness of Social Inference Test (TASIT), which uses video vignettes to assess comprehension of subtle conversational inferences (sarcasm, lies/deception). Given the proposed relationships between advanced ToM and cognitive and affective empathy, these associations were also investigated. As expected, the high-functioning adults with ASDs demonstrated specific deficits in comprehending the beliefs, intentions, and meaning of nonliteral expressions. They also had significantly lower cognitive and affective empathy. Cognitive empathy was related to ToM and group membership whereas affective empathy was only related to group membership.

  3. 78 FR 21633 - International Mail Product

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-11

    ... of United States Postal Service Filing of a Functionally Equivalent International Business Reply...); Attachment 3--a copy of Governors' Decision No. 08-24; and Attachment 4--an application for non-public... equivalent to the baseline agreement filed in Docket No. CP2011-59 because it shares similar cost and market...

  4. 78 FR 21632 - International Mail Product

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-11

    ... of United States Postal Service Filing of a Functionally Equivalent International Business Reply...' Decision No. 08-24; and Attachment 4--an application for non-public treatment of materials filed under seal... equivalent to the baseline agreement filed in Docket No. CP2011-59 because it shares similar cost and market...

  5. Seeing through Symbols: The Case of Equivalent Expressions.

    ERIC Educational Resources Information Center

    Kieran, Carolyn; Sfard, Anna

    1999-01-01

    Presents a teaching experiment to turn students from external observers into active participants in a game of algebra learning where students use graphs to build meaning for equivalence of algebraic expressions. Concludes that the graphic-functional approach seems to make the introduction to algebra much more meaningful for the learner. (ASK)

  6. Inferring distinct mechanisms in the absence of subjective differences: Placebo and centrally acting analgesic underlie unique brain adaptations.

    PubMed

    Tétreault, Pascal; Baliki, Marwan N; Baria, Alexis T; Bauer, William R; Schnitzer, Thomas J; Apkarian, A Vania

    2018-05-01

    Development and maintenance of chronic pain is associated with structural and functional brain reorganization. However, few studies have explored the impact of drug treatments on such changes. The extent to which long-term analgesia is related to brain adaptations and its effects on the reversibility of brain reorganization remain unclear. In a randomized placebo-controlled clinical trial, we contrasted pain relief (3-month treatment period), and anatomical (gray matter density [GMD], assessed by voxel-based morphometry) and functional connectivity (resting state fMRI nodal degree count [DC]) adaptations, in 39 knee osteoarthritis (OA) patients (22 females), randomized to duloxetine (DLX, 60 mg once daily) or placebo. Pain relief was equivalent between treatment types. However, distinct circuitry (GMD and DC) could explain pain relief in each group: up to 85% of variance for placebo analgesia and 49% of variance for DLX analgesia. No behavioral measures (collected at entry into the study) could independently explain observed analgesia. Identified circuitry were outside of nociceptive circuitry and minimally overlapped with OA-abnormal or placebo response predictive brain regions. Mediation analysis revealed that changes in GMD and DC can influence each other across remote brain regions to explain observed analgesia. Therefore, we can conclude that distinct brain mechanisms underlie DLX and placebo analgesia in OA. The results demonstrate that even in the absence of differences in subjective pain relief, pharmacological treatments can be differentiated from placebo based on objective brain biomarkers. This is a crucial step to untangling mechanisms and advancing personalized therapy approaches for chronic pain. © 2018 Wiley Periodicals, Inc.

  7. Evaluation of Second-Level Inference in fMRI Analysis

    PubMed Central

    Roels, Sanne P.; Loeys, Tom; Moerkerke, Beatrijs

    2016-01-01

    We investigate the impact of decisions in the second-level (i.e., over subjects) inferential process in functional magnetic resonance imaging on (1) the balance between false positives and false negatives and on (2) the data-analytical stability, both proxies for the reproducibility of results. Second-level analysis based on a mass univariate approach typically consists of 3 phases. First, one proceeds via a general linear model for a test image that consists of pooled information from different subjects. We evaluate models that take into account first-level (within-subjects) variability and models that do not take into account this variability. Second, one proceeds via inference based on parametrical assumptions or via permutation-based inference. Third, we evaluate 3 commonly used procedures to address the multiple testing problem: familywise error rate correction, False Discovery Rate (FDR) correction, and a two-step procedure with minimal cluster size. Based on a simulation study and real data we find that the two-step procedure with minimal cluster size results in most stable results, followed by the familywise error rate correction. The FDR results in most variable results, for both permutation-based inference and parametrical inference. Modeling the subject-specific variability yields a better balance between false positives and false negatives when using parametric inference. PMID:26819578

  8. Reward inference by primate prefrontal and striatal neurons.

    PubMed

    Pan, Xiaochuan; Fan, Hongwei; Sawa, Kosuke; Tsuda, Ichiro; Tsukada, Minoru; Sakagami, Masamichi

    2014-01-22

    The brain contains multiple yet distinct systems involved in reward prediction. To understand the nature of these processes, we recorded single-unit activity from the lateral prefrontal cortex (LPFC) and the striatum in monkeys performing a reward inference task using an asymmetric reward schedule. We found that neurons both in the LPFC and in the striatum predicted reward values for stimuli that had been previously well experienced with set reward quantities in the asymmetric reward task. Importantly, these LPFC neurons could predict the reward value of a stimulus using transitive inference even when the monkeys had not yet learned the stimulus-reward association directly; whereas these striatal neurons did not show such an ability. Nevertheless, because there were two set amounts of reward (large and small), the selected striatal neurons were able to exclusively infer the reward value (e.g., large) of one novel stimulus from a pair after directly experiencing the alternative stimulus with the other reward value (e.g., small). Our results suggest that although neurons that predict reward value for old stimuli in the LPFC could also do so for new stimuli via transitive inference, those in the striatum could only predict reward for new stimuli via exclusive inference. Moreover, the striatum showed more complex functions than was surmised previously for model-free learning.

  9. Reward Inference by Primate Prefrontal and Striatal Neurons

    PubMed Central

    Pan, Xiaochuan; Fan, Hongwei; Sawa, Kosuke; Tsuda, Ichiro; Tsukada, Minoru

    2014-01-01

    The brain contains multiple yet distinct systems involved in reward prediction. To understand the nature of these processes, we recorded single-unit activity from the lateral prefrontal cortex (LPFC) and the striatum in monkeys performing a reward inference task using an asymmetric reward schedule. We found that neurons both in the LPFC and in the striatum predicted reward values for stimuli that had been previously well experienced with set reward quantities in the asymmetric reward task. Importantly, these LPFC neurons could predict the reward value of a stimulus using transitive inference even when the monkeys had not yet learned the stimulus–reward association directly; whereas these striatal neurons did not show such an ability. Nevertheless, because there were two set amounts of reward (large and small), the selected striatal neurons were able to exclusively infer the reward value (e.g., large) of one novel stimulus from a pair after directly experiencing the alternative stimulus with the other reward value (e.g., small). Our results suggest that although neurons that predict reward value for old stimuli in the LPFC could also do so for new stimuli via transitive inference, those in the striatum could only predict reward for new stimuli via exclusive inference. Moreover, the striatum showed more complex functions than was surmised previously for model-free learning. PMID:24453328

  10. Derivative expansion of wave function equivalent potentials

    NASA Astrophysics Data System (ADS)

    Sugiura, Takuya; Ishii, Noriyoshi; Oka, Makoto

    2017-04-01

    Properties of the wave function equivalent potentials introduced by the HAL QCD collaboration are studied in a nonrelativistic coupled-channel model. The derivative expansion is generalized, and then applied to the energy-independent and nonlocal potentials. The expansion coefficients are determined from analytic solutions to the Nambu-Bethe-Salpeter wave functions. The scattering phase shifts computed from these potentials are compared with the exact values to examine the convergence of the expansion. It is confirmed that the generalized derivative expansion converges in terms of the scattering phase shift rather than the functional structure of the non-local potentials. It is also found that the convergence can be improved by tuning either the choice of interpolating fields or expansion scale in the generalized derivative expansion.

  11. Creators' Intentions Bias Judgments of Function Independently from Causal Inferences

    ERIC Educational Resources Information Center

    Chaigneau, Sergio E.; Castillo, Ramon D.; Martinez, Luis

    2008-01-01

    Participants learned about novel artifacts that were created for function X, but later used for function Y. When asked to rate the extent to which X and Y were a given artifact's function, participants consistently rated X higher than Y. In Experiments 1 and 2, participants were also asked to rate artifacts' efficiency to perform X and Y. This…

  12. Differences between selection on sex versus recombination in red queen models with diploid hosts.

    PubMed

    Agrawal, Aneil F

    2009-08-01

    The Red Queen hypothesis argues that parasites generate selection for genetic mixing (sex and recombination) in their hosts. A number of recent papers have examined this hypothesis using models with haploid hosts. In these haploid models, sex and recombination are selectively equivalent. However, sex and recombination are not equivalent in diploids because selection on sex depends on the consequences of segregation as well as recombination. Here I compare how parasites select on modifiers of sexual reproduction and modifiers of recombination rate. Across a wide set of parameters, parasites tend to select against both sex and recombination, though recombination is favored more often than is sex. There is little correspondence between the conditions favoring sex and those favoring recombination, indicating that the direction of selection on sex is often determined by the effects of segregation, not recombination. Moreover, when sex was favored it is usually due to a long-term advantage whereas short-term effects are often responsible for selection favoring recombination. These results strongly indicate that Red Queen models focusing exclusively on the effects of recombination cannot be used to infer the type of selection on sex that is generated by parasites on diploid hosts.

  13. Regulatory Considerations of Bioequivalence Studies for Oral Solid Dosage Forms in Japan.

    PubMed

    Kuribayashi, Ryosuke; Takishita, Tomoko; Mikami, Kenichi

    2016-08-01

    Bioequivalence (BE) studies are used to infer the therapeutic equivalence of generic drug products to original drug products throughout the world. In BE studies, bioavailability (BA) should be compared between the original and generic drug products, with BA defined as the rate and extent of absorption of active pharmaceutical ingredients or active metabolites from a product into the systemic circulation. For most of BE studies conducted during generic drug development, BA comparisons are performed in single-dose studies. In Japan, the revised "Guideline for Bioequivalence Studies of Generic Products" was made available in 2012 by the Ministry of Health, Labour, and Welfare, and generic drug development is currently conducted based on this guideline. Similarly, the U.S. Food and Drug Administration and European Medicines Agency have published guidance and guideline on generic drug development. This article introduces the guideline on Japanese BE studies for oral solid dosage forms and the dissolution tests for the similarity and equivalence evaluation between the original and generic drug products. Additionally, we discuss some of the similarities and differences in guideline between Japan, the United States, and the European Union. Copyright © 2016 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  14. Tevatron Run II Combination of the Effective Leptonic Electroweak Mixing Angle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Aaltonen, Timo Antero; et al.

    Drell-Yan lepton pairs produced in the processmore » $$p \\bar{p} \\rightarrow \\ell^+\\ell^- + X$$ through an intermediate $$\\gamma^*/Z$$ boson have an asymmetry in their angular distribution related to the spontaneous symmetry breaking of the electroweak force and the associated mixing of its neutral gauge bosons. The CDF and D0 experiments have measured the effective-leptonic electroweak mixing parameter $$\\sin^2\\theta^{\\rm lept}_{\\rm eff}$$ using electron and muon pairs selected from the full Tevatron proton-antiproton data sets collected in 2001-2011, corresponding to 9-10 fb$$^{-1}$$ of integrated luminosity. The combination of these measurements yields the most precise result from hadron colliders, $$\\sin^2 \\theta^{\\rm lept}_{\\rm eff} = 0.23148 \\pm 0.00033$$. This result is consistent with, and approaches in precision, the best measurements from electron-positron colliders. The standard model inference of the on-shell electroweak mixing parameter $$\\sin^2\\theta_W$$, or equivalently the $W$-boson mass $$M_W$$, using the \\textsc{zfitter} software package yields $$\\sin^2 \\theta_W = 0.22324 \\pm 0.00033$$ or equivalently, $$M_W = 80.367 \\pm 0.017 \\;{\\rm GeV}/c^2$$.« less

  15. Measuring the distance between multiple sequence alignments.

    PubMed

    Blackburne, Benjamin P; Whelan, Simon

    2012-02-15

    Multiple sequence alignment (MSA) is a core method in bioinformatics. The accuracy of such alignments may influence the success of downstream analyses such as phylogenetic inference, protein structure prediction, and functional prediction. The importance of MSA has lead to the proliferation of MSA methods, with different objective functions and heuristics to search for the optimal MSA. Different methods of inferring MSAs produce different results in all but the most trivial cases. By measuring the differences between inferred alignments, we may be able to develop an understanding of how these differences (i) relate to the objective functions and heuristics used in MSA methods, and (ii) affect downstream analyses. We introduce four metrics to compare MSAs, which include the position in a sequence where a gap occurs or the location on a phylogenetic tree where an insertion or deletion (indel) event occurs. We use both real and synthetic data to explore the information given by these metrics and demonstrate how the different metrics in combination can yield more information about MSA methods and the differences between them. MetAl is a free software implementation of these metrics in Haskell. Source and binaries for Windows, Linux and Mac OS X are available from http://kumiho.smith.man.ac.uk/whelan/software/metal/.

  16. Influence function for robust phylogenetic reconstructions.

    PubMed

    Bar-Hen, Avner; Mariadassou, Mahendra; Poursat, Marie-Anne; Vandenkoornhuyse, Philippe

    2008-05-01

    Based on the computation of the influence function, a tool to measure the impact of each piece of sampled data on the statistical inference of a parameter, we propose to analyze the support of the maximum-likelihood (ML) tree for each site. We provide a new tool for filtering data sets (nucleotides, amino acids, and others) in the context of ML phylogenetic reconstructions. Because different sites support different phylogenic topologies in different ways, outlier sites, that is, sites with a very negative influence value, are important: they can drastically change the topology resulting from the statistical inference. Therefore, these outlier sites must be clearly identified and their effects accounted for before drawing biological conclusions from the inferred tree. A matrix containing 158 fungal terminals all belonging to Chytridiomycota, Zygomycota, and Glomeromycota is analyzed. We show that removing the strongest outlier from the analysis strikingly modifies the ML topology, with a loss of as many as 20% of the internal nodes. As a result, estimating the topology on the filtered data set results in a topology with enhanced bootstrap support. From this analysis, the polyphyletic status of the fungal phyla Chytridiomycota and Zygomycota is reinforced, suggesting the necessity of revisiting the systematics of these fungal groups. We show the ability of influence function to produce new evolution hypotheses.

  17. Functional mechanisms of probabilistic inference in feature- and space-based attentional systems.

    PubMed

    Dombert, Pascasie L; Kuhns, Anna; Mengotti, Paola; Fink, Gereon R; Vossel, Simone

    2016-11-15

    Humans flexibly attend to features or locations and these processes are influenced by the probability of sensory events. We combined computational modeling of response times with fMRI to compare the functional correlates of (re-)orienting, and the modulation by probabilistic inference in spatial and feature-based attention systems. Twenty-four volunteers performed two task versions with spatial or color cues. Percentage of cue validity changed unpredictably. A hierarchical Bayesian model was used to derive trial-wise estimates of probability-dependent attention, entering the fMRI analysis as parametric regressors. Attentional orienting activated a dorsal frontoparietal network in both tasks, without significant parametric modulation. Spatially invalid trials activated a bilateral frontoparietal network and the precuneus, while invalid feature trials activated the left intraparietal sulcus (IPS). Probability-dependent attention modulated activity in the precuneus, left posterior IPS, middle occipital gyrus, and right temporoparietal junction for spatial attention, and in the left anterior IPS for feature-based and spatial attention. These findings provide novel insights into the generality and specificity of the functional basis of attentional control. They suggest that probabilistic inference can distinctively affect each attentional subsystem, but that there is an overlap in the left IPS, which responds to both spatial and feature-based expectancy violations. Copyright © 2016 Elsevier Inc. All rights reserved.

  18. Bounded-Influence Inference in Regression.

    DTIC Science & Technology

    1984-02-01

    be viewed as generalization of the classical F-test. By means of the influence function their robustness properties are investigated and optimally...robust tests that maximize the asymptotic power within each class, under the side condition of a bounded influence function , are constructed. Finally, an

  19. The "Instructional Leader" Must Go.

    ERIC Educational Resources Information Center

    Evans, Dennis L.

    Using some dictionary definitions, one might easily infer that supervision of teaching is a managerial/administrative function closely related to evaluation and control, implying hierarchical connotations. However, Guthrie and Reed (1991) describe teacher supervision as "a function of leadership concerned with improving, enhancing, and reinforcing…

  20. Counting Craters on MOC Images: Production Functions and Other Complications

    NASA Technical Reports Server (NTRS)

    Plaut, J. J.

    2001-01-01

    New crater counts on MOC images and associated Viking Orbiter images are used to address the issue of the crater production function at Mars, and to infer aspects of resurfacing processes. Additional information is contained in the original extended abstract.

Top