Sample records for prior information based

  1. Sensitivity analyses for sparse-data problems-using weakly informative bayesian priors.

    PubMed

    Hamra, Ghassan B; MacLehose, Richard F; Cole, Stephen R

    2013-03-01

    Sparse-data problems are common, and approaches are needed to evaluate the sensitivity of parameter estimates based on sparse data. We propose a Bayesian approach that uses weakly informative priors to quantify sensitivity of parameters to sparse data. The weakly informative prior is based on accumulated evidence regarding the expected magnitude of relationships using relative measures of disease association. We illustrate the use of weakly informative priors with an example of the association of lifetime alcohol consumption and head and neck cancer. When data are sparse and the observed information is weak, a weakly informative prior will shrink parameter estimates toward the prior mean. Additionally, the example shows that when data are not sparse and the observed information is not weak, a weakly informative prior is not influential. Advancements in implementation of Markov Chain Monte Carlo simulation make this sensitivity analysis easily accessible to the practicing epidemiologist.

  2. Sensitivity Analyses for Sparse-Data Problems—Using Weakly Informative Bayesian Priors

    PubMed Central

    Hamra, Ghassan B.; MacLehose, Richard F.; Cole, Stephen R.

    2013-01-01

    Sparse-data problems are common, and approaches are needed to evaluate the sensitivity of parameter estimates based on sparse data. We propose a Bayesian approach that uses weakly informative priors to quantify sensitivity of parameters to sparse data. The weakly informative prior is based on accumulated evidence regarding the expected magnitude of relationships using relative measures of disease association. We illustrate the use of weakly informative priors with an example of the association of lifetime alcohol consumption and head and neck cancer. When data are sparse and the observed information is weak, a weakly informative prior will shrink parameter estimates toward the prior mean. Additionally, the example shows that when data are not sparse and the observed information is not weak, a weakly informative prior is not influential. Advancements in implementation of Markov Chain Monte Carlo simulation make this sensitivity analysis easily accessible to the practicing epidemiologist. PMID:23337241

  3. Investigating the impact of spatial priors on the performance of model-based IVUS elastography

    PubMed Central

    Richards, M S; Doyley, M M

    2012-01-01

    This paper describes methods that provide pre-requisite information for computing circumferential stress in modulus elastograms recovered from vascular tissue—information that could help cardiologists detect life-threatening plaques and predict their propensity to rupture. The modulus recovery process is an ill-posed problem; therefore additional information is needed to provide useful elastograms. In this work, prior geometrical information was used to impose hard or soft constraints on the reconstruction process. We conducted simulation and phantom studies to evaluate and compare modulus elastograms computed with soft and hard constraints versus those computed without any prior information. The results revealed that (1) the contrast-to-noise ratio of modulus elastograms achieved using the soft prior and hard prior reconstruction methods exceeded those computed without any prior information; (2) the soft prior and hard prior reconstruction methods could tolerate up to 8 % measurement noise; and (3) the performance of soft and hard prior modulus elastogram degraded when incomplete spatial priors were employed. This work demonstrates that including spatial priors in the reconstruction process should improve the performance of model-based elastography, and the soft prior approach should enhance the robustness of the reconstruction process to errors in the geometrical information. PMID:22037648

  4. Hippocampus segmentation using locally weighted prior based level set

    NASA Astrophysics Data System (ADS)

    Achuthan, Anusha; Rajeswari, Mandava

    2015-12-01

    Segmentation of hippocampus in the brain is one of a major challenge in medical image segmentation due to its' imaging characteristics, with almost similar intensity between another adjacent gray matter structure, such as amygdala. The intensity similarity has causes the hippocampus to have weak or fuzzy boundaries. With this main challenge being demonstrated by hippocampus, a segmentation method that relies on image information alone may not produce accurate segmentation results. Therefore, it is needed an assimilation of prior information such as shape and spatial information into existing segmentation method to produce the expected segmentation. Previous studies has widely integrated prior information into segmentation methods. However, the prior information has been utilized through a global manner integration, and this does not reflect the real scenario during clinical delineation. Therefore, in this paper, a locally integrated prior information into a level set model is presented. This work utilizes a mean shape model to provide automatic initialization for level set evolution, and has been integrated as prior information into the level set model. The local integration of edge based information and prior information has been implemented through an edge weighting map that decides at voxel level which information need to be observed during a level set evolution. The edge weighting map shows which corresponding voxels having sufficient edge information. Experiments shows that the proposed integration of prior information locally into a conventional edge-based level set model, known as geodesic active contour has shown improvement of 9% in averaged Dice coefficient.

  5. Investigating different approaches to develop informative priors in hierarchical Bayesian safety performance functions.

    PubMed

    Yu, Rongjie; Abdel-Aty, Mohamed

    2013-07-01

    The Bayesian inference method has been frequently adopted to develop safety performance functions. One advantage of the Bayesian inference is that prior information for the independent variables can be included in the inference procedures. However, there are few studies that discussed how to formulate informative priors for the independent variables and evaluated the effects of incorporating informative priors in developing safety performance functions. This paper addresses this deficiency by introducing four approaches of developing informative priors for the independent variables based on historical data and expert experience. Merits of these informative priors have been tested along with two types of Bayesian hierarchical models (Poisson-gamma and Poisson-lognormal models). Deviance information criterion (DIC), R-square values, and coefficients of variance for the estimations were utilized as evaluation measures to select the best model(s). Comparison across the models indicated that the Poisson-gamma model is superior with a better model fit and it is much more robust with the informative priors. Moreover, the two-stage Bayesian updating informative priors provided the best goodness-of-fit and coefficient estimation accuracies. Furthermore, informative priors for the inverse dispersion parameter have also been introduced and tested. Different types of informative priors' effects on the model estimations and goodness-of-fit have been compared and concluded. Finally, based on the results, recommendations for future research topics and study applications have been made. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Determining informative priors for cognitive models.

    PubMed

    Lee, Michael D; Vanpaemel, Wolf

    2018-02-01

    The development of cognitive models involves the creative scientific formalization of assumptions, based on theory, observation, and other relevant information. In the Bayesian approach to implementing, testing, and using cognitive models, assumptions can influence both the likelihood function of the model, usually corresponding to assumptions about psychological processes, and the prior distribution over model parameters, usually corresponding to assumptions about the psychological variables that influence those processes. The specification of the prior is unique to the Bayesian context, but often raises concerns that lead to the use of vague or non-informative priors in cognitive modeling. Sometimes the concerns stem from philosophical objections, but more often practical difficulties with how priors should be determined are the stumbling block. We survey several sources of information that can help to specify priors for cognitive models, discuss some of the methods by which this information can be formalized in a prior distribution, and identify a number of benefits of including informative priors in cognitive modeling. Our discussion is based on three illustrative cognitive models, involving memory retention, categorization, and decision making.

  7. SU-E-J-71: Spatially Preserving Prior Knowledge-Based Treatment Planning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, H; Xing, L

    2015-06-15

    Purpose: Prior knowledge-based treatment planning is impeded by the use of a single dose volume histogram (DVH) curve. Critical spatial information is lost from collapsing the dose distribution into a histogram. Even similar patients possess geometric variations that becomes inaccessible in the form of a single DVH. We propose a simple prior knowledge-based planning scheme that extracts features from prior dose distribution while still preserving the spatial information. Methods: A prior patient plan is not used as a mere starting point for a new patient but rather stopping criteria are constructed. Each structure from the prior patient is partitioned intomore » multiple shells. For instance, the PTV is partitioned into an inner, middle, and outer shell. Prior dose statistics are then extracted for each shell and translated into the appropriate Dmin and Dmax parameters for the new patient. Results: The partitioned dose information from a prior case has been applied onto 14 2-D prostate cases. Using prior case yielded final DVHs that was comparable to manual planning, even though the DVH for the prior case was different from the DVH for the 14 cases. Solely using a single DVH for the entire organ was also performed for comparison but showed a much poorer performance. Different ways of translating the prior dose statistics into parameters for the new patient was also tested. Conclusion: Prior knowledge-based treatment planning need to salvage the spatial information without transforming the patients on a voxel to voxel basis. An efficient balance between the anatomy and dose domain is gained through partitioning the organs into multiple shells. The use of prior knowledge not only serves as a starting point for a new case but the information extracted from the partitioned shells are also translated into stopping criteria for the optimization problem at hand.« less

  8. Tree Biomass Estimation of Chinese fir (Cunninghamia lanceolata) Based on Bayesian Method

    PubMed Central

    Zhang, Jianguo

    2013-01-01

    Chinese fir (Cunninghamia lanceolata (Lamb.) Hook.) is the most important conifer species for timber production with huge distribution area in southern China. Accurate estimation of biomass is required for accounting and monitoring Chinese forest carbon stocking. In the study, allometric equation was used to analyze tree biomass of Chinese fir. The common methods for estimating allometric model have taken the classical approach based on the frequency interpretation of probability. However, many different biotic and abiotic factors introduce variability in Chinese fir biomass model, suggesting that parameters of biomass model are better represented by probability distributions rather than fixed values as classical method. To deal with the problem, Bayesian method was used for estimating Chinese fir biomass model. In the Bayesian framework, two priors were introduced: non-informative priors and informative priors. For informative priors, 32 biomass equations of Chinese fir were collected from published literature in the paper. The parameter distributions from published literature were regarded as prior distributions in Bayesian model for estimating Chinese fir biomass. Therefore, the Bayesian method with informative priors was better than non-informative priors and classical method, which provides a reasonable method for estimating Chinese fir biomass. PMID:24278198

  9. Tree biomass estimation of Chinese fir (Cunninghamia lanceolata) based on Bayesian method.

    PubMed

    Zhang, Xiongqing; Duan, Aiguo; Zhang, Jianguo

    2013-01-01

    Chinese fir (Cunninghamia lanceolata (Lamb.) Hook.) is the most important conifer species for timber production with huge distribution area in southern China. Accurate estimation of biomass is required for accounting and monitoring Chinese forest carbon stocking. In the study, allometric equation W = a(D2H)b was used to analyze tree biomass of Chinese fir. The common methods for estimating allometric model have taken the classical approach based on the frequency interpretation of probability. However, many different biotic and abiotic factors introduce variability in Chinese fir biomass model, suggesting that parameters of biomass model are better represented by probability distributions rather than fixed values as classical method. To deal with the problem, Bayesian method was used for estimating Chinese fir biomass model. In the Bayesian framework, two priors were introduced: non-informative priors and informative priors. For informative priors, 32 biomass equations of Chinese fir were collected from published literature in the paper. The parameter distributions from published literature were regarded as prior distributions in Bayesian model for estimating Chinese fir biomass. Therefore, the Bayesian method with informative priors was better than non-informative priors and classical method, which provides a reasonable method for estimating Chinese fir biomass.

  10. Gradient-based reliability maps for ACM-based segmentation of hippocampus.

    PubMed

    Zarpalas, Dimitrios; Gkontra, Polyxeni; Daras, Petros; Maglaveras, Nicos

    2014-04-01

    Automatic segmentation of deep brain structures, such as the hippocampus (HC), in MR images has attracted considerable scientific attention due to the widespread use of MRI and to the principal role of some structures in various mental disorders. In this literature, there exists a substantial amount of work relying on deformable models incorporating prior knowledge about structures' anatomy and shape information. However, shape priors capture global shape characteristics and thus fail to model boundaries of varying properties; HC boundaries present rich, poor, and missing gradient regions. On top of that, shape prior knowledge is blended with image information in the evolution process, through global weighting of the two terms, again neglecting the spatially varying boundary properties, causing segmentation faults. An innovative method is hereby presented that aims to achieve highly accurate HC segmentation in MR images, based on the modeling of boundary properties at each anatomical location and the inclusion of appropriate image information for each of those, within an active contour model framework. Hence, blending of image information and prior knowledge is based on a local weighting map, which mixes gradient information, regional and whole brain statistical information with a multi-atlas-based spatial distribution map of the structure's labels. Experimental results on three different datasets demonstrate the efficacy and accuracy of the proposed method.

  11. Effects of prior information on decoding degraded speech: an fMRI study.

    PubMed

    Clos, Mareike; Langner, Robert; Meyer, Martin; Oechslin, Mathias S; Zilles, Karl; Eickhoff, Simon B

    2014-01-01

    Expectations and prior knowledge are thought to support the perceptual analysis of incoming sensory stimuli, as proposed by the predictive-coding framework. The current fMRI study investigated the effect of prior information on brain activity during the decoding of degraded speech stimuli. When prior information enabled the comprehension of the degraded sentences, the left middle temporal gyrus and the left angular gyrus were activated, highlighting a role of these areas in meaning extraction. In contrast, the activation of the left inferior frontal gyrus (area 44/45) appeared to reflect the search for meaningful information in degraded speech material that could not be decoded because of mismatches with the prior information. Our results show that degraded sentences evoke instantaneously different percepts and activation patterns depending on the type of prior information, in line with prediction-based accounts of perception. Copyright © 2012 Wiley Periodicals, Inc.

  12. Integrating biological knowledge into variable selection: an empirical Bayes approach with an application in cancer biology

    PubMed Central

    2012-01-01

    Background An important question in the analysis of biochemical data is that of identifying subsets of molecular variables that may jointly influence a biological response. Statistical variable selection methods have been widely used for this purpose. In many settings, it may be important to incorporate ancillary biological information concerning the variables of interest. Pathway and network maps are one example of a source of such information. However, although ancillary information is increasingly available, it is not always clear how it should be used nor how it should be weighted in relation to primary data. Results We put forward an approach in which biological knowledge is incorporated using informative prior distributions over variable subsets, with prior information selected and weighted in an automated, objective manner using an empirical Bayes formulation. We employ continuous, linear models with interaction terms and exploit biochemically-motivated sparsity constraints to permit exact inference. We show an example of priors for pathway- and network-based information and illustrate our proposed method on both synthetic response data and by an application to cancer drug response data. Comparisons are also made to alternative Bayesian and frequentist penalised-likelihood methods for incorporating network-based information. Conclusions The empirical Bayes method proposed here can aid prior elicitation for Bayesian variable selection studies and help to guard against mis-specification of priors. Empirical Bayes, together with the proposed pathway-based priors, results in an approach with a competitive variable selection performance. In addition, the overall procedure is fast, deterministic, and has very few user-set parameters, yet is capable of capturing interplay between molecular players. The approach presented is general and readily applicable in any setting with multiple sources of biological prior knowledge. PMID:22578440

  13. Marginally specified priors for non-parametric Bayesian estimation

    PubMed Central

    Kessler, David C.; Hoff, Peter D.; Dunson, David B.

    2014-01-01

    Summary Prior specification for non-parametric Bayesian inference involves the difficult task of quantifying prior knowledge about a parameter of high, often infinite, dimension. A statistician is unlikely to have informed opinions about all aspects of such a parameter but will have real information about functionals of the parameter, such as the population mean or variance. The paper proposes a new framework for non-parametric Bayes inference in which the prior distribution for a possibly infinite dimensional parameter is decomposed into two parts: an informative prior on a finite set of functionals, and a non-parametric conditional prior for the parameter given the functionals. Such priors can be easily constructed from standard non-parametric prior distributions in common use and inherit the large support of the standard priors on which they are based. Additionally, posterior approximations under these informative priors can generally be made via minor adjustments to existing Markov chain approximation algorithms for standard non-parametric prior distributions. We illustrate the use of such priors in the context of multivariate density estimation using Dirichlet process mixture models, and in the modelling of high dimensional sparse contingency tables. PMID:25663813

  14. Identification of subsurface structures using electromagnetic data and shape priors

    NASA Astrophysics Data System (ADS)

    Tveit, Svenn; Bakr, Shaaban A.; Lien, Martha; Mannseth, Trond

    2015-03-01

    We consider the inverse problem of identifying large-scale subsurface structures using the controlled source electromagnetic method. To identify structures in the subsurface where the contrast in electric conductivity can be small, regularization is needed to bias the solution towards preserving structural information. We propose to combine two approaches for regularization of the inverse problem. In the first approach we utilize a model-based, reduced, composite representation of the electric conductivity that is highly flexible, even for a moderate number of degrees of freedom. With a low number of parameters, the inverse problem is efficiently solved using a standard, second-order gradient-based optimization algorithm. Further regularization is obtained using structural prior information, available, e.g., from interpreted seismic data. The reduced conductivity representation is suitable for incorporation of structural prior information. Such prior information cannot, however, be accurately modeled with a gaussian distribution. To alleviate this, we incorporate the structural information using shape priors. The shape prior technique requires the choice of kernel function, which is application dependent. We argue for using the conditionally positive definite kernel which is shown to have computational advantages over the commonly applied gaussian kernel for our problem. Numerical experiments on various test cases show that the methodology is able to identify fairly complex subsurface electric conductivity distributions while preserving structural prior information during the inversion.

  15. [Inferential evaluation of intimacy based on observation of interpersonal communication].

    PubMed

    Kimura, Masanori

    2015-06-01

    How do people inferentially evaluate others' levels of intimacy with friends? We examined the inferential evaluation of intimacy based on the observation of interpersonal communication. In Experiment 1, participants (N = 41) responded to questions after observing conversations between friends. Results indicated that participants inferentially evaluated not only goodness of communication, but also intimacy between friends, using an expressivity heuristic approach. In Experiment 2, we investigated how inferential evaluation of intimacy was affected by prior information about relationships and by individual differences in face-to-face interactional ability. Participants (N = 64) were divided into prior- and no-prior-information groups and all performed the same task as in Experiment 1. Additionally, their interactional ability was assessed. In the prior-information group, individual differences had no effect on inferential evaluation of intimacy. On the other hand, in the no-prior-information group, face-to-face interactional ability partially influenced evaluations of intimacy. Finally, we discuss the fact that to understand one's social environment, it is important to observe others' interpersonal communications.

  16. Discovering mutated driver genes through a robust and sparse co-regularized matrix factorization framework with prior information from mRNA expression patterns and interaction network.

    PubMed

    Xi, Jianing; Wang, Minghui; Li, Ao

    2018-06-05

    Discovery of mutated driver genes is one of the primary objective for studying tumorigenesis. To discover some relatively low frequently mutated driver genes from somatic mutation data, many existing methods incorporate interaction network as prior information. However, the prior information of mRNA expression patterns are not exploited by these existing network-based methods, which is also proven to be highly informative of cancer progressions. To incorporate prior information from both interaction network and mRNA expressions, we propose a robust and sparse co-regularized nonnegative matrix factorization to discover driver genes from mutation data. Furthermore, our framework also conducts Frobenius norm regularization to overcome overfitting issue. Sparsity-inducing penalty is employed to obtain sparse scores in gene representations, of which the top scored genes are selected as driver candidates. Evaluation experiments by known benchmarking genes indicate that the performance of our method benefits from the two type of prior information. Our method also outperforms the existing network-based methods, and detect some driver genes that are not predicted by the competing methods. In summary, our proposed method can improve the performance of driver gene discovery by effectively incorporating prior information from interaction network and mRNA expression patterns into a robust and sparse co-regularized matrix factorization framework.

  17. Neural Mechanisms for Integrating Prior Knowledge and Likelihood in Value-Based Probabilistic Inference

    PubMed Central

    Ting, Chih-Chung; Yu, Chia-Chen; Maloney, Laurence T.

    2015-01-01

    In Bayesian decision theory, knowledge about the probabilities of possible outcomes is captured by a prior distribution and a likelihood function. The prior reflects past knowledge and the likelihood summarizes current sensory information. The two combined (integrated) form a posterior distribution that allows estimation of the probability of different possible outcomes. In this study, we investigated the neural mechanisms underlying Bayesian integration using a novel lottery decision task in which both prior knowledge and likelihood information about reward probability were systematically manipulated on a trial-by-trial basis. Consistent with Bayesian integration, as sample size increased, subjects tended to weigh likelihood information more compared with prior information. Using fMRI in humans, we found that the medial prefrontal cortex (mPFC) correlated with the mean of the posterior distribution, a statistic that reflects the integration of prior knowledge and likelihood of reward probability. Subsequent analysis revealed that both prior and likelihood information were represented in mPFC and that the neural representations of prior and likelihood in mPFC reflected changes in the behaviorally estimated weights assigned to these different sources of information in response to changes in the environment. Together, these results establish the role of mPFC in prior-likelihood integration and highlight its involvement in representing and integrating these distinct sources of information. PMID:25632152

  18. Parameter estimation of multivariate multiple regression model using bayesian with non-informative Jeffreys’ prior distribution

    NASA Astrophysics Data System (ADS)

    Saputro, D. R. S.; Amalia, F.; Widyaningsih, P.; Affan, R. C.

    2018-05-01

    Bayesian method is a method that can be used to estimate the parameters of multivariate multiple regression model. Bayesian method has two distributions, there are prior and posterior distributions. Posterior distribution is influenced by the selection of prior distribution. Jeffreys’ prior distribution is a kind of Non-informative prior distribution. This prior is used when the information about parameter not available. Non-informative Jeffreys’ prior distribution is combined with the sample information resulting the posterior distribution. Posterior distribution is used to estimate the parameter. The purposes of this research is to estimate the parameters of multivariate regression model using Bayesian method with Non-informative Jeffreys’ prior distribution. Based on the results and discussion, parameter estimation of β and Σ which were obtained from expected value of random variable of marginal posterior distribution function. The marginal posterior distributions for β and Σ are multivariate normal and inverse Wishart. However, in calculation of the expected value involving integral of a function which difficult to determine the value. Therefore, approach is needed by generating of random samples according to the posterior distribution characteristics of each parameter using Markov chain Monte Carlo (MCMC) Gibbs sampling algorithm.

  19. Bayesian Phase II optimization for time-to-event data based on historical information.

    PubMed

    Bertsche, Anja; Fleischer, Frank; Beyersmann, Jan; Nehmiz, Gerhard

    2017-01-01

    After exploratory drug development, companies face the decision whether to initiate confirmatory trials based on limited efficacy information. This proof-of-concept decision is typically performed after a Phase II trial studying a novel treatment versus either placebo or an active comparator. The article aims to optimize the design of such a proof-of-concept trial with respect to decision making. We incorporate historical information and develop pre-specified decision criteria accounting for the uncertainty of the observed treatment effect. We optimize these criteria based on sensitivity and specificity, given the historical information. Specifically, time-to-event data are considered in a randomized 2-arm trial with additional prior information on the control treatment. The proof-of-concept criterion uses treatment effect size, rather than significance. Criteria are defined on the posterior distribution of the hazard ratio given the Phase II data and the historical control information. Event times are exponentially modeled within groups, allowing for group-specific conjugate prior-to-posterior calculation. While a non-informative prior is placed on the investigational treatment, the control prior is constructed via the meta-analytic-predictive approach. The design parameters including sample size and allocation ratio are then optimized, maximizing the probability of taking the right decision. The approach is illustrated with an example in lung cancer.

  20. Bayesian hierarchical functional data analysis via contaminated informative priors.

    PubMed

    Scarpa, Bruno; Dunson, David B

    2009-09-01

    A variety of flexible approaches have been proposed for functional data analysis, allowing both the mean curve and the distribution about the mean to be unknown. Such methods are most useful when there is limited prior information. Motivated by applications to modeling of temperature curves in the menstrual cycle, this article proposes a flexible approach for incorporating prior information in semiparametric Bayesian analyses of hierarchical functional data. The proposed approach is based on specifying the distribution of functions as a mixture of a parametric hierarchical model and a nonparametric contamination. The parametric component is chosen based on prior knowledge, while the contamination is characterized as a functional Dirichlet process. In the motivating application, the contamination component allows unanticipated curve shapes in unhealthy menstrual cycles. Methods are developed for posterior computation, and the approach is applied to data from a European fecundability study.

  1. Bayes factors for testing inequality constrained hypotheses: Issues with prior specification.

    PubMed

    Mulder, Joris

    2014-02-01

    Several issues are discussed when testing inequality constrained hypotheses using a Bayesian approach. First, the complexity (or size) of the inequality constrained parameter spaces can be ignored. This is the case when using the posterior probability that the inequality constraints of a hypothesis hold, Bayes factors based on non-informative improper priors, and partial Bayes factors based on posterior priors. Second, the Bayes factor may not be invariant for linear one-to-one transformations of the data. This can be observed when using balanced priors which are centred on the boundary of the constrained parameter space with a diagonal covariance structure. Third, the information paradox can be observed. When testing inequality constrained hypotheses, the information paradox occurs when the Bayes factor of an inequality constrained hypothesis against its complement converges to a constant as the evidence for the first hypothesis accumulates while keeping the sample size fixed. This paradox occurs when using Zellner's g prior as a result of too much prior shrinkage. Therefore, two new methods are proposed that avoid these issues. First, partial Bayes factors are proposed based on transformed minimal training samples. These training samples result in posterior priors that are centred on the boundary of the constrained parameter space with the same covariance structure as in the sample. Second, a g prior approach is proposed by letting g go to infinity. This is possible because the Jeffreys-Lindley paradox is not an issue when testing inequality constrained hypotheses. A simulation study indicated that the Bayes factor based on this g prior approach converges fastest to the true inequality constrained hypothesis. © 2013 The British Psychological Society.

  2. Uncertainty plus prior equals rational bias: an intuitive Bayesian probability weighting function.

    PubMed

    Fennell, John; Baddeley, Roland

    2012-10-01

    Empirical research has shown that when making choices based on probabilistic options, people behave as if they overestimate small probabilities, underestimate large probabilities, and treat positive and negative outcomes differently. These distortions have been modeled using a nonlinear probability weighting function, which is found in several nonexpected utility theories, including rank-dependent models and prospect theory; here, we propose a Bayesian approach to the probability weighting function and, with it, a psychological rationale. In the real world, uncertainty is ubiquitous and, accordingly, the optimal strategy is to combine probability statements with prior information using Bayes' rule. First, we show that any reasonable prior on probabilities leads to 2 of the observed effects; overweighting of low probabilities and underweighting of high probabilities. We then investigate 2 plausible kinds of priors: informative priors based on previous experience and uninformative priors of ignorance. Individually, these priors potentially lead to large problems of bias and inefficiency, respectively; however, when combined using Bayesian model comparison methods, both forms of prior can be applied adaptively, gaining the efficiency of empirical priors and the robustness of ignorance priors. We illustrate this for the simple case of generic good and bad options, using Internet blogs to estimate the relevant priors of inference. Given this combined ignorant/informative prior, the Bayesian probability weighting function is not only robust and efficient but also matches all of the major characteristics of the distortions found in empirical research. PsycINFO Database Record (c) 2012 APA, all rights reserved.

  3. Automatic Bayes Factors for Testing Equality- and Inequality-Constrained Hypotheses on Variances.

    PubMed

    Böing-Messing, Florian; Mulder, Joris

    2018-05-03

    In comparing characteristics of independent populations, researchers frequently expect a certain structure of the population variances. These expectations can be formulated as hypotheses with equality and/or inequality constraints on the variances. In this article, we consider the Bayes factor for testing such (in)equality-constrained hypotheses on variances. Application of Bayes factors requires specification of a prior under every hypothesis to be tested. However, specifying subjective priors for variances based on prior information is a difficult task. We therefore consider so-called automatic or default Bayes factors. These methods avoid the need for the user to specify priors by using information from the sample data. We present three automatic Bayes factors for testing variances. The first is a Bayes factor with equal priors on all variances, where the priors are specified automatically using a small share of the information in the sample data. The second is the fractional Bayes factor, where a fraction of the likelihood is used for automatic prior specification. The third is an adjustment of the fractional Bayes factor such that the parsimony of inequality-constrained hypotheses is properly taken into account. The Bayes factors are evaluated by investigating different properties such as information consistency and large sample consistency. Based on this evaluation, it is concluded that the adjusted fractional Bayes factor is generally recommendable for testing equality- and inequality-constrained hypotheses on variances.

  4. Feasibility of Providing Web-Based Information to Breast Cancer Patients Prior to a Surgical Consult.

    PubMed

    Bruce, Jordan G; Tucholka, Jennifer L; Steffens, Nicole M; Mahoney, Jane E; Neuman, Heather B

    2017-03-30

    Patients facing decisions for breast cancer surgery commonly search the internet. Directing patients to high-quality websites prior to the surgeon consultation may be one way of supporting patients' informational needs. The objective was to test an approach for delivering web-based information to breast cancer patients. The implementation strategy was developed using the Replicating Effective Programs framework. Pilot testing measured the proportion that accepted the web-based information. A pre-consultation survey assessed whether the information was reviewed and the acceptability to stakeholders. Reasons for declining guided refinement to the implementation package. Eighty-two percent (309/377) accepted the web-based information. Of the 309 that accepted, 244 completed the pre-consultation survey. Participants were a median 59 years, white (98%), and highly educated (>50% with a college degree). Most patients who completed the questionnaire reported reviewing the website (85%), and nearly all found it helpful. Surgeons thought implementation increased visit efficiency (5/6) and would result in patients making more informed decisions (6/6). The most common reasons patients declined information were limited internet comfort or access (n = 36), emotional distress (n = 14), and preference to receive information directly from the surgeon (n = 7). Routine delivery of web-based information to breast cancer patients prior to the surgeon consultation is feasible. High stakeholder acceptability combined with the low implementation burden means that these findings have immediate relevance for improving care quality.

  5. Prospective regularization design in prior-image-based reconstruction

    NASA Astrophysics Data System (ADS)

    Dang, Hao; Siewerdsen, Jeffrey H.; Webster Stayman, J.

    2015-12-01

    Prior-image-based reconstruction (PIBR) methods leveraging patient-specific anatomical information from previous imaging studies and/or sequences have demonstrated dramatic improvements in dose utilization and image quality for low-fidelity data. However, a proper balance of information from the prior images and information from the measurements is required (e.g. through careful tuning of regularization parameters). Inappropriate selection of reconstruction parameters can lead to detrimental effects including false structures and failure to improve image quality. Traditional methods based on heuristics are subject to error and sub-optimal solutions, while exhaustive searches require a large number of computationally intensive image reconstructions. In this work, we propose a novel method that prospectively estimates the optimal amount of prior image information for accurate admission of specific anatomical changes in PIBR without performing full image reconstructions. This method leverages an analytical approximation to the implicitly defined PIBR estimator, and introduces a predictive performance metric leveraging this analytical form and knowledge of a particular presumed anatomical change whose accurate reconstruction is sought. Additionally, since model-based PIBR approaches tend to be space-variant, a spatially varying prior image strength map is proposed to optimally admit changes everywhere in the image (eliminating the need to know change locations a priori). Studies were conducted in both an ellipse phantom and a realistic thorax phantom emulating a lung nodule surveillance scenario. The proposed method demonstrated accurate estimation of the optimal prior image strength while achieving a substantial computational speedup (about a factor of 20) compared to traditional exhaustive search. Moreover, the use of the proposed prior strength map in PIBR demonstrated accurate reconstruction of anatomical changes without foreknowledge of change locations in phantoms where the optimal parameters vary spatially by an order of magnitude or more. In a series of studies designed to explore potential unknowns associated with accurate PIBR, optimal prior image strength was found to vary with attenuation differences associated with anatomical change but exhibited only small variations as a function of the shape and size of the change. The results suggest that, given a target change attenuation, prospective patient-, change-, and data-specific customization of the prior image strength can be performed to ensure reliable reconstruction of specific anatomical changes.

  6. A Simple Method for Estimating Informative Node Age Priors for the Fossil Calibration of Molecular Divergence Time Analyses

    PubMed Central

    Nowak, Michael D.; Smith, Andrew B.; Simpson, Carl; Zwickl, Derrick J.

    2013-01-01

    Molecular divergence time analyses often rely on the age of fossil lineages to calibrate node age estimates. Most divergence time analyses are now performed in a Bayesian framework, where fossil calibrations are incorporated as parametric prior probabilities on node ages. It is widely accepted that an ideal parameterization of such node age prior probabilities should be based on a comprehensive analysis of the fossil record of the clade of interest, but there is currently no generally applicable approach for calculating such informative priors. We provide here a simple and easily implemented method that employs fossil data to estimate the likely amount of missing history prior to the oldest fossil occurrence of a clade, which can be used to fit an informative parametric prior probability distribution on a node age. Specifically, our method uses the extant diversity and the stratigraphic distribution of fossil lineages confidently assigned to a clade to fit a branching model of lineage diversification. Conditioning this on a simple model of fossil preservation, we estimate the likely amount of missing history prior to the oldest fossil occurrence of a clade. The likelihood surface of missing history can then be translated into a parametric prior probability distribution on the age of the clade of interest. We show that the method performs well with simulated fossil distribution data, but that the likelihood surface of missing history can at times be too complex for the distribution-fitting algorithm employed by our software tool. An empirical example of the application of our method is performed to estimate echinoid node ages. A simulation-based sensitivity analysis using the echinoid data set shows that node age prior distributions estimated under poor preservation rates are significantly less informative than those estimated under high preservation rates. PMID:23755303

  7. 78 FR 32359 - Information Required in Prior Notice of Imported Food

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-30

    ... or animal food based on food safety reasons, such as intentional or unintentional contamination of an... DEPARTMENT OF HEALTH AND HUMAN SERVICES Food and Drug Administration 21 CFR Part 1 [Docket No. FDA-2011-N-0179] RIN 0910-AG65 Information Required in Prior Notice of Imported Food AGENCY: Food and Drug...

  8. Integrating prior information into microwave tomography part 2: Impact of errors in prior information on microwave tomography image quality.

    PubMed

    Kurrant, Douglas; Fear, Elise; Baran, Anastasia; LoVetri, Joe

    2017-12-01

    The authors have developed a method to combine a patient-specific map of tissue structure and average dielectric properties with microwave tomography. The patient-specific map is acquired with radar-based techniques and serves as prior information for microwave tomography. The impact that the degree of structural detail included in this prior information has on image quality was reported in a previous investigation. The aim of the present study is to extend this previous work by identifying and quantifying the impact that errors in the prior information have on image quality, including the reconstruction of internal structures and lesions embedded in fibroglandular tissue. This study also extends the work of others reported in literature by emulating a clinical setting with a set of experiments that incorporate heterogeneity into both the breast interior and glandular region, as well as prior information related to both fat and glandular structures. Patient-specific structural information is acquired using radar-based methods that form a regional map of the breast. Errors are introduced to create a discrepancy in the geometry and electrical properties between the regional map and the model used to generate the data. This permits the impact that errors in the prior information have on image quality to be evaluated. Image quality is quantitatively assessed by measuring the ability of the algorithm to reconstruct both internal structures and lesions embedded in fibroglandular tissue. The study is conducted using both 2D and 3D numerical breast models constructed from MRI scans. The reconstruction results demonstrate robustness of the method relative to errors in the dielectric properties of the background regional map, and to misalignment errors. These errors do not significantly influence the reconstruction accuracy of the underlying structures, or the ability of the algorithm to reconstruct malignant tissue. Although misalignment errors do not significantly impact the quality of the reconstructed fat and glandular structures for the 3D scenarios, the dielectric properties are reconstructed less accurately within the glandular structure for these cases relative to the 2D cases. However, general agreement between the 2D and 3D results was found. A key contribution of this paper is the detailed analysis of the impact of prior information errors on the reconstruction accuracy and ability to detect tumors. The results support the utility of acquiring patient-specific information with radar-based techniques and incorporating this information into MWT. The method is robust to errors in the dielectric properties of the background regional map, and to misalignment errors. Completion of this analysis is an important step toward developing the method into a practical diagnostic tool. © 2017 American Association of Physicists in Medicine.

  9. Bayesian and “Anti-Bayesian” Biases in Sensory Integration for Action and Perception in the Size–Weight Illusion

    PubMed Central

    Brayanov, Jordan B.

    2010-01-01

    Which is heavier: a pound of lead or a pound of feathers? This classic trick question belies a simple but surprising truth: when lifted, the pound of lead feels heavier—a phenomenon known as the size–weight illusion. To estimate the weight of an object, our CNS combines two imperfect sources of information: a prior expectation, based on the object's appearance, and direct sensory information from lifting it. Bayes' theorem (or Bayes' law) defines the statistically optimal way to combine multiple information sources for maximally accurate estimation. Here we asked whether the mechanisms for combining these information sources produce statistically optimal weight estimates for both perceptions and actions. We first studied the ability of subjects to hold one hand steady when the other removed an object from it, under conditions in which sensory information about the object's weight sometimes conflicted with prior expectations based on its size. Since the ability to steady the supporting hand depends on the generation of a motor command that accounts for lift timing and object weight, hand motion can be used to gauge biases in weight estimation by the motor system. We found that these motor system weight estimates reflected the integration of prior expectations with real-time proprioceptive information in a Bayesian, statistically optimal fashion that discounted unexpected sensory information. This produces a motor size–weight illusion that consistently biases weight estimates toward prior expectations. In contrast, when subjects compared the weights of two objects, their perceptions defied Bayes' law, exaggerating the value of unexpected sensory information. This produces a perceptual size–weight illusion that biases weight perceptions away from prior expectations. We term this effect “anti-Bayesian” because the bias is opposite that seen in Bayesian integration. Our findings suggest that two fundamentally different strategies for the integration of prior expectations with sensory information coexist in the nervous system for weight estimation. PMID:20089821

  10. Adaptive allocation for binary outcomes using decreasingly informative priors.

    PubMed

    Sabo, Roy T

    2014-01-01

    A method of outcome-adaptive allocation is presented using Bayes methods, where a natural lead-in is incorporated through the use of informative yet skeptical prior distributions for each treatment group. These prior distributions are modeled on unobserved data in such a way that their influence on the allocation scheme decreases as the trial progresses. Simulation studies show this method to behave comparably to the Bayesian adaptive allocation method described by Thall and Wathen (2007), who incorporate a natural lead-in through sample-size-based exponents.

  11. Using expert knowledge for test linking.

    PubMed

    Bolsinova, Maria; Hoijtink, Herbert; Vermeulen, Jorine Adinda; Béguin, Anton

    2017-12-01

    Linking and equating procedures are used to make the results of different test forms comparable. In the cases where no assumption of random equivalent groups can be made some form of linking design is used. In practice the amount of data available to link the two tests is often very limited due to logistic and security reasons, which affects the precision of linking procedures. This study proposes to enhance the quality of linking procedures based on sparse data by using Bayesian methods which combine the information in the linking data with background information captured in informative prior distributions. We propose two methods for the elicitation of prior knowledge about the difference in difficulty of two tests from subject-matter experts and explain how these results can be used in the specification of priors. To illustrate the proposed methods and evaluate the quality of linking with and without informative priors, an empirical example of linking primary school mathematics tests is presented. The results suggest that informative priors can increase the precision of linking without decreasing the accuracy. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  12. An Optimization-Based State Estimatioin Framework for Large-Scale Natural Gas Networks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jalving, Jordan; Zavala, Victor M.

    We propose an optimization-based state estimation framework to track internal spacetime flow and pressure profiles of natural gas networks during dynamic transients. We find that the estimation problem is ill-posed (because of the infinite-dimensional nature of the states) and that this leads to instability of the estimator when short estimation horizons are used. To circumvent this issue, we propose moving horizon strategies that incorporate prior information. In particular, we propose a strategy that initializes the prior using steady-state information and compare its performance against a strategy that does not initialize the prior. We find that both strategies are capable ofmore » tracking the state profiles but we also find that superior performance is obtained with steady-state prior initialization. We also find that, under the proposed framework, pressure sensor information at junctions is sufficient to track the state profiles. We also derive approximate transport models and show that some of these can be used to achieve significant computational speed-ups without sacrificing estimation performance. We show that the estimator can be easily implemented in the graph-based modeling framework Plasmo.jl and use a multipipeline network study to demonstrate the developments.« less

  13. Improving phylogenetic analyses by incorporating additional information from genetic sequence databases.

    PubMed

    Liang, Li-Jung; Weiss, Robert E; Redelings, Benjamin; Suchard, Marc A

    2009-10-01

    Statistical analyses of phylogenetic data culminate in uncertain estimates of underlying model parameters. Lack of additional data hinders the ability to reduce this uncertainty, as the original phylogenetic dataset is often complete, containing the entire gene or genome information available for the given set of taxa. Informative priors in a Bayesian analysis can reduce posterior uncertainty; however, publicly available phylogenetic software specifies vague priors for model parameters by default. We build objective and informative priors using hierarchical random effect models that combine additional datasets whose parameters are not of direct interest but are similar to the analysis of interest. We propose principled statistical methods that permit more precise parameter estimates in phylogenetic analyses by creating informative priors for parameters of interest. Using additional sequence datasets from our lab or public databases, we construct a fully Bayesian semiparametric hierarchical model to combine datasets. A dynamic iteratively reweighted Markov chain Monte Carlo algorithm conveniently recycles posterior samples from the individual analyses. We demonstrate the value of our approach by examining the insertion-deletion (indel) process in the enolase gene across the Tree of Life using the phylogenetic software BALI-PHY; we incorporate prior information about indels from 82 curated alignments downloaded from the BAliBASE database.

  14. Novel joint TOA/RSSI-based WCE location tracking method without prior knowledge of biological human body tissues.

    PubMed

    Ito, Takahiro; Anzai, Daisuke; Jianqing Wang

    2014-01-01

    This paper proposes a novel joint time of arrival (TOA)/received signal strength indicator (RSSI)-based wireless capsule endoscope (WCE) location tracking method without prior knowledge of biological human tissues. Generally, TOA-based localization can achieve much higher localization accuracy than other radio frequency-based localization techniques, whereas wireless signals transmitted from a WCE pass through various kinds of human body tissues, as a result, the propagation velocity inside a human body should be different from one in free space. Because the variation of propagation velocity is mainly affected by the relative permittivity of human body tissues, instead of pre-measurement for the relative permittivity in advance, we simultaneously estimate not only the WCE location but also the relative permittivity information. For this purpose, this paper first derives the relative permittivity estimation model with measured RSSI information. Then, we pay attention to a particle filter algorithm with the TOA-based localization and the RSSI-based relative permittivity estimation. Our computer simulation results demonstrates that the proposed tracking methods with the particle filter can accomplish an excellent localization accuracy of around 2 mm without prior information of the relative permittivity of the human body tissues.

  15. Exploring the Transformative Potential of Recognition of Prior Informal Learning for Learners: A Case Study in Scotland

    ERIC Educational Resources Information Center

    Brown, Julie

    2017-01-01

    This article presents an overview of the findings of a recently completed study exploring the potentially transformative impact upon learners of recognition of prior informal learning (RPL). The specific transformative dimension being reported is learner identity. In addition to providing a starting point for an evidence base within Scotland, the…

  16. Corpus callosum segmentation using deep neural networks with prior information from multi-atlas images

    NASA Astrophysics Data System (ADS)

    Park, Gilsoon; Hong, Jinwoo; Lee, Jong-Min

    2018-03-01

    In human brain, Corpus Callosum (CC) is the largest white matter structure, connecting between right and left hemispheres. Structural features such as shape and size of CC in midsagittal plane are of great significance for analyzing various neurological diseases, for example Alzheimer's disease, autism and epilepsy. For quantitative and qualitative studies of CC in brain MR images, robust segmentation of CC is important. In this paper, we present a novel method for CC segmentation. Our approach is based on deep neural networks and the prior information generated from multi-atlas images. Deep neural networks have recently shown good performance in various image processing field. Convolutional neural networks (CNN) have shown outstanding performance for classification and segmentation in medical image fields. We used convolutional neural networks for CC segmentation. Multi-atlas based segmentation model have been widely used in medical image segmentation because atlas has powerful information about the target structure we want to segment, consisting of MR images and corresponding manual segmentation of the target structure. We combined the prior information, such as location and intensity distribution of target structure (i.e. CC), made from multi-atlas images in CNN training process for more improving training. The CNN with prior information showed better segmentation performance than without.

  17. Incorporating Functional Genomic Information in Genetic Association Studies Using an Empirical Bayes Approach.

    PubMed

    Spencer, Amy V; Cox, Angela; Lin, Wei-Yu; Easton, Douglas F; Michailidou, Kyriaki; Walters, Kevin

    2016-04-01

    There is a large amount of functional genetic data available, which can be used to inform fine-mapping association studies (in diseases with well-characterised disease pathways). Single nucleotide polymorphism (SNP) prioritization via Bayes factors is attractive because prior information can inform the effect size or the prior probability of causal association. This approach requires the specification of the effect size. If the information needed to estimate a priori the probability density for the effect sizes for causal SNPs in a genomic region isn't consistent or isn't available, then specifying a prior variance for the effect sizes is challenging. We propose both an empirical method to estimate this prior variance, and a coherent approach to using SNP-level functional data, to inform the prior probability of causal association. Through simulation we show that when ranking SNPs by our empirical Bayes factor in a fine-mapping study, the causal SNP rank is generally as high or higher than the rank using Bayes factors with other plausible values of the prior variance. Importantly, we also show that assigning SNP-specific prior probabilities of association based on expert prior functional knowledge of the disease mechanism can lead to improved causal SNPs ranks compared to ranking with identical prior probabilities of association. We demonstrate the use of our methods by applying the methods to the fine mapping of the CASP8 region of chromosome 2 using genotype data from the Collaborative Oncological Gene-Environment Study (COGS) Consortium. The data we analysed included approximately 46,000 breast cancer case and 43,000 healthy control samples. © 2016 The Authors. *Genetic Epidemiology published by Wiley Periodicals, Inc.

  18. Main Geomagnetic Field Models from Oersted and Magsat Data Via a Rigorous General Inverse Theory with Error Bounds

    NASA Technical Reports Server (NTRS)

    Backus, George E.

    1999-01-01

    The purpose of the grant was to study how prior information about the geomagnetic field can be used to interpret surface and satellite magnetic measurements, to generate quantitative descriptions of prior information that might be so used, and to use this prior information to obtain from satellite data a model of the core field with statistically justifiable error estimates. The need for prior information in geophysical inversion has long been recognized. Data sets are finite, and faithful descriptions of aspects of the earth almost always require infinite-dimensional model spaces. By themselves, the data can confine the correct earth model only to an infinite-dimensional subset of the model space. Earth properties other than direct functions of the observed data cannot be estimated from those data without prior information about the earth. Prior information is based on what the observer already knows before the data become available. Such information can be "hard" or "soft". Hard information is a belief that the real earth must lie in some known region of model space. For example, the total ohmic dissipation in the core is probably less that the total observed geothermal heat flow out of the earth's surface. (In principle, ohmic heat in the core can be recaptured to help drive the dynamo, but this effect is probably small.) "Soft" information is a probability distribution on the model space, a distribution that the observer accepts as a quantitative description of her/his beliefs about the earth. The probability distribution can be a subjective prior in the sense of Bayes or the objective result of a statistical study of previous data or relevant theories.

  19. How Judgments Change Following Comparison of Current and Prior Information

    PubMed Central

    Albarracin, Dolores; Wallace, Harry M.; Hart, William; Brown, Rick D.

    2013-01-01

    Although much observed judgment change is superficial and occurs without considering prior information, other forms of change also occur. Comparison between prior and new information about an issue may trigger change by influencing either or both the perceived strength and direction of the new information. In four experiments, participants formed and reported initial judgments of a policy based on favorable written information about it. Later, these participants read a second passage containing strong favorable or unfavorable information on the policy. Compared to control conditions, subtle and direct prompts to compare the initial and new information led to more judgment change in the direction of a second passage perceived to be strong. Mediation analyses indicated that comparison yielded greater perceived strength of the second passage, which in turn correlated positively with judgment change. Moreover, self-reports of comparison mediated the judgment change resulting from comparison prompts. PMID:23599557

  20. Weighted integration of short-term memory and sensory signals in the oculomotor system.

    PubMed

    Deravet, Nicolas; Blohm, Gunnar; de Xivry, Jean-Jacques Orban; Lefèvre, Philippe

    2018-05-01

    Oculomotor behaviors integrate sensory and prior information to overcome sensory-motor delays and noise. After much debate about this process, reliability-based integration has recently been proposed and several models of smooth pursuit now include recurrent Bayesian integration or Kalman filtering. However, there is a lack of behavioral evidence in humans supporting these theoretical predictions. Here, we independently manipulated the reliability of visual and prior information in a smooth pursuit task. Our results show that both smooth pursuit eye velocity and catch-up saccade amplitude were modulated by visual and prior information reliability. We interpret these findings as the continuous reliability-based integration of a short-term memory of target motion with visual information, which support modeling work. Furthermore, we suggest that saccadic and pursuit systems share this short-term memory. We propose that this short-term memory of target motion is quickly built and continuously updated, and constitutes a general building block present in all sensorimotor systems.

  1. Mitigating Information Overload: The Impact of Context-Based Approach to the Design of Tools for Intelligence Analysts

    DTIC Science & Technology

    2008-03-01

    amount of arriving data, extract actionable information, and integrate it with prior knowledge. Add to that the pressures of today’s fusion center...information, and integrate it with prior knowledge. Add to that the pressures of today’s fusion center climate and it becomes clear that analysts, police... fusion centers, including specifics about how these problems manifest at the Illinois State Police (ISP) Statewide Terrorism and Intelligence Center

  2. Selected aspects of prior and likelihood information for a Bayesian classifier in a road safety analysis.

    PubMed

    Nowakowska, Marzena

    2017-04-01

    The development of the Bayesian logistic regression model classifying the road accident severity is discussed. The already exploited informative priors (method of moments, maximum likelihood estimation, and two-stage Bayesian updating), along with the original idea of a Boot prior proposal, are investigated when no expert opinion has been available. In addition, two possible approaches to updating the priors, in the form of unbalanced and balanced training data sets, are presented. The obtained logistic Bayesian models are assessed on the basis of a deviance information criterion (DIC), highest probability density (HPD) intervals, and coefficients of variation estimated for the model parameters. The verification of the model accuracy has been based on sensitivity, specificity and the harmonic mean of sensitivity and specificity, all calculated from a test data set. The models obtained from the balanced training data set have a better classification quality than the ones obtained from the unbalanced training data set. The two-stage Bayesian updating prior model and the Boot prior model, both identified with the use of the balanced training data set, outperform the non-informative, method of moments, and maximum likelihood estimation prior models. It is important to note that one should be careful when interpreting the parameters since different priors can lead to different models. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. Nudging toward Inquiry: Strategies for Searching for and Finding Great Information

    ERIC Educational Resources Information Center

    Fontichiaro, Kristin, Comp.

    2010-01-01

    Inquiry does not replace information literacy; rather, it encompasses it. Inquiry-based learning invites school librarians to step into all aspects of instructional planning, from activating prior knowledge straight through to reflection. Libraries pursuing inquiry-based instruction are building on the bedrock of information literacy, not starting…

  4. Abdominal multi-organ segmentation from CT images using conditional shape–location and unsupervised intensity priors

    PubMed Central

    Linguraru, Marius George; Hori, Masatoshi; Summers, Ronald M; Tomiyama, Noriyuki

    2015-01-01

    This paper addresses the automated segmentation of multiple organs in upper abdominal computed tomography (CT) data. The aim of our study is to develop methods to effectively construct the conditional priors and use their prediction power for more accurate segmentation as well as easy adaptation to various imaging conditions in CT images, as observed in clinical practice. We propose a general framework of multi-organ segmentation which effectively incorporates interrelations among multiple organs and easily adapts to various imaging conditions without the need for supervised intensity information. The features of the framework are as follows: (1) A method for modeling conditional shape and location (shape–location) priors, which we call prediction-based priors, is developed to derive accurate priors specific to each subject, which enables the estimation of intensity priors without the need for supervised intensity information. (2) Organ correlation graph is introduced, which defines how the conditional priors are constructed and segmentation processes of multiple organs are executed. In our framework, predictor organs, whose segmentation is sufficiently accurate by using conventional single-organ segmentation methods, are pre-segmented, and the remaining organs are hierarchically segmented using conditional shape–location priors. The proposed framework was evaluated through the segmentation of eight abdominal organs (liver, spleen, left and right kidneys, pancreas, gallbladder, aorta, and inferior vena cava) from 134 CT data from 86 patients obtained under six imaging conditions at two hospitals. The experimental results show the effectiveness of the proposed prediction-based priors and the applicability to various imaging conditions without the need for supervised intensity information. Average Dice coefficients for the liver, spleen, and kidneys were more than 92%, and were around 73% and 67% for the pancreas and gallbladder, respectively. PMID:26277022

  5. Abdominal multi-organ segmentation from CT images using conditional shape-location and unsupervised intensity priors.

    PubMed

    Okada, Toshiyuki; Linguraru, Marius George; Hori, Masatoshi; Summers, Ronald M; Tomiyama, Noriyuki; Sato, Yoshinobu

    2015-12-01

    This paper addresses the automated segmentation of multiple organs in upper abdominal computed tomography (CT) data. The aim of our study is to develop methods to effectively construct the conditional priors and use their prediction power for more accurate segmentation as well as easy adaptation to various imaging conditions in CT images, as observed in clinical practice. We propose a general framework of multi-organ segmentation which effectively incorporates interrelations among multiple organs and easily adapts to various imaging conditions without the need for supervised intensity information. The features of the framework are as follows: (1) A method for modeling conditional shape and location (shape-location) priors, which we call prediction-based priors, is developed to derive accurate priors specific to each subject, which enables the estimation of intensity priors without the need for supervised intensity information. (2) Organ correlation graph is introduced, which defines how the conditional priors are constructed and segmentation processes of multiple organs are executed. In our framework, predictor organs, whose segmentation is sufficiently accurate by using conventional single-organ segmentation methods, are pre-segmented, and the remaining organs are hierarchically segmented using conditional shape-location priors. The proposed framework was evaluated through the segmentation of eight abdominal organs (liver, spleen, left and right kidneys, pancreas, gallbladder, aorta, and inferior vena cava) from 134 CT data from 86 patients obtained under six imaging conditions at two hospitals. The experimental results show the effectiveness of the proposed prediction-based priors and the applicability to various imaging conditions without the need for supervised intensity information. Average Dice coefficients for the liver, spleen, and kidneys were more than 92%, and were around 73% and 67% for the pancreas and gallbladder, respectively. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Metal Artifact Reduction in X-ray Computed Tomography Using Computer-Aided Design Data of Implants as Prior Information.

    PubMed

    Ruth, Veikko; Kolditz, Daniel; Steiding, Christian; Kalender, Willi A

    2017-06-01

    The performance of metal artifact reduction (MAR) methods in x-ray computed tomography (CT) suffers from incorrect identification of metallic implants in the artifact-affected volumetric images. The aim of this study was to investigate potential improvements of state-of-the-art MAR methods by using prior information on geometry and material of the implant. The influence of a novel prior knowledge-based segmentation (PS) compared with threshold-based segmentation (TS) on 2 MAR methods (linear interpolation [LI] and normalized-MAR [NORMAR]) was investigated. The segmentation is the initial step of both MAR methods. Prior knowledge-based segmentation uses 3-dimensional registered computer-aided design (CAD) data as prior knowledge to estimate the correct position and orientation of the metallic objects. Threshold-based segmentation uses an adaptive threshold to identify metal. Subsequently, for LI and NORMAR, the selected voxels are projected into the raw data domain to mark metal areas. Attenuation values in these areas are replaced by different interpolation schemes followed by a second reconstruction. Finally, the previously selected metal voxels are replaced by the metal voxels determined by PS or TS in the initial reconstruction. First, we investigated in an elaborate phantom study if the knowledge of the exact implant shape extracted from the CAD data provided by the manufacturer of the implant can improve the MAR result. Second, the leg of a human cadaver was scanned using a clinical CT system before and after the implantation of an artificial knee joint. The results were compared regarding segmentation accuracy, CT number accuracy, and the restoration of distorted structures. The use of PS improved the efficacy of LI and NORMAR compared with TS. Artifacts caused by insufficient segmentation were reduced, and additional information was made available within the projection data. The estimation of the implant shape was more exact and not dependent on a threshold value. Consequently, the visibility of structures was improved when comparing the new approach to the standard method. This was further confirmed by improved CT value accuracy and reduced image noise. The PS approach based on prior implant information provides image quality which is superior to TS-based MAR, especially when the shape of the metallic implant is complex. The new approach can be useful for improving MAR methods and dose calculations within radiation therapy based on the MAR corrected CT images.

  7. Subject-Specific Sparse Dictionary Learning for Atlas-Based Brain MRI Segmentation.

    PubMed

    Roy, Snehashis; He, Qing; Sweeney, Elizabeth; Carass, Aaron; Reich, Daniel S; Prince, Jerry L; Pham, Dzung L

    2015-09-01

    Quantitative measurements from segmentations of human brain magnetic resonance (MR) images provide important biomarkers for normal aging and disease progression. In this paper, we propose a patch-based tissue classification method from MR images that uses a sparse dictionary learning approach and atlas priors. Training data for the method consists of an atlas MR image, prior information maps depicting where different tissues are expected to be located, and a hard segmentation. Unlike most atlas-based classification methods that require deformable registration of the atlas priors to the subject, only affine registration is required between the subject and training atlas. A subject-specific patch dictionary is created by learning relevant patches from the atlas. Then the subject patches are modeled as sparse combinations of learned atlas patches leading to tissue memberships at each voxel. The combination of prior information in an example-based framework enables us to distinguish tissues having similar intensities but different spatial locations. We demonstrate the efficacy of the approach on the application of whole-brain tissue segmentation in subjects with healthy anatomy and normal pressure hydrocephalus, as well as lesion segmentation in multiple sclerosis patients. For each application, quantitative comparisons are made against publicly available state-of-the art approaches.

  8. Self-prior strategy for organ reconstruction in fluorescence molecular tomography

    PubMed Central

    Zhou, Yuan; Chen, Maomao; Su, Han; Luo, Jianwen

    2017-01-01

    The purpose of this study is to propose a strategy for organ reconstruction in fluorescence molecular tomography (FMT) without prior information from other imaging modalities, and to overcome the high cost and ionizing radiation caused by the traditional structural prior strategy. The proposed strategy is designed as an iterative architecture to solve the inverse problem of FMT. In each iteration, a short time Fourier transform (STFT) based algorithm is used to extract the self-prior information in the space-frequency energy spectrum with the assumption that the regions with higher fluorescence concentration have larger energy intensity, then the cost function of the inverse problem is modified by the self-prior information, and lastly an iterative Laplacian regularization algorithm is conducted to solve the updated inverse problem and obtains the reconstruction results. Simulations and in vivo experiments on liver reconstruction are carried out to test the performance of the self-prior strategy on organ reconstruction. The organ reconstruction results obtained by the proposed self-prior strategy are closer to the ground truth than those obtained by the iterative Tikhonov regularization (ITKR) method (traditional non-prior strategy). Significant improvements are shown in the evaluation indexes of relative locational error (RLE), relative error (RE) and contrast-to-noise ratio (CNR). The self-prior strategy improves the organ reconstruction results compared with the non-prior strategy and also overcomes the shortcomings of the traditional structural prior strategy. Various applications such as metabolic imaging and pharmacokinetic study can be aided by this strategy. PMID:29082094

  9. Self-prior strategy for organ reconstruction in fluorescence molecular tomography.

    PubMed

    Zhou, Yuan; Chen, Maomao; Su, Han; Luo, Jianwen

    2017-10-01

    The purpose of this study is to propose a strategy for organ reconstruction in fluorescence molecular tomography (FMT) without prior information from other imaging modalities, and to overcome the high cost and ionizing radiation caused by the traditional structural prior strategy. The proposed strategy is designed as an iterative architecture to solve the inverse problem of FMT. In each iteration, a short time Fourier transform (STFT) based algorithm is used to extract the self-prior information in the space-frequency energy spectrum with the assumption that the regions with higher fluorescence concentration have larger energy intensity, then the cost function of the inverse problem is modified by the self-prior information, and lastly an iterative Laplacian regularization algorithm is conducted to solve the updated inverse problem and obtains the reconstruction results. Simulations and in vivo experiments on liver reconstruction are carried out to test the performance of the self-prior strategy on organ reconstruction. The organ reconstruction results obtained by the proposed self-prior strategy are closer to the ground truth than those obtained by the iterative Tikhonov regularization (ITKR) method (traditional non-prior strategy). Significant improvements are shown in the evaluation indexes of relative locational error (RLE), relative error (RE) and contrast-to-noise ratio (CNR). The self-prior strategy improves the organ reconstruction results compared with the non-prior strategy and also overcomes the shortcomings of the traditional structural prior strategy. Various applications such as metabolic imaging and pharmacokinetic study can be aided by this strategy.

  10. Engaging underserved audiences in informal science education through community-based partnerships

    NASA Astrophysics Data System (ADS)

    Bouzo, Suzanne

    This thesis explores the impact of the Science Education and Engagement of Denver (SEED) Partnership on three of its participant families. The partnership, consisting of large informal science organizations, as well as small community-based organizations, created its programming based on prior research identifying barriers to minority participation in informal science education programs. SEED aims to engage youth and families of emerging populations in science and nature. Three families were examined as a case study to have an in depth investigation about their involvement in the programs sponsored by the partnership. Findings suggest a positive impact on participant feelings and engagement in science and nature. Future recommendations are made for furthering programming as well as conducting a larger scale, more comprehensive program evaluation. This research addresses prior studies that have identified several barriers toward participation of underserved audiences in informal science education programs and how the SEED partnership has addressed specific identified barriers.

  11. The neglected tool in the Bayesian ecologist's shed: a case study testing informative priors' effect on model accuracy

    PubMed Central

    Morris, William K; Vesk, Peter A; McCarthy, Michael A; Bunyavejchewin, Sarayudh; Baker, Patrick J

    2015-01-01

    Despite benefits for precision, ecologists rarely use informative priors. One reason that ecologists may prefer vague priors is the perception that informative priors reduce accuracy. To date, no ecological study has empirically evaluated data-derived informative priors' effects on precision and accuracy. To determine the impacts of priors, we evaluated mortality models for tree species using data from a forest dynamics plot in Thailand. Half the models used vague priors, and the remaining half had informative priors. We found precision was greater when using informative priors, but effects on accuracy were more variable. In some cases, prior information improved accuracy, while in others, it was reduced. On average, models with informative priors were no more or less accurate than models without. Our analyses provide a detailed case study on the simultaneous effect of prior information on precision and accuracy and demonstrate that when priors are specified appropriately, they lead to greater precision without systematically reducing model accuracy. PMID:25628867

  12. The neglected tool in the Bayesian ecologist's shed: a case study testing informative priors' effect on model accuracy.

    PubMed

    Morris, William K; Vesk, Peter A; McCarthy, Michael A; Bunyavejchewin, Sarayudh; Baker, Patrick J

    2015-01-01

    Despite benefits for precision, ecologists rarely use informative priors. One reason that ecologists may prefer vague priors is the perception that informative priors reduce accuracy. To date, no ecological study has empirically evaluated data-derived informative priors' effects on precision and accuracy. To determine the impacts of priors, we evaluated mortality models for tree species using data from a forest dynamics plot in Thailand. Half the models used vague priors, and the remaining half had informative priors. We found precision was greater when using informative priors, but effects on accuracy were more variable. In some cases, prior information improved accuracy, while in others, it was reduced. On average, models with informative priors were no more or less accurate than models without. Our analyses provide a detailed case study on the simultaneous effect of prior information on precision and accuracy and demonstrate that when priors are specified appropriately, they lead to greater precision without systematically reducing model accuracy.

  13. Filtering genetic variants and placing informative priors based on putative biological function.

    PubMed

    Friedrichs, Stefanie; Malzahn, Dörthe; Pugh, Elizabeth W; Almeida, Marcio; Liu, Xiao Qing; Bailey, Julia N

    2016-02-03

    High-density genetic marker data, especially sequence data, imply an immense multiple testing burden. This can be ameliorated by filtering genetic variants, exploiting or accounting for correlations between variants, jointly testing variants, and by incorporating informative priors. Priors can be based on biological knowledge or predicted variant function, or even be used to integrate gene expression or other omics data. Based on Genetic Analysis Workshop (GAW) 19 data, this article discusses diversity and usefulness of functional variant scores provided, for example, by PolyPhen2, SIFT, or RegulomeDB annotations. Incorporating functional scores into variant filters or weights and adjusting the significance level for correlations between variants yielded significant associations with blood pressure traits in a large family study of Mexican Americans (GAW19 data set). Marker rs218966 in gene PHF14 and rs9836027 in MAP4 significantly associated with hypertension; additionally, rare variants in SNUPN significantly associated with systolic blood pressure. Variant weights strongly influenced the power of kernel methods and burden tests. Apart from variant weights in test statistics, prior weights may also be used when combining test statistics or to informatively weight p values while controlling false discovery rate (FDR). Indeed, power improved when gene expression data for FDR-controlled informative weighting of association test p values of genes was used. Finally, approaches exploiting variant correlations included identity-by-descent mapping and the optimal strategy for joint testing rare and common variants, which was observed to depend on linkage disequilibrium structure.

  14. How the prior information shapes couplings in neural fields performing optimal multisensory integration

    NASA Astrophysics Data System (ADS)

    Wang, He; Zhang, Wen-Hao; Wong, K. Y. Michael; Wu, Si

    Extensive studies suggest that the brain integrates multisensory signals in a Bayesian optimal way. However, it remains largely unknown how the sensory reliability and the prior information shape the neural architecture. In this work, we propose a biologically plausible neural field model, which can perform optimal multisensory integration and encode the whole profile of the posterior. Our model is composed of two modules, each for one modality. The crosstalks between the two modules can be carried out through feedforwad cross-links and reciprocal connections. We found that the reciprocal couplings are crucial to optimal multisensory integration in that the reciprocal coupling pattern is shaped by the correlation in the joint prior distribution of the sensory stimuli. A perturbative approach is developed to illustrate the relation between the prior information and features in coupling patterns quantitatively. Our results show that a decentralized architecture based on reciprocal connections is able to accommodate complex correlation structures across modalities and utilize this prior information in optimal multisensory integration. This work is supported by the Research Grants Council of Hong Kong (N_HKUST606/12 and 605813) and National Basic Research Program of China (2014CB846101) and the Natural Science Foundation of China (31261160495).

  15. A pseudo-discrete algebraic reconstruction technique (PDART) prior image-based suppression of high density artifacts in computed tomography

    NASA Astrophysics Data System (ADS)

    Pua, Rizza; Park, Miran; Wi, Sunhee; Cho, Seungryong

    2016-12-01

    We propose a hybrid metal artifact reduction (MAR) approach for computed tomography (CT) that is computationally more efficient than a fully iterative reconstruction method, but at the same time achieves superior image quality to the interpolation-based in-painting techniques. Our proposed MAR method, an image-based artifact subtraction approach, utilizes an intermediate prior image reconstructed via PDART to recover the background information underlying the high density objects. For comparison, prior images generated by total-variation minimization (TVM) algorithm, as a realization of fully iterative approach, were also utilized as intermediate images. From the simulation and real experimental results, it has been shown that PDART drastically accelerates the reconstruction to an acceptable quality of prior images. Incorporating PDART-reconstructed prior images in the proposed MAR scheme achieved higher quality images than those by a conventional in-painting method. Furthermore, the results were comparable to the fully iterative MAR that uses high-quality TVM prior images.

  16. Cortical plasticity as a mechanism for storing Bayesian priors in sensory perception.

    PubMed

    Köver, Hania; Bao, Shaowen

    2010-05-05

    Human perception of ambiguous sensory signals is biased by prior experiences. It is not known how such prior information is encoded, retrieved and combined with sensory information by neurons. Previous authors have suggested dynamic encoding mechanisms for prior information, whereby top-down modulation of firing patterns on a trial-by-trial basis creates short-term representations of priors. Although such a mechanism may well account for perceptual bias arising in the short-term, it does not account for the often irreversible and robust changes in perception that result from long-term, developmental experience. Based on the finding that more frequently experienced stimuli gain greater representations in sensory cortices during development, we reasoned that prior information could be stored in the size of cortical sensory representations. For the case of auditory perception, we use a computational model to show that prior information about sound frequency distributions may be stored in the size of primary auditory cortex frequency representations, read-out by elevated baseline activity in all neurons and combined with sensory-evoked activity to generate a perception that conforms to Bayesian integration theory. Our results suggest an alternative neural mechanism for experience-induced long-term perceptual bias in the context of auditory perception. They make the testable prediction that the extent of such perceptual prior bias is modulated by both the degree of cortical reorganization and the magnitude of spontaneous activity in primary auditory cortex. Given that cortical over-representation of frequently experienced stimuli, as well as perceptual bias towards such stimuli is a common phenomenon across sensory modalities, our model may generalize to sensory perception, rather than being specific to auditory perception.

  17. Adjustment of prior constraints for an improved crop monitoring with the Earth Observation Land Data Assimilation System (EO-LDAS)

    NASA Astrophysics Data System (ADS)

    Truckenbrodt, Sina C.; Gómez-Dans, José; Stelmaszczuk-Górska, Martyna A.; Chernetskiy, Maxim; Schmullius, Christiane C.

    2017-04-01

    Throughout the past decades various satellite sensors have been launched that record reflectance in the optical domain and facilitate comprehensive monitoring of the vegetation-covered land surface from space. The interaction of photons with the canopy, leaves and soil that determines the spectrum of reflected sunlight can be simulated with radiative transfer models (RTMs). The inversion of RTMs permits the derivation of state variables such as leaf area index (LAI) and leaf chlorophyll content from top-of-canopy reflectance. Space-borne data are, however, insufficient for an unambiguous derivation of state variables and additional constraints are required to resolve this ill-posed problem. Data assimilation techniques permit the conflation of various information with due allowance for associated uncertainties. The Earth Observation Land Data Assimilation System (EO-LDAS) integrates RTMs into a dynamic process model that describes the temporal evolution of state variables. In addition, prior information is included to further constrain the inversion and enhance the state variable derivation. In previous studies on EO-LDAS, prior information was represented by temporally constant values for all investigated state variables, while information about their phenological evolution was neglected. Here, we examine to what extent the implementation of prior information reflecting the phenological variability improves the performance of EO-LDAS with respect to the monitoring of crops on the agricultural Gebesee test site (Central Germany). Various routines for the generation of prior information are tested. This involves the usage of data on state variables that was acquired in previous years as well as the application of phenological models. The performance of EO-LDAS with the newly implemented prior information is tested based on medium resolution satellite imagery (e.g., RapidEye REIS, Sentinel-2 MSI, Landsat-7 ETM+ and Landsat-8 OLI). The predicted state variables are validated against in situ data from the Gebesee test site that were acquired with a weekly to fortnightly resolution throughout the growing seasons of 2010, 2013, 2014 and 2016. Furthermore, the results are compared with the outcome of using constant values as prior information. In this presentation, the EO-LDAS scheme and results obtained from different prior information are presented.

  18. Classroom Action Research on Formative Assessment in a Context-Based Chemistry Course

    ERIC Educational Resources Information Center

    Vogelzang, Johannes; Admiraal, Wilfried F.

    2017-01-01

    Context-based science courses stimulate students to reconstruct the information presented by connecting to their prior knowledge and experiences. However, students need support. Formative assessments inform both teacher and students about students' knowledge deficiencies and misconceptions and how students can be supported. Research on formative…

  19. Through Increasing "Information Literacy" Capital and Habitus (Agency): The Complementary Impact on Composition Skills When Appropriately Sequenced

    ERIC Educational Resources Information Center

    Karas, Timothy

    2017-01-01

    Through a case study approach of a cohort of community college students at a single community college, the impact on success rates in composition courses was analyzed based on the sequence of completing an information literacy course. Two student cohorts were sampled based on completing an information literacy course prior to, or concurrently with…

  20. Internet Use for Prediagnosis Symptom Appraisal by Colorectal Cancer Patients

    ERIC Educational Resources Information Center

    Thomson, Maria D.; Siminoff, Laura A.; Longo, Daniel R.

    2012-01-01

    Background: This study explored the characteristics of colorectal cancer (CRC) patients who accessed Internet-based health information as part of their symptom appraisal process prior to consulting a health care provider. Method: Newly diagnosed CRC patients who experienced symptoms prior to diagnosis were interviewed. Brief COPE was used to…

  1. Item Memory, Context Memory and the Hippocampus: fMRI Evidence

    ERIC Educational Resources Information Center

    Rugg, Michael D.; Vilberg, Kaia L.; Mattson, Julia T.; Yu, Sarah S.; Johnson, Jeffrey D.; Suzuki, Maki

    2012-01-01

    Dual-process models of recognition memory distinguish between the retrieval of qualitative information about a prior event (recollection), and judgments of prior occurrence based on an acontextual sense of familiarity. fMRI studies investigating the neural correlates of memory encoding and retrieval conducted within the dual-process framework have…

  2. Pig Data and Bayesian Inference on Multinomial Probabilities

    ERIC Educational Resources Information Center

    Kern, John C.

    2006-01-01

    Bayesian inference on multinomial probabilities is conducted based on data collected from the game Pass the Pigs[R]. Prior information on these probabilities is readily available from the instruction manual, and is easily incorporated in a Dirichlet prior. Posterior analysis of the scoring probabilities quantifies the discrepancy between empirical…

  3. Experiences with using information and communication technology to build a multi-municipal support network for informal carers.

    PubMed

    Torp, Steffen; Bing-Jonsson, Pia C; Hanson, Elizabeth

    2013-09-01

    This multi-municipal intervention study explored whether informal carers of frail older people and disabled children living at home made use of information and communication technology (ICT) to gain knowledge about caring and to form informal support networks, thereby improving their health. Seventy-nine informal carers accessed web-based information about caring and an e-based discussion forum via their personal computers. They were able to maintain contact with each other using a web camera and via normal group meetings. After the first 12 months, 17 informal carers participated in focus group interviews and completed a short questionnaire. Four staff members were also interviewed. Participant carers who had prior experiences with a similar ICT-based support network reported greater satisfaction and more extensive use of the network than did participants with no such prior experience. It seems that infrequent usage of the service may be explained by too few other carers to identify with and inappropriate recruitment procedures. Nevertheless, carers of disabled children reported that the intervention had resulted in improved services across the participant municipalities. To achieve optimal effects of an ICT-based support network due attention must be given to recruitment processes and social environment building for which care practitioners require training and support.

  4. ℓ1-Regularized full-waveform inversion with prior model information based on orthant-wise limited memory quasi-Newton method

    NASA Astrophysics Data System (ADS)

    Dai, Meng-Xue; Chen, Jing-Bo; Cao, Jian

    2017-07-01

    Full-waveform inversion (FWI) is an ill-posed optimization problem which is sensitive to noise and initial model. To alleviate the ill-posedness of the problem, regularization techniques are usually adopted. The ℓ1-norm penalty is a robust regularization method that preserves contrasts and edges. The Orthant-Wise Limited-Memory Quasi-Newton (OWL-QN) method extends the widely-used limited-memory Broyden-Fletcher-Goldfarb-Shanno (L-BFGS) method to the ℓ1-regularized optimization problems and inherits the efficiency of L-BFGS. To take advantage of the ℓ1-regularized method and the prior model information obtained from sonic logs and geological information, we implement OWL-QN algorithm in ℓ1-regularized FWI with prior model information in this paper. Numerical experiments show that this method not only improve the inversion results but also has a strong anti-noise ability.

  5. 47 CFR 27.70 - Information exchange.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 2 2012-10-01 2012-10-01 false Information exchange. 27.70 Section 27.70... COMMUNICATIONS SERVICES Technical Standards § 27.70 Information exchange. (a) Prior notification. Public safety... information to the public safety licensee at least 10 business days before a new base or fixed station is...

  6. 47 CFR 27.70 - Information exchange.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 2 2013-10-01 2013-10-01 false Information exchange. 27.70 Section 27.70... COMMUNICATIONS SERVICES Technical Standards § 27.70 Information exchange. (a) Prior notification. Public safety... information to the public safety licensee at least 10 business days before a new base or fixed station is...

  7. 47 CFR 27.70 - Information exchange.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 2 2011-10-01 2011-10-01 false Information exchange. 27.70 Section 27.70... COMMUNICATIONS SERVICES Technical Standards § 27.70 Information exchange. (a) Prior notification. Public safety... information to the public safety licensee at least 10 business days before a new base or fixed station is...

  8. Estimating Tree Height-Diameter Models with the Bayesian Method

    PubMed Central

    Duan, Aiguo; Zhang, Jianguo; Xiang, Congwei

    2014-01-01

    Six candidate height-diameter models were used to analyze the height-diameter relationships. The common methods for estimating the height-diameter models have taken the classical (frequentist) approach based on the frequency interpretation of probability, for example, the nonlinear least squares method (NLS) and the maximum likelihood method (ML). The Bayesian method has an exclusive advantage compared with classical method that the parameters to be estimated are regarded as random variables. In this study, the classical and Bayesian methods were used to estimate six height-diameter models, respectively. Both the classical method and Bayesian method showed that the Weibull model was the “best” model using data1. In addition, based on the Weibull model, data2 was used for comparing Bayesian method with informative priors with uninformative priors and classical method. The results showed that the improvement in prediction accuracy with Bayesian method led to narrower confidence bands of predicted value in comparison to that for the classical method, and the credible bands of parameters with informative priors were also narrower than uninformative priors and classical method. The estimated posterior distributions for parameters can be set as new priors in estimating the parameters using data2. PMID:24711733

  9. Estimating tree height-diameter models with the Bayesian method.

    PubMed

    Zhang, Xiongqing; Duan, Aiguo; Zhang, Jianguo; Xiang, Congwei

    2014-01-01

    Six candidate height-diameter models were used to analyze the height-diameter relationships. The common methods for estimating the height-diameter models have taken the classical (frequentist) approach based on the frequency interpretation of probability, for example, the nonlinear least squares method (NLS) and the maximum likelihood method (ML). The Bayesian method has an exclusive advantage compared with classical method that the parameters to be estimated are regarded as random variables. In this study, the classical and Bayesian methods were used to estimate six height-diameter models, respectively. Both the classical method and Bayesian method showed that the Weibull model was the "best" model using data1. In addition, based on the Weibull model, data2 was used for comparing Bayesian method with informative priors with uninformative priors and classical method. The results showed that the improvement in prediction accuracy with Bayesian method led to narrower confidence bands of predicted value in comparison to that for the classical method, and the credible bands of parameters with informative priors were also narrower than uninformative priors and classical method. The estimated posterior distributions for parameters can be set as new priors in estimating the parameters using data2.

  10. Hierarchical Commensurate and Power Prior Models for Adaptive Incorporation of Historical Information in Clinical Trials

    PubMed Central

    Hobbs, Brian P.; Carlin, Bradley P.; Mandrekar, Sumithra J.; Sargent, Daniel J.

    2011-01-01

    Summary Bayesian clinical trial designs offer the possibility of a substantially reduced sample size, increased statistical power, and reductions in cost and ethical hazard. However when prior and current information conflict, Bayesian methods can lead to higher than expected Type I error, as well as the possibility of a costlier and lengthier trial. This motivates an investigation of the feasibility of hierarchical Bayesian methods for incorporating historical data that are adaptively robust to prior information that reveals itself to be inconsistent with the accumulating experimental data. In this paper, we present several models that allow for the commensurability of the information in the historical and current data to determine how much historical information is used. A primary tool is elaborating the traditional power prior approach based upon a measure of commensurability for Gaussian data. We compare the frequentist performance of several methods using simulations, and close with an example of a colon cancer trial that illustrates a linear models extension of our adaptive borrowing approach. Our proposed methods produce more precise estimates of the model parameters, in particular conferring statistical significance to the observed reduction in tumor size for the experimental regimen as compared to the control regimen. PMID:21361892

  11. Investigating local controls on temporal stability of soil water content using sensor network data and an inverse modeling approach

    NASA Astrophysics Data System (ADS)

    Qu, W.; Bogena, H. R.; Huisman, J. A.; Martinez, G.; Pachepsky, Y. A.; Vereecken, H.

    2013-12-01

    Soil water content is a key variable in the soil, vegetation and atmosphere continuum with high spatial and temporal variability. Temporal stability of soil water content (SWC) has been observed in multiple monitoring studies and the quantification of controls on soil moisture variability and temporal stability presents substantial interest. The objective of this work was to assess the effect of soil hydraulic parameters on the temporal stability. The inverse modeling based on large observed time series SWC with in-situ sensor network was used to estimate the van Genuchten-Mualem (VGM) soil hydraulic parameters in a small grassland catchment located in western Germany. For the inverse modeling, the shuffled complex evaluation (SCE) optimization algorithm was coupled with the HYDRUS 1D code. We considered two cases: without and with prior information about the correlation between VGM parameters. The temporal stability of observed SWC was well pronounced at all observation depths. Both the spatial variability of SWC and the robustness of temporal stability increased with depth. Calibrated models both with and without prior information provided reasonable correspondence between simulated and measured time series of SWC. Furthermore, we found a linear relationship between the mean relative difference (MRD) of SWC and the saturated SWC (θs). Also, the logarithm of saturated hydraulic conductivity (Ks), the VGM parameter n and logarithm of α were strongly correlated with the MRD of saturation degree for the prior information case, but no correlation was found for the non-prior information case except at the 50cm depth. Based on these results we propose that establishing relationships between temporal stability and spatial variability of soil properties presents a promising research avenue for a better understanding of the controls on soil moisture variability. Correlation between Mean Relative Difference of soil water content (or saturation degree) and inversely estimated soil hydraulic parameters (log10(Ks), log10(α), n, and θs) at 5-cm, 20-cm and 50-cm depths. Solid circles represent parameters estimated by using prior information; open circles represent parameters estimated without using prior information.

  12. A cautionary note on Bayesian estimation of population size by removal sampling with diffuse priors.

    PubMed

    Bord, Séverine; Bioche, Christèle; Druilhet, Pierre

    2018-05-01

    We consider the problem of estimating a population size by removal sampling when the sampling rate is unknown. Bayesian methods are now widespread and allow to include prior knowledge in the analysis. However, we show that Bayes estimates based on default improper priors lead to improper posteriors or infinite estimates. Similarly, weakly informative priors give unstable estimators that are sensitive to the choice of hyperparameters. By examining the likelihood, we show that population size estimates can be stabilized by penalizing small values of the sampling rate or large value of the population size. Based on theoretical results and simulation studies, we propose some recommendations on the choice of the prior. Then, we applied our results to real datasets. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Extraction of microseismic waveforms characteristics prior to rock burst using Hilbert-Huang transform

    NASA Astrophysics Data System (ADS)

    Li, Xuelong; Li, Zhonghui; Wang, Enyuan; Feng, Junjun; Chen, Liang; Li, Nan; Kong, Xiangguo

    2016-09-01

    This study provides a new research idea concerning rock burst prediction. The characteristics of microseismic (MS) waveforms prior to and during the rock burst were studied through the Hilbert-Huang transform (HHT). In order to demonstrate the advantage of the MS features extraction based on HHT, the conventional analysis method (Fourier transform) was also used to make a comparison. The results show that HHT is simple and reliable, and could extract in-depth information about the characteristics of MS waveforms. About 10 days prior to the rock burst, the main frequency of MS waveforms transforms from the high-frequency to low-frequency. What's more, the waveforms energy also presents accumulation characteristic. Based on our study results, it can be concluded that the MS signals analysis through HHT could provide valuable information about the coal or rock deformation and fracture.

  14. Influence of prior information on pain involves biased perceptual decision-making.

    PubMed

    Wiech, Katja; Vandekerckhove, Joachim; Zaman, Jonas; Tuerlinckx, Francis; Vlaeyen, Johan W S; Tracey, Irene

    2014-08-04

    Prior information about features of a stimulus is a strong modulator of perception. For instance, the prospect of more intense pain leads to an increased perception of pain, whereas the expectation of analgesia reduces pain, as shown in placebo analgesia and expectancy modulations during drug administration. This influence is commonly assumed to be rooted in altered sensory processing and expectancy-related modulations in the spinal cord, are often taken as evidence for this notion. Contemporary models of perception, however, suggest that prior information can also modulate perception by biasing perceptual decision-making - the inferential process underlying perception in which prior information is used to interpret sensory information. In this type of bias, the information is already present in the system before the stimulus is observed. Computational models can distinguish between changes in sensory processing and altered decision-making as they result in different response times for incorrect choices in a perceptual decision-making task (Figure S1A,B). Using a drift-diffusion model, we investigated the influence of both processes in two independent experiments. The results of both experiments strongly suggest that these changes in pain perception are predominantly based on altered perceptual decision-making. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.

  15. RC-MAPS: Bridging the Comprehension Gap in EAP Reading

    ERIC Educational Resources Information Center

    Sterzik, Angela Meyer; Fraser, Carol

    2012-01-01

    In academic environments, reading is assigned not simply to transmit information; students are required to take the information, and based on the task set by the instructor, assess, analyze, and critique it on the basis of personal experiences, prior knowledge, and other readings (Grabe, 2009). Thus text-based comprehension (Kintsch, 1998) alone…

  16. Improved patch-based learning for image deblurring

    NASA Astrophysics Data System (ADS)

    Dong, Bo; Jiang, Zhiguo; Zhang, Haopeng

    2015-05-01

    Most recent image deblurring methods only use valid information found in input image as the clue to fill the deblurring region. These methods usually have the defects of insufficient prior information and relatively poor adaptiveness. Patch-based method not only uses the valid information of the input image itself, but also utilizes the prior information of the sample images to improve the adaptiveness. However the cost function of this method is quite time-consuming and the method may also produce ringing artifacts. In this paper, we propose an improved non-blind deblurring algorithm based on learning patch likelihoods. On one hand, we consider the effect of the Gaussian mixture model with different weights and normalize the weight values, which can optimize the cost function and reduce running time. On the other hand, a post processing method is proposed to solve the ringing artifacts produced by traditional patch-based method. Extensive experiments are performed. Experimental results verify that our method can effectively reduce the execution time, suppress the ringing artifacts effectively, and keep the quality of deblurred image.

  17. Incorporation of prior information on parameters into nonlinear regression groundwater flow models: 1. Theory

    USGS Publications Warehouse

    Cooley, Richard L.

    1982-01-01

    Prior information on the parameters of a groundwater flow model can be used to improve parameter estimates obtained from nonlinear regression solution of a modeling problem. Two scales of prior information can be available: (1) prior information having known reliability (that is, bias and random error structure) and (2) prior information consisting of best available estimates of unknown reliability. A regression method that incorporates the second scale of prior information assumes the prior information to be fixed for any particular analysis to produce improved, although biased, parameter estimates. Approximate optimization of two auxiliary parameters of the formulation is used to help minimize the bias, which is almost always much smaller than that resulting from standard ridge regression. It is shown that if both scales of prior information are available, then a combined regression analysis may be made.

  18. A Web Browser Interface to Manage the Searching and Organizing of Information on the Web by Learners

    ERIC Educational Resources Information Center

    Li, Liang-Yi; Chen, Gwo-Dong

    2010-01-01

    Information Gathering is a knowledge construction process. Web learners make a plan for their Information Gathering task based on their prior knowledge. The plan is evolved with new information encountered and their mental model is constructed through continuously assimilating and accommodating new information gathered from different Web pages. In…

  19. SciRide Finder: a citation-based paradigm in biomedical literature search.

    PubMed

    Volanakis, Adam; Krawczyk, Konrad

    2018-04-18

    There are more than 26 million peer-reviewed biomedical research items according to Medline/PubMed. This breadth of information is indicative of the progress in biomedical sciences on one hand, but an overload for scientists performing literature searches on the other. A major portion of scientific literature search is to find statements, numbers and protocols that can be cited to build an evidence-based narrative for a new manuscript. Because science builds on prior knowledge, such information has likely been written out and cited in an older manuscript. Thus, Cited Statements, pieces of text from scientific literature supported by citing other peer-reviewed publications, carry significant amount of condensed information on prior art. Based on this principle, we propose a literature search service, SciRide Finder (finder.sciride.org), which constrains the search corpus to such Cited Statements only. We demonstrate that Cited Statements can carry different information to this found in titles/abstracts and full text, giving access to alternative literature search results than traditional search engines. We further show how presenting search results as a list of Cited Statements allows researchers to easily find information to build an evidence-based narrative for their own manuscripts.

  20. Inferring metabolic networks using the Bayesian adaptive graphical lasso with informative priors.

    PubMed

    Peterson, Christine; Vannucci, Marina; Karakas, Cemal; Choi, William; Ma, Lihua; Maletić-Savatić, Mirjana

    2013-10-01

    Metabolic processes are essential for cellular function and survival. We are interested in inferring a metabolic network in activated microglia, a major neuroimmune cell in the brain responsible for the neuroinflammation associated with neurological diseases, based on a set of quantified metabolites. To achieve this, we apply the Bayesian adaptive graphical lasso with informative priors that incorporate known relationships between covariates. To encourage sparsity, the Bayesian graphical lasso places double exponential priors on the off-diagonal entries of the precision matrix. The Bayesian adaptive graphical lasso allows each double exponential prior to have a unique shrinkage parameter. These shrinkage parameters share a common gamma hyperprior. We extend this model to create an informative prior structure by formulating tailored hyperpriors on the shrinkage parameters. By choosing parameter values for each hyperprior that shift probability mass toward zero for nodes that are close together in a reference network, we encourage edges between covariates with known relationships. This approach can improve the reliability of network inference when the sample size is small relative to the number of parameters to be estimated. When applied to the data on activated microglia, the inferred network includes both known relationships and associations of potential interest for further investigation.

  1. Inferring metabolic networks using the Bayesian adaptive graphical lasso with informative priors

    PubMed Central

    PETERSON, CHRISTINE; VANNUCCI, MARINA; KARAKAS, CEMAL; CHOI, WILLIAM; MA, LIHUA; MALETIĆ-SAVATIĆ, MIRJANA

    2014-01-01

    Metabolic processes are essential for cellular function and survival. We are interested in inferring a metabolic network in activated microglia, a major neuroimmune cell in the brain responsible for the neuroinflammation associated with neurological diseases, based on a set of quantified metabolites. To achieve this, we apply the Bayesian adaptive graphical lasso with informative priors that incorporate known relationships between covariates. To encourage sparsity, the Bayesian graphical lasso places double exponential priors on the off-diagonal entries of the precision matrix. The Bayesian adaptive graphical lasso allows each double exponential prior to have a unique shrinkage parameter. These shrinkage parameters share a common gamma hyperprior. We extend this model to create an informative prior structure by formulating tailored hyperpriors on the shrinkage parameters. By choosing parameter values for each hyperprior that shift probability mass toward zero for nodes that are close together in a reference network, we encourage edges between covariates with known relationships. This approach can improve the reliability of network inference when the sample size is small relative to the number of parameters to be estimated. When applied to the data on activated microglia, the inferred network includes both known relationships and associations of potential interest for further investigation. PMID:24533172

  2. 78 FR 9391 - Agency Information Collection Activities; Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-02-08

    ... extend the existing PRA clearance for the information collection requirements associated with the... burden of the FCLCA and Rule based on its knowledge of, and information from, the eye care industry... party prescriber. No substantive provisions in the Rule have been amended or changed since staff's prior...

  3. Imaging performance of a hybrid x-ray computed tomography-fluorescence molecular tomography system using priors.

    PubMed

    Ale, Angelique; Schulz, Ralf B; Sarantopoulos, Athanasios; Ntziachristos, Vasilis

    2010-05-01

    The performance is studied of two newly introduced and previously suggested methods that incorporate priors into inversion schemes associated with data from a recently developed hybrid x-ray computed tomography and fluorescence molecular tomography system, the latter based on CCD camera photon detection. The unique data set studied attains accurately registered data of high spatially sampled photon fields propagating through tissue along 360 degrees projections. Approaches that incorporate structural prior information were included in the inverse problem by adding a penalty term to the minimization function utilized for image reconstructions. Results were compared as to their performance with simulated and experimental data from a lung inflammation animal model and against the inversions achieved when not using priors. The importance of using priors over stand-alone inversions is also showcased with high spatial sampling simulated and experimental data. The approach of optimal performance in resolving fluorescent biodistribution in small animals is also discussed. Inclusion of prior information from x-ray CT data in the reconstruction of the fluorescence biodistribution leads to improved agreement between the reconstruction and validation images for both simulated and experimental data.

  4. On selecting a prior for the precision parameter of Dirichlet process mixture models

    USGS Publications Warehouse

    Dorazio, R.M.

    2009-01-01

    In hierarchical mixture models the Dirichlet process is used to specify latent patterns of heterogeneity, particularly when the distribution of latent parameters is thought to be clustered (multimodal). The parameters of a Dirichlet process include a precision parameter ?? and a base probability measure G0. In problems where ?? is unknown and must be estimated, inferences about the level of clustering can be sensitive to the choice of prior assumed for ??. In this paper an approach is developed for computing a prior for the precision parameter ?? that can be used in the presence or absence of prior information about the level of clustering. This approach is illustrated in an analysis of counts of stream fishes. The results of this fully Bayesian analysis are compared with an empirical Bayes analysis of the same data and with a Bayesian analysis based on an alternative commonly used prior.

  5. Minimally Informative Prior Distributions for PSA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dana L. Kelly; Robert W. Youngblood; Kurt G. Vedros

    2010-06-01

    A salient feature of Bayesian inference is its ability to incorporate information from a variety of sources into the inference model, via the prior distribution (hereafter simply “the prior”). However, over-reliance on old information can lead to priors that dominate new data. Some analysts seek to avoid this by trying to work with a minimally informative prior distribution. Another reason for choosing a minimally informative prior is to avoid the often-voiced criticism of subjectivity in the choice of prior. Minimally informative priors fall into two broad classes: 1) so-called noninformative priors, which attempt to be completely objective, in that themore » posterior distribution is determined as completely as possible by the observed data, the most well known example in this class being the Jeffreys prior, and 2) priors that are diffuse over the region where the likelihood function is nonnegligible, but that incorporate some information about the parameters being estimated, such as a mean value. In this paper, we compare four approaches in the second class, with respect to their practical implications for Bayesian inference in Probabilistic Safety Assessment (PSA). The most commonly used such prior, the so-called constrained noninformative prior, is a special case of the maximum entropy prior. This is formulated as a conjugate distribution for the most commonly encountered aleatory models in PSA, and is correspondingly mathematically convenient; however, it has a relatively light tail and this can cause the posterior mean to be overly influenced by the prior in updates with sparse data. A more informative prior that is capable, in principle, of dealing more effectively with sparse data is a mixture of conjugate priors. A particular diffuse nonconjugate prior, the logistic-normal, is shown to behave similarly for some purposes. Finally, we review the so-called robust prior. Rather than relying on the mathematical abstraction of entropy, as does the constrained noninformative prior, the robust prior places a heavy-tailed Cauchy prior on the canonical parameter of the aleatory model.« less

  6. Generalized multiple kernel learning with data-dependent priors.

    PubMed

    Mao, Qi; Tsang, Ivor W; Gao, Shenghua; Wang, Li

    2015-06-01

    Multiple kernel learning (MKL) and classifier ensemble are two mainstream methods for solving learning problems in which some sets of features/views are more informative than others, or the features/views within a given set are inconsistent. In this paper, we first present a novel probabilistic interpretation of MKL such that maximum entropy discrimination with a noninformative prior over multiple views is equivalent to the formulation of MKL. Instead of using the noninformative prior, we introduce a novel data-dependent prior based on an ensemble of kernel predictors, which enhances the prediction performance of MKL by leveraging the merits of the classifier ensemble. With the proposed probabilistic framework of MKL, we propose a hierarchical Bayesian model to learn the proposed data-dependent prior and classification model simultaneously. The resultant problem is convex and other information (e.g., instances with either missing views or missing labels) can be seamlessly incorporated into the data-dependent priors. Furthermore, a variety of existing MKL models can be recovered under the proposed MKL framework and can be readily extended to incorporate these priors. Extensive experiments demonstrate the benefits of our proposed framework in supervised and semisupervised settings, as well as in tasks with partial correspondence among multiple views.

  7. 6 CFR 37.45 - Background checks for covered employees.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ..., the validation of references from prior employment, a name-based and fingerprint-based criminal.... States must conduct a name-based and fingerprint-based criminal history records check (CHRC) using, at a minimum, the FBI's National Crime Information Center (NCIC) and the Integrated Automated Fingerprint...

  8. 6 CFR 37.45 - Background checks for covered employees.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ..., the validation of references from prior employment, a name-based and fingerprint-based criminal.... States must conduct a name-based and fingerprint-based criminal history records check (CHRC) using, at a minimum, the FBI's National Crime Information Center (NCIC) and the Integrated Automated Fingerprint...

  9. 6 CFR 37.45 - Background checks for covered employees.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ..., the validation of references from prior employment, a name-based and fingerprint-based criminal.... States must conduct a name-based and fingerprint-based criminal history records check (CHRC) using, at a minimum, the FBI's National Crime Information Center (NCIC) and the Integrated Automated Fingerprint...

  10. 6 CFR 37.45 - Background checks for covered employees.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ..., the validation of references from prior employment, a name-based and fingerprint-based criminal.... States must conduct a name-based and fingerprint-based criminal history records check (CHRC) using, at a minimum, the FBI's National Crime Information Center (NCIC) and the Integrated Automated Fingerprint...

  11. 6 CFR 37.45 - Background checks for covered employees.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ..., the validation of references from prior employment, a name-based and fingerprint-based criminal.... States must conduct a name-based and fingerprint-based criminal history records check (CHRC) using, at a minimum, the FBI's National Crime Information Center (NCIC) and the Integrated Automated Fingerprint...

  12. XID+: Next generation XID development

    NASA Astrophysics Data System (ADS)

    Hurley, Peter

    2017-04-01

    XID+ is a prior-based source extraction tool which carries out photometry in the Herschel SPIRE (Spectral and Photometric Imaging Receiver) maps at the positions of known sources. It uses a probabilistic Bayesian framework that provides a natural framework in which to include prior information, and uses the Bayesian inference tool Stan to obtain the full posterior probability distribution on flux estimates.

  13. Low dose CBCT reconstruction via prior contour based total variation (PCTV) regularization: a feasibility study

    NASA Astrophysics Data System (ADS)

    Chen, Yingxuan; Yin, Fang-Fang; Zhang, Yawei; Zhang, You; Ren, Lei

    2018-04-01

    Purpose: compressed sensing reconstruction using total variation (TV) tends to over-smooth the edge information by uniformly penalizing the image gradient. The goal of this study is to develop a novel prior contour based TV (PCTV) method to enhance the edge information in compressed sensing reconstruction for CBCT. Methods: the edge information is extracted from prior planning-CT via edge detection. Prior CT is first registered with on-board CBCT reconstructed with TV method through rigid or deformable registration. The edge contours in prior-CT is then mapped to CBCT and used as the weight map for TV regularization to enhance edge information in CBCT reconstruction. The PCTV method was evaluated using extended-cardiac-torso (XCAT) phantom, physical CatPhan phantom and brain patient data. Results were compared with both TV and edge preserving TV (EPTV) methods which are commonly used for limited projection CBCT reconstruction. Relative error was used to calculate pixel value difference and edge cross correlation was defined as the similarity of edge information between reconstructed images and ground truth in the quantitative evaluation. Results: compared to TV and EPTV, PCTV enhanced the edge information of bone, lung vessels and tumor in XCAT reconstruction and complex bony structures in brain patient CBCT. In XCAT study using 45 half-fan CBCT projections, compared with ground truth, relative errors were 1.5%, 0.7% and 0.3% and edge cross correlations were 0.66, 0.72 and 0.78 for TV, EPTV and PCTV, respectively. PCTV is more robust to the projection number reduction. Edge enhancement was reduced slightly with noisy projections but PCTV was still superior to other methods. PCTV can maintain resolution while reducing the noise in the low mAs CatPhan reconstruction. Low contrast edges were preserved better with PCTV compared with TV and EPTV. Conclusion: PCTV preserved edge information as well as reduced streak artifacts and noise in low dose CBCT reconstruction. PCTV is superior to TV and EPTV methods in edge enhancement, which can potentially improve the localization accuracy in radiation therapy.

  14. Exploring patterns in resource utilization prior to the formal identification of homelessness in recently returned veterans.

    PubMed

    Gundlapalli, Adi V; Redd, Andrew; Carter, Marjorie E; Palmer, Miland; Peterson, Rachel; Samore, Matthew H

    2014-01-01

    There are limited data on resources utilized by US Veterans prior to their identification as being homeless. We performed visual analytics on longitudinal medical encounter data prior to the official recognition of homelessness in a large cohort of OEF/OIF Veterans. A statistically significant increase in numbers of several categories of visits in the immediate 30 days prior to the recognition of homelessness was noted as compared to an earlier period. This finding has the potential to inform prediction algorithms based on structured data with a view to intervention and mitigation of homelessness among Veterans.

  15. Effects of model complexity and priors on estimation using sequential importance sampling/resampling for species conservation

    USGS Publications Warehouse

    Dunham, Kylee; Grand, James B.

    2016-01-01

    We examined the effects of complexity and priors on the accuracy of models used to estimate ecological and observational processes, and to make predictions regarding population size and structure. State-space models are useful for estimating complex, unobservable population processes and making predictions about future populations based on limited data. To better understand the utility of state space models in evaluating population dynamics, we used them in a Bayesian framework and compared the accuracy of models with differing complexity, with and without informative priors using sequential importance sampling/resampling (SISR). Count data were simulated for 25 years using known parameters and observation process for each model. We used kernel smoothing to reduce the effect of particle depletion, which is common when estimating both states and parameters with SISR. Models using informative priors estimated parameter values and population size with greater accuracy than their non-informative counterparts. While the estimates of population size and trend did not suffer greatly in models using non-informative priors, the algorithm was unable to accurately estimate demographic parameters. This model framework provides reasonable estimates of population size when little to no information is available; however, when information on some vital rates is available, SISR can be used to obtain more precise estimates of population size and process. Incorporating model complexity such as that required by structured populations with stage-specific vital rates affects precision and accuracy when estimating latent population variables and predicting population dynamics. These results are important to consider when designing monitoring programs and conservation efforts requiring management of specific population segments.

  16. Robust Bayesian hypocentre and uncertainty region estimation: the effect of heavy-tailed distributions and prior information in cases with poor, inconsistent and insufficient arrival times

    NASA Astrophysics Data System (ADS)

    Martinsson, J.

    2013-03-01

    We propose methods for robust Bayesian inference of the hypocentre in presence of poor, inconsistent and insufficient phase arrival times. The objectives are to increase the robustness, the accuracy and the precision by introducing heavy-tailed distributions and an informative prior distribution of the seismicity. The effects of the proposed distributions are studied under real measurement conditions in two underground mine networks and validated using 53 blasts with known hypocentres. To increase the robustness against poor, inconsistent or insufficient arrivals, a Gaussian Mixture Model is used as a hypocentre prior distribution to describe the seismically active areas, where the parameters are estimated based on previously located events in the region. The prior is truncated to constrain the solution to valid geometries, for example below the ground surface, excluding known cavities, voids and fractured zones. To reduce the sensitivity to outliers, different heavy-tailed distributions are evaluated to model the likelihood distribution of the arrivals given the hypocentre and the origin time. Among these distributions, the multivariate t-distribution is shown to produce the overall best performance, where the tail-mass adapts to the observed data. Hypocentre and uncertainty region estimates are based on simulations from the posterior distribution using Markov Chain Monte Carlo techniques. Velocity graphs (equivalent to traveltime graphs) are estimated using blasts from known locations, and applied to reduce the main uncertainties and thereby the final estimation error. To focus on the behaviour and the performance of the proposed distributions, a basic single-event Bayesian procedure is considered in this study for clarity. Estimation results are shown with different distributions, with and without prior distribution of seismicity, with wrong prior distribution, with and without error compensation, with and without error description, with insufficient arrival times and in presence of significant outliers. A particular focus is on visual results and comparisons to give a better understanding of the Bayesian advantage and to show the effects of heavy-tailed distributions and informative prior information on real data.

  17. Implicit Priors in Galaxy Cluster Mass and Scaling Relation Determinations

    NASA Technical Reports Server (NTRS)

    Mantz, A.; Allen, S. W.

    2011-01-01

    Deriving the total masses of galaxy clusters from observations of the intracluster medium (ICM) generally requires some prior information, in addition to the assumptions of hydrostatic equilibrium and spherical symmetry. Often, this information takes the form of particular parametrized functions used to describe the cluster gas density and temperature profiles. In this paper, we investigate the implicit priors on hydrostatic masses that result from this fully parametric approach, and the implications of such priors for scaling relations formed from those masses. We show that the application of such fully parametric models of the ICM naturally imposes a prior on the slopes of the derived scaling relations, favoring the self-similar model, and argue that this prior may be influential in practice. In contrast, this bias does not exist for techniques which adopt an explicit prior on the form of the mass profile but describe the ICM non-parametrically. Constraints on the slope of the cluster mass-temperature relation in the literature show a separation based the approach employed, with the results from fully parametric ICM modeling clustering nearer the self-similar value. Given that a primary goal of scaling relation analyses is to test the self-similar model, the application of methods subject to strong, implicit priors should be avoided. Alternative methods and best practices are discussed.

  18. Estimation of the limit of detection using information theory measures.

    PubMed

    Fonollosa, Jordi; Vergara, Alexander; Huerta, Ramón; Marco, Santiago

    2014-01-31

    Definitions of the limit of detection (LOD) based on the probability of false positive and/or false negative errors have been proposed over the past years. Although such definitions are straightforward and valid for any kind of analytical system, proposed methodologies to estimate the LOD are usually simplified to signals with Gaussian noise. Additionally, there is a general misconception that two systems with the same LOD provide the same amount of information on the source regardless of the prior probability of presenting a blank/analyte sample. Based upon an analogy between an analytical system and a binary communication channel, in this paper we show that the amount of information that can be extracted from an analytical system depends on the probability of presenting the two different possible states. We propose a new definition of LOD utilizing information theory tools that deals with noise of any kind and allows the introduction of prior knowledge easily. Unlike most traditional LOD estimation approaches, the proposed definition is based on the amount of information that the chemical instrumentation system provides on the chemical information source. Our findings indicate that the benchmark of analytical systems based on the ability to provide information about the presence/absence of the analyte (our proposed approach) is a more general and proper framework, while converging to the usual values when dealing with Gaussian noise. Copyright © 2013 Elsevier B.V. All rights reserved.

  19. The Role of Exploratory Talk in Classroom Search Engine Tasks

    ERIC Educational Resources Information Center

    Knight, Simon; Mercer, Neil

    2015-01-01

    While search engines are commonly used by children to find information, and in classroom-based activities, children are not adept in their information seeking or evaluation of information sources. Prior work has explored such activities in isolated, individual contexts, failing to account for the collaborative, discourse-mediated nature of search…

  20. Comparison of three instructional strategies in food and nutrition education: developing a diet plan for a diabetic case

    NASA Astrophysics Data System (ADS)

    Darabi, Aubteen; Pourafshar, Shirin; Suryavanshi, Rinki; `Logan'Arrington, Thomas

    2016-05-01

    This study examines the performance of dietitians-in-training on developing a diet plan for a diabetic patient either independently or after peer discussion. Participants (n = 58) from an undergraduate program in food and nutrition were divided into two groups based on their prior knowledge before being randomly assigned into three conditions: (1) peer discussion with just-in-time information (JIT information), (2) peer discussion without JIT information), and (3) independent performers. The learners' performance in the three conditions was analyzed. The results presented here describe the role of prior knowledge and JIT information across the conditions and the interaction of the two factors as well as the instructional implications of the findings.

  1. Adaptive Markov Random Fields for Example-Based Super-resolution of Faces

    NASA Astrophysics Data System (ADS)

    Stephenson, Todd A.; Chen, Tsuhan

    2006-12-01

    Image enhancement of low-resolution images can be done through methods such as interpolation, super-resolution using multiple video frames, and example-based super-resolution. Example-based super-resolution, in particular, is suited to images that have a strong prior (for those frameworks that work on only a single image, it is more like image restoration than traditional, multiframe super-resolution). For example, hallucination and Markov random field (MRF) methods use examples drawn from the same domain as the image being enhanced to determine what the missing high-frequency information is likely to be. We propose to use even stronger prior information by extending MRF-based super-resolution to use adaptive observation and transition functions, that is, to make these functions region-dependent. We show with face images how we can adapt the modeling for each image patch so as to improve the resolution.

  2. Balancing the Role of Priors in Multi-Observer Segmentation Evaluation

    PubMed Central

    Huang, Xiaolei; Wang, Wei; Lopresti, Daniel; Long, Rodney; Antani, Sameer; Xue, Zhiyun; Thoma, George

    2009-01-01

    Comparison of a group of multiple observer segmentations is known to be a challenging problem. A good segmentation evaluation method would allow different segmentations not only to be compared, but to be combined to generate a “true” segmentation with higher consensus. Numerous multi-observer segmentation evaluation approaches have been proposed in the literature, and STAPLE in particular probabilistically estimates the true segmentation by optimal combination of observed segmentations and a prior model of the truth. An Expectation–Maximization (EM) algorithm, STAPLE’S convergence to the desired local minima depends on good initializations for the truth prior and the observer-performance prior. However, accurate modeling of the initial truth prior is nontrivial. Moreover, among the two priors, the truth prior always dominates so that in certain scenarios when meaningful observer-performance priors are available, STAPLE can not take advantage of that information. In this paper, we propose a Bayesian decision formulation of the problem that permits the two types of prior knowledge to be integrated in a complementary manner in four cases with differing application purposes: (1) with known truth prior; (2) with observer prior; (3) with neither truth prior nor observer prior; and (4) with both truth prior and observer prior. The third and fourth cases are not discussed (or effectively ignored) by STAPLE, and in our research we propose a new method to combine multiple-observer segmentations based on the maximum a posterior (MAP) principle, which respects the observer prior regardless of the availability of the truth prior. Based on the four scenarios, we have developed a web-based software application that implements the flexible segmentation evaluation framework for digitized uterine cervix images. Experiment results show that our framework has flexibility in effectively integrating different priors for multi-observer segmentation evaluation and it also generates results comparing favorably to those by the STAPLE algorithm and the Majority Vote Rule. PMID:20523759

  3. Influences of Source - Item Contingency and Schematic Knowledge on Source Monitoring: Tests of the Probability-Matching Account

    PubMed Central

    Bayen, Ute J.; Kuhlmann, Beatrice G.

    2010-01-01

    The authors investigated conditions under which judgments in source-monitoring tasks are influenced by prior schematic knowledge. According to a probability-matching account of source guessing (Spaniol & Bayen, 2002), when people do not remember the source of information, they match source guessing probabilities to the perceived contingency between sources and item types. When they do not have a representation of a contingency, they base their guesses on prior schematic knowledge. The authors provide support for this account in two experiments with sources presenting information that was expected for one source and somewhat unexpected for another. Schema-relevant information about the sources was provided at the time of encoding. When contingency perception was impeded by dividing attention, participants showed schema-based guessing (Experiment 1). Manipulating source - item contingency also affected guessing (Experiment 2). When this contingency was schema-inconsistent, it superseded schema-based expectations and led to schema-inconsistent guessing. PMID:21603251

  4. Addressing potential prior-data conflict when using informative priors in proof-of-concept studies.

    PubMed

    Mutsvari, Timothy; Tytgat, Dominique; Walley, Rosalind

    2016-01-01

    Bayesian methods are increasingly used in proof-of-concept studies. An important benefit of these methods is the potential to use informative priors, thereby reducing sample size. This is particularly relevant for treatment arms where there is a substantial amount of historical information such as placebo and active comparators. One issue with using an informative prior is the possibility of a mismatch between the informative prior and the observed data, referred to as prior-data conflict. We focus on two methods for dealing with this: a testing approach and a mixture prior approach. The testing approach assesses prior-data conflict by comparing the observed data to the prior predictive distribution and resorting to a non-informative prior if prior-data conflict is declared. The mixture prior approach uses a prior with a precise and diffuse component. We assess these approaches for the normal case via simulation and show they have some attractive features as compared with the standard one-component informative prior. For example, when the discrepancy between the prior and the data is sufficiently marked, and intuitively, one feels less certain about the results, both the testing and mixture approaches typically yield wider posterior-credible intervals than when there is no discrepancy. In contrast, when there is no discrepancy, the results of these approaches are typically similar to the standard approach. Whilst for any specific study, the operating characteristics of any selected approach should be assessed and agreed at the design stage; we believe these two approaches are each worthy of consideration. Copyright © 2015 John Wiley & Sons, Ltd.

  5. HELP: XID+, the probabilistic de-blender for Herschel SPIRE maps

    NASA Astrophysics Data System (ADS)

    Hurley, P. D.; Oliver, S.; Betancourt, M.; Clarke, C.; Cowley, W. I.; Duivenvoorden, S.; Farrah, D.; Griffin, M.; Lacey, C.; Le Floc'h, E.; Papadopoulos, A.; Sargent, M.; Scudder, J. M.; Vaccari, M.; Valtchanov, I.; Wang, L.

    2017-01-01

    We have developed a new prior-based source extraction tool, XID+, to carry out photometry in the Herschel SPIRE (Spectral and Photometric Imaging Receiver) maps at the positions of known sources. XID+ is developed using a probabilistic Bayesian framework that provides a natural framework in which to include prior information, and uses the Bayesian inference tool Stan to obtain the full posterior probability distribution on flux estimates. In this paper, we discuss the details of XID+ and demonstrate the basic capabilities and performance by running it on simulated SPIRE maps resembling the COSMOS field, and comparing to the current prior-based source extraction tool DESPHOT. Not only we show that XID+ performs better on metrics such as flux accuracy and flux uncertainty accuracy, but we also illustrate how obtaining the posterior probability distribution can help overcome some of the issues inherent with maximum-likelihood-based source extraction routines. We run XID+ on the COSMOS SPIRE maps from Herschel Multi-Tiered Extragalactic Survey using a 24-μm catalogue as a positional prior, and a uniform flux prior ranging from 0.01 to 1000 mJy. We show the marginalized SPIRE colour-colour plot and marginalized contribution to the cosmic infrared background at the SPIRE wavelengths. XID+ is a core tool arising from the Herschel Extragalactic Legacy Project (HELP) and we discuss how additional work within HELP providing prior information on fluxes can and will be utilized. The software is available at https://github.com/H-E-L-P/XID_plus. We also provide the data product for COSMOS. We believe this is the first time that the full posterior probability of galaxy photometry has been provided as a data product.

  6. Spatially adapted augmentation of age-specific atlas-based segmentation using patch-based priors

    NASA Astrophysics Data System (ADS)

    Liu, Mengyuan; Seshamani, Sharmishtaa; Harrylock, Lisa; Kitsch, Averi; Miller, Steven; Chau, Van; Poskitt, Kenneth; Rousseau, Francois; Studholme, Colin

    2014-03-01

    One of the most common approaches to MRI brain tissue segmentation is to employ an atlas prior to initialize an Expectation- Maximization (EM) image labeling scheme using a statistical model of MRI intensities. This prior is commonly derived from a set of manually segmented training data from the population of interest. However, in cases where subject anatomy varies significantly from the prior anatomical average model (for example in the case where extreme developmental abnormalities or brain injuries occur), the prior tissue map does not provide adequate information about the observed MRI intensities to ensure the EM algorithm converges to an anatomically accurate labeling of the MRI. In this paper, we present a novel approach for automatic segmentation of such cases. This approach augments the atlas-based EM segmentation by exploring methods to build a hybrid tissue segmentation scheme that seeks to learn where an atlas prior fails (due to inadequate representation of anatomical variation in the statistical atlas) and utilize an alternative prior derived from a patch driven search of the atlas data. We describe a framework for incorporating this patch-based augmentation of EM (PBAEM) into a 4D age-specific atlas-based segmentation of developing brain anatomy. The proposed approach was evaluated on a set of MRI brain scans of premature neonates with ages ranging from 27.29 to 46.43 gestational weeks (GWs). Results indicated superior performance compared to the conventional atlas-based segmentation method, providing improved segmentation accuracy for gray matter, white matter, ventricles and sulcal CSF regions.

  7. Uncertainty Quantification using Exponential Epi-Splines

    DTIC Science & Technology

    2013-06-01

    Leibler divergence. The choice of κ in applications can be informed by the fact that the Kullback - Leibler divergence between two normal densities, ϕ1... of ran- dom output quantities of interests. The framework systematically incorporates hard information derived from physics-based sensors, field test ... information , and determines the ‘best’ estimate within that family. Bayesian estima- tion makes use of prior soft information

  8. Multichannel Speech Enhancement Based on Generalized Gamma Prior Distribution with Its Online Adaptive Estimation

    NASA Astrophysics Data System (ADS)

    Dat, Tran Huy; Takeda, Kazuya; Itakura, Fumitada

    We present a multichannel speech enhancement method based on MAP speech spectral magnitude estimation using a generalized gamma model of speech prior distribution, where the model parameters are adapted from actual noisy speech in a frame-by-frame manner. The utilization of a more general prior distribution with its online adaptive estimation is shown to be effective for speech spectral estimation in noisy environments. Furthermore, the multi-channel information in terms of cross-channel statistics are shown to be useful to better adapt the prior distribution parameters to the actual observation, resulting in better performance of speech enhancement algorithm. We tested the proposed algorithm in an in-car speech database and obtained significant improvements of the speech recognition performance, particularly under non-stationary noise conditions such as music, air-conditioner and open window.

  9. A hierarchical Bayesian approach to adaptive vision testing: A case study with the contrast sensitivity function.

    PubMed

    Gu, Hairong; Kim, Woojae; Hou, Fang; Lesmes, Luis Andres; Pitt, Mark A; Lu, Zhong-Lin; Myung, Jay I

    2016-01-01

    Measurement efficiency is of concern when a large number of observations are required to obtain reliable estimates for parametric models of vision. The standard entropy-based Bayesian adaptive testing procedures addressed the issue by selecting the most informative stimulus in sequential experimental trials. Noninformative, diffuse priors were commonly used in those tests. Hierarchical adaptive design optimization (HADO; Kim, Pitt, Lu, Steyvers, & Myung, 2014) further improves the efficiency of the standard Bayesian adaptive testing procedures by constructing an informative prior using data from observers who have already participated in the experiment. The present study represents an empirical validation of HADO in estimating the human contrast sensitivity function. The results show that HADO significantly improves the accuracy and precision of parameter estimates, and therefore requires many fewer observations to obtain reliable inference about contrast sensitivity, compared to the method of quick contrast sensitivity function (Lesmes, Lu, Baek, & Albright, 2010), which uses the standard Bayesian procedure. The improvement with HADO was maintained even when the prior was constructed from heterogeneous populations or a relatively small number of observers. These results of this case study support the conclusion that HADO can be used in Bayesian adaptive testing by replacing noninformative, diffuse priors with statistically justified informative priors without introducing unwanted bias.

  10. A hierarchical Bayesian approach to adaptive vision testing: A case study with the contrast sensitivity function

    PubMed Central

    Gu, Hairong; Kim, Woojae; Hou, Fang; Lesmes, Luis Andres; Pitt, Mark A.; Lu, Zhong-Lin; Myung, Jay I.

    2016-01-01

    Measurement efficiency is of concern when a large number of observations are required to obtain reliable estimates for parametric models of vision. The standard entropy-based Bayesian adaptive testing procedures addressed the issue by selecting the most informative stimulus in sequential experimental trials. Noninformative, diffuse priors were commonly used in those tests. Hierarchical adaptive design optimization (HADO; Kim, Pitt, Lu, Steyvers, & Myung, 2014) further improves the efficiency of the standard Bayesian adaptive testing procedures by constructing an informative prior using data from observers who have already participated in the experiment. The present study represents an empirical validation of HADO in estimating the human contrast sensitivity function. The results show that HADO significantly improves the accuracy and precision of parameter estimates, and therefore requires many fewer observations to obtain reliable inference about contrast sensitivity, compared to the method of quick contrast sensitivity function (Lesmes, Lu, Baek, & Albright, 2010), which uses the standard Bayesian procedure. The improvement with HADO was maintained even when the prior was constructed from heterogeneous populations or a relatively small number of observers. These results of this case study support the conclusion that HADO can be used in Bayesian adaptive testing by replacing noninformative, diffuse priors with statistically justified informative priors without introducing unwanted bias. PMID:27105061

  11. Improving semantic scene understanding using prior information

    NASA Astrophysics Data System (ADS)

    Laddha, Ankit; Hebert, Martial

    2016-05-01

    Perception for ground robot mobility requires automatic generation of descriptions of the robot's surroundings from sensor input (cameras, LADARs, etc.). Effective techniques for scene understanding have been developed, but they are generally purely bottom-up in that they rely entirely on classifying features from the input data based on learned models. In fact, perception systems for ground robots have a lot of information at their disposal from knowledge about the domain and the task. For example, a robot in urban environments might have access to approximate maps that can guide the scene interpretation process. In this paper, we explore practical ways to combine such prior information with state of the art scene understanding approaches.

  12. Clustering and Bayesian hierarchical modeling for the definition of informative prior distributions in hydrogeology

    NASA Astrophysics Data System (ADS)

    Cucchi, K.; Kawa, N.; Hesse, F.; Rubin, Y.

    2017-12-01

    In order to reduce uncertainty in the prediction of subsurface flow and transport processes, practitioners should use all data available. However, classic inverse modeling frameworks typically only make use of information contained in in-situ field measurements to provide estimates of hydrogeological parameters. Such hydrogeological information about an aquifer is difficult and costly to acquire. In this data-scarce context, the transfer of ex-situ information coming from previously investigated sites can be critical for improving predictions by better constraining the estimation procedure. Bayesian inverse modeling provides a coherent framework to represent such ex-situ information by virtue of the prior distribution and combine them with in-situ information from the target site. In this study, we present an innovative data-driven approach for defining such informative priors for hydrogeological parameters at the target site. Our approach consists in two steps, both relying on statistical and machine learning methods. The first step is data selection; it consists in selecting sites similar to the target site. We use clustering methods for selecting similar sites based on observable hydrogeological features. The second step is data assimilation; it consists in assimilating data from the selected similar sites into the informative prior. We use a Bayesian hierarchical model to account for inter-site variability and to allow for the assimilation of multiple types of site-specific data. We present the application and validation of the presented methods on an established database of hydrogeological parameters. Data and methods are implemented in the form of an open-source R-package and therefore facilitate easy use by other practitioners.

  13. In Search of Social Movement Learning: The Growing Jobs for Living Project. NALL Working Paper.

    ERIC Educational Resources Information Center

    Clover, Darlene E.; Hall, Budd L.

    The New Approaches to Lifelong Learning (NALL) project is a Canada-wide 5-year research initiative during which more than 70 academic and community members are working collaboratively within a framework of informal learning to address the following issues: informal computer-based learning, recognition of prior learning, informal learning in a…

  14. Internet Usage by Parents Prior to Seeking Care at a Pediatric Emergency Department: Observational Study

    PubMed Central

    2017-01-01

    Background Little is known about how parents utilize medical information on the Internet prior to an emergency department (ED) visit. Objective The objective of the study was to determine the proportion of parents who accessed the Internet for medical information related to their child’s illness in the 24 hours prior to an ED visit (IPED), to identify the websites used, and to understand how the content contributed to the decision to visit the ED. Methods A 40-question interview was conducted with parents presenting to an ED within a freestanding children’s hospital. If parents reported IPED, the number and names of websites were documented. Parents indicated the helpfulness of Web-based content using a 100-mm visual analog scale and the degree to which it contributed to the decision to visit the ED using 5-point Likert-type responses. Results About 11.8 % (31/262) reported IPED (95% CI 7.3-5.3). Parents who reported IPED were more likely to have at least some college education (P=.04), higher annual household income (P=.001), and older children (P=.04) than those who did not report IPED. About 35% (11/31) could not name any websites used. Mean level of helpfulness of Web-based content was 62 mm (standard deviation, SD=25 mm). After Internet use, some parents (29%, 9/31) were more certain they needed to visit the ED, whereas 19% (6/31) were less certain. A majority (87%, 195/224) of parents who used the Internet stated that they would be somewhat likely or very likely to visit a website recommended by a physician. Conclusions Nearly 1 out of 8 parents presenting to an urban pediatric ED reported using the Internet in the 24 hours prior to the ED visit. Among privately insured, at least one in 5 parents reported using the Internet prior to visiting the ED. Web-based medical information often influences decision making regarding ED utilization. Pediatric providers should provide parents with recommendations for high-quality sources of health information available on the Internet. PMID:28958988

  15. Interactive lesion segmentation with shape priors from offline and online learning.

    PubMed

    Shepherd, Tony; Prince, Simon J D; Alexander, Daniel C

    2012-09-01

    In medical image segmentation, tumors and other lesions demand the highest levels of accuracy but still call for the highest levels of manual delineation. One factor holding back automatic segmentation is the exemption of pathological regions from shape modelling techniques that rely on high-level shape information not offered by lesions. This paper introduces two new statistical shape models (SSMs) that combine radial shape parameterization with machine learning techniques from the field of nonlinear time series analysis. We then develop two dynamic contour models (DCMs) using the new SSMs as shape priors for tumor and lesion segmentation. From training data, the SSMs learn the lower level shape information of boundary fluctuations, which we prove to be nevertheless highly discriminant. One of the new DCMs also uses online learning to refine the shape prior for the lesion of interest based on user interactions. Classification experiments reveal superior sensitivity and specificity of the new shape priors over those previously used to constrain DCMs. User trials with the new interactive algorithms show that the shape priors are directly responsible for improvements in accuracy and reductions in user demand.

  16. The pharmacokinetics of dexmedetomidine during long-term infusion in critically ill pediatric patients. A Bayesian approach with informative priors.

    PubMed

    Wiczling, Paweł; Bartkowska-Śniatkowska, Alicja; Szerkus, Oliwia; Siluk, Danuta; Rosada-Kurasińska, Jowita; Warzybok, Justyna; Borsuk, Agnieszka; Kaliszan, Roman; Grześkowiak, Edmund; Bienert, Agnieszka

    2016-06-01

    The purpose of this study was to assess the pharmacokinetics of dexmedetomidine in the ICU settings during the prolonged infusion and to compare it with the existing literature data using the Bayesian population modeling with literature-based informative priors. Thirty-eight patients were included in the analysis with concentration measurements obtained at two occasions: first from 0 to 24 h after infusion initiation and second from 0 to 8 h after infusion end. Data analysis was conducted using WinBUGS software. The prior information on dexmedetomidine pharmacokinetics was elicited from the literature study pooling results from a relatively large group of 95 children. A two compartment PK model, with allometrically scaled parameters, maturation of clearance and t-student residual distribution on a log-scale was used to describe the data. The incorporation of time-dependent (different between two occasions) PK parameters improved the model. It was observed that volume of distribution is 1.5-fold higher during the second occasion. There was also an evidence of increased (1.3-fold) clearance for the second occasion with posterior probability equal to 62 %. This work demonstrated the usefulness of Bayesian modeling with informative priors in analyzing pharmacokinetic data and comparing it with existing literature knowledge.

  17. Formal and Informal Measures of Reading and Math Achievement as a Function of Early Childhood Program Participation among Kindergarten through Eighth Grade Students

    ERIC Educational Resources Information Center

    Haas, Lory E.

    2011-01-01

    Three main purposes provided the foundation for this study. The first purpose was to investigate academic achievement through analyses of data obtained through formal and informal assessments among kindergarten through eighth grade students who participated in a Head Start program, center-based care program, or home-based care prior to school…

  18. An Examination of Web-Based Information on the Transition to School for Children Who Are Deaf or Hard of Hearing

    ERIC Educational Resources Information Center

    Curle, Deirdre M.

    2015-01-01

    Both prior to and during the transition from early intervention (EI) to school, parents of children who are deaf or hard of hearing (d/hh) need crucial information about the transition process and school services. Given the ubiquitous nature of computers and Internet access, it is reasonable to assume that web-based dissemination of information…

  19. Visual Perception of Force: Comment on White (2012)

    ERIC Educational Resources Information Center

    Hubbard, Timothy L.

    2012-01-01

    White (2012) proposed that kinematic features in a visual percept are matched to stored representations containing information regarding forces (based on prior haptic experience) and that information in the matched, stored representations regarding forces is then incorporated into visual perception. Although some elements of White's (2012) account…

  20. 40 CFR 745.326 - Renovation: State and Tribal program requirements.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...) TOXIC SUBSTANCES CONTROL ACT LEAD-BASED PAINT POISONING PREVENTION IN CERTAIN RESIDENTIAL STRUCTURES... distribution of lead hazard information to owners and occupants of target housing and child-occupied facilities... distributing the lead hazard information to owners and occupants of housing and child-occupied facilities prior...

  1. A Method for Constructing Informative Priors for Bayesian Modeling of Occupational Hygiene Data.

    PubMed

    Quick, Harrison; Huynh, Tran; Ramachandran, Gurumurthy

    2017-01-01

    In many occupational hygiene settings, the demand for more accurate, more precise results is at odds with limited resources. To combat this, practitioners have begun using Bayesian methods to incorporate prior information into their statistical models in order to obtain more refined inference from their data. This is not without risk, however, as incorporating prior information that disagrees with the information contained in data can lead to spurious conclusions, particularly if the prior is too informative. In this article, we propose a method for constructing informative prior distributions for normal and lognormal data that are intuitive to specify and robust to bias. To demonstrate the use of these priors, we walk practitioners through a step-by-step implementation of our priors using an illustrative example. We then conclude with recommendations for general use. © The Author 2017. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.

  2. Geometric Methods for Controlled Active Vision

    DTIC Science & Technology

    2012-02-07

    information -based criteria, such as the Kullback - Leibler divergence, have been employed. Returning to the problem of segmentation, one can think of a data...Transactions on Information Technology in Biomedicine, 2012. 32. “3D automatic segmentation of the hippocampus using wavelets with applications to... used to induce shape information to the estimated curve without the need for explicit incorporation of shape information into the motion prior. In

  3. Segmentation and tracking of lung nodules via graph-cuts incorporating shape prior and motion from 4D CT.

    PubMed

    Cha, Jungwon; Farhangi, Mohammad Mehdi; Dunlap, Neal; Amini, Amir A

    2018-01-01

    We have developed a robust tool for performing volumetric and temporal analysis of nodules from respiratory gated four-dimensional (4D) CT. The method could prove useful in IMRT of lung cancer. We modified the conventional graph-cuts method by adding an adaptive shape prior as well as motion information within a signed distance function representation to permit more accurate and automated segmentation and tracking of lung nodules in 4D CT data. Active shape models (ASM) with signed distance function were used to capture the shape prior information, preventing unwanted surrounding tissues from becoming part of the segmented object. The optical flow method was used to estimate the local motion and to extend three-dimensional (3D) segmentation to 4D by warping a prior shape model through time. The algorithm has been applied to segmentation of well-circumscribed, vascularized, and juxtapleural lung nodules from respiratory gated CT data. In all cases, 4D segmentation and tracking for five phases of high-resolution CT data took approximately 10 min on a PC workstation with AMD Phenom II and 32 GB of memory. The method was trained based on 500 breath-held 3D CT data from the LIDC data base and was tested on 17 4D lung nodule CT datasets consisting of 85 volumetric frames. The validation tests resulted in an average Dice Similarity Coefficient (DSC) = 0.68 for all test data. An important by-product of the method is quantitative volume measurement from 4D CT from end-inspiration to end-expiration which will also have important diagnostic value. The algorithm performs robust segmentation of lung nodules from 4D CT data. Signed distance ASM provides the shape prior information which based on the iterative graph-cuts framework is adaptively refined to best fit the input data, preventing unwanted surrounding tissue from merging with the segmented object. © 2017 American Association of Physicists in Medicine.

  4. dPIRPLE: a joint estimation framework for deformable registration and penalized-likelihood CT image reconstruction using prior images

    NASA Astrophysics Data System (ADS)

    Dang, H.; Wang, A. S.; Sussman, Marc S.; Siewerdsen, J. H.; Stayman, J. W.

    2014-09-01

    Sequential imaging studies are conducted in many clinical scenarios. Prior images from previous studies contain a great deal of patient-specific anatomical information and can be used in conjunction with subsequent imaging acquisitions to maintain image quality while enabling radiation dose reduction (e.g., through sparse angular sampling, reduction in fluence, etc). However, patient motion between images in such sequences results in misregistration between the prior image and current anatomy. Existing prior-image-based approaches often include only a simple rigid registration step that can be insufficient for capturing complex anatomical motion, introducing detrimental effects in subsequent image reconstruction. In this work, we propose a joint framework that estimates the 3D deformation between an unregistered prior image and the current anatomy (based on a subsequent data acquisition) and reconstructs the current anatomical image using a model-based reconstruction approach that includes regularization based on the deformed prior image. This framework is referred to as deformable prior image registration, penalized-likelihood estimation (dPIRPLE). Central to this framework is the inclusion of a 3D B-spline-based free-form-deformation model into the joint registration-reconstruction objective function. The proposed framework is solved using a maximization strategy whereby alternating updates to the registration parameters and image estimates are applied allowing for improvements in both the registration and reconstruction throughout the optimization process. Cadaver experiments were conducted on a cone-beam CT testbench emulating a lung nodule surveillance scenario. Superior reconstruction accuracy and image quality were demonstrated using the dPIRPLE algorithm as compared to more traditional reconstruction methods including filtered backprojection, penalized-likelihood estimation (PLE), prior image penalized-likelihood estimation (PIPLE) without registration, and prior image penalized-likelihood estimation with rigid registration of a prior image (PIRPLE) over a wide range of sampling sparsity and exposure levels.

  5. "I Felt That I Could Be Whatever I Wanted": Pre-Service Drama Teachers' Prior Experiences and Beliefs about Teaching Drama

    ERIC Educational Resources Information Center

    Gray, Christina; Pascoe, Robin; Wright, Peter

    2018-01-01

    Pre-service drama teachers enter teacher training with established ideas and beliefs about teaching. These beliefs, based on experience, are informed by many hours spent in schools, and the pedagogies--both effective and ineffective--utilised by their teachers. This research explores the influence of some of these prior experiences on pre-service…

  6. Understanding the Factors That Shape Dispositions toward Students with Disabilities: A Case Study of Three General Education Pre-Service Teachers

    ERIC Educational Resources Information Center

    Bialka, Christa S.

    2017-01-01

    Presently, there is limited research that explores how the nature of pre-service teachers' prior experience informs their dispositional development, and it has yet to be determined whether the pedagogical needs of these prospective teachers vary based on their level of prior exposure. This qualitative case study examines the dispositions of three…

  7. Intrinsic Bayesian Active Contours for Extraction of Object Boundaries in Images

    PubMed Central

    Srivastava, Anuj

    2010-01-01

    We present a framework for incorporating prior information about high-probability shapes in the process of contour extraction and object recognition in images. Here one studies shapes as elements of an infinite-dimensional, non-linear quotient space, and statistics of shapes are defined and computed intrinsically using differential geometry of this shape space. Prior models on shapes are constructed using probability distributions on tangent bundles of shape spaces. Similar to the past work on active contours, where curves are driven by vector fields based on image gradients and roughness penalties, we incorporate the prior shape knowledge in the form of vector fields on curves. Through experimental results, we demonstrate the use of prior shape models in the estimation of object boundaries, and their success in handling partial obscuration and missing data. Furthermore, we describe the use of this framework in shape-based object recognition or classification. PMID:21076692

  8. Main Road Extraction from ZY-3 Grayscale Imagery Based on Directional Mathematical Morphology and VGI Prior Knowledge in Urban Areas

    PubMed Central

    Liu, Bo; Wu, Huayi; Wang, Yandong; Liu, Wenming

    2015-01-01

    Main road features extracted from remotely sensed imagery play an important role in many civilian and military applications, such as updating Geographic Information System (GIS) databases, urban structure analysis, spatial data matching and road navigation. Current methods for road feature extraction from high-resolution imagery are typically based on threshold value segmentation. It is difficult however, to completely separate road features from the background. We present a new method for extracting main roads from high-resolution grayscale imagery based on directional mathematical morphology and prior knowledge obtained from the Volunteered Geographic Information found in the OpenStreetMap. The two salient steps in this strategy are: (1) using directional mathematical morphology to enhance the contrast between roads and non-roads; (2) using OpenStreetMap roads as prior knowledge to segment the remotely sensed imagery. Experiments were conducted on two ZiYuan-3 images and one QuickBird high-resolution grayscale image to compare our proposed method to other commonly used techniques for road feature extraction. The results demonstrated the validity and better performance of the proposed method for urban main road feature extraction. PMID:26397832

  9. A feature-based inference model of numerical estimation: the split-seed effect.

    PubMed

    Murray, Kyle B; Brown, Norman R

    2009-07-01

    Prior research has identified two modes of quantitative estimation: numerical retrieval and ordinal conversion. In this paper we introduce a third mode, which operates by a feature-based inference process. In contrast to prior research, the results of three experiments demonstrate that people estimate automobile prices by combining metric information associated with two critical features: product class and brand status. In addition, Experiments 2 and 3 demonstrated that when participants are seeded with the actual current base price of one of the to-be-estimated vehicles, they respond by revising the general metric and splitting the information carried by the seed between the two critical features. As a result, the degree of post-seeding revision is directly related to the number of these features that the seed and the transfer items have in common. The paper concludes with a general discussion of the practical and theoretical implications of our findings.

  10. Cone beam x-ray luminescence computed tomography reconstruction with a priori anatomical information

    NASA Astrophysics Data System (ADS)

    Lo, Pei-An; Lin, Meng-Lung; Jin, Shih-Chun; Chen, Jyh-Cheng; Lin, Syue-Liang; Chang, C. Allen; Chiang, Huihua Kenny

    2014-09-01

    X-ray luminescence computed tomography (XLCT) is a novel molecular imaging modality that reconstructs the optical distribution of x-ray-excited phosphor particles with prior informational of anatomical CT image. The prior information improves the accuracy of image reconstruction. The system can also present anatomical CT image. The optical system based on a high sensitive charge coupled device (CCD) is perpendicular with a CT system. In the XLCT system, the xray was adopted to excite the phosphor of the sample and CCD camera was utilized to acquire luminescence emitted from the sample in 360 degrees projection free-space. In this study, the fluorescence diffuse optical tomography (FDOT)-like algorithm was used for image reconstruction, the structural prior information was incorporated in the reconstruction by adding a penalty term to the minimization function. The phosphor used in this study is Gd2O2S:Tb. For the simulation and experiments, the data was collected from 16 projections. The cylinder phantom was 40 mm in diameter and contains 8 mm diameter inclusion; the phosphor in the in vivo study was 5 mm in diameter at a depth of 3 mm. Both the errors were no more than 5%. Based on the results from these simulation and experimental studies, the novel XLCT method has demonstrated the feasibility for in vivo animal model studies.

  11. Evolution of an Information Competency Requirement for Undergraduates

    ERIC Educational Resources Information Center

    Walsh, Tiffany R.

    2011-01-01

    University at Buffalo undergraduate students are required to complete a non-credit-bearing information competency assessment prior to graduation, preferably within their first year of study. Called the "Library Skills Workbook," this assessment has evolved from a short, print-based quiz into a sophisticated, multi-module tutorial and…

  12. Interactive and Collaborative Professional Development for In-Service History Teachers

    ERIC Educational Resources Information Center

    Callahan, Cory; Saye, John; Brush, Thomas

    2016-01-01

    This article advances a continuing line of inquiry into an innovative teacher-support program intended to help in-service history teachers develop professional teaching knowledge for inquiry-based history instruction. Two prior iterations informed our design and use of professional development materials; they also informed the implementation…

  13. 76 FR 1183 - Agency Information Collection Activities: Submission for OMB Review; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-07

    ... Information Collection Activities: Submission for OMB Review; Comment Request Periodically, the Health... further divided by the RSR component. Estimates for grantees and providers are based on prior experience... ``attention of the desk officer for HRSA.'' Dated: January 3, 2011. Robert Hendricks, Director, Division of...

  14. Nudging toward Inquiry: Developing Questions and a Sense of Wonder

    ERIC Educational Resources Information Center

    Fontichiaro, Kristin, Comp.

    2010-01-01

    Inquiry does not replace information literacy; it encompasses it. It encourages librarians to consider instructional design beyond information search, retrieval, citation, and use. Inquiry-based learning invites school librarians to step into all aspects of instructional planning, from activating prior knowledge straight through to reflection.…

  15. The impact of the rate prior on Bayesian estimation of divergence times with multiple Loci.

    PubMed

    Dos Reis, Mario; Zhu, Tianqi; Yang, Ziheng

    2014-07-01

    Bayesian methods provide a powerful way to estimate species divergence times by combining information from molecular sequences with information from the fossil record. With the explosive increase of genomic data, divergence time estimation increasingly uses data of multiple loci (genes or site partitions). Widely used computer programs to estimate divergence times use independent and identically distributed (i.i.d.) priors on the substitution rates for different loci. The i.i.d. prior is problematic. As the number of loci (L) increases, the prior variance of the average rate across all loci goes to zero at the rate 1/L. As a consequence, the rate prior dominates posterior time estimates when many loci are analyzed, and if the rate prior is misspecified, the estimated divergence times will converge to wrong values with very narrow credibility intervals. Here we develop a new prior on the locus rates based on the Dirichlet distribution that corrects the problematic behavior of the i.i.d. prior. We use computer simulation and real data analysis to highlight the differences between the old and new priors. For a dataset for six primate species, we show that with the old i.i.d. prior, if the prior rate is too high (or too low), the estimated divergence times are too young (or too old), outside the bounds imposed by the fossil calibrations. In contrast, with the new Dirichlet prior, posterior time estimates are insensitive to the rate prior and are compatible with the fossil calibrations. We re-analyzed a phylogenomic data set of 36 mammal species and show that using many fossil calibrations can alleviate the adverse impact of a misspecified rate prior to some extent. We recommend the use of the new Dirichlet prior in Bayesian divergence time estimation. [Bayesian inference, divergence time, relaxed clock, rate prior, partition analysis.]. © The Author(s) 2014. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  16. Test Expectancy and Memory for Important Information

    PubMed Central

    Middlebrooks, Catherine D.; Murayama, Kou; Castel, Alan D.

    2016-01-01

    Prior research suggests that learners study and remember information differently depending upon the type of test they expect to later receive. The current experiments investigate how testing expectations impact the study of and memory for valuable information. Participants studied lists of words ranging in value from 1–10 points with the goal being to maximize their score on a later memory test. Half of the participants were told to expect a recognition test after each list, while the other half were told to expect a recall test. After several lists of receiving tests congruent with expectations, participants studying for a recognition test instead received an unexpected recall test. In Experiment 1, participants who had studied for a recognition test recalled less of the valuable information than participants anticipating the recall format. These participants continued to attend less to item value on future (expected) recall tests than participants who had only ever experienced recall testing. When the recognition tests were made more demanding in Experiment 2, value-based recall improved relative to Experiment 1: though memory for the valuable information remained superior when participants studied with the expectation of having to recall the information, there were no longer significant differences after accounting for recall testing experience. Thus, recall-based testing encouraged strategic, value-based encoding and enhanced retrieval of important information, while recognition testing in some cases limited value-based study and memory. These results extend prior work concerning the impact of testing expectations on memory, offering further insight into how people study important information. PMID:28095010

  17. Matrix-Inversion-Free Compressed Sensing With Variable Orthogonal Multi-Matching Pursuit Based on Prior Information for ECG Signals.

    PubMed

    Cheng, Yih-Chun; Tsai, Pei-Yun; Huang, Ming-Hao

    2016-05-19

    Low-complexity compressed sensing (CS) techniques for monitoring electrocardiogram (ECG) signals in wireless body sensor network (WBSN) are presented. The prior probability of ECG sparsity in the wavelet domain is first exploited. Then, variable orthogonal multi-matching pursuit (vOMMP) algorithm that consists of two phases is proposed. In the first phase, orthogonal matching pursuit (OMP) algorithm is adopted to effectively augment the support set with reliable indices and in the second phase, the orthogonal multi-matching pursuit (OMMP) is employed to rescue the missing indices. The reconstruction performance is thus enhanced with the prior information and the vOMMP algorithm. Furthermore, the computation-intensive pseudo-inverse operation is simplified by the matrix-inversion-free (MIF) technique based on QR decomposition. The vOMMP-MIF CS decoder is then implemented in 90 nm CMOS technology. The QR decomposition is accomplished by two systolic arrays working in parallel. The implementation supports three settings for obtaining 40, 44, and 48 coefficients in the sparse vector. From the measurement result, the power consumption is 11.7 mW at 0.9 V and 12 MHz. Compared to prior chip implementations, our design shows good hardware efficiency and is suitable for low-energy applications.

  18. Incorporating prior information into differential network analysis using non-paranormal graphical models.

    PubMed

    Zhang, Xiao-Fei; Ou-Yang, Le; Yan, Hong

    2017-08-15

    Understanding how gene regulatory networks change under different cellular states is important for revealing insights into network dynamics. Gaussian graphical models, which assume that the data follow a joint normal distribution, have been used recently to infer differential networks. However, the distributions of the omics data are non-normal in general. Furthermore, although much biological knowledge (or prior information) has been accumulated, most existing methods ignore the valuable prior information. Therefore, new statistical methods are needed to relax the normality assumption and make full use of prior information. We propose a new differential network analysis method to address the above challenges. Instead of using Gaussian graphical models, we employ a non-paranormal graphical model that can relax the normality assumption. We develop a principled model to take into account the following prior information: (i) a differential edge less likely exists between two genes that do not participate together in the same pathway; (ii) changes in the networks are driven by certain regulator genes that are perturbed across different cellular states and (iii) the differential networks estimated from multi-view gene expression data likely share common structures. Simulation studies demonstrate that our method outperforms other graphical model-based algorithms. We apply our method to identify the differential networks between platinum-sensitive and platinum-resistant ovarian tumors, and the differential networks between the proneural and mesenchymal subtypes of glioblastoma. Hub nodes in the estimated differential networks rediscover known cancer-related regulator genes and contain interesting predictions. The source code is at https://github.com/Zhangxf-ccnu/pDNA. szuouyl@gmail.com. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  19. Exploiting Genome Structure in Association Analysis

    PubMed Central

    Kim, Seyoung

    2014-01-01

    Abstract A genome-wide association study involves examining a large number of single-nucleotide polymorphisms (SNPs) to identify SNPs that are significantly associated with the given phenotype, while trying to reduce the false positive rate. Although haplotype-based association methods have been proposed to accommodate correlation information across nearby SNPs that are in linkage disequilibrium, none of these methods directly incorporated the structural information such as recombination events along chromosome. In this paper, we propose a new approach called stochastic block lasso for association mapping that exploits prior knowledge on linkage disequilibrium structure in the genome such as recombination rates and distances between adjacent SNPs in order to increase the power of detecting true associations while reducing false positives. Following a typical linear regression framework with the genotypes as inputs and the phenotype as output, our proposed method employs a sparsity-enforcing Laplacian prior for the regression coefficients, augmented by a first-order Markov process along the sequence of SNPs that incorporates the prior information on the linkage disequilibrium structure. The Markov-chain prior models the structural dependencies between a pair of adjacent SNPs, and allows us to look for association SNPs in a coupled manner, combining strength from multiple nearby SNPs. Our results on HapMap-simulated datasets and mouse datasets show that there is a significant advantage in incorporating the prior knowledge on linkage disequilibrium structure for marker identification under whole-genome association. PMID:21548809

  20. A Knowledge-Based Arrangement of Prototypical Neural Representation Prior to Experience Contributes to Selectivity in Upcoming Knowledge Acquisition.

    PubMed

    Kurashige, Hiroki; Yamashita, Yuichi; Hanakawa, Takashi; Honda, Manabu

    2018-01-01

    Knowledge acquisition is a process in which one actively selects a piece of information from the environment and assimilates it with prior knowledge. However, little is known about the neural mechanism underlying selectivity in knowledge acquisition. Here we executed a 2-day human experiment to investigate the involvement of characteristic spontaneous activity resembling a so-called "preplay" in selectivity in sentence comprehension, an instance of knowledge acquisition. On day 1, we presented 10 sentences (prior sentences) that were difficult to understand on their own. On the following day, we first measured the resting-state functional magnetic resonance imaging (fMRI). Then, we administered a sentence comprehension task using 20 new sentences (posterior sentences). The posterior sentences were also difficult to understand on their own, but some could be associated with prior sentences to facilitate their understanding. Next, we measured the posterior sentence-induced fMRI to identify the neural representation. From the resting-state fMRI, we extracted the appearances of activity patterns similar to the neural representations for posterior sentences. Importantly, the resting-state fMRI was measured before giving the posterior sentences, and thus such appearances could be considered as preplay-like or prototypical neural representations. We compared the intensities of such appearances with the understanding of posterior sentences. This gave a positive correlation between these two variables, but only if posterior sentences were associated with prior sentences. Additional analysis showed the contribution of the entorhinal cortex, rather than the hippocampus, to the correlation. The present study suggests that prior knowledge-based arrangement of neural activity before an experience contributes to the active selection of information to be learned. Such arrangement prior to an experience resembles preplay activity observed in the rodent brain. In terms of knowledge acquisition, the present study leads to a new view of the brain (or more precisely of the brain's knowledge) as an autopoietic system in which the brain (or knowledge) selects what it should learn by itself, arranges preplay-like activity as a position for the new information in advance, and actively reorganizes itself.

  1. A Knowledge-Based Arrangement of Prototypical Neural Representation Prior to Experience Contributes to Selectivity in Upcoming Knowledge Acquisition

    PubMed Central

    Kurashige, Hiroki; Yamashita, Yuichi; Hanakawa, Takashi; Honda, Manabu

    2018-01-01

    Knowledge acquisition is a process in which one actively selects a piece of information from the environment and assimilates it with prior knowledge. However, little is known about the neural mechanism underlying selectivity in knowledge acquisition. Here we executed a 2-day human experiment to investigate the involvement of characteristic spontaneous activity resembling a so-called “preplay” in selectivity in sentence comprehension, an instance of knowledge acquisition. On day 1, we presented 10 sentences (prior sentences) that were difficult to understand on their own. On the following day, we first measured the resting-state functional magnetic resonance imaging (fMRI). Then, we administered a sentence comprehension task using 20 new sentences (posterior sentences). The posterior sentences were also difficult to understand on their own, but some could be associated with prior sentences to facilitate their understanding. Next, we measured the posterior sentence-induced fMRI to identify the neural representation. From the resting-state fMRI, we extracted the appearances of activity patterns similar to the neural representations for posterior sentences. Importantly, the resting-state fMRI was measured before giving the posterior sentences, and thus such appearances could be considered as preplay-like or prototypical neural representations. We compared the intensities of such appearances with the understanding of posterior sentences. This gave a positive correlation between these two variables, but only if posterior sentences were associated with prior sentences. Additional analysis showed the contribution of the entorhinal cortex, rather than the hippocampus, to the correlation. The present study suggests that prior knowledge-based arrangement of neural activity before an experience contributes to the active selection of information to be learned. Such arrangement prior to an experience resembles preplay activity observed in the rodent brain. In terms of knowledge acquisition, the present study leads to a new view of the brain (or more precisely of the brain’s knowledge) as an autopoietic system in which the brain (or knowledge) selects what it should learn by itself, arranges preplay-like activity as a position for the new information in advance, and actively reorganizes itself. PMID:29662446

  2. Consider the source: Children link the accuracy of text-based sources to the accuracy of the author.

    PubMed

    Vanderbilt, Kimberly E; Ochoa, Karlena D; Heilbrun, Jayd

    2018-05-06

    The present research investigated whether young children link the accuracy of text-based information to the accuracy of its author. Across three experiments, three- and four-year-olds (N = 231) received information about object labels from accurate and inaccurate sources who provided information both in text and verbally. Of primary interest was whether young children would selectively rely on information provided by more accurate sources, regardless of the form in which the information was communicated. Experiment 1 tested children's trust in text-based information (e.g., books) written by an author with a history of either accurate or inaccurate verbal testimony and found that children showed greater trust in books written by accurate authors. Experiment 2 replicated the findings of Experiment 1 and extended them by showing that children's selective trust in more accurate text-based sources was not dependent on experience trusting or distrusting the author's verbal testimony. Experiment 3 investigated this understanding in reverse by testing children's trust in verbal testimony communicated by an individual who had authored either accurate or inaccurate text-based information. Experiment 3 revealed that children showed greater trust in individuals who had authored accurate rather than inaccurate books. Experiment 3 also demonstrated that children used the accuracy of text-based sources to make inferences about the mental states of the authors. Taken together, these results suggest children do indeed link the reliability of text-based sources to the reliability of the author. Statement of Contribution Existing knowledge Children use sources' prior accuracy to predict future accuracy in face-to-face verbal interactions. Children who are just learning to read show increased trust in text bases (vs. verbal) information. It is unknown whether children consider authors' prior accuracy when judging the accuracy of text-based information. New knowledge added by this article Preschool children track sources' accuracy across communication mediums - from verbal to text-based modalities and vice versa. Children link the reliability of text-based sources to the reliability of the author. © 2018 The British Psychological Society.

  3. Prior health expenditures and risk sharing with insurers competing on quality.

    PubMed

    Marchand, Maurice; Sato, Motohiro; Schokkaert, Erik

    2003-01-01

    Insurers can exploit the heterogeneity within risk-adjustment classes to select the good risks because they have more information than the regulator on the expected expenditures of individual insurees. To counteract this cream skimming, mixed systems combining capitation and cost-based payments have been adopted that do not, however, generally use the past expenditures of insurees as a risk adjuster. In this article, two symmetric insurers compete for clients by differentiating the quality of service offered to them according to some private information about their risk. In our setting it is always welfare improving to use prior expenditures as a risk adjuster.

  4. Bayesian road safety analysis: incorporation of past evidence and effect of hyper-prior choice.

    PubMed

    Miranda-Moreno, Luis F; Heydari, Shahram; Lord, Dominique; Fu, Liping

    2013-09-01

    This paper aims to address two related issues when applying hierarchical Bayesian models for road safety analysis, namely: (a) how to incorporate available information from previous studies or past experiences in the (hyper) prior distributions for model parameters and (b) what are the potential benefits of incorporating past evidence on the results of a road safety analysis when working with scarce accident data (i.e., when calibrating models with crash datasets characterized by a very low average number of accidents and a small number of sites). A simulation framework was developed to evaluate the performance of alternative hyper-priors including informative and non-informative Gamma, Pareto, as well as Uniform distributions. Based on this simulation framework, different data scenarios (i.e., number of observations and years of data) were defined and tested using crash data collected at 3-legged rural intersections in California and crash data collected for rural 4-lane highway segments in Texas. This study shows how the accuracy of model parameter estimates (inverse dispersion parameter) is considerably improved when incorporating past evidence, in particular when working with the small number of observations and crash data with low mean. The results also illustrates that when the sample size (more than 100 sites) and the number of years of crash data is relatively large, neither the incorporation of past experience nor the choice of the hyper-prior distribution may affect the final results of a traffic safety analysis. As a potential solution to the problem of low sample mean and small sample size, this paper suggests some practical guidance on how to incorporate past evidence into informative hyper-priors. By combining evidence from past studies and data available, the model parameter estimates can significantly be improved. The effect of prior choice seems to be less important on the hotspot identification. The results show the benefits of incorporating prior information when working with limited crash data in road safety studies. Copyright © 2013 National Safety Council and Elsevier Ltd. All rights reserved.

  5. English Perceptive Teaching of Middle School in China--Based on an Empirical Study

    ERIC Educational Resources Information Center

    Lifen, He; Junying, Yong

    2016-01-01

    Perception is the reconstruction and interaction between the new information and prior knowledge in mind or in the process of internalization about the new information. It has three teaching procedures: First, teachers elicit the learners to acquire text meaning. Second, teachers create situation in practical teaching. Third, learners comprehend…

  6. Epistemic Metacognition in Context: Evaluating and Learning Online Information

    ERIC Educational Resources Information Center

    Mason, Lucia; Boldrin, Angela; Ariasi, Nicola

    2010-01-01

    This study examined epistemic metacognition as a reflective activity about knowledge and knowing in the context of online information searching on the Web, and whether it was related to prior knowledge on the topic, study approach, and domain-specific beliefs about science. In addition, we investigated whether Internet-based learning was…

  7. Bayesian generalized linear mixed modeling of Tuberculosis using informative priors.

    PubMed

    Ojo, Oluwatobi Blessing; Lougue, Siaka; Woldegerima, Woldegebriel Assefa

    2017-01-01

    TB is rated as one of the world's deadliest diseases and South Africa ranks 9th out of the 22 countries with hardest hit of TB. Although many pieces of research have been carried out on this subject, this paper steps further by inculcating past knowledge into the model, using Bayesian approach with informative prior. Bayesian statistics approach is getting popular in data analyses. But, most applications of Bayesian inference technique are limited to situations of non-informative prior, where there is no solid external information about the distribution of the parameter of interest. The main aim of this study is to profile people living with TB in South Africa. In this paper, identical regression models are fitted for classical and Bayesian approach both with non-informative and informative prior, using South Africa General Household Survey (GHS) data for the year 2014. For the Bayesian model with informative prior, South Africa General Household Survey dataset for the year 2011 to 2013 are used to set up priors for the model 2014.

  8. Bayesian statistical ionospheric tomography improved by incorporating ionosonde measurements

    NASA Astrophysics Data System (ADS)

    Norberg, Johannes; Virtanen, Ilkka I.; Roininen, Lassi; Vierinen, Juha; Orispää, Mikko; Kauristie, Kirsti; Lehtinen, Markku S.

    2016-04-01

    We validate two-dimensional ionospheric tomography reconstructions against EISCAT incoherent scatter radar measurements. Our tomography method is based on Bayesian statistical inversion with prior distribution given by its mean and covariance. We employ ionosonde measurements for the choice of the prior mean and covariance parameters and use the Gaussian Markov random fields as a sparse matrix approximation for the numerical computations. This results in a computationally efficient tomographic inversion algorithm with clear probabilistic interpretation. We demonstrate how this method works with simultaneous beacon satellite and ionosonde measurements obtained in northern Scandinavia. The performance is compared with results obtained with a zero-mean prior and with the prior mean taken from the International Reference Ionosphere 2007 model. In validating the results, we use EISCAT ultra-high-frequency incoherent scatter radar measurements as the ground truth for the ionization profile shape. We find that in comparison to the alternative prior information sources, ionosonde measurements improve the reconstruction by adding accurate information about the absolute value and the altitude distribution of electron density. With an ionosonde at continuous disposal, the presented method enhances stand-alone near-real-time ionospheric tomography for the given conditions significantly.

  9. The Development and Evaluation of a Computer-Based System for Managing the Design and Pilot-Testing of Interactive Videodisc Programs. Training and Development Research Center, Project Number Forty-Three.

    ERIC Educational Resources Information Center

    Sayre, Scott Alan

    The purpose of this study was to develop and validate a computer-based system that would allow interactive video developers to integrate and manage the design components prior to production. These components of an interactive video (IVD) program include visual information in a variety of formats, audio information, and instructional techniques,…

  10. Mixture class recovery in GMM under varying degrees of class separation: frequentist versus Bayesian estimation.

    PubMed

    Depaoli, Sarah

    2013-06-01

    Growth mixture modeling (GMM) represents a technique that is designed to capture change over time for unobserved subgroups (or latent classes) that exhibit qualitatively different patterns of growth. The aim of the current article was to explore the impact of latent class separation (i.e., how similar growth trajectories are across latent classes) on GMM performance. Several estimation conditions were compared: maximum likelihood via the expectation maximization (EM) algorithm and the Bayesian framework implementing diffuse priors, "accurate" informative priors, weakly informative priors, data-driven informative priors, priors reflecting partial-knowledge of parameters, and "inaccurate" (but informative) priors. The main goal was to provide insight about the optimal estimation condition under different degrees of latent class separation for GMM. Results indicated that optimal parameter recovery was obtained though the Bayesian approach using "accurate" informative priors, and partial-knowledge priors showed promise for the recovery of the growth trajectory parameters. Maximum likelihood and the remaining Bayesian estimation conditions yielded poor parameter recovery for the latent class proportions and the growth trajectories. (PsycINFO Database Record (c) 2013 APA, all rights reserved).

  11. Health-based risk adjustment: is inpatient and outpatient diagnostic information sufficient?

    PubMed

    Lamers, L M

    Adequate risk adjustment is critical to the success of market-oriented health care reforms in many countries. Currently used risk adjusters based on demographic and diagnostic cost groups (DCGs) do not reflect expected costs accurately. This study examines the simultaneous predictive accuracy of inpatient and outpatient morbidity measures and prior costs. DCGs, pharmacy cost groups (PCGs), and prior year's costs improve the predictive accuracy of the demographic model substantially. DCGs and PCGs seem complementary in their ability to predict future costs. However, this study shows that the combination of DCGs and PCGs still leaves room for cream skimming.

  12. Bayesian inference with historical data-based informative priors improves detection of differentially expressed genes

    PubMed Central

    Li, Ben; Sun, Zhaonan; He, Qing; Zhu, Yu; Qin, Zhaohui S.

    2016-01-01

    Motivation: Modern high-throughput biotechnologies such as microarray are capable of producing a massive amount of information for each sample. However, in a typical high-throughput experiment, only limited number of samples were assayed, thus the classical ‘large p, small n’ problem. On the other hand, rapid propagation of these high-throughput technologies has resulted in a substantial collection of data, often carried out on the same platform and using the same protocol. It is highly desirable to utilize the existing data when performing analysis and inference on a new dataset. Results: Utilizing existing data can be carried out in a straightforward fashion under the Bayesian framework in which the repository of historical data can be exploited to build informative priors and used in new data analysis. In this work, using microarray data, we investigate the feasibility and effectiveness of deriving informative priors from historical data and using them in the problem of detecting differentially expressed genes. Through simulation and real data analysis, we show that the proposed strategy significantly outperforms existing methods including the popular and state-of-the-art Bayesian hierarchical model-based approaches. Our work illustrates the feasibility and benefits of exploiting the increasingly available genomics big data in statistical inference and presents a promising practical strategy for dealing with the ‘large p, small n’ problem. Availability and implementation: Our method is implemented in R package IPBT, which is freely available from https://github.com/benliemory/IPBT. Contact: yuzhu@purdue.edu; zhaohui.qin@emory.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26519502

  13. Confidence of compliance: a Bayesian approach for percentile standards.

    PubMed

    McBride, G B; Ellis, J C

    2001-04-01

    Rules for assessing compliance with percentile standards commonly limit the number of exceedances permitted in a batch of samples taken over a defined assessment period. Such rules are commonly developed using classical statistical methods. Results from alternative Bayesian methods are presented (using beta-distributed prior information and a binomial likelihood), resulting in "confidence of compliance" graphs. These allow simple reading of the consumer's risk and the supplier's risks for any proposed rule. The influence of the prior assumptions required by the Bayesian technique on the confidence results is demonstrated, using two reference priors (uniform and Jeffreys') and also using optimistic and pessimistic user-defined priors. All four give less pessimistic results than does the classical technique, because interpreting classical results as "confidence of compliance" actually invokes a Bayesian approach with an extreme prior distribution. Jeffreys' prior is shown to be the most generally appropriate choice of prior distribution. Cost savings can be expected using rules based on this approach.

  14. Integration of prior knowledge into dense image matching for video surveillance

    NASA Astrophysics Data System (ADS)

    Menze, M.; Heipke, C.

    2014-08-01

    Three-dimensional information from dense image matching is a valuable input for a broad range of vision applications. While reliable approaches exist for dedicated stereo setups they do not easily generalize to more challenging camera configurations. In the context of video surveillance the typically large spatial extent of the region of interest and repetitive structures in the scene render the application of dense image matching a challenging task. In this paper we present an approach that derives strong prior knowledge from a planar approximation of the scene. This information is integrated into a graph-cut based image matching framework that treats the assignment of optimal disparity values as a labelling task. Introducing the planar prior heavily reduces ambiguities together with the search space and increases computational efficiency. The results provide a proof of concept of the proposed approach. It allows the reconstruction of dense point clouds in more general surveillance camera setups with wider stereo baselines.

  15. Written Informed-Consent Statutes and HIV Testing

    PubMed Central

    Ehrenkranz, Peter D.; Pagán, José A.; Begier, Elizabeth M.; Linas, Benjamin; Madison, Kristin; Armstrong, Katrina

    2009-01-01

    Background Almost 1 million Americans are infected with HIV, yet it is estimated that as many as 250,000 of them do not know their serostatus. This study examined whether people residing in states with statutes requiring written informed consent prior to HIV testing were less likely to report a recent HIV test. Methods The study is based on survey data from the 2004 Behavioral Risk Factor Surveillance System. Logistic regression was used to assess the association between residence in a state with a pre-test written informed-consent requirement and individual self-report of recent HIV testing. The regression analyses controlled for potential state- and individual-level confounders. Results Almost 17% of respondents reported that they had been tested for HIV in the prior 12 months. Ten states had statutes requiring written informed consent prior to routine HIV testing; nine of those were analyzed in this study. After adjusting for other state- and individual-level factors, people who resided in these nine states were less likely to report a recent history of HIV testing (OR=0.85; 95% CI=0.80, 0.90). The average marginal effect was −0.02 (p<0.001, 95%CI= −0.03, −0.01); thus, written informed-consent statutes are associated with a 12% reduction in HIV testing from the baseline testing level of 17%. The association between a consent requirement and lack of testing was greatest among respondents who denied HIV risk factors, were non-Hispanic whites, or who had higher levels of education. Conclusions This study’s findings suggest that the removal of written informed-consent requirements might promote the non–risk-based routine-testing approach that the CDC advocates in its new testing guidelines. PMID:19423271

  16. Extracting Prior Distributions from a Large Dataset of In-Situ Measurements to Support SWOT-based Estimation of River Discharge

    NASA Astrophysics Data System (ADS)

    Hagemann, M.; Gleason, C. J.

    2017-12-01

    The upcoming (2021) Surface Water and Ocean Topography (SWOT) NASA satellite mission aims, in part, to estimate discharge on major rivers worldwide using reach-scale measurements of stream width, slope, and height. Current formalizations of channel and floodplain hydraulics are insufficient to fully constrain this problem mathematically, resulting in an infinitely large solution set for any set of satellite observations. Recent work has reformulated this problem in a Bayesian statistical setting, in which the likelihood distributions derive directly from hydraulic flow-law equations. When coupled with prior distributions on unknown flow-law parameters, this formulation probabilistically constrains the parameter space, and results in a computationally tractable description of discharge. Using a curated dataset of over 200,000 in-situ acoustic Doppler current profiler (ADCP) discharge measurements from over 10,000 USGS gaging stations throughout the United States, we developed empirical prior distributions for flow-law parameters that are not observable by SWOT, but that are required in order to estimate discharge. This analysis quantified prior uncertainties on quantities including cross-sectional area, at-a-station hydraulic geometry width exponent, and discharge variability, that are dependent on SWOT-observable variables including reach-scale statistics of width and height. When compared against discharge estimation approaches that do not use this prior information, the Bayesian approach using ADCP-derived priors demonstrated consistently improved performance across a range of performance metrics. This Bayesian approach formally transfers information from in-situ gaging stations to remote-sensed estimation of discharge, in which the desired quantities are not directly observable. Further investigation using large in-situ datasets is therefore a promising way forward in improving satellite-based estimates of river discharge.

  17. Fuzzy-based propagation of prior knowledge to improve large-scale image analysis pipelines

    PubMed Central

    Mikut, Ralf

    2017-01-01

    Many automatically analyzable scientific questions are well-posed and a variety of information about expected outcomes is available a priori. Although often neglected, this prior knowledge can be systematically exploited to make automated analysis operations sensitive to a desired phenomenon or to evaluate extracted content with respect to this prior knowledge. For instance, the performance of processing operators can be greatly enhanced by a more focused detection strategy and by direct information about the ambiguity inherent in the extracted data. We present a new concept that increases the result quality awareness of image analysis operators by estimating and distributing the degree of uncertainty involved in their output based on prior knowledge. This allows the use of simple processing operators that are suitable for analyzing large-scale spatiotemporal (3D+t) microscopy images without compromising result quality. On the foundation of fuzzy set theory, we transform available prior knowledge into a mathematical representation and extensively use it to enhance the result quality of various processing operators. These concepts are illustrated on a typical bioimage analysis pipeline comprised of seed point detection, segmentation, multiview fusion and tracking. The functionality of the proposed approach is further validated on a comprehensive simulated 3D+t benchmark data set that mimics embryonic development and on large-scale light-sheet microscopy data of a zebrafish embryo. The general concept introduced in this contribution represents a new approach to efficiently exploit prior knowledge to improve the result quality of image analysis pipelines. The generality of the concept makes it applicable to practically any field with processing strategies that are arranged as linear pipelines. The automated analysis of terabyte-scale microscopy data will especially benefit from sophisticated and efficient algorithms that enable a quantitative and fast readout. PMID:29095927

  18. 21 CFR 1.282 - What must you do if information changes after you have received confirmation of a prior notice...

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... have received confirmation of a prior notice from FDA? 1.282 Section 1.282 Food and Drugs FOOD AND DRUG... changes after you have received confirmation of a prior notice from FDA? (a)(1) If any of the information... information), changes after you receive notice that FDA has confirmed your prior notice submission for review...

  19. 21 CFR 1.282 - What must you do if information changes after you have received confirmation of a prior notice...

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... have received confirmation of a prior notice from FDA? 1.282 Section 1.282 Food and Drugs FOOD AND DRUG... changes after you have received confirmation of a prior notice from FDA? (a)(1) If any of the information... information), changes after you receive notice that FDA has confirmed your prior notice submission for review...

  20. 21 CFR 1.282 - What must you do if information changes after you have received confirmation of a prior notice...

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... have received confirmation of a prior notice from FDA? 1.282 Section 1.282 Food and Drugs FOOD AND DRUG... changes after you have received confirmation of a prior notice from FDA? (a)(1) If any of the information... information), changes after you receive notice that FDA has confirmed your prior notice submission for review...

  1. 21 CFR 1.282 - What must you do if information changes after you have received confirmation of a prior notice...

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... have received confirmation of a prior notice from FDA? 1.282 Section 1.282 Food and Drugs FOOD AND DRUG... changes after you have received confirmation of a prior notice from FDA? (a)(1) If any of the information... information), changes after you receive notice that FDA has confirmed your prior notice submission for review...

  2. 21 CFR 1.282 - What must you do if information changes after you have received confirmation of a prior notice...

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... have received confirmation of a prior notice from FDA? 1.282 Section 1.282 Food and Drugs FOOD AND DRUG... changes after you have received confirmation of a prior notice from FDA? (a)(1) If any of the information... information), changes after you receive notice that FDA has confirmed your prior notice submission for review...

  3. Generation Psy: Student Characteristics and Academic Achievement in a Three-Year Problem-Based Learning Bachelor Program

    ERIC Educational Resources Information Center

    de Koning, Bjorn B.; Loyens, Sofie M. M.; Rikers, Remy M. J. P.; Smeets, Guus; van der Molen, Henk T.

    2012-01-01

    This study investigated the simultaneous impact of demographic, personality, intelligence, and (prior) study performance factors on students' academic achievement in a three-year academic problem-based psychology program. Information regarding students' gender, age, nationality, pre-university education, high school grades, Big Five personality…

  4. Propagation of population pharmacokinetic information using a Bayesian approach: comparison with meta-analysis.

    PubMed

    Dokoumetzidis, Aristides; Aarons, Leon

    2005-08-01

    We investigated the propagation of population pharmacokinetic information across clinical studies by applying Bayesian techniques. The aim was to summarize the population pharmacokinetic estimates of a study in appropriate statistical distributions in order to use them as Bayesian priors in consequent population pharmacokinetic analyses. Various data sets of simulated and real clinical data were fitted with WinBUGS, with and without informative priors. The posterior estimates of fittings with non-informative priors were used to build parametric informative priors and the whole procedure was carried on in a consecutive manner. The posterior distributions of the fittings with informative priors where compared to those of the meta-analysis fittings of the respective combinations of data sets. Good agreement was found, for the simulated and experimental datasets when the populations were exchangeable, with the posterior distribution from the fittings with the prior to be nearly identical to the ones estimated with meta-analysis. However, when populations were not exchangeble an alternative parametric form for the prior, the natural conjugate prior, had to be used in order to have consistent results. In conclusion, the results of a population pharmacokinetic analysis may be summarized in Bayesian prior distributions that can be used consecutively with other analyses. The procedure is an alternative to meta-analysis and gives comparable results. It has the advantage that it is faster than the meta-analysis, due to the large datasets used with the latter and can be performed when the data included in the prior are not actually available.

  5. a New Approach for Progressive Dense Reconstruction from Consecutive Images Based on Prior Low-Density 3d Point Clouds

    NASA Astrophysics Data System (ADS)

    Lari, Z.; El-Sheimy, N.

    2017-09-01

    In recent years, the increasing incidence of climate-related disasters has tremendously affected our environment. In order to effectively manage and reduce dramatic impacts of such events, the development of timely disaster management plans is essential. Since these disasters are spatial phenomena, timely provision of geospatial information is crucial for effective development of response and management plans. Due to inaccessibility of the affected areas and limited budget of first-responders, timely acquisition of the required geospatial data for these applications is usually possible only using low-cost imaging and georefencing sensors mounted on unmanned platforms. Despite rapid collection of the required data using these systems, available processing techniques are not yet capable of delivering geospatial information to responders and decision makers in a timely manner. To address this issue, this paper introduces a new technique for dense 3D reconstruction of the affected scenes which can deliver and improve the needed geospatial information incrementally. This approach is implemented based on prior 3D knowledge of the scene and employs computationally-efficient 2D triangulation, feature descriptor, feature matching and point verification techniques to optimize and speed up 3D dense scene reconstruction procedure. To verify the feasibility and computational efficiency of the proposed approach, an experiment using a set of consecutive images collected onboard a UAV platform and prior low-density airborne laser scanning over the same area is conducted and step by step results are provided. A comparative analysis of the proposed approach and an available image-based dense reconstruction technique is also conducted to prove the computational efficiency and competency of this technique for delivering geospatial information with pre-specified accuracy.

  6. Parallelized Bayesian inversion for three-dimensional dental X-ray imaging.

    PubMed

    Kolehmainen, Ville; Vanne, Antti; Siltanen, Samuli; Järvenpää, Seppo; Kaipio, Jari P; Lassas, Matti; Kalke, Martti

    2006-02-01

    Diagnostic and operational tasks based on dental radiology often require three-dimensional (3-D) information that is not available in a single X-ray projection image. Comprehensive 3-D information about tissues can be obtained by computerized tomography (CT) imaging. However, in dental imaging a conventional CT scan may not be available or practical because of high radiation dose, low-resolution or the cost of the CT scanner equipment. In this paper, we consider a novel type of 3-D imaging modality for dental radiology. We consider situations in which projection images of the teeth are taken from a few sparsely distributed projection directions using the dentist's regular (digital) X-ray equipment and the 3-D X-ray attenuation function is reconstructed. A complication in these experiments is that the reconstruction of the 3-D structure based on a few projection images becomes an ill-posed inverse problem. Bayesian inversion is a well suited framework for reconstruction from such incomplete data. In Bayesian inversion, the ill-posed reconstruction problem is formulated in a well-posed probabilistic form in which a priori information is used to compensate for the incomplete information of the projection data. In this paper we propose a Bayesian method for 3-D reconstruction in dental radiology. The method is partially based on Kolehmainen et al. 2003. The prior model for dental structures consist of a weighted l1 and total variation (TV)-prior together with the positivity prior. The inverse problem is stated as finding the maximum a posteriori (MAP) estimate. To make the 3-D reconstruction computationally feasible, a parallelized version of an optimization algorithm is implemented for a Beowulf cluster computer. The method is tested with projection data from dental specimens and patient data. Tomosynthetic reconstructions are given as reference for the proposed method.

  7. Estimating Bayesian Phylogenetic Information Content

    PubMed Central

    Lewis, Paul O.; Chen, Ming-Hui; Kuo, Lynn; Lewis, Louise A.; Fučíková, Karolina; Neupane, Suman; Wang, Yu-Bo; Shi, Daoyuan

    2016-01-01

    Measuring the phylogenetic information content of data has a long history in systematics. Here we explore a Bayesian approach to information content estimation. The entropy of the posterior distribution compared with the entropy of the prior distribution provides a natural way to measure information content. If the data have no information relevant to ranking tree topologies beyond the information supplied by the prior, the posterior and prior will be identical. Information in data discourages consideration of some hypotheses allowed by the prior, resulting in a posterior distribution that is more concentrated (has lower entropy) than the prior. We focus on measuring information about tree topology using marginal posterior distributions of tree topologies. We show that both the accuracy and the computational efficiency of topological information content estimation improve with use of the conditional clade distribution, which also allows topological information content to be partitioned by clade. We explore two important applications of our method: providing a compelling definition of saturation and detecting conflict among data partitions that can negatively affect analyses of concatenated data. [Bayesian; concatenation; conditional clade distribution; entropy; information; phylogenetics; saturation.] PMID:27155008

  8. Bayesian generalized linear mixed modeling of Tuberculosis using informative priors

    PubMed Central

    Woldegerima, Woldegebriel Assefa

    2017-01-01

    TB is rated as one of the world’s deadliest diseases and South Africa ranks 9th out of the 22 countries with hardest hit of TB. Although many pieces of research have been carried out on this subject, this paper steps further by inculcating past knowledge into the model, using Bayesian approach with informative prior. Bayesian statistics approach is getting popular in data analyses. But, most applications of Bayesian inference technique are limited to situations of non-informative prior, where there is no solid external information about the distribution of the parameter of interest. The main aim of this study is to profile people living with TB in South Africa. In this paper, identical regression models are fitted for classical and Bayesian approach both with non-informative and informative prior, using South Africa General Household Survey (GHS) data for the year 2014. For the Bayesian model with informative prior, South Africa General Household Survey dataset for the year 2011 to 2013 are used to set up priors for the model 2014. PMID:28257437

  9. Comprehending and Learning from Internet Sources: Processing Patterns of Better and Poorer Learners

    ERIC Educational Resources Information Center

    Goldman, Susan R.; Braasch, Jason L. G.; Wiley, Jennifer; Graesser, Arthur C.; Brodowinska, Kamila

    2012-01-01

    Readers increasingly attempt to understand and learn from information sources they find on the Internet. Doing so highlights the crucial role that evaluative processes play in selecting and making sense of the information. In a prior study, Wiley et al. (2009, Experiment 1) asked undergraduates to perform a web-based inquiry task about volcanoes…

  10. Improved compressed sensing-based cone-beam CT reconstruction using adaptive prior image constraints

    NASA Astrophysics Data System (ADS)

    Lee, Ho; Xing, Lei; Davidi, Ran; Li, Ruijiang; Qian, Jianguo; Lee, Rena

    2012-04-01

    Volumetric cone-beam CT (CBCT) images are acquired repeatedly during a course of radiation therapy and a natural question to ask is whether CBCT images obtained earlier in the process can be utilized as prior knowledge to reduce patient imaging dose in subsequent scans. The purpose of this work is to develop an adaptive prior image constrained compressed sensing (APICCS) method to solve this problem. Reconstructed images using full projections are taken on the first day of radiation therapy treatment and are used as prior images. The subsequent scans are acquired using a protocol of sparse projections. In the proposed APICCS algorithm, the prior images are utilized as an initial guess and are incorporated into the objective function in the compressed sensing (CS)-based iterative reconstruction process. Furthermore, the prior information is employed to detect any possible mismatched regions between the prior and current images for improved reconstruction. For this purpose, the prior images and the reconstructed images are classified into three anatomical regions: air, soft tissue and bone. Mismatched regions are identified by local differences of the corresponding groups in the two classified sets of images. A distance transformation is then introduced to convert the information into an adaptive voxel-dependent relaxation map. In constructing the relaxation map, the matched regions (unchanged anatomy) between the prior and current images are assigned with smaller weight values, which are translated into less influence on the CS iterative reconstruction process. On the other hand, the mismatched regions (changed anatomy) are associated with larger values and the regions are updated more by the new projection data, thus avoiding any possible adverse effects of prior images. The APICCS approach was systematically assessed by using patient data acquired under standard and low-dose protocols for qualitative and quantitative comparisons. The APICCS method provides an effective way for us to enhance the image quality at the matched regions between the prior and current images compared to the existing PICCS algorithm. Compared to the current CBCT imaging protocols, the APICCS algorithm allows an imaging dose reduction of 10-40 times due to the greatly reduced number of projections and lower x-ray tube current level coming from the low-dose protocol.

  11. Infrared Instrument for Detecting Hydrogen Fires

    NASA Technical Reports Server (NTRS)

    Youngquist, Robert; Ihlefeld, Curtis; Immer, Christopher; Oostdyk, Rebecca; Cox, Robert; Taylor, John

    2006-01-01

    The figure shows an instrument incorporating an infrared camera for detecting small hydrogen fires. The instrument has been developed as an improved replacement for prior infrared and ultraviolet instruments used to detect hydrogen fires. The need for this or any such instrument arises because hydrogen fires (e.g., those associated with leaks from tanks, valves, and ducts) pose a great danger, yet they emit so little visible light that they are mostly undetectable by the unaided human eye. The main performance advantage offered by the present instrument over prior hydrogen-fire-detecting instruments lies in its greater ability to avoid false alarms by discriminating against reflected infrared light, including that originating in (1) the Sun, (2) welding torches, and (3) deliberately ignited hydrogen flames (e.g., ullage-burn-off flames) that are nearby but outside the field of view intended to be monitored by the instrument. Like prior such instruments, this instrument is based mostly on the principle of detecting infrared emission above a threshold level. However, in addition, this instrument utilizes information on the spatial distribution of infrared light from a source that it detects. Because the combination of spatial and threshold information about a flame tends to constitute a unique signature that differs from that of reflected infrared light originating in a source not in the field of view, the incidence of false alarms is reduced substantially below that of related prior threshold- based instruments.

  12. Variable Selection with Prior Information for Generalized Linear Models via the Prior LASSO Method.

    PubMed

    Jiang, Yuan; He, Yunxiao; Zhang, Heping

    LASSO is a popular statistical tool often used in conjunction with generalized linear models that can simultaneously select variables and estimate parameters. When there are many variables of interest, as in current biological and biomedical studies, the power of LASSO can be limited. Fortunately, so much biological and biomedical data have been collected and they may contain useful information about the importance of certain variables. This paper proposes an extension of LASSO, namely, prior LASSO (pLASSO), to incorporate that prior information into penalized generalized linear models. The goal is achieved by adding in the LASSO criterion function an additional measure of the discrepancy between the prior information and the model. For linear regression, the whole solution path of the pLASSO estimator can be found with a procedure similar to the Least Angle Regression (LARS). Asymptotic theories and simulation results show that pLASSO provides significant improvement over LASSO when the prior information is relatively accurate. When the prior information is less reliable, pLASSO shows great robustness to the misspecification. We illustrate the application of pLASSO using a real data set from a genome-wide association study.

  13. Prior knowledge guided active modules identification: an integrated multi-objective approach.

    PubMed

    Chen, Weiqi; Liu, Jing; He, Shan

    2017-03-14

    Active module, defined as an area in biological network that shows striking changes in molecular activity or phenotypic signatures, is important to reveal dynamic and process-specific information that is correlated with cellular or disease states. A prior information guided active module identification approach is proposed to detect modules that are both active and enriched by prior knowledge. We formulate the active module identification problem as a multi-objective optimisation problem, which consists two conflicting objective functions of maximising the coverage of known biological pathways and the activity of the active module simultaneously. Network is constructed from protein-protein interaction database. A beta-uniform-mixture model is used to estimate the distribution of p-values and generate scores for activity measurement from microarray data. A multi-objective evolutionary algorithm is used to search for Pareto optimal solutions. We also incorporate a novel constraints based on algebraic connectivity to ensure the connectedness of the identified active modules. Application of proposed algorithm on a small yeast molecular network shows that it can identify modules with high activities and with more cross-talk nodes between related functional groups. The Pareto solutions generated by the algorithm provides solutions with different trade-off between prior knowledge and novel information from data. The approach is then applied on microarray data from diclofenac-treated yeast cells to build network and identify modules to elucidate the molecular mechanisms of diclofenac toxicity and resistance. Gene ontology analysis is applied to the identified modules for biological interpretation. Integrating knowledge of functional groups into the identification of active module is an effective method and provides a flexible control of balance between pure data-driven method and prior information guidance.

  14. In favor of general probability distributions: lateral prefrontal and insular cortices respond to stimulus inherent, but irrelevant differences.

    PubMed

    Mestres-Missé, Anna; Trampel, Robert; Turner, Robert; Kotz, Sonja A

    2016-04-01

    A key aspect of optimal behavior is the ability to predict what will come next. To achieve this, we must have a fairly good idea of the probability of occurrence of possible outcomes. This is based both on prior knowledge about a particular or similar situation and on immediately relevant new information. One question that arises is: when considering converging prior probability and external evidence, is the most probable outcome selected or does the brain represent degrees of uncertainty, even highly improbable ones? Using functional magnetic resonance imaging, the current study explored these possibilities by contrasting words that differ in their probability of occurrence, namely, unbalanced ambiguous words and unambiguous words. Unbalanced ambiguous words have a strong frequency-based bias towards one meaning, while unambiguous words have only one meaning. The current results reveal larger activation in lateral prefrontal and insular cortices in response to dominant ambiguous compared to unambiguous words even when prior and contextual information biases one interpretation only. These results suggest a probability distribution, whereby all outcomes and their associated probabilities of occurrence--even if very low--are represented and maintained.

  15. Bayes factors based on robust TDT-type tests for family trio design.

    PubMed

    Yuan, Min; Pan, Xiaoqing; Yang, Yaning

    2015-06-01

    Adaptive transmission disequilibrium test (aTDT) and MAX3 test are two robust-efficient association tests for case-parent family trio data. Both tests incorporate information of common genetic models including recessive, additive and dominant models and are efficient in power and robust to genetic model specifications. The aTDT uses information of departure from Hardy-Weinberg disequilibrium to identify the potential genetic model underlying the data and then applies the corresponding TDT-type test, and the MAX3 test is defined as the maximum of the absolute value of three TDT-type tests under the three common genetic models. In this article, we propose three robust Bayes procedures, the aTDT based Bayes factor, MAX3 based Bayes factor and Bayes model averaging (BMA), for association analysis with case-parent trio design. The asymptotic distributions of aTDT under the null and alternative hypothesis are derived in order to calculate its Bayes factor. Extensive simulations show that the Bayes factors and the p-values of the corresponding tests are generally consistent and these Bayes factors are robust to genetic model specifications, especially so when the priors on the genetic models are equal. When equal priors are used for the underlying genetic models, the Bayes factor method based on aTDT is more powerful than those based on MAX3 and Bayes model averaging. When the prior placed a small (large) probability on the true model, the Bayes factor based on aTDT (BMA) is more powerful. Analysis of a simulation data about RA from GAW15 is presented to illustrate applications of the proposed methods.

  16. 40 CFR 60.2953 - What information must I submit prior to initial startup?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... initial startup? 60.2953 Section 60.2953 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... and Reporting § 60.2953 What information must I submit prior to initial startup? You must submit the information specified in paragraphs (a) through (e) of this section prior to initial startup. (a) The type(s...

  17. 40 CFR 60.2195 - What information must I submit prior to initial startup?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... initial startup? 60.2195 Section 60.2195 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... What information must I submit prior to initial startup? You must submit the information specified in paragraphs (a) through (e) of this section prior to initial startup. (a) The type(s) of waste to be burned...

  18. 40 CFR 60.2953 - What information must I submit prior to initial startup?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... initial startup? 60.2953 Section 60.2953 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... and Reporting § 60.2953 What information must I submit prior to initial startup? You must submit the information specified in paragraphs (a) through (e) of this section prior to initial startup. (a) The type(s...

  19. 40 CFR 60.2195 - What information must I submit prior to initial startup?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... initial startup? 60.2195 Section 60.2195 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... What information must I submit prior to initial startup? You must submit the information specified in paragraphs (a) through (e) of this section prior to initial startup. (a) The type(s) of waste to be burned...

  20. DNA-testing for BRCA1/2 prior to genetic counselling in patients with breast cancer: design of an intervention study, DNA-direct.

    PubMed

    Sie, Aisha S; Spruijt, Liesbeth; van Zelst-Stams, Wendy A G; Mensenkamp, Arjen R; Ligtenberg, Marjolijn J; Brunner, Han G; Prins, Judith B; Hoogerbrugge, Nicoline

    2012-05-08

    Current practice for patients with breast cancer referred for genetic counseling, includes face-to-face consultations with a genetic counselor prior to and following DNA-testing. This is based on guidelines regarding Huntington's disease in anticipation of high psychosocial impact of DNA-testing for mutations in BRCA1/2 genes. The initial consultation covers generic information regarding hereditary breast cancer and the (im)possibilities of DNA-testing, prior to such testing. Patients with breast cancer may see this information as irrelevant or unnecessary because individual genetic advice depends on DNA-test results. Also, verbal information is not always remembered well by patients. A different format for this information prior to DNA-testing is possible: replacing initial face-to-face genetic counseling (DNA-intake procedure) by telephone, written and digital information sent to patients' homes (DNA-direct procedure). In this intervention study, 150 patients with breast cancer referred to the department of Clinical Genetics of the Radboud University Nijmegen Medical Centre are given the choice between two procedures, DNA-direct (intervention group) or DNA-intake (usual care, control group). During a triage telephone call, patients are excluded if they have problems with Dutch text, family communication, or of psychological or psychiatric nature. Primary outcome measures are satisfaction and psychological distress. Secondary outcome measures are determinants for the participant's choice of procedure, waiting and processing times, and family characteristics. Data are collected by self-report questionnaires at baseline and following completion of genetic counseling. A minority of participants will receive an invitation for a 30 min semi-structured telephone interview, e.g. confirmed carriers of a BRCA1/2 mutation, and those who report problems with the procedure. This study compares current practice of an intake consultation (DNA-intake) to a home informational package of telephone, written and digital information (DNA-direct) prior to DNA-testing in patients with breast cancer. The aim is to determine whether DNA-direct is an acceptable procedure for BRCA1/2 testing, in order to provide customized care to patients with breast cancer, cutting down on the period of uncertainty during this diagnostic process.

  1. Notes on the birth-death prior with fossil calibrations for Bayesian estimation of species divergence times.

    PubMed

    Dos Reis, Mario

    2016-07-19

    Constructing a multi-dimensional prior on the times of divergence (the node ages) of species in a phylogeny is not a trivial task, in particular, if the prior density is the result of combining different sources of information such as a speciation process with fossil calibration densities. Yang & Rannala (2006 Mol. Biol. Evol 23, 212-226. (doi:10.1093/molbev/msj024)) laid out the general approach to combine the birth-death process with arbitrary fossil-based densities to construct a prior on divergence times. They achieved this by calculating the density of node ages without calibrations conditioned on the ages of the calibrated nodes. Here, I show that the conditional density obtained by Yang & Rannala is misspecified. The misspecified density can sometimes be quite strange-looking and can lead to unintentionally informative priors on node ages without fossil calibrations. I derive the correct density and provide a few illustrative examples. Calculation of the density involves a sum over a large set of labelled histories, and so obtaining the density in a computer program seems hard at the moment. A general algorithm that may provide a way forward is given.This article is part of the themed issue 'Dating species divergences using rocks and clocks'. © 2016 The Author(s).

  2. Organic geochemistry data of Alaska

    USGS Publications Warehouse

    complied by Threlkeld, Charles N.; Obuch, Raymond C.; Gunther, G.L.

    2000-01-01

    In order to archive the results of various petroleum geochemical analyses of the Alaska resource assessment, the USGS developed an Alaskan Organic Geochemical Data Base (AOGDB) in 1978 to house the data generated from USGS and subcontracted laboratories. Prior to the AOGDB, the accumulated data resided in a flat data file entitled 'PGS' that was maintained by Petroleum Information Corporation with technical input from the USGS. The information herein is a breakout of the master flat file format into a relational data base table format (akdata).

  3. JANIS-2: An Improved Version of the NEA Java-based Nuclear Data Information System

    NASA Astrophysics Data System (ADS)

    Soppera, N.; Henriksson, H.; Nouri, A.; Nagel, P.; Dupont, E.

    2005-05-01

    JANIS (JAva-based Nuclear Information Software) is a display program designed to facilitate the visualisation and manipulation of nuclear data. Its objective is to allow the user of nuclear data to access numerical and graphical representations without prior knowledge of the storage format. It offers maximum flexibility for the comparison of different nuclear data sets. Features included in the latest release are described such as direct access to centralised databases through JAVA Servlet technology.

  4. JANIS-2: An Improved Version of the NEA Java-based Nuclear Data Information System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soppera, N.; Henriksson, H.; Nagel, P.

    2005-05-24

    JANIS (JAva-based Nuclear Information Software) is a display program designed to facilitate the visualisation and manipulation of nuclear data. Its objective is to allow the user of nuclear data to access numerical and graphical representations without prior knowledge of the storage format. It offers maximum flexibility for the comparison of different nuclear data sets. Features included in the latest release are described such as direct access to centralised databases through JAVA Servlet technology.

  5. Performance of informative priors skeptical of large treatment effects in clinical trials: A simulation study.

    PubMed

    Pedroza, Claudia; Han, Weilu; Thanh Truong, Van Thi; Green, Charles; Tyson, Jon E

    2018-01-01

    One of the main advantages of Bayesian analyses of clinical trials is their ability to formally incorporate skepticism about large treatment effects through the use of informative priors. We conducted a simulation study to assess the performance of informative normal, Student- t, and beta distributions in estimating relative risk (RR) or odds ratio (OR) for binary outcomes. Simulation scenarios varied the prior standard deviation (SD; level of skepticism of large treatment effects), outcome rate in the control group, true treatment effect, and sample size. We compared the priors with regards to bias, mean squared error (MSE), and coverage of 95% credible intervals. Simulation results show that the prior SD influenced the posterior to a greater degree than the particular distributional form of the prior. For RR, priors with a 95% interval of 0.50-2.0 performed well in terms of bias, MSE, and coverage under most scenarios. For OR, priors with a wider 95% interval of 0.23-4.35 had good performance. We recommend the use of informative priors that exclude implausibly large treatment effects in analyses of clinical trials, particularly for major outcomes such as mortality.

  6. MR-Consistent Simultaneous Reconstruction of Attenuation and Activity for Non-TOF PET/MR

    NASA Astrophysics Data System (ADS)

    Heußer, Thorsten; Rank, Christopher M.; Freitag, Martin T.; Dimitrakopoulou-Strauss, Antonia; Schlemmer, Heinz-Peter; Beyer, Thomas; Kachelrieß, Marc

    2016-10-01

    Attenuation correction (AC) is required for accurate quantification of the reconstructed activity distribution in positron emission tomography (PET). For simultaneous PET/magnetic resonance (MR), however, AC is challenging, since the MR images do not provide direct information on the attenuating properties of the underlying tissue. Standard MR-based AC does not account for the presence of bone and thus leads to an underestimation of the activity distribution. To improve quantification for non-time-of-flight PET/MR, we propose an algorithm which simultaneously reconstructs activity and attenuation distribution from the PET emission data using available MR images as anatomical prior information. The MR information is used to derive voxel-dependent expectations on the attenuation coefficients. The expectations are modeled using Gaussian-like probability functions. An iterative reconstruction scheme incorporating the prior information on the attenuation coefficients is used to update attenuation and activity distribution in an alternating manner. We tested and evaluated the proposed algorithm for simulated 3D PET data of the head and the pelvis region. Activity deviations were below 5% in soft tissue and lesions compared to the ground truth whereas standard MR-based AC resulted in activity underestimation values of up to 12%.

  7. Application of Bayesian informative priors to enhance the transferability of safety performance functions.

    PubMed

    Farid, Ahmed; Abdel-Aty, Mohamed; Lee, Jaeyoung; Eluru, Naveen

    2017-09-01

    Safety performance functions (SPFs) are essential tools for highway agencies to predict crashes, identify hotspots and assess safety countermeasures. In the Highway Safety Manual (HSM), a variety of SPFs are provided for different types of roadway facilities, crash types and severity levels. Agencies, lacking the necessary resources to develop own localized SPFs, may opt to apply the HSM's SPFs for their jurisdictions. Yet, municipalities that want to develop and maintain their regional SPFs might encounter the issue of the small sample bias. Bayesian inference is being conducted to address this issue by combining the current data with prior information to achieve reliable results. It follows that the essence of Bayesian statistics is the application of informative priors, obtained from other SPFs or experts' experiences. In this study, we investigate the applicability of informative priors for Bayesian negative binomial SPFs for rural divided multilane highway segments in Florida and California. An SPF with non-informative priors is developed for each state and its parameters' distributions are assigned to the other state's SPF as informative priors. The performances of SPFs are evaluated by applying each state's SPFs to the other state. The analysis is conducted for both total (KABCO) and severe (KAB) crashes. As per the results, applying one state's SPF with informative priors, which are the other state's SPF independent variable estimates, to the latter state's conditions yields better goodness of fit (GOF) values than applying the former state's SPF with non-informative priors to the conditions of the latter state. This is for both total and severe crash SPFs. Hence, for localities where it is not preferred to develop own localized SPFs and adopt SPFs from elsewhere to cut down on resources, application of informative priors is shown to facilitate the process. Copyright © 2017 National Safety Council and Elsevier Ltd. All rights reserved.

  8. Risk-adjusted capitation based on the Diagnostic Cost Group Model: an empirical evaluation with health survey information.

    PubMed Central

    Lamers, L M

    1999-01-01

    OBJECTIVE: To evaluate the predictive accuracy of the Diagnostic Cost Group (DCG) model using health survey information. DATA SOURCES/STUDY SETTING: Longitudinal data collected for a sample of members of a Dutch sickness fund. In the Netherlands the sickness funds provide compulsory health insurance coverage for the 60 percent of the population in the lowest income brackets. STUDY DESIGN: A demographic model and DCG capitation models are estimated by means of ordinary least squares, with an individual's annual healthcare expenditures in 1994 as the dependent variable. For subgroups based on health survey information, costs predicted by the models are compared with actual costs. Using stepwise regression procedures a subset of relevant survey variables that could improve the predictive accuracy of the three-year DCG model was identified. Capitation models were extended with these variables. DATA COLLECTION/EXTRACTION METHODS: For the empirical analysis, panel data of sickness fund members were used that contained demographic information, annual healthcare expenditures, and diagnostic information from hospitalizations for each member. In 1993, a mailed health survey was conducted among a random sample of 15,000 persons in the panel data set, with a 70 percent response rate. PRINCIPAL FINDINGS: The predictive accuracy of the demographic model improves when it is extended with diagnostic information from prior hospitalizations (DCGs). A subset of survey variables further improves the predictive accuracy of the DCG capitation models. The predictable profits and losses based on survey information for the DCG models are smaller than for the demographic model. Most persons with predictable losses based on health survey information were not hospitalized in the preceding year. CONCLUSIONS: The use of diagnostic information from prior hospitalizations is a promising option for improving the demographic capitation payment formula. This study suggests that diagnostic information from outpatient utilization is complementary to DCGs in predicting future costs. PMID:10029506

  9. Filter-based multiscale entropy analysis of complex physiological time series.

    PubMed

    Xu, Yuesheng; Zhao, Liang

    2013-08-01

    Multiscale entropy (MSE) has been widely and successfully used in analyzing the complexity of physiological time series. We reinterpret the averaging process in MSE as filtering a time series by a filter of a piecewise constant type. From this viewpoint, we introduce filter-based multiscale entropy (FME), which filters a time series to generate multiple frequency components, and then we compute the blockwise entropy of the resulting components. By choosing filters adapted to the feature of a given time series, FME is able to better capture its multiscale information and to provide more flexibility for studying its complexity. Motivated by the heart rate turbulence theory, which suggests that the human heartbeat interval time series can be described in piecewise linear patterns, we propose piecewise linear filter multiscale entropy (PLFME) for the complexity analysis of the time series. Numerical results from PLFME are more robust to data of various lengths than those from MSE. The numerical performance of the adaptive piecewise constant filter multiscale entropy without prior information is comparable to that of PLFME, whose design takes prior information into account.

  10. 34 CFR 99.30 - Under what conditions is prior consent required to disclose information?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 34 Education 1 2010-07-01 2010-07-01 false Under what conditions is prior consent required to disclose information? 99.30 Section 99.30 Education Office of the Secretary, Department of Education FAMILY... Information From Education Records? § 99.30 Under what conditions is prior consent required to disclose...

  11. MO-C-18A-01: Advances in Model-Based 3D Image Reconstruction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, G; Pan, X; Stayman, J

    2014-06-15

    Recent years have seen the emergence of CT image reconstruction techniques that exploit physical models of the imaging system, photon statistics, and even the patient to achieve improved 3D image quality and/or reduction of radiation dose. With numerous advantages in comparison to conventional 3D filtered backprojection, such techniques bring a variety of challenges as well, including: a demanding computational load associated with sophisticated forward models and iterative optimization methods; nonlinearity and nonstationarity in image quality characteristics; a complex dependency on multiple free parameters; and the need to understand how best to incorporate prior information (including patient-specific prior images) within themore » reconstruction process. The advantages, however, are even greater – for example: improved image quality; reduced dose; robustness to noise and artifacts; task-specific reconstruction protocols; suitability to novel CT imaging platforms and noncircular orbits; and incorporation of known characteristics of the imager and patient that are conventionally discarded. This symposium features experts in 3D image reconstruction, image quality assessment, and the translation of such methods to emerging clinical applications. Dr. Chen will address novel methods for the incorporation of prior information in 3D and 4D CT reconstruction techniques. Dr. Pan will show recent advances in optimization-based reconstruction that enable potential reduction of dose and sampling requirements. Dr. Stayman will describe a “task-based imaging” approach that leverages models of the imaging system and patient in combination with a specification of the imaging task to optimize both the acquisition and reconstruction process. Dr. Samei will describe the development of methods for image quality assessment in such nonlinear reconstruction techniques and the use of these methods to characterize and optimize image quality and dose in a spectrum of clinical applications. Learning Objectives: Learn the general methodologies associated with model-based 3D image reconstruction. Learn the potential advantages in image quality and dose associated with model-based image reconstruction. Learn the challenges associated with computational load and image quality assessment for such reconstruction methods. Learn how imaging task can be incorporated as a means to drive optimal image acquisition and reconstruction techniques. Learn how model-based reconstruction methods can incorporate prior information to improve image quality, ease sampling requirements, and reduce dose.« less

  12. Use of Elaborative Interrogation to Help Students Acquire Information Consistent with Prior Knowledge and Information Inconsistent with Prior Knowledge.

    ERIC Educational Resources Information Center

    Woloshyn, Vera E.; And Others

    1994-01-01

    Thirty-two factual statements, half consistent and half not consistent with subjects' prior knowledge, were processed by 140 sixth and seventh graders. Half were directed to use elaborative interrogation (using prior knowledge) to answer why each statement was true. Across all memory measures, elaborative interrogation subjects performed better…

  13. When generating answers benefits arithmetic skill: the importance of prior knowledge.

    PubMed

    Rittle-Johnson, Bethany; Kmicikewycz, Alexander Oleksij

    2008-09-01

    People remember information better if they generate the information while studying rather than read the information. However, prior research has not investigated whether this generation effect extends to related but unstudied items and has not been conducted in classroom settings. We compared third graders' success on studied and unstudied multiplication problems after they spent a class period generating answers to problems or reading the answers from a calculator. The effect of condition interacted with prior knowledge. Students with low prior knowledge had higher accuracy in the generate condition, but as prior knowledge increased, the advantage of generating answers decreased. The benefits of generating answers may extend to unstudied items and to classroom settings, but only for learners with low prior knowledge.

  14. Mendelian randomization with Egger pleiotropy correction and weakly informative Bayesian priors.

    PubMed

    Schmidt, A F; Dudbridge, F

    2017-12-15

    The MR-Egger (MRE) estimator has been proposed to correct for directional pleiotropic effects of genetic instruments in an instrumental variable (IV) analysis. The power of this method is considerably lower than that of conventional estimators, limiting its applicability. Here we propose a novel Bayesian implementation of the MR-Egger estimator (BMRE) and explore the utility of applying weakly informative priors on the intercept term (the pleiotropy estimate) to increase power of the IV (slope) estimate. This was a simulation study to compare the performance of different IV estimators. Scenarios differed in the presence of a causal effect, the presence of pleiotropy, the proportion of pleiotropic instruments and degree of 'Instrument Strength Independent of Direct Effect' (InSIDE) assumption violation. Based on empirical plasma urate data, we present an approach to elucidate a prior distribution for the amount of pleiotropy. A weakly informative prior on the intercept term increased power of the slope estimate while maintaining type 1 error rates close to the nominal value of 0.05. Under the InSIDE assumption, performance was unaffected by the presence or absence of pleiotropy. Violation of the InSIDE assumption biased all estimators, affecting the BMRE more than the MRE method. Depending on the prior distribution, the BMRE estimator has more power at the cost of an increased susceptibility to InSIDE assumption violations. As such the BMRE method is a compromise between the MRE and conventional IV estimators, and may be an especially useful approach to account for observed pleiotropy. © The Author 2017. Published by Oxford University Press on behalf of the International Epidemiological Association.

  15. Health literacy: a study of internet-based information on advance directives.

    PubMed

    Stuart, Peter

    2017-11-28

    The aim of this study was to evaluate the quality and value of web-based information on advance directives. Internet-based information on advance directives was selected because, if it is inaccurate or difficult to understand, patients risk making decisions about their care that may not be followed in practice. Two validated health information evaluation tools, the Suitability Assessment of Materials and DISCERN, and a focus group were used to assess credibility, user orientation and effectiveness. Only one of the 34 internet-based information items on advance directives reviewed fulfilled the study criteria and 30% of the sites were classed as unreadable. In terms of learning and informing, 79% of the sites were considered unsuitable. Using health literacy tools to evaluate internet-based health information highlights that often it is not at a functional literacy level and neither informs nor empowers users to make independent and valid healthcare decisions. ©2017 RCN Publishing Company Ltd. All rights reserved. Not to be copied, transmitted or recorded in any way, in whole or part, without prior permission of the publishers.

  16. A Blended Approach to Active Learning in a Physiology Laboratory-Based Subject Facilitated by an e-Learning Component

    ERIC Educational Resources Information Center

    Dantas, Arianne M.; Kemm, Robert E.

    2008-01-01

    Learning via online activities (e-learning) was introduced to facilitate existing face-to-face teaching to encourage more effective student preparation and then informed participation in an undergraduate physiology laboratory-based course. Active learning was encouraged by hypothesis formation and predictions prior to classes, with opportunities…

  17. Individual Differences in Learning Entrepreneurship and Their Implications for Web-Based Instruction in E-Business and E-Commerce.

    ERIC Educational Resources Information Center

    Foster, Jonathan; Lin, Angela

    2003-01-01

    Discusses results from a survey of graduates following a module in e-business and e-commerce at the University of Sheffield that suggest differences in prior knowledge and cultural background impact students' acquisition of domain knowledge and intellectual and information research skills. Considers implications for Web-based instruction.…

  18. Finding Financial Resources for Adult Learners: Profiles for Practice.

    ERIC Educational Resources Information Center

    College Entrance Examination Board, New York, NY.

    A variety of special financial aid practices that colleges have created to meet the needs of adult students are described, based on a 1983 survey of financial aid directors from more than 100 colleges. Information is provided on campus-based sources of financial aid such as: credit for prior learning programs, financial and career information…

  19. Internet use for prediagnosis symptom appraisal by colorectal cancer patients.

    PubMed

    Thomson, Maria D; Siminoff, Laura A; Longo, Daniel R

    2012-10-01

    This study explored the characteristics of colorectal cancer (CRC) patients who accessed Internet-based health information as part of their symptom appraisal process prior to consulting a health care provider. Newly diagnosed CRC patients who experienced symptoms prior to diagnosis were interviewed. Brief COPE was used to measure patient coping. Logistic and linear regressions were used to assess Internet use and appraisal delay. Twenty-five percent of the sample (61/242) consulted the Internet prior to visiting a health care provider. Internet use was associated with having private health insurance (odds ratio [OR] = 2.55; 95% confidence interval [CI] = 1.20-5.43) and experiencing elimination symptoms (OR = 1.43; 95% CI = 1.14-1.80) and was marginally associated with age (OR = 0.96; 95% CI = 0.93-0.99). Internet use was not related to delayed medical care seeking. Internet use did not influence decisions to seek medical care. The Internet provided a preliminary information resource for individuals who experienced embarrassing CRC symptoms, had private health insurance, and were younger.

  20. Shape priors for segmentation of the cervix region within uterine cervix images

    NASA Astrophysics Data System (ADS)

    Lotenberg, Shelly; Gordon, Shiri; Greenspan, Hayit

    2008-03-01

    The work focuses on a unique medical repository of digital Uterine Cervix images ("Cervigrams") collected by the National Cancer Institute (NCI), National Institute of Health, in longitudinal multi-year studies. NCI together with the National Library of Medicine is developing a unique web-based database of the digitized cervix images to study the evolution of lesions related to cervical cancer. Tools are needed for the automated analysis of the cervigram content to support the cancer research. In recent works, a multi-stage automated system for segmenting and labeling regions of medical and anatomical interest within the cervigrams was developed. The current paper concentrates on incorporating prior-shape information in the cervix region segmentation task. In accordance with the fact that human experts mark the cervix region as circular or elliptical, two shape models (and corresponding methods) are suggested. The shape models are embedded within an active contour framework that relies on image features. Experiments indicate that incorporation of the prior shape information augments previous results.

  1. A Bayesian Alternative for Multi-objective Ecohydrological Model Specification

    NASA Astrophysics Data System (ADS)

    Tang, Y.; Marshall, L. A.; Sharma, A.; Ajami, H.

    2015-12-01

    Process-based ecohydrological models combine the study of hydrological, physical, biogeochemical and ecological processes of the catchments, which are usually more complex and parametric than conceptual hydrological models. Thus, appropriate calibration objectives and model uncertainty analysis are essential for ecohydrological modeling. In recent years, Bayesian inference has become one of the most popular tools for quantifying the uncertainties in hydrological modeling with the development of Markov Chain Monte Carlo (MCMC) techniques. Our study aims to develop appropriate prior distributions and likelihood functions that minimize the model uncertainties and bias within a Bayesian ecohydrological framework. In our study, a formal Bayesian approach is implemented in an ecohydrological model which combines a hydrological model (HyMOD) and a dynamic vegetation model (DVM). Simulations focused on one objective likelihood (Streamflow/LAI) and multi-objective likelihoods (Streamflow and LAI) with different weights are compared. Uniform, weakly informative and strongly informative prior distributions are used in different simulations. The Kullback-leibler divergence (KLD) is used to measure the dis(similarity) between different priors and corresponding posterior distributions to examine the parameter sensitivity. Results show that different prior distributions can strongly influence posterior distributions for parameters, especially when the available data is limited or parameters are insensitive to the available data. We demonstrate differences in optimized parameters and uncertainty limits in different cases based on multi-objective likelihoods vs. single objective likelihoods. We also demonstrate the importance of appropriately defining the weights of objectives in multi-objective calibration according to different data types.

  2. A new prior for bayesian anomaly detection: application to biosurveillance.

    PubMed

    Shen, Y; Cooper, G F

    2010-01-01

    Bayesian anomaly detection computes posterior probabilities of anomalous events by combining prior beliefs and evidence from data. However, the specification of prior probabilities can be challenging. This paper describes a Bayesian prior in the context of disease outbreak detection. The goal is to provide a meaningful, easy-to-use prior that yields a posterior probability of an outbreak that performs at least as well as a standard frequentist approach. If this goal is achieved, the resulting posterior could be usefully incorporated into a decision analysis about how to act in light of a possible disease outbreak. This paper describes a Bayesian method for anomaly detection that combines learning from data with a semi-informative prior probability over patterns of anomalous events. A univariate version of the algorithm is presented here for ease of illustration of the essential ideas. The paper describes the algorithm in the context of disease-outbreak detection, but it is general and can be used in other anomaly detection applications. For this application, the semi-informative prior specifies that an increased count over baseline is expected for the variable being monitored, such as the number of respiratory chief complaints per day at a given emergency department. The semi-informative prior is derived based on the baseline prior, which is estimated from using historical data. The evaluation reported here used semi-synthetic data to evaluate the detection performance of the proposed Bayesian method and a control chart method, which is a standard frequentist algorithm that is closest to the Bayesian method in terms of the type of data it uses. The disease-outbreak detection performance of the Bayesian method was statistically significantly better than that of the control chart method when proper baseline periods were used to estimate the baseline behavior to avoid seasonal effects. When using longer baseline periods, the Bayesian method performed as well as the control chart method. The time complexity of the Bayesian algorithm is linear in the number of the observed events being monitored, due to a novel, closed-form derivation that is introduced in the paper. This paper introduces a novel prior probability for Bayesian outbreak detection that is expressive, easy-to-apply, computationally efficient, and performs as well or better than a standard frequentist method.

  3. A comment on priors for Bayesian occupancy models.

    PubMed

    Northrup, Joseph M; Gerber, Brian D

    2018-01-01

    Understanding patterns of species occurrence and the processes underlying these patterns is fundamental to the study of ecology. One of the more commonly used approaches to investigate species occurrence patterns is occupancy modeling, which can account for imperfect detection of a species during surveys. In recent years, there has been a proliferation of Bayesian modeling in ecology, which includes fitting Bayesian occupancy models. The Bayesian framework is appealing to ecologists for many reasons, including the ability to incorporate prior information through the specification of prior distributions on parameters. While ecologists almost exclusively intend to choose priors so that they are "uninformative" or "vague", such priors can easily be unintentionally highly informative. Here we report on how the specification of a "vague" normally distributed (i.e., Gaussian) prior on coefficients in Bayesian occupancy models can unintentionally influence parameter estimation. Using both simulated data and empirical examples, we illustrate how this issue likely compromises inference about species-habitat relationships. While the extent to which these informative priors influence inference depends on the data set, researchers fitting Bayesian occupancy models should conduct sensitivity analyses to ensure intended inference, or employ less commonly used priors that are less informative (e.g., logistic or t prior distributions). We provide suggestions for addressing this issue in occupancy studies, and an online tool for exploring this issue under different contexts.

  4. Accommodating Uncertainty in Prior Distributions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Picard, Richard Roy; Vander Wiel, Scott Alan

    2017-01-19

    A fundamental premise of Bayesian methodology is that a priori information is accurately summarized by a single, precisely de ned prior distribution. In many cases, especially involving informative priors, this premise is false, and the (mis)application of Bayes methods produces posterior quantities whose apparent precisions are highly misleading. We examine the implications of uncertainty in prior distributions, and present graphical methods for dealing with them.

  5. The Latest Information on Fort Detrick Gate Access Procedures | Poster

    Cancer.gov

    As of Jan. 5, all visitors to Fort Detrick are required to undergo a National Crime Information Center background check prior to entering base. The background checks are conducted at Old Farm Gate. The new access procedures may cause delays at all Fort Detrick gates, but especially at Old Farm Gate. Access requirements have not changed for employees and personnel with a

  6. Development of schemas revealed by prior experience and NMDA receptor knock-out

    PubMed Central

    Dragoi, George; Tonegawa, Susumu

    2013-01-01

    Prior experience accelerates acquisition of novel, related information through processes like assimilation into mental schemas, but the underlying neuronal mechanisms are poorly understood. We investigated the roles that prior experience and hippocampal CA3 N-Methyl-D-aspartate receptor (NMDAR)-dependent synaptic plasticity play in CA1 place cell sequence encoding and learning during novel spatial experiences. We found that specific representations of de novo experiences on linear environments were formed on a framework of pre configured network activity expressed in the preceding sleep and were rapidly, flexibly adjusted via NMDAR-dependent activity. This prior experience accelerated encoding of subsequent experiences on contiguous or isolated novel tracks, significantly decreasing their NMDAR-dependence. Similarly, de novo learning of an alternation task was facilitated by CA3 NMDARs; this experience accelerated subsequent learning of related tasks, independent of CA3 NMDARs, consistent with a schema-based learning. These results reveal the existence of distinct neuronal encoding schemes which could explain why hippocampal dysfunction results in anterograde amnesia while sparing recollection of old, schema-based memories. DOI: http://dx.doi.org/10.7554/eLife.01326.001 PMID:24327561

  7. Estimating haplotype frequencies by combining data from large DNA pools with database information.

    PubMed

    Gasbarra, Dario; Kulathinal, Sangita; Pirinen, Matti; Sillanpää, Mikko J

    2011-01-01

    We assume that allele frequency data have been extracted from several large DNA pools, each containing genetic material of up to hundreds of sampled individuals. Our goal is to estimate the haplotype frequencies among the sampled individuals by combining the pooled allele frequency data with prior knowledge about the set of possible haplotypes. Such prior information can be obtained, for example, from a database such as HapMap. We present a Bayesian haplotyping method for pooled DNA based on a continuous approximation of the multinomial distribution. The proposed method is applicable when the sizes of the DNA pools and/or the number of considered loci exceed the limits of several earlier methods. In the example analyses, the proposed model clearly outperforms a deterministic greedy algorithm on real data from the HapMap database. With a small number of loci, the performance of the proposed method is similar to that of an EM-algorithm, which uses a multinormal approximation for the pooled allele frequencies, but which does not utilize prior information about the haplotypes. The method has been implemented using Matlab and the code is available upon request from the authors.

  8. 25 CFR 167.6 - Carrying capacities.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Grazing Committee, and the Navajo Tribal Council for review and recommendations prior to presentation to...; recommendations for future adjustments to the established carrying capacities shall be made by Range Technicians based on the best information available through annual utilization studies and range condition studies...

  9. 25 CFR 167.6 - Carrying capacities.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Grazing Committee, and the Navajo Tribal Council for review and recommendations prior to presentation to...; recommendations for future adjustments to the established carrying capacities shall be made by Range Technicians based on the best information available through annual utilization studies and range condition studies...

  10. Dynamic PET Image reconstruction for parametric imaging using the HYPR kernel method

    NASA Astrophysics Data System (ADS)

    Spencer, Benjamin; Qi, Jinyi; Badawi, Ramsey D.; Wang, Guobao

    2017-03-01

    Dynamic PET image reconstruction is a challenging problem because of the ill-conditioned nature of PET and the lowcounting statistics resulted from short time-frames in dynamic imaging. The kernel method for image reconstruction has been developed to improve image reconstruction of low-count PET data by incorporating prior information derived from high-count composite data. In contrast to most of the existing regularization-based methods, the kernel method embeds image prior information in the forward projection model and does not require an explicit regularization term in the reconstruction formula. Inspired by the existing highly constrained back-projection (HYPR) algorithm for dynamic PET image denoising, we propose in this work a new type of kernel that is simpler to implement and further improves the kernel-based dynamic PET image reconstruction. Our evaluation study using a physical phantom scan with synthetic FDG tracer kinetics has demonstrated that the new HYPR kernel-based reconstruction can achieve a better region-of-interest (ROI) bias versus standard deviation trade-off for dynamic PET parametric imaging than the post-reconstruction HYPR denoising method and the previously used nonlocal-means kernel.

  11. Advanced Oxidation Processes: Process Mechanisms, Affecting Parameters and Landfill Leachate Treatment.

    PubMed

    Su-Huan, Kow; Fahmi, Muhammad Ridwan; Abidin, Che Zulzikrami Azner; Soon-An, Ong

    2016-11-01

      Advanced oxidation processes (AOPs) are of special interest in treating landfill leachate as they are the most promising procedures to degrade recalcitrant compounds and improve the biodegradability of wastewater. This paper aims to refresh the information base of AOPs and to discover the research gaps of AOPs in landfill leachate treatment. A brief overview of mechanisms involving in AOPs including ozone-based AOPs, hydrogen peroxide-based AOPs and persulfate-based AOPs are presented, and the parameters affecting AOPs are elaborated. Particularly, the advancement of AOPs in landfill leachate treatment is compared and discussed. Landfill leachate characterization prior to method selection and method optimization prior to treatment are necessary, as the performance and practicability of AOPs are influenced by leachate matrixes and treatment cost. More studies concerning the scavenging effects of leachate matrixes towards AOPs, as well as the persulfate-based AOPs in landfill leachate treatment, are necessary in the future.

  12. The influence of prior reputation and reciprocity on dynamic trust-building in adults with and without autism spectrum disorder.

    PubMed

    Maurer, Cornelius; Chambon, Valerian; Bourgeois-Gironde, Sacha; Leboyer, Marion; Zalla, Tiziana

    2018-03-01

    The present study was designed to investigate the effects of reputational priors and direct reciprocity on the dynamics of trust building in adults with (N = 17) and without (N = 25) autism spectrum disorder (ASD) using a multi-round Trust Game (MTG). On each round, participants, who played as investors, were required to maximize their benefits by updating their prior expectations (the partner's positive or negative reputation), based on the partner's directed reciprocity, and adjusting their own investment decisions accordingly. Results showed that reputational priors strongly oriented the initial decision to trust, operationalized as the amount of investment the investor shares with the counterpart. However, while typically developed participants were mainly affected by the direct reciprocity, and rapidly adopted the optimal Tit-for-Tat strategy, participants with ASD continued to rely on reputational priors throughout the game, even when experience of the counterpart's actual behavior contradicted their prior-based expectations. In participants with ASD, the effect of the reputational prior never disappeared, and affected judgments of trustworthiness and reciprocity of the partner even after completion of the game. Moreover, the weight of prior reputation positively correlated with the severity of the ASD participant's social impairments while the reciprocity score negatively correlated with the severity of repetitive and stereotyped behaviors, as measured by the Autism Diagnostic Interview-Revised (ADI-R). In line with Bayesian theoretical accounts, the present findings indicate that individuals with ASD have difficulties encoding incoming social information and using it to revise and flexibly update prior social expectations, and that this deficit might severely hinder social learning and everyday life interactions. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Diffusion-based neuromodulation can eliminate catastrophic forgetting in simple neural networks

    PubMed Central

    Clune, Jeff

    2017-01-01

    A long-term goal of AI is to produce agents that can learn a diversity of skills throughout their lifetimes and continuously improve those skills via experience. A longstanding obstacle towards that goal is catastrophic forgetting, which is when learning new information erases previously learned information. Catastrophic forgetting occurs in artificial neural networks (ANNs), which have fueled most recent advances in AI. A recent paper proposed that catastrophic forgetting in ANNs can be reduced by promoting modularity, which can limit forgetting by isolating task information to specific clusters of nodes and connections (functional modules). While the prior work did show that modular ANNs suffered less from catastrophic forgetting, it was not able to produce ANNs that possessed task-specific functional modules, thereby leaving the main theory regarding modularity and forgetting untested. We introduce diffusion-based neuromodulation, which simulates the release of diffusing, neuromodulatory chemicals within an ANN that can modulate (i.e. up or down regulate) learning in a spatial region. On the simple diagnostic problem from the prior work, diffusion-based neuromodulation 1) induces task-specific learning in groups of nodes and connections (task-specific localized learning), which 2) produces functional modules for each subtask, and 3) yields higher performance by eliminating catastrophic forgetting. Overall, our results suggest that diffusion-based neuromodulation promotes task-specific localized learning and functional modularity, which can help solve the challenging, but important problem of catastrophic forgetting. PMID:29145413

  14. Prototyping an institutional IAIMS/UMLS information environment for an academic medical center.

    PubMed

    Miller, P L; Paton, J A; Clyman, J I; Powsner, S M

    1992-07-01

    The paper describes a prototype information environment designed to link network-based information resources in an integrated fashion and thus enhance the information capabilities of an academic medical center. The prototype was implemented on a single Macintosh computer to permit exploration of the overall "information architecture" and to demonstrate the various desired capabilities prior to full-scale network-based implementation. At the heart of the prototype are two components: a diverse set of information resources available over an institutional computer network and an information sources map designed to assist users in finding and accessing information resources relevant to their needs. The paper describes these and other components of the prototype and presents a scenario illustrating its use. The prototype illustrates the link between the goals of two National Library of Medicine initiatives, the Integrated Academic Information Management System (IAIMS) and the Unified Medical Language System (UMLS).

  15. Model weights and the foundations of multimodel inference

    USGS Publications Warehouse

    Link, W.A.; Barker, R.J.

    2006-01-01

    Statistical thinking in wildlife biology and ecology has been profoundly influenced by the introduction of AIC (Akaike?s information criterion) as a tool for model selection and as a basis for model averaging. In this paper, we advocate the Bayesian paradigm as a broader framework for multimodel inference, one in which model averaging and model selection are naturally linked, and in which the performance of AIC-based tools is naturally evaluated. Prior model weights implicitly associated with the use of AIC are seen to highly favor complex models: in some cases, all but the most highly parameterized models in the model set are virtually ignored a priori. We suggest the usefulness of the weighted BIC (Bayesian information criterion) as a computationally simple alternative to AIC, based on explicit selection of prior model probabilities rather than acceptance of default priors associated with AIC. We note, however, that both procedures are only approximate to the use of exact Bayes factors. We discuss and illustrate technical difficulties associated with Bayes factors, and suggest approaches to avoiding these difficulties in the context of model selection for a logistic regression. Our example highlights the predisposition of AIC weighting to favor complex models and suggests a need for caution in using the BIC for computing approximate posterior model weights.

  16. Can we improve top-down GHG inverse methods through informed prior and better representations of atmospheric transport? Insights from the Atmospheric Carbon and Transport (ACT) - America Aircraft Mission

    NASA Astrophysics Data System (ADS)

    Feng, S.; Lauvaux, T.; Keller, K.; Davis, K. J.

    2016-12-01

    Current estimates of biogenic carbon fluxes over North America based on top-down atmospheric inversions are subject to considerable uncertainty. This uncertainty stems to a large part from the uncertain prior fluxes estimates with the associated error covariances and approximations in the atmospheric transport models that link observed carbon dioxide mixing ratios with surface fluxes. Specifically, approximations in the representation of vertical mixing associated with atmospheric turbulence or convective transport and largely under-determined prior fluxes and their error structures significantly hamper our capacity to reliably estimate regional carbon fluxes. The Atmospheric Carbon and Transport - America (ACT-America) mission aims at reducing the uncertainties in inverse fluxes at the regional-scale by deploying airborne and ground-based platforms to characterize atmospheric GHG mixing ratios and the concurrent atmospheric dynamics. Two aircraft measure the 3-dimensional distribution of greenhouse gases at synoptic scales, focusing on the atmospheric boundary layer and the free troposphere during both fair and stormy weather conditions. Here we analyze two main questions: (i) What level of information can we expect from the currently planned observations? (ii) How might ACT-America reduce the hindcast and predictive uncertainty of carbon estimates over North America?

  17. LORAKS Makes Better SENSE: Phase-Constrained Partial Fourier SENSE Reconstruction without Phase Calibration

    PubMed Central

    Kim, Tae Hyung; Setsompop, Kawin; Haldar, Justin P.

    2016-01-01

    Purpose Parallel imaging and partial Fourier acquisition are two classical approaches for accelerated MRI. Methods that combine these approaches often rely on prior knowledge of the image phase, but the need to obtain this prior information can place practical restrictions on the data acquisition strategy. In this work, we propose and evaluate SENSE-LORAKS, which enables combined parallel imaging and partial Fourier reconstruction without requiring prior phase information. Theory and Methods The proposed formulation is based on combining the classical SENSE model for parallel imaging data with the more recent LORAKS framework for MR image reconstruction using low-rank matrix modeling. Previous LORAKS-based methods have successfully enabled calibrationless partial Fourier parallel MRI reconstruction, but have been most successful with nonuniform sampling strategies that may be hard to implement for certain applications. By combining LORAKS with SENSE, we enable highly-accelerated partial Fourier MRI reconstruction for a broader range of sampling trajectories, including widely-used calibrationless uniformly-undersampled trajectories. Results Our empirical results with retrospectively undersampled datasets indicate that when SENSE-LORAKS reconstruction is combined with an appropriate k-space sampling trajectory, it can provide substantially better image quality at high-acceleration rates relative to existing state-of-the-art reconstruction approaches. Conclusion The SENSE-LORAKS framework provides promising new opportunities for highly-accelerated MRI. PMID:27037836

  18. A Voice-Based E-Examination Framework for Visually Impaired Students in Open and Distance Learning

    ERIC Educational Resources Information Center

    Azeta, Ambrose A.; Inam, Itorobong A.; Daramola, Olawande

    2018-01-01

    Voice-based systems allow users access to information on the internet over a voice interface. Prior studies on Open and Distance Learning (ODL) e-examination systems that make use of voice interface do not sufficiently exhibit intelligent form of assessment, which diminishes the rigor of examination. The objective of this paper is to improve on…

  19. Bayesian inference with historical data-based informative priors improves detection of differentially expressed genes.

    PubMed

    Li, Ben; Sun, Zhaonan; He, Qing; Zhu, Yu; Qin, Zhaohui S

    2016-03-01

    Modern high-throughput biotechnologies such as microarray are capable of producing a massive amount of information for each sample. However, in a typical high-throughput experiment, only limited number of samples were assayed, thus the classical 'large p, small n' problem. On the other hand, rapid propagation of these high-throughput technologies has resulted in a substantial collection of data, often carried out on the same platform and using the same protocol. It is highly desirable to utilize the existing data when performing analysis and inference on a new dataset. Utilizing existing data can be carried out in a straightforward fashion under the Bayesian framework in which the repository of historical data can be exploited to build informative priors and used in new data analysis. In this work, using microarray data, we investigate the feasibility and effectiveness of deriving informative priors from historical data and using them in the problem of detecting differentially expressed genes. Through simulation and real data analysis, we show that the proposed strategy significantly outperforms existing methods including the popular and state-of-the-art Bayesian hierarchical model-based approaches. Our work illustrates the feasibility and benefits of exploiting the increasingly available genomics big data in statistical inference and presents a promising practical strategy for dealing with the 'large p, small n' problem. Our method is implemented in R package IPBT, which is freely available from https://github.com/benliemory/IPBT CONTACT: yuzhu@purdue.edu; zhaohui.qin@emory.edu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  20. A comment on priors for Bayesian occupancy models

    PubMed Central

    Gerber, Brian D.

    2018-01-01

    Understanding patterns of species occurrence and the processes underlying these patterns is fundamental to the study of ecology. One of the more commonly used approaches to investigate species occurrence patterns is occupancy modeling, which can account for imperfect detection of a species during surveys. In recent years, there has been a proliferation of Bayesian modeling in ecology, which includes fitting Bayesian occupancy models. The Bayesian framework is appealing to ecologists for many reasons, including the ability to incorporate prior information through the specification of prior distributions on parameters. While ecologists almost exclusively intend to choose priors so that they are “uninformative” or “vague”, such priors can easily be unintentionally highly informative. Here we report on how the specification of a “vague” normally distributed (i.e., Gaussian) prior on coefficients in Bayesian occupancy models can unintentionally influence parameter estimation. Using both simulated data and empirical examples, we illustrate how this issue likely compromises inference about species-habitat relationships. While the extent to which these informative priors influence inference depends on the data set, researchers fitting Bayesian occupancy models should conduct sensitivity analyses to ensure intended inference, or employ less commonly used priors that are less informative (e.g., logistic or t prior distributions). We provide suggestions for addressing this issue in occupancy studies, and an online tool for exploring this issue under different contexts. PMID:29481554

  1. Quantum information and the problem of mechanisms of biological evolution.

    PubMed

    Melkikh, Alexey V

    2014-01-01

    One of the most important conditions for replication in early evolution is the de facto elimination of the conformational degrees of freedom of the replicators, the mechanisms of which remain unclear. In addition, realistic evolutionary timescales can be established based only on partially directed evolution, further complicating this issue. A division of the various evolutionary theories into two classes has been proposed based on the presence or absence of a priori information about the evolving system. A priori information plays a key role in solving problems in evolution. Here, a model of partially directed evolution, based on the learning automata theory, which includes a priori information about the fitness space, is proposed. A potential repository of such prior information is the states of biologically important molecules. Thus, the need for extended evolutionary synthesis is discussed. Experiments to test the hypothesis of partially directed evolution are proposed. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.

  2. How effective are messages and their characteristics in changing behavioural intentions to substitute plant-based foods for red meat? The mediating role of prior beliefs.

    PubMed

    Vainio, Annukka; Irz, Xavier; Hartikainen, Hanna

    2018-06-01

    By means of a population-based survey experiment, we analysed the effectiveness of two message characteristics - message framing and the refutation of misinformation - in persuading respondents to reduce their consumption of red meat and increase that of plant-based alternatives. We also tested whether the effects of those two message characteristics were moderated by prior beliefs about the health and climate impacts of red meat consumption. The data were collected with an online survey of the adult population living in Finland (N = 1279). We found that messages had a small but desired effect on intentions when the effect of prior beliefs was taken into account, but that that effect was strongly moderated by prior beliefs. In particular, messages changed behavioural intentions among the "meat-sceptics" (i.e., those believing relatively strongly in the negative health and climate effects of meat consumption) but not among the "meat believers" (defined symmetrically). The combination of frames and refutation of misinformation were not found to be more effective strategies than the provision of information through single-framed, one-sided messages. We found limited evidence that the way a message was formulated determined its effectiveness in changing behaviours. Copyright © 2018 Elsevier Ltd. All rights reserved.

  3. Towards a general theory of neural computation based on prediction by single neurons.

    PubMed

    Fiorillo, Christopher D

    2008-10-01

    Although there has been tremendous progress in understanding the mechanics of the nervous system, there has not been a general theory of its computational function. Here I present a theory that relates the established biophysical properties of single generic neurons to principles of Bayesian probability theory, reinforcement learning and efficient coding. I suggest that this theory addresses the general computational problem facing the nervous system. Each neuron is proposed to mirror the function of the whole system in learning to predict aspects of the world related to future reward. According to the model, a typical neuron receives current information about the state of the world from a subset of its excitatory synaptic inputs, and prior information from its other inputs. Prior information would be contributed by synaptic inputs representing distinct regions of space, and by different types of non-synaptic, voltage-regulated channels representing distinct periods of the past. The neuron's membrane voltage is proposed to signal the difference between current and prior information ("prediction error" or "surprise"). A neuron would apply a Hebbian plasticity rule to select those excitatory inputs that are the most closely correlated with reward but are the least predictable, since unpredictable inputs provide the neuron with the most "new" information about future reward. To minimize the error in its predictions and to respond only when excitation is "new and surprising," the neuron selects amongst its prior information sources through an anti-Hebbian rule. The unique inputs of a mature neuron would therefore result from learning about spatial and temporal patterns in its local environment, and by extension, the external world. Thus the theory describes how the structure of the mature nervous system could reflect the structure of the external world, and how the complexity and intelligence of the system might develop from a population of undifferentiated neurons, each implementing similar learning algorithms.

  4. The neural basis of belief updating and rational decision making

    PubMed Central

    Achtziger, Anja; Hügelschäfer, Sabine; Steinhauser, Marco

    2014-01-01

    Rational decision making under uncertainty requires forming beliefs that integrate prior and new information through Bayes’ rule. Human decision makers typically deviate from Bayesian updating by either overweighting the prior (conservatism) or overweighting new information (e.g. the representativeness heuristic). We investigated these deviations through measurements of electrocortical activity in the human brain during incentivized probability-updating tasks and found evidence of extremely early commitment to boundedly rational heuristics. Participants who overweight new information display a lower sensibility to conflict detection, captured by an event-related potential (the N2) observed around 260 ms after the presentation of new information. Conservative decision makers (who overweight prior probabilities) make up their mind before new information is presented, as indicated by the lateralized readiness potential in the brain. That is, they do not inhibit the processing of new information but rather immediately rely on the prior for making a decision. PMID:22956673

  5. The neural basis of belief updating and rational decision making.

    PubMed

    Achtziger, Anja; Alós-Ferrer, Carlos; Hügelschäfer, Sabine; Steinhauser, Marco

    2014-01-01

    Rational decision making under uncertainty requires forming beliefs that integrate prior and new information through Bayes' rule. Human decision makers typically deviate from Bayesian updating by either overweighting the prior (conservatism) or overweighting new information (e.g. the representativeness heuristic). We investigated these deviations through measurements of electrocortical activity in the human brain during incentivized probability-updating tasks and found evidence of extremely early commitment to boundedly rational heuristics. Participants who overweight new information display a lower sensibility to conflict detection, captured by an event-related potential (the N2) observed around 260 ms after the presentation of new information. Conservative decision makers (who overweight prior probabilities) make up their mind before new information is presented, as indicated by the lateralized readiness potential in the brain. That is, they do not inhibit the processing of new information but rather immediately rely on the prior for making a decision.

  6. Quantifying the sensitivity of aerosol optical depths retrieved from MSG SEVIRI to a priori data

    NASA Astrophysics Data System (ADS)

    Bulgin, C. E.; Palmer, P. I.; Merchant, C. J.; Siddans, R.; Poulsen, C.; Grainger, R. G.; Thomas, G.; Carboni, E.; McConnell, C.; Highwood, E.

    2009-12-01

    Radiative forcing contributions from aerosol direct and indirect effects remain one of the most uncertain components of the climate system. Satellite observations of aerosol optical properties offer important constraints on atmospheric aerosols but their sensitivity to prior assumptions must be better characterized before they are used effectively to reduce uncertainty in aerosol radiative forcing. We assess the sensitivity of the Oxford-RAL Aerosol and Cloud (ORAC) optimal estimation retrieval of aerosol optical depth (AOD) from the Spinning Enhanced Visible and InfraRed Imager (SEVIRI) to a priori aerosol data. SEVIRI is a geostationary satellite instrument centred over Africa and the neighbouring Atlantic Ocean, routinely sampling desert dust and biomass burning outflow from Africa. We quantify the uncertainty in SEVIRI AOD retrievals in the presence of desert dust by comparing retrievals that use prior information from the Optical Properties of Aerosol and Cloud (OPAC) database, with those that use measured aerosol properties during the Dust Outflow and Deposition to the Ocean (DODO) aircraft campaign (August, 2006). We also assess the sensitivity of retrieved AODs to changes in solar zenith angle, and the vertical profile of aerosol effective radius and extinction coefficient input into the retrieval forward model. Currently the ORAC retrieval scheme retrieves AODs for five aerosol types (desert dust, biomass burning, maritime, urban and continental) and chooses the most appropriate AOD based on the cost functions. We generate an improved prior aerosol speciation database for SEVIRI based on a statistical analysis of a Saharan Dust Index (SDI) determined using variances of different brightness temperatures, and organic and black carbon tracers from the GEOS-Chem chemistry transport model. This database is described as a function of season and time of day. We quantify the difference in AODs between those chosen based on prior information from the SDI and GEOS-Chem and those chosen based on the smallest cost function.

  7. A predictive model of avian natal dispersal distance provides prior information for investigating response to landscape change.

    PubMed

    Garrard, Georgia E; McCarthy, Michael A; Vesk, Peter A; Radford, James Q; Bennett, Andrew F

    2012-01-01

    1. Informative Bayesian priors can improve the precision of estimates in ecological studies or estimate parameters for which little or no information is available. While Bayesian analyses are becoming more popular in ecology, the use of strongly informative priors remains rare, perhaps because examples of informative priors are not readily available in the published literature. 2. Dispersal distance is an important ecological parameter, but is difficult to measure and estimates are scarce. General models that provide informative prior estimates of dispersal distances will therefore be valuable. 3. Using a world-wide data set on birds, we develop a predictive model of median natal dispersal distance that includes body mass, wingspan, sex and feeding guild. This model predicts median dispersal distance well when using the fitted data and an independent test data set, explaining up to 53% of the variation. 4. Using this model, we predict a priori estimates of median dispersal distance for 57 woodland-dependent bird species in northern Victoria, Australia. These estimates are then used to investigate the relationship between dispersal ability and vulnerability to landscape-scale changes in habitat cover and fragmentation. 5. We find evidence that woodland bird species with poor predicted dispersal ability are more vulnerable to habitat fragmentation than those species with longer predicted dispersal distances, thus improving the understanding of this important phenomenon. 6. The value of constructing informative priors from existing information is also demonstrated. When used as informative priors for four example species, predicted dispersal distances reduced the 95% credible intervals of posterior estimates of dispersal distance by 8-19%. Further, should we have wished to collect information on avian dispersal distances and relate it to species' responses to habitat loss and fragmentation, data from 221 individuals across 57 species would have been required to obtain estimates with the same precision as those provided by the general model. © 2011 The Authors. Journal of Animal Ecology © 2011 British Ecological Society.

  8. Integrating prior information into microwave tomography Part 1: Impact of detail on image quality.

    PubMed

    Kurrant, Douglas; Baran, Anastasia; LoVetri, Joe; Fear, Elise

    2017-12-01

    The authors investigate the impact that incremental increases in the level of detail of patient-specific prior information have on image quality and the convergence behavior of an inversion algorithm in the context of near-field microwave breast imaging. A methodology is presented that uses image quality measures to characterize the ability of the algorithm to reconstruct both internal structures and lesions embedded in fibroglandular tissue. The approach permits key aspects that impact the quality of reconstruction of these structures to be identified and quantified. This provides insight into opportunities to improve image reconstruction performance. Patient-specific information is acquired using radar-based methods that form a regional map of the breast. This map is then incorporated into a microwave tomography algorithm. Previous investigations have demonstrated the effectiveness of this approach to improve image quality when applied to data generated with two-dimensional (2D) numerical models. The present study extends this work by generating prior information that is customized to vary the degree of structural detail to facilitate the investigation of the role of prior information in image formation. Numerical 2D breast models constructed from magnetic resonance (MR) scans, and reconstructions formed with a three-dimensional (3D) numerical breast model are used to assess if trends observed for the 2D results can be extended to 3D scenarios. For the blind reconstruction scenario (i.e., no prior information), the breast surface is not accurately identified and internal structures are not clearly resolved. A substantial improvement in image quality is achieved by incorporating the skin surface map and constraining the imaging domain to the breast. Internal features within the breast appear in the reconstructed image. However, it is challenging to discriminate between adipose and glandular regions and there are inaccuracies in both the structural properties of the glandular region and the dielectric properties reconstructed within this structure. Using a regional map with a skin layer only marginally improves this situation. Increasing the structural detail in the prior information to include internal features leads to reconstructions for which the interface that delineates the fat and gland regions can be inferred. Different features within the glandular region corresponding to tissues with varying relative permittivity values, such as a lesion embedded within glandular structure, emerge in the reconstructed images. Including knowledge of the breast surface and skin layer leads to a substantial improvement in image quality compared to the blind case, but the images have limited diagnostic utility for applications such as tumor response tracking. The diagnostic utility of the reconstruction technique is improved considerably when patient-specific structural information is used. This qualitative observation is supported quantitatively with image metrics. © 2017 American Association of Physicists in Medicine.

  9. Prior robust empirical Bayes inference for large-scale data by conditioning on rank with application to microarray data

    PubMed Central

    Liao, J. G.; Mcmurry, Timothy; Berg, Arthur

    2014-01-01

    Empirical Bayes methods have been extensively used for microarray data analysis by modeling the large number of unknown parameters as random effects. Empirical Bayes allows borrowing information across genes and can automatically adjust for multiple testing and selection bias. However, the standard empirical Bayes model can perform poorly if the assumed working prior deviates from the true prior. This paper proposes a new rank-conditioned inference in which the shrinkage and confidence intervals are based on the distribution of the error conditioned on rank of the data. Our approach is in contrast to a Bayesian posterior, which conditions on the data themselves. The new method is almost as efficient as standard Bayesian methods when the working prior is close to the true prior, and it is much more robust when the working prior is not close. In addition, it allows a more accurate (but also more complex) non-parametric estimate of the prior to be easily incorporated, resulting in improved inference. The new method’s prior robustness is demonstrated via simulation experiments. Application to a breast cancer gene expression microarray dataset is presented. Our R package rank.Shrinkage provides a ready-to-use implementation of the proposed methodology. PMID:23934072

  10. Calibration of Magnetometers with GNSS Receivers and Magnetometer-Aided GNSS Ambiguity Fixing

    PubMed Central

    Henkel, Patrick

    2017-01-01

    Magnetometers provide compass information, and are widely used for navigation, orientation and alignment of objects. As magnetometers are affected by sensor biases and eventually by systematic distortions of the Earth magnetic field, a calibration is needed. In this paper, a method for calibration of magnetometers with three Global Navigation Satellite System (GNSS) receivers is presented. We perform a least-squares estimation of the magnetic flux and sensor biases using GNSS-based attitude information. The attitude is obtained from the relative positions between the GNSS receivers in the North-East-Down coordinate frame and prior knowledge of these relative positions in the platform’s coordinate frame. The relative positions and integer ambiguities of the periodic carrier phase measurements are determined with an integer least-squares estimation using an integer decorrelation and sequential tree search. Prior knowledge on the relative positions is used to increase the success rate of ambiguity fixing. We have validated the proposed method with low-cost magnetometers and GNSS receivers on a vehicle in a test drive. The calibration enabled a consistent heading determination with an accuracy of five degrees. This precise magnetometer-based attitude information allows an instantaneous GNSS integer ambiguity fixing. PMID:28594369

  11. Calibration of Magnetometers with GNSS Receivers and Magnetometer-Aided GNSS Ambiguity Fixing.

    PubMed

    Henkel, Patrick

    2017-06-08

    Magnetometers provide compass information, and are widely used for navigation, orientation and alignment of objects. As magnetometers are affected by sensor biases and eventually by systematic distortions of the Earth magnetic field, a calibration is needed. In this paper, a method for calibration of magnetometers with three Global Navigation Satellite System (GNSS) receivers is presented. We perform a least-squares estimation of the magnetic flux and sensor biases using GNSS-based attitude information. The attitude is obtained from the relative positions between the GNSS receivers in the North-East-Down coordinate frame and prior knowledge of these relative positions in the platform's coordinate frame. The relative positions and integer ambiguities of the periodic carrier phase measurements are determined with an integer least-squares estimation using an integer decorrelation and sequential tree search. Prior knowledge on the relative positions is used to increase the success rate of ambiguity fixing. We have validated the proposed method with low-cost magnetometers and GNSS receivers on a vehicle in a test drive. The calibration enabled a consistent heading determination with an accuracy of five degrees. This precise magnetometer-based attitude information allows an instantaneous GNSS integer ambiguity fixing.

  12. Conceptual Influences on Category-Based Induction

    ERIC Educational Resources Information Center

    Gelman, Susan A.; Davidson, Natalie S.

    2013-01-01

    One important function of categories is to permit rich inductive inferences. Prior work shows that children use category labels to guide their inductive inferences. However, there are competing theories to explain this phenomenon, differing in the roles attributed to conceptual information vs. perceptual similarity. Seven experiments with 4- to…

  13. Bricklayer's Helper. Coordinator's Guide. Individualized Study Guide.

    ERIC Educational Resources Information Center

    Barnes, Bill

    This individualized, competency-based study guide is designed to assist teacher-coordinators supervising cooperative education programs for bricklayer's helpers in providing students with general information for immediate reinforcement on the job and developing an understanding of the job prior to employment. A progress chart is provided to allow…

  14. Knowledge Structures of Entering Computer Networking Students and Their Instructors

    ERIC Educational Resources Information Center

    DiCerbo, Kristen E.

    2007-01-01

    Students bring prior knowledge to their learning experiences. This prior knowledge is known to affect how students encode and later retrieve new information learned. Teachers and content developers can use information about students' prior knowledge to create more effective lessons and materials. In many content areas, particularly the sciences,…

  15. Nudging toward Inquiry: Awakening and Building upon Prior Knowledge

    ERIC Educational Resources Information Center

    Fontichiaro, Kristin, Comp.

    2010-01-01

    "Prior knowledge" (sometimes called schema or background knowledge) is information one already knows that helps him/her make sense of new information. New learning builds on existing prior knowledge. In traditional reporting-style research projects, students bypass this crucial step and plow right into answer-finding. It's no wonder that many…

  16. A model to systematically employ professional judgment in the Bayesian Decision Analysis for a semiconductor industry exposure assessment.

    PubMed

    Torres, Craig; Jones, Rachael; Boelter, Fred; Poole, James; Dell, Linda; Harper, Paul

    2014-01-01

    Bayesian Decision Analysis (BDA) uses Bayesian statistics to integrate multiple types of exposure information and classify exposures within the exposure rating categorization scheme promoted in American Industrial Hygiene Association (AIHA) publications. Prior distributions for BDA may be developed from existing monitoring data, mathematical models, or professional judgment. Professional judgments may misclassify exposures. We suggest that a structured qualitative risk assessment (QLRA) method can provide consistency and transparency in professional judgments. In this analysis, we use a structured QLRA method to define prior distributions (priors) for BDA. We applied this approach at three semiconductor facilities in South Korea, and present an evaluation of the performance of structured QLRA for determination of priors, and an evaluation of occupational exposures using BDA. Specifically, the structured QLRA was applied to chemical agents in similar exposure groups to identify provisional risk ratings. Standard priors were developed for each risk rating before review of historical monitoring data. Newly collected monitoring data were used to update priors informed by QLRA or historical monitoring data, and determine the posterior distribution. Exposure ratings were defined by the rating category with the highest probability--i.e., the most likely. We found the most likely exposure rating in the QLRA-informed priors to be consistent with historical and newly collected monitoring data, and the posterior exposure ratings developed with QLRA-informed priors to be equal to or greater than those developed with data-informed priors in 94% of comparisons. Overall, exposures at these facilities are consistent with well-controlled work environments. That is, the 95th percentile of exposure distributions are ≤50% of the occupational exposure limit (OEL) for all chemical-SEG combinations evaluated; and are ≤10% of the limit for 94% of chemical-SEG combinations evaluated.

  17. Hilbert-Schmidt and Sobol sensitivity indices for static and time series Wnt signaling measurements in colorectal cancer - part A.

    PubMed

    Sinha, Shriprakash

    2017-12-04

    Ever since the accidental discovery of Wingless [Sharma R.P., Drosophila information service, 1973, 50, p 134], research in the field of Wnt signaling pathway has taken significant strides in wet lab experiments and various cancer clinical trials, augmented by recent developments in advanced computational modeling of the pathway. Information rich gene expression profiles reveal various aspects of the signaling pathway and help in studying different issues simultaneously. Hitherto, not many computational studies exist which incorporate the simultaneous study of these issues. This manuscript ∙ explores the strength of contributing factors in the signaling pathway, ∙ analyzes the existing causal relations among the inter/extracellular factors effecting the pathway based on prior biological knowledge and ∙ investigates the deviations in fold changes in the recently found prevalence of psychophysical laws working in the pathway. To achieve this goal, local and global sensitivity analysis is conducted on the (non)linear responses between the factors obtained from static and time series expression profiles using the density (Hilbert-Schmidt Information Criterion) and variance (Sobol) based sensitivity indices. The results show the advantage of using density based indices over variance based indices mainly due to the former's employment of distance measures & the kernel trick via Reproducing kernel Hilbert space (RKHS) that capture nonlinear relations among various intra/extracellular factors of the pathway in a higher dimensional space. In time series data, using these indices it is now possible to observe where in time, which factors get influenced & contribute to the pathway, as changes in concentration of the other factors are made. This synergy of prior biological knowledge, sensitivity analysis & representations in higher dimensional spaces can facilitate in time based administration of target therapeutic drugs & reveal hidden biological information within colorectal cancer samples.

  18. A Web-based patient information system--identification of patients' information needs.

    PubMed

    Hassling, Linda; Babic, Ankica; Lönn, Urban; Casimir-Ahn, Henrik

    2003-06-01

    Research described here was carried out to explore possibilities of creating a web-based patient information system within the areas of thoracic surgery. Data were collected to distinguish and assess the actual information needs of patients (1) prior to surgical treatment, (2) before discharge, and (3) 8 months after the hospitalization using a follow-up questionnaire. Interviews were performed with patients undergoing heart surgery. The study included material of 19 consecutive patients undergoing coronary artery bypass surgery (12) and valve replacement (7), age 35-74, 13 males and 6 females with nonacademic background. Patient satisfaction with given information was high. Analysis of the interviews held at the hospital resulted in seven different categories describing and giving a picture of the patients' information needs and apprehension of received care. The results found in this study can be used as guidance for developers in their design and development process of a health information system.

  19. Improving Bayesian credibility intervals for classifier error rates using maximum entropy empirical priors.

    PubMed

    Gustafsson, Mats G; Wallman, Mikael; Wickenberg Bolin, Ulrika; Göransson, Hanna; Fryknäs, M; Andersson, Claes R; Isaksson, Anders

    2010-06-01

    Successful use of classifiers that learn to make decisions from a set of patient examples require robust methods for performance estimation. Recently many promising approaches for determination of an upper bound for the error rate of a single classifier have been reported but the Bayesian credibility interval (CI) obtained from a conventional holdout test still delivers one of the tightest bounds. The conventional Bayesian CI becomes unacceptably large in real world applications where the test set sizes are less than a few hundred. The source of this problem is that fact that the CI is determined exclusively by the result on the test examples. In other words, there is no information at all provided by the uniform prior density distribution employed which reflects complete lack of prior knowledge about the unknown error rate. Therefore, the aim of the study reported here was to study a maximum entropy (ME) based approach to improved prior knowledge and Bayesian CIs, demonstrating its relevance for biomedical research and clinical practice. It is demonstrated how a refined non-uniform prior density distribution can be obtained by means of the ME principle using empirical results from a few designs and tests using non-overlapping sets of examples. Experimental results show that ME based priors improve the CIs when employed to four quite different simulated and two real world data sets. An empirically derived ME prior seems promising for improving the Bayesian CI for the unknown error rate of a designed classifier. Copyright 2010 Elsevier B.V. All rights reserved.

  20. Predictive top-down integration of prior knowledge during speech perception.

    PubMed

    Sohoglu, Ediz; Peelle, Jonathan E; Carlyon, Robert P; Davis, Matthew H

    2012-06-20

    A striking feature of human perception is that our subjective experience depends not only on sensory information from the environment but also on our prior knowledge or expectations. The precise mechanisms by which sensory information and prior knowledge are integrated remain unclear, with longstanding disagreement concerning whether integration is strictly feedforward or whether higher-level knowledge influences sensory processing through feedback connections. Here we used concurrent EEG and MEG recordings to determine how sensory information and prior knowledge are integrated in the brain during speech perception. We manipulated listeners' prior knowledge of speech content by presenting matching, mismatching, or neutral written text before a degraded (noise-vocoded) spoken word. When speech conformed to prior knowledge, subjective perceptual clarity was enhanced. This enhancement in clarity was associated with a spatiotemporal profile of brain activity uniquely consistent with a feedback process: activity in the inferior frontal gyrus was modulated by prior knowledge before activity in lower-level sensory regions of the superior temporal gyrus. In parallel, we parametrically varied the level of speech degradation, and therefore the amount of sensory detail, so that changes in neural responses attributable to sensory information and prior knowledge could be directly compared. Although sensory detail and prior knowledge both enhanced speech clarity, they had an opposite influence on the evoked response in the superior temporal gyrus. We argue that these data are best explained within the framework of predictive coding in which sensory activity is compared with top-down predictions and only unexplained activity propagated through the cortical hierarchy.

  1. Experimentally Derived δ13C and δ15N Discrimination Factors for Gray Wolves and the Impact of Prior Information in Bayesian Mixing Models

    PubMed Central

    Bucci, Melanie E.; Callahan, Peggy; Koprowski, John L.; Polfus, Jean L.; Krausman, Paul R.

    2015-01-01

    Stable isotope analysis of diet has become a common tool in conservation research. However, the multiple sources of uncertainty inherent in this analysis framework involve consequences that have not been thoroughly addressed. Uncertainty arises from the choice of trophic discrimination factors, and for Bayesian stable isotope mixing models (SIMMs), the specification of prior information; the combined effect of these aspects has not been explicitly tested. We used a captive feeding study of gray wolves (Canis lupus) to determine the first experimentally-derived trophic discrimination factors of C and N for this large carnivore of broad conservation interest. Using the estimated diet in our controlled system and data from a published study on wild wolves and their prey in Montana, USA, we then investigated the simultaneous effect of discrimination factors and prior information on diet reconstruction with Bayesian SIMMs. Discrimination factors for gray wolves and their prey were 1.97‰ for δ13C and 3.04‰ for δ15N. Specifying wolf discrimination factors, as opposed to the commonly used red fox (Vulpes vulpes) factors, made little practical difference to estimates of wolf diet, but prior information had a strong effect on bias, precision, and accuracy of posterior estimates. Without specifying prior information in our Bayesian SIMM, it was not possible to produce SIMM posteriors statistically similar to the estimated diet in our controlled study or the diet of wild wolves. Our study demonstrates the critical effect of prior information on estimates of animal diets using Bayesian SIMMs, and suggests species-specific trophic discrimination factors are of secondary importance. When using stable isotope analysis to inform conservation decisions researchers should understand the limits of their data. It may be difficult to obtain useful information from SIMMs if informative priors are omitted and species-specific discrimination factors are unavailable. PMID:25803664

  2. Experimentally derived δ¹³C and δ¹⁵N discrimination factors for gray wolves and the impact of prior information in Bayesian mixing models.

    PubMed

    Derbridge, Jonathan J; Merkle, Jerod A; Bucci, Melanie E; Callahan, Peggy; Koprowski, John L; Polfus, Jean L; Krausman, Paul R

    2015-01-01

    Stable isotope analysis of diet has become a common tool in conservation research. However, the multiple sources of uncertainty inherent in this analysis framework involve consequences that have not been thoroughly addressed. Uncertainty arises from the choice of trophic discrimination factors, and for Bayesian stable isotope mixing models (SIMMs), the specification of prior information; the combined effect of these aspects has not been explicitly tested. We used a captive feeding study of gray wolves (Canis lupus) to determine the first experimentally-derived trophic discrimination factors of C and N for this large carnivore of broad conservation interest. Using the estimated diet in our controlled system and data from a published study on wild wolves and their prey in Montana, USA, we then investigated the simultaneous effect of discrimination factors and prior information on diet reconstruction with Bayesian SIMMs. Discrimination factors for gray wolves and their prey were 1.97‰ for δ13C and 3.04‰ for δ15N. Specifying wolf discrimination factors, as opposed to the commonly used red fox (Vulpes vulpes) factors, made little practical difference to estimates of wolf diet, but prior information had a strong effect on bias, precision, and accuracy of posterior estimates. Without specifying prior information in our Bayesian SIMM, it was not possible to produce SIMM posteriors statistically similar to the estimated diet in our controlled study or the diet of wild wolves. Our study demonstrates the critical effect of prior information on estimates of animal diets using Bayesian SIMMs, and suggests species-specific trophic discrimination factors are of secondary importance. When using stable isotope analysis to inform conservation decisions researchers should understand the limits of their data. It may be difficult to obtain useful information from SIMMs if informative priors are omitted and species-specific discrimination factors are unavailable.

  3. Uncertainty analysis of gross primary production partitioned from net ecosystem exchange measurements

    NASA Astrophysics Data System (ADS)

    Raj, R.; Hamm, N. A. S.; van der Tol, C.; Stein, A.

    2015-08-01

    Gross primary production (GPP), separated from flux tower measurements of net ecosystem exchange (NEE) of CO2, is used increasingly to validate process-based simulators and remote sensing-derived estimates of simulated GPP at various time steps. Proper validation should include the uncertainty associated with this separation at different time steps. This can be achieved by using a Bayesian framework. In this study, we estimated the uncertainty in GPP at half hourly time steps. We used a non-rectangular hyperbola (NRH) model to separate GPP from flux tower measurements of NEE at the Speulderbos forest site, The Netherlands. The NRH model included the variables that influence GPP, in particular radiation, and temperature. In addition, the NRH model provided a robust empirical relationship between radiation and GPP by including the degree of curvature of the light response curve. Parameters of the NRH model were fitted to the measured NEE data for every 10-day period during the growing season (April to October) in 2009. Adopting a Bayesian approach, we defined the prior distribution of each NRH parameter. Markov chain Monte Carlo (MCMC) simulation was used to update the prior distribution of each NRH parameter. This allowed us to estimate the uncertainty in the separated GPP at half-hourly time steps. This yielded the posterior distribution of GPP at each half hour and allowed the quantification of uncertainty. The time series of posterior distributions thus obtained allowed us to estimate the uncertainty at daily time steps. We compared the informative with non-informative prior distributions of the NRH parameters. The results showed that both choices of prior produced similar posterior distributions GPP. This will provide relevant and important information for the validation of process-based simulators in the future. Furthermore, the obtained posterior distributions of NEE and the NRH parameters are of interest for a range of applications.

  4. PET image reconstruction using multi-parametric anato-functional priors

    NASA Astrophysics Data System (ADS)

    Mehranian, Abolfazl; Belzunce, Martin A.; Niccolini, Flavia; Politis, Marios; Prieto, Claudia; Turkheimer, Federico; Hammers, Alexander; Reader, Andrew J.

    2017-08-01

    In this study, we investigate the application of multi-parametric anato-functional (MR-PET) priors for the maximum a posteriori (MAP) reconstruction of brain PET data in order to address the limitations of the conventional anatomical priors in the presence of PET-MR mismatches. In addition to partial volume correction benefits, the suitability of these priors for reconstruction of low-count PET data is also introduced and demonstrated, comparing to standard maximum-likelihood (ML) reconstruction of high-count data. The conventional local Tikhonov and total variation (TV) priors and current state-of-the-art anatomical priors including the Kaipio, non-local Tikhonov prior with Bowsher and Gaussian similarity kernels are investigated and presented in a unified framework. The Gaussian kernels are calculated using both voxel- and patch-based feature vectors. To cope with PET and MR mismatches, the Bowsher and Gaussian priors are extended to multi-parametric priors. In addition, we propose a modified joint Burg entropy prior that by definition exploits all parametric information in the MAP reconstruction of PET data. The performance of the priors was extensively evaluated using 3D simulations and two clinical brain datasets of [18F]florbetaben and [18F]FDG radiotracers. For simulations, several anato-functional mismatches were intentionally introduced between the PET and MR images, and furthermore, for the FDG clinical dataset, two PET-unique active tumours were embedded in the PET data. Our simulation results showed that the joint Burg entropy prior far outperformed the conventional anatomical priors in terms of preserving PET unique lesions, while still reconstructing functional boundaries with corresponding MR boundaries. In addition, the multi-parametric extension of the Gaussian and Bowsher priors led to enhanced preservation of edge and PET unique features and also an improved bias-variance performance. In agreement with the simulation results, the clinical results also showed that the Gaussian prior with voxel-based feature vectors, the Bowsher and the joint Burg entropy priors were the best performing priors. However, for the FDG dataset with simulated tumours, the TV and proposed priors were capable of preserving the PET-unique tumours. Finally, an important outcome was the demonstration that the MAP reconstruction of a low-count FDG PET dataset using the proposed joint entropy prior can lead to comparable image quality to a conventional ML reconstruction with up to 5 times more counts. In conclusion, multi-parametric anato-functional priors provide a solution to address the pitfalls of the conventional priors and are therefore likely to increase the diagnostic confidence in MR-guided PET image reconstructions.

  5. Random walks with shape prior for cochlea segmentation in ex vivo μCT.

    PubMed

    Ruiz Pujadas, Esmeralda; Kjer, Hans Martin; Piella, Gemma; Ceresa, Mario; González Ballester, Miguel Angel

    2016-09-01

    Cochlear implantation is a safe and effective surgical procedure to restore hearing in deaf patients. However, the level of restoration achieved may vary due to differences in anatomy, implant type and surgical access. In order to reduce the variability of the surgical outcomes, we previously proposed the use of a high-resolution model built from [Formula: see text] images and then adapted to patient-specific clinical CT scans. As the accuracy of the model is dependent on the precision of the original segmentation, it is extremely important to have accurate [Formula: see text] segmentation algorithms. We propose a new framework for cochlea segmentation in ex vivo [Formula: see text] images using random walks where a distance-based shape prior is combined with a region term estimated by a Gaussian mixture model. The prior is also weighted by a confidence map to adjust its influence according to the strength of the image contour. Random walks is performed iteratively, and the prior mask is aligned in every iteration. We tested the proposed approach in ten [Formula: see text] data sets and compared it with other random walks-based segmentation techniques such as guided random walks (Eslami et al. in Med Image Anal 17(2):236-253, 2013) and constrained random walks (Li et al. in Advances in image and video technology. Springer, Berlin, pp 215-226, 2012). Our approach demonstrated higher accuracy results due to the probability density model constituted by the region term and shape prior information weighed by a confidence map. The weighted combination of the distance-based shape prior with a region term into random walks provides accurate segmentations of the cochlea. The experiments suggest that the proposed approach is robust for cochlea segmentation.

  6. Exploring Encoding and Retrieval Effects of Background Information on Text Memory

    ERIC Educational Resources Information Center

    Rawson, Katherine A.; Kintsch, Walter

    2004-01-01

    Two experiments were conducted (a) to evaluate how providing background information at test may benefit retrieval and (b) to further examine how providing background information prior to study influences encoding. Half of the participants read background information prior to study, and the other half did not. In each group, half were presented…

  7. Adaptive local thresholding for robust nucleus segmentation utilizing shape priors

    NASA Astrophysics Data System (ADS)

    Wang, Xiuzhong; Srinivas, Chukka

    2016-03-01

    This paper describes a novel local thresholding method for foreground detection. First, a Canny edge detection method is used for initial edge detection. Then, tensor voting is applied on the initial edge pixels, using a nonsymmetric tensor field tailored to encode prior information about nucleus size, shape, and intensity spatial distribution. Tensor analysis is then performed to generate the saliency image and, based on that, the refined edge. Next, the image domain is divided into blocks. In each block, at least one foreground and one background pixel are sampled for each refined edge pixel. The saliency weighted foreground histogram and background histogram are then created. These two histograms are used to calculate a threshold by minimizing the background and foreground pixel classification error. The block-wise thresholds are then used to generate the threshold for each pixel via interpolation. Finally, the foreground is obtained by comparing the original image with the threshold image. The effective use of prior information, combined with robust techniques, results in far more reliable foreground detection, which leads to robust nucleus segmentation.

  8. Feasibility of reusing time-matched controls in an overlapping cohort.

    PubMed

    Delcoigne, Bénédicte; Hagenbuch, Niels; Schelin, Maria Ec; Salim, Agus; Lindström, Linda S; Bergh, Jonas; Czene, Kamila; Reilly, Marie

    2018-06-01

    The methods developed for secondary analysis of nested case-control data have been illustrated only in simplified settings in a common cohort and have not found their way into biostatistical practice. This paper demonstrates the feasibility of reusing prior nested case-control data in a realistic setting where a new outcome is available in an overlapping cohort where no new controls were gathered and where all data have been anonymised. Using basic information about the background cohort and sampling criteria, the new cases and prior data are "aligned" to identify the common underlying study base. With this study base, a Kaplan-Meier table of the prior outcome extracts the risk sets required to calculate the weights to assign to the controls to remove the sampling bias. A weighted Cox regression, implemented in standard statistical software, provides unbiased hazard ratios. Using the method to compare cases of contralateral breast cancer to available controls from a prior study of metastases, we identified a multifocal tumor as a risk factor that has not been reported previously. We examine the sensitivity of the method to an imperfect weighting scheme and discuss its merits and pitfalls to provide guidance for its use in medical research studies.

  9. Authorship Attribution with Function Word N-Grams

    ERIC Educational Resources Information Center

    Johnson, Rusty

    2013-01-01

    Prior research has considered the sequential order of function words, after the contextual words of the text have been removed, as a stylistic indicator of authorship. This research describes an effort to enhance authorship attribution accuracy based on this same information source with alternate classifiers, alternate n-gram construction methods,…

  10. Plumber's Helper. Coordinator's Guide. Individualized Study Guide.

    ERIC Educational Resources Information Center

    Traylor, Charles R.

    This individualized, competency-based study guide is designed to assist teacher-coordinators supervising cooperative education programs for plumber's helpers in providing students with general information for immediate reinforcement on the job and developing an understanding of the job prior to employment. A progress chart is provided to allow the…

  11. 47 CFR 64.2010 - Safeguards on the disclosure of customer proprietary network information.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... authenticate a customer prior to disclosing CPNI based on customer-initiated telephone contact, online account... customer. (c) Online access to CPNI. A telecommunications carrier must authenticate a customer without the... customer online access to CPNI related to a telecommunications service account. Once authenticated, the...

  12. 47 CFR 64.2010 - Safeguards on the disclosure of customer proprietary network information.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... authenticate a customer prior to disclosing CPNI based on customer-initiated telephone contact, online account... customer. (c) Online access to CPNI. A telecommunications carrier must authenticate a customer without the... customer online access to CPNI related to a telecommunications service account. Once authenticated, the...

  13. 47 CFR 64.2010 - Safeguards on the disclosure of customer proprietary network information.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... authenticate a customer prior to disclosing CPNI based on customer-initiated telephone contact, online account... customer. (c) Online access to CPNI. A telecommunications carrier must authenticate a customer without the... customer online access to CPNI related to a telecommunications service account. Once authenticated, the...

  14. Semantic Similarity of Labels and Inductive Generalization: Taking a Second Look

    ERIC Educational Resources Information Center

    Fisher, Anna V.; Matlen, Bryan J.; Godwin, Karrie E.

    2011-01-01

    Prior research suggests that preschoolers can generalize object properties based on category information conveyed by semantically-similar labels. However, previous research did not control for co-occurrence probability of labels in natural speech. The current studies re-assessed children's generalization with semantically-similar labels.…

  15. A Data Base for Curriculum Design in Medical Ethics.

    ERIC Educational Resources Information Center

    Tiberius, Richard G.; Cleave-Hogg, Doreen

    1984-01-01

    A study to provide information about medical students' prior knowledge of and attitudes toward medical ethics is reported. A questionnaire was administered to 845 entering medical students at the University of Toronto. The results support the need for a course that requires thinking rather than rote memory. (Author/MLW)

  16. The impact of using informative priors in a Bayesian cost-effectiveness analysis: an application of endovascular versus open surgical repair for abdominal aortic aneurysms in high-risk patients.

    PubMed

    McCarron, C Elizabeth; Pullenayegum, Eleanor M; Thabane, Lehana; Goeree, Ron; Tarride, Jean-Eric

    2013-04-01

    Bayesian methods have been proposed as a way of synthesizing all available evidence to inform decision making. However, few practical applications of the use of Bayesian methods for combining patient-level data (i.e., trial) with additional evidence (e.g., literature) exist in the cost-effectiveness literature. The objective of this study was to compare a Bayesian cost-effectiveness analysis using informative priors to a standard non-Bayesian nonparametric method to assess the impact of incorporating additional information into a cost-effectiveness analysis. Patient-level data from a previously published nonrandomized study were analyzed using traditional nonparametric bootstrap techniques and bivariate normal Bayesian models with vague and informative priors. Two different types of informative priors were considered to reflect different valuations of the additional evidence relative to the patient-level data (i.e., "face value" and "skeptical"). The impact of using different distributions and valuations was assessed in a sensitivity analysis. Models were compared in terms of incremental net monetary benefit (INMB) and cost-effectiveness acceptability frontiers (CEAFs). The bootstrapping and Bayesian analyses using vague priors provided similar results. The most pronounced impact of incorporating the informative priors was the increase in estimated life years in the control arm relative to what was observed in the patient-level data alone. Consequently, the incremental difference in life years originally observed in the patient-level data was reduced, and the INMB and CEAF changed accordingly. The results of this study demonstrate the potential impact and importance of incorporating additional information into an analysis of patient-level data, suggesting this could alter decisions as to whether a treatment should be adopted and whether more information should be acquired.

  17. Bayesian bivariate meta-analysis of diagnostic test studies with interpretable priors.

    PubMed

    Guo, Jingyi; Riebler, Andrea; Rue, Håvard

    2017-08-30

    In a bivariate meta-analysis, the number of diagnostic studies involved is often very low so that frequentist methods may result in problems. Using Bayesian inference is particularly attractive as informative priors that add a small amount of information can stabilise the analysis without overwhelming the data. However, Bayesian analysis is often computationally demanding and the selection of the prior for the covariance matrix of the bivariate structure is crucial with little data. The integrated nested Laplace approximations method provides an efficient solution to the computational issues by avoiding any sampling, but the important question of priors remain. We explore the penalised complexity (PC) prior framework for specifying informative priors for the variance parameters and the correlation parameter. PC priors facilitate model interpretation and hyperparameter specification as expert knowledge can be incorporated intuitively. We conduct a simulation study to compare the properties and behaviour of differently defined PC priors to currently used priors in the field. The simulation study shows that the PC prior seems beneficial for the variance parameters. The use of PC priors for the correlation parameter results in more precise estimates when specified in a sensible neighbourhood around the truth. To investigate the usage of PC priors in practice, we reanalyse a meta-analysis using the telomerase marker for the diagnosis of bladder cancer and compare the results with those obtained by other commonly used modelling approaches. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  18. Explaining the forgetting bias effect on value judgments: The influence of memory for a past test.

    PubMed

    Rhodes, Matthew G; Witherby, Amber E; Castel, Alan D; Murayama, Kou

    2017-04-01

    People often feel that information that was forgotten is less important than remembered information. Prior work has shown that participants assign higher importance to remembered information while undervaluing forgotten information. The current study examined two possible accounts of this finding. In three experiments, participants studied lists of words in which each word was randomly assigned a point value denoting the value of remembering the word. Following the presentation of each list participants engaged in a free recall test. After the presentation of all lists participants were shown each of the words they had studied and asked to recall the point value that was initially paired with each word. Experiment 1 tested a fluency-based account by presenting items for value judgments in a low-fluency or high-fluency format. Experiment 2 examined whether value judgments reflect attributions based on the familiarity of an item when value judgments are made. Finally, in Experiment 3, we evaluated whether participants believe that forgotten words are less important by having them judge whether an item was initially recalled or forgotten prior to making a value judgment. Manipulating the fluency of an item presented for judgment had no influence on value ratings (Experiment 1) and familiarity exerted a limited influence on value judgments (Experiment 2). More importantly, participants' value judgments appeared to reflect a theory that remembered information is more valuable than forgotten information (Experiment 3). Overall, the present work suggests that individuals may apply a theory about remembering and forgetting to retrospectively assess the value of information.

  19. Diagnostic labeling of COPD in five Latin American cities.

    PubMed

    Tálamo, Carlos; de Oca, Maria Montes; Halbert, Ron; Perez-Padilla, Rogelio; Jardim, José Roberto B; Muiño, Adriana; Lopez, Maria Victorina; Valdivia, Gonzalo; Pertuzé, Julio; Moreno, Dolores; Menezes, Ana Maria B

    2007-01-01

    COPD is a major worldwide problem with a rising prevalence. Despite its importance, there is a lack of information regarding underdiagnosis and misdiagnosis of COPD in different countries. As part of the Proyecto Latinoamericano de Investigación en Obstrucción Pulmonar study, we examined the relationship between prior diagnostic label and airway obstruction in the metropolitan areas of five Latin American cities (São Paulo, Santiago, Mexico City, Montevideo, and Caracas). A two-stage sampling strategy was used in each of the five areas to obtain probability samples of adults aged >or= 40 years. Participants completed a questionnaire that included questions on prior diagnoses, and prebronchodilator and postbronchodilator spirometry. A study diagnosis of COPD was based on airway obstruction, defined as a postbronchodilator FEV(1)/FVC < 0.70. Valid spirometry and prior diagnosis information was obtained for 5,303 participants; 758 subjects had a study diagnosis of COPD, of which 672 cases (88.7%) had not been previously diagnosed. The prevalence of undiagnosed COPD was 12.7%, ranging from 6.9% in Mexico City to 18.2% in Montevideo. Among 237 subjects with a prior COPD diagnosis, only 86 subjects (36.3%) had postbronchodilator FEV(1)/FVC < 0.7, while 151 subjects (63.7%) had normal spirometric values. In the same group of 237 subjects, only 34% reported ever undergoing spirometry prior to our study. Inaccurate diagnostic labeling of COPD represents an important health problem in Latin America. One possible explanation is the low rate of spirometry for COPD diagnosis.

  20. Dimensionality of the 9-item Utrecht Work Engagement Scale revisited: A Bayesian structural equation modeling approach.

    PubMed

    Fong, Ted C T; Ho, Rainbow T H

    2015-01-01

    The aim of this study was to reexamine the dimensionality of the widely used 9-item Utrecht Work Engagement Scale using the maximum likelihood (ML) approach and Bayesian structural equation modeling (BSEM) approach. Three measurement models (1-factor, 3-factor, and bi-factor models) were evaluated in two split samples of 1,112 health-care workers using confirmatory factor analysis and BSEM, which specified small-variance informative priors for cross-loadings and residual covariances. Model fit and comparisons were evaluated by posterior predictive p-value (PPP), deviance information criterion, and Bayesian information criterion (BIC). None of the three ML-based models showed an adequate fit to the data. The use of informative priors for cross-loadings did not improve the PPP for the models. The 1-factor BSEM model with approximately zero residual covariances displayed a good fit (PPP>0.10) to both samples and a substantially lower BIC than its 3-factor and bi-factor counterparts. The BSEM results demonstrate empirical support for the 1-factor model as a parsimonious and reasonable representation of work engagement.

  1. Perceptual constancy in auditory perception of distance to railway tracks.

    PubMed

    De Coensel, Bert; Nilsson, Mats E; Berglund, Birgitta; Brown, A L

    2013-07-01

    Distance to a sound source can be accurately estimated solely from auditory information. With a sound source such as a train that is passing by at a relatively large distance, the most important auditory information for the listener for estimating its distance consists of the intensity of the sound, spectral changes in the sound caused by air absorption, and the motion-induced rate of change of intensity. However, these cues are relative because prior information/experience of the sound source-its source power, its spectrum and the typical speed at which it moves-is required for such distance estimates. This paper describes two listening experiments that allow investigation of further prior contextual information taken into account by listeners-viz., whether they are indoors or outdoors. Asked to estimate the distance to the track of a railway, it is shown that listeners assessing sounds heard inside the dwelling based their distance estimates on the expected train passby sound level outdoors rather than on the passby sound level actually experienced indoors. This form of perceptual constancy may have consequences for the assessment of annoyance caused by railway noise.

  2. A modified non-binary LDPC scheme based on watermark symbols in high speed optical transmission systems

    NASA Astrophysics Data System (ADS)

    Wang, Liming; Qiao, Yaojun; Yu, Qian; Zhang, Wenbo

    2016-04-01

    We introduce a watermark non-binary low-density parity check code (NB-LDPC) scheme, which can estimate the time-varying noise variance by using prior information of watermark symbols, to improve the performance of NB-LDPC codes. And compared with the prior-art counterpart, the watermark scheme can bring about 0.25 dB improvement in net coding gain (NCG) at bit error rate (BER) of 1e-6 and 36.8-81% reduction of the iteration numbers. Obviously, the proposed scheme shows great potential in terms of error correction performance and decoding efficiency.

  3. NASA's Human Mission to a Near-Earth Asteroid: Landing on a Moving Target

    NASA Technical Reports Server (NTRS)

    Smith, Jeffrey H.; Lincoln, William P.; Weisbin, Charles R.

    2011-01-01

    This paper describes a Bayesian approach for comparing the productivity and cost-risk tradeoffs of sending versus not sending one or more robotic surveyor missions prior to a human mission to land on an asteroid. The expected value of sample information based on productivity combined with parametric variations in the prior probability an asteroid might be found suitable for landing were used to assess the optimal number of spacecraft and asteroids to survey. The analysis supports the value of surveyor missions to asteroids and indicates one launch with two spacecraft going simultaneously to two independent asteroids appears optimal.

  4. The integration of probabilistic information during sensorimotor estimation is unimpaired in children with Cerebral Palsy

    PubMed Central

    Sokhey, Taegh; Gaebler-Spira, Deborah; Kording, Konrad P.

    2017-01-01

    Background It is important to understand the motor deficits of children with Cerebral Palsy (CP). Our understanding of this motor disorder can be enriched by computational models of motor control. One crucial stage in generating movement involves combining uncertain information from different sources, and deficits in this process could contribute to reduced motor function in children with CP. Healthy adults can integrate previously-learned information (prior) with incoming sensory information (likelihood) in a close-to-optimal way when estimating object location, consistent with the use of Bayesian statistics. However, there are few studies investigating how children with CP perform sensorimotor integration. We compare sensorimotor estimation in children with CP and age-matched controls using a model-based analysis to understand the process. Methods and findings We examined Bayesian sensorimotor integration in children with CP, aged between 5 and 12 years old, with Gross Motor Function Classification System (GMFCS) levels 1–3 and compared their estimation behavior with age-matched typically-developing (TD) children. We used a simple sensorimotor estimation task which requires participants to combine probabilistic information from different sources: a likelihood distribution (current sensory information) with a prior distribution (learned target information). In order to examine sensorimotor integration, we quantified how participants weighed statistical information from the two sources (prior and likelihood) and compared this to the statistical optimal weighting. We found that the weighing of statistical information in children with CP was as statistically efficient as that of TD children. Conclusions We conclude that Bayesian sensorimotor integration is not impaired in children with CP and therefore, does not contribute to their motor deficits. Future research has the potential to enrich our understanding of motor disorders by investigating the stages of motor processing set out by computational models. Therapeutic interventions should exploit the ability of children with CP to use statistical information. PMID:29186196

  5. Objective Bayesian analysis of neutrino masses and hierarchy

    NASA Astrophysics Data System (ADS)

    Heavens, Alan F.; Sellentin, Elena

    2018-04-01

    Given the precision of current neutrino data, priors still impact noticeably the constraints on neutrino masses and their hierarchy. To avoid our understanding of neutrinos being driven by prior assumptions, we construct a prior that is mathematically minimally informative. Using the constructed uninformative prior, we find that the normal hierarchy is favoured but with inconclusive posterior odds of 5.1:1. Better data is hence needed before the neutrino masses and their hierarchy can be well constrained. We find that the next decade of cosmological data should provide conclusive evidence if the normal hierarchy with negligible minimum mass is correct, and if the uncertainty in the sum of neutrino masses drops below 0.025 eV. On the other hand, if neutrinos obey the inverted hierarchy, achieving strong evidence will be difficult with the same uncertainties. Our uninformative prior was constructed from principles of the Objective Bayesian approach. The prior is called a reference prior and is minimally informative in the specific sense that the information gain after collection of data is maximised. The prior is computed for the combination of neutrino oscillation data and cosmological data and still applies if the data improve.

  6. Short-term memory affects color perception in context.

    PubMed

    Olkkonen, Maria; Allred, Sarah R

    2014-01-01

    Color-based object selection - for instance, looking for ripe tomatoes in the market - places demands on both perceptual and memory processes: it is necessary to form a stable perceptual estimate of surface color from a variable visual signal, as well as to retain multiple perceptual estimates in memory while comparing objects. Nevertheless, perceptual and memory processes in the color domain are generally studied in separate research programs with the assumption that they are independent. Here, we demonstrate a strong failure of independence between color perception and memory: the effect of context on color appearance is substantially weakened by a short retention interval between a reference and test stimulus. This somewhat counterintuitive result is consistent with Bayesian estimation: as the precision of the representation of the reference surface and its context decays in memory, prior information gains more weight, causing the retained percepts to be drawn toward prior information about surface and context color. This interaction implies that to fully understand information processing in real-world color tasks, perception and memory need to be considered jointly.

  7. A 3-Component Mixture of Rayleigh Distributions: Properties and Estimation in Bayesian Framework

    PubMed Central

    Aslam, Muhammad; Tahir, Muhammad; Hussain, Zawar; Al-Zahrani, Bander

    2015-01-01

    To study lifetimes of certain engineering processes, a lifetime model which can accommodate the nature of such processes is desired. The mixture models of underlying lifetime distributions are intuitively more appropriate and appealing to model the heterogeneous nature of process as compared to simple models. This paper is about studying a 3-component mixture of the Rayleigh distributionsin Bayesian perspective. The censored sampling environment is considered due to its popularity in reliability theory and survival analysis. The expressions for the Bayes estimators and their posterior risks are derived under different scenarios. In case the case that no or little prior information is available, elicitation of hyperparameters is given. To examine, numerically, the performance of the Bayes estimators using non-informative and informative priors under different loss functions, we have simulated their statistical properties for different sample sizes and test termination times. In addition, to highlight the practical significance, an illustrative example based on a real-life engineering data is also given. PMID:25993475

  8. Breast cancer patients' information seeking during surgical consultations: A qualitative, videotape-based analysis of patients' questions.

    PubMed

    Robinson, Jeffrey D; Venetis, Maria; Street, Richard L; Kearney, Thomas

    2016-12-01

    Despite data on breast cancer patients' information needs and their association with patient outcomes, there are currently no data on what U.S. patients actually ask surgeons during primary consultations. Working from transcripts of videotaped, treatment decision making consultations between breast cancer patients and surgeons, we identify all questions (by patients and companions) and then use grounded theory techniques to determine the most recurrent question-asking themes. Sample includes 132 recently diagnosed (M = 8.9 days), late-middle-aged (M = 61.2 years), female patients with predominantly early stage (0-1; 78%), first-time breast cancer (92.4%) consulting with one of nine surgeons in community based offices. Transcripts contained 2,781 questions (1,929 by patients, 852 by companions; Cohen's Kappa = 0.90), which generated 15 patient question asking themes that were represented (i.e., asked about) at least once in >20% of all consultations. Question asking themes are a concrete index of what patients want to know more about prior to treatment. Identified themes specify, modify, and extend prior findings based on self-report data. Findings potentially increase surgeons' levels of patient centered care by improving surgeons' abilities to satisfactorily address patients' information needs, which has the potential to improve both patient outcomes and clinical practice guidelines. J. Surg. Oncol. 2016;114:922-929. © 2016 Wiley Periodicals, Inc. © 2016 Wiley Periodicals, Inc.

  9. Army women's sexual health information needs.

    PubMed

    von Sadovszky, Victoria; Ryan-Wenger, Nancy

    2007-01-01

    To ascertain Army women's specific sexual health information needs prior to developing a theoretically based, self-administered intervention to promote safer sexual practices during deployment. An exploratory design was employed to address the research questions. Participants (N= 131) were Army women recruited from Army posts around the United States. The women ranged in age from 18 to 68 years (M= 30.8, SD= 10.5), were of varied ethnicity, and had an average time in service of 8.0 years (SD= 6.6). Desire for knowledge about sexual health and safer sexual practices were measured with forced-choice responses based upon DiIorio's Safer Sex Questionnaire (DiIorio, Parsons, Lehr, Adame, & Carlone, 1992) and open-ended questions to assess past information received, quality of that information, and information desired. Participants had moderate levels of sexual risk behaviors. Forced-choice responses yielded little desire for information regarding safer sexual practices. Women identified different sexual health and safer sexual information needs based upon whether they were at a normal duty station or during deployment. Participants did not identify many information needs; however, their sexual behaviors indicate the need for interventions.

  10. LORAKS makes better SENSE: Phase-constrained partial fourier SENSE reconstruction without phase calibration.

    PubMed

    Kim, Tae Hyung; Setsompop, Kawin; Haldar, Justin P

    2017-03-01

    Parallel imaging and partial Fourier acquisition are two classical approaches for accelerated MRI. Methods that combine these approaches often rely on prior knowledge of the image phase, but the need to obtain this prior information can place practical restrictions on the data acquisition strategy. In this work, we propose and evaluate SENSE-LORAKS, which enables combined parallel imaging and partial Fourier reconstruction without requiring prior phase information. The proposed formulation is based on combining the classical SENSE model for parallel imaging data with the more recent LORAKS framework for MR image reconstruction using low-rank matrix modeling. Previous LORAKS-based methods have successfully enabled calibrationless partial Fourier parallel MRI reconstruction, but have been most successful with nonuniform sampling strategies that may be hard to implement for certain applications. By combining LORAKS with SENSE, we enable highly accelerated partial Fourier MRI reconstruction for a broader range of sampling trajectories, including widely used calibrationless uniformly undersampled trajectories. Our empirical results with retrospectively undersampled datasets indicate that when SENSE-LORAKS reconstruction is combined with an appropriate k-space sampling trajectory, it can provide substantially better image quality at high-acceleration rates relative to existing state-of-the-art reconstruction approaches. The SENSE-LORAKS framework provides promising new opportunities for highly accelerated MRI. Magn Reson Med 77:1021-1035, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  11. A randomized comparative trial of two decision tools for pregnant women with prior cesareans.

    PubMed

    Eden, Karen B; Perrin, Nancy A; Vesco, Kimberly K; Guise, Jeanne-Marie

    2014-01-01

    Evaluate tools to help pregnant women with prior cesareans make informed decisions about having trials of labor. Randomized comparative trial. A research assistant with a laptop met the women in quiet locations at clinics and at health fairs. Pregnant women (N = 131) who had one prior cesarean and were eligible for vaginal birth after cesarean (VBAC) participated one time between 2005 and 2007. Women were randomized to receive either an evidence-based, interactive decision aid or two evidence-based educational brochures about cesarean delivery and VBAC. Effect on the decision-making process was assessed before and after the interventions. Compared to baseline, women in both groups felt more informed (F = 23.8, p < .001), were more clear about their birth priorities (F = 9.7, p = .002), felt more supported (F = 9.8, p = .002, and overall reported less conflict (F = 18.1, p < 0.001) after receiving either intervention. Women in their third trimesters reported greater clarity around birth priorities after using the interactive decision aid than women given brochures (F = 9.8, p = .003). Although both decision tools significantly reduced conflict around the birth decision compared to baseline, more work is needed to understand which format, the interactive decision aid or paper brochures, are more effective early and late in pregnancy. © 2014 AWHONN, the Association of Women's Health, Obstetric and Neonatal Nurses.

  12. The Critical Role of Retrieval Processes in Release from Proactive Interference

    ERIC Educational Resources Information Center

    Bauml, Karl-Heinz T.; Kliegl, Oliver

    2013-01-01

    Proactive interference (PI) refers to the finding that memory for recently studied (target) information can be vastly impaired by the previous study of other (nontarget) information. PI can be reduced in a number of ways, for instance, by directed forgetting of the prior nontarget information, the testing of the prior nontarget information, or an…

  13. Optimal Multiple Surface Segmentation With Shape and Context Priors

    PubMed Central

    Bai, Junjie; Garvin, Mona K.; Sonka, Milan; Buatti, John M.; Wu, Xiaodong

    2014-01-01

    Segmentation of multiple surfaces in medical images is a challenging problem, further complicated by the frequent presence of weak boundary evidence, large object deformations, and mutual influence between adjacent objects. This paper reports a novel approach to multi-object segmentation that incorporates both shape and context prior knowledge in a 3-D graph-theoretic framework to help overcome the stated challenges. We employ an arc-based graph representation to incorporate a wide spectrum of prior information through pair-wise energy terms. In particular, a shape-prior term is used to penalize local shape changes and a context-prior term is used to penalize local surface-distance changes from a model of the expected shape and surface distances, respectively. The globally optimal solution for multiple surfaces is obtained by computing a maximum flow in a low-order polynomial time. The proposed method was validated on intraretinal layer segmentation of optical coherence tomography images and demonstrated statistically significant improvement of segmentation accuracy compared to our earlier graph-search method that was not utilizing shape and context priors. The mean unsigned surface positioning errors obtained by the conventional graph-search approach (6.30 ± 1.58 μm) was improved to 5.14 ± 0.99 μm when employing our new method with shape and context priors. PMID:23193309

  14. Autistic traits, but not schizotypy, predict increased weighting of sensory information in Bayesian visual integration.

    PubMed

    Karvelis, Povilas; Seitz, Aaron R; Lawrie, Stephen M; Seriès, Peggy

    2018-05-14

    Recent theories propose that schizophrenia/schizotypy and autistic spectrum disorder are related to impairments in Bayesian inference that is, how the brain integrates sensory information (likelihoods) with prior knowledge. However existing accounts fail to clarify: (i) how proposed theories differ in accounts of ASD vs. schizophrenia and (ii) whether the impairments result from weaker priors or enhanced likelihoods. Here, we directly address these issues by characterizing how 91 healthy participants, scored for autistic and schizotypal traits, implicitly learned and combined priors with sensory information. This was accomplished through a visual statistical learning paradigm designed to quantitatively assess variations in individuals' likelihoods and priors. The acquisition of the priors was found to be intact along both traits spectra. However, autistic traits were associated with more veridical perception and weaker influence of expectations. Bayesian modeling revealed that this was due, not to weaker prior expectations, but to more precise sensory representations. © 2018, Karvelis et al.

  15. Whatever the cost? Information integration in memory-based inferences depends on cognitive effort.

    PubMed

    Hilbig, Benjamin E; Michalkiewicz, Martha; Castela, Marta; Pohl, Rüdiger F; Erdfelder, Edgar

    2015-05-01

    One of the most prominent models of probabilistic inferences from memory is the simple recognition heuristic (RH). The RH theory assumes that judgments are based on recognition in isolation, such that other information is ignored. However, some prior research has shown that available knowledge is not generally ignored. In line with the notion of adaptive strategy selection--and, thus, a trade-off between accuracy and effort--we hypothesized that information integration crucially depends on how easily accessible information beyond recognition is, how much confidence decision makers have in this information, and how (cognitively) costly it is to acquire it. In three experiments, we thus manipulated (a) the availability of information beyond recognition, (b) the subjective usefulness of this information, and (c) the cognitive costs associated with acquiring this information. In line with the predictions, we found that RH use decreased substantially, the more easily and confidently information beyond recognition could be integrated, and increased substantially with increasing cognitive costs.

  16. Informative priors based on transcription factor structural class improve de novo motif discovery.

    PubMed

    Narlikar, Leelavati; Gordân, Raluca; Ohler, Uwe; Hartemink, Alexander J

    2006-07-15

    An important problem in molecular biology is to identify the locations at which a transcription factor (TF) binds to DNA, given a set of DNA sequences believed to be bound by that TF. In previous work, we showed that information in the DNA sequence of a binding site is sufficient to predict the structural class of the TF that binds it. In particular, this suggests that we can predict which locations in any DNA sequence are more likely to be bound by certain classes of TFs than others. Here, we argue that traditional methods for de novo motif finding can be significantly improved by adopting an informative prior probability that a TF binding site occurs at each sequence location. To demonstrate the utility of such an approach, we present priority, a powerful new de novo motif finding algorithm. Using data from TRANSFAC, we train three classifiers to recognize binding sites of basic leucine zipper, forkhead, and basic helix loop helix TFs. These classifiers are used to equip priority with three class-specific priors, in addition to a default prior to handle TFs of other classes. We apply priority and a number of popular motif finding programs to sets of yeast intergenic regions that are reported by ChIP-chip to be bound by particular TFs. priority identifies motifs the other methods fail to identify, and correctly predicts the structural class of the TF recognizing the identified binding sites. Supplementary material and code can be found at http://www.cs.duke.edu/~amink/.

  17. Managing Engineering Design Information

    DTIC Science & Technology

    1989-10-01

    aerospace industry, and design operations cannot be delayed until a prior task is completed [Ref. 9]. 5 ReacDevelopment, and ’ Marketin ~g nceptal...Figure 4. Translator Interface Between Application Tools 12 2. Directory Data Base Approach The directory approach uses a data base with the traditional ...Technologies, 1985, pp. 313-320. 17. Bray, O.H., "Computer-Integrated Manufacturing: The Data Management Strategy," Digital Press, Bedford, MA, 1988. 18. Atre

  18. GeneCOST: a novel scoring-based prioritization framework for identifying disease causing genes.

    PubMed

    Ozer, Bugra; Sağıroğlu, Mahmut; Demirci, Hüseyin

    2015-11-15

    Due to the big data produced by next-generation sequencing studies, there is an evident need for methods to extract the valuable information gathered from these experiments. In this work, we propose GeneCOST, a novel scoring-based method to evaluate every gene for their disease association. Without any prior filtering and any prior knowledge, we assign a disease likelihood score to each gene in correspondence with their variations. Then, we rank all genes based on frequency, conservation, pedigree and detailed variation information to find out the causative reason of the disease state. We demonstrate the usage of GeneCOST with public and real life Mendelian disease cases including recessive, dominant, compound heterozygous and sporadic models. As a result, we were able to identify causative reason behind the disease state in top rankings of our list, proving that this novel prioritization framework provides a powerful environment for the analysis in genetic disease studies alternative to filtering-based approaches. GeneCOST software is freely available at www.igbam.bilgem.tubitak.gov.tr/en/softwares/genecost-en/index.html. buozer@gmail.com Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  19. Constant time worker thread allocation via configuration caching

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Eichenberger, Alexandre E; O'Brien, John K. P.

    Mechanisms are provided for allocating threads for execution of a parallel region of code. A request for allocation of worker threads to execute the parallel region of code is received from a master thread. Cached thread allocation information identifying prior thread allocations that have been performed for the master thread are accessed. Worker threads are allocated to the master thread based on the cached thread allocation information. The parallel region of code is executed using the allocated worker threads.

  20. Incorporation of prior information on parameters into nonlinear regression groundwater flow models: 2. Applications

    USGS Publications Warehouse

    Cooley, Richard L.

    1983-01-01

    This paper investigates factors influencing the degree of improvement in estimates of parameters of a nonlinear regression groundwater flow model by incorporating prior information of unknown reliability. Consideration of expected behavior of the regression solutions and results of a hypothetical modeling problem lead to several general conclusions. First, if the parameters are properly scaled, linearized expressions for the mean square error (MSE) in parameter estimates of a nonlinear model will often behave very nearly as if the model were linear. Second, by using prior information, the MSE in properly scaled parameters can be reduced greatly over the MSE of ordinary least squares estimates of parameters. Third, plots of estimated MSE and the estimated standard deviation of MSE versus an auxiliary parameter (the ridge parameter) specifying the degree of influence of the prior information on regression results can help determine the potential for improvement of parameter estimates. Fourth, proposed criteria can be used to make appropriate choices for the ridge parameter and another parameter expressing degree of overall bias in the prior information. Results of a case study of Truckee Meadows, Reno-Sparks area, Washoe County, Nevada, conform closely to the results of the hypothetical problem. In the Truckee Meadows case, incorporation of prior information did not greatly change the parameter estimates from those obtained by ordinary least squares. However, the analysis showed that both sets of estimates are more reliable than suggested by the standard errors from ordinary least squares.

  1. Heuristics as Bayesian inference under extreme priors.

    PubMed

    Parpart, Paula; Jones, Matt; Love, Bradley C

    2018-05-01

    Simple heuristics are often regarded as tractable decision strategies because they ignore a great deal of information in the input data. One puzzle is why heuristics can outperform full-information models, such as linear regression, which make full use of the available information. These "less-is-more" effects, in which a relatively simpler model outperforms a more complex model, are prevalent throughout cognitive science, and are frequently argued to demonstrate an inherent advantage of simplifying computation or ignoring information. In contrast, we show at the computational level (where algorithmic restrictions are set aside) that it is never optimal to discard information. Through a formal Bayesian analysis, we prove that popular heuristics, such as tallying and take-the-best, are formally equivalent to Bayesian inference under the limit of infinitely strong priors. Varying the strength of the prior yields a continuum of Bayesian models with the heuristics at one end and ordinary regression at the other. Critically, intermediate models perform better across all our simulations, suggesting that down-weighting information with the appropriate prior is preferable to entirely ignoring it. Rather than because of their simplicity, our analyses suggest heuristics perform well because they implement strong priors that approximate the actual structure of the environment. We end by considering how new heuristics could be derived by infinitely strengthening the priors of other Bayesian models. These formal results have implications for work in psychology, machine learning and economics. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  2. [Patient information prior to sterilization].

    PubMed

    Rasmussen, O V; Henriksen, L O; Baldur, B; Hansen, T

    1992-09-14

    The law in Denmark prescribes that the patient and the general practitioner to whom the patient directs his or her request for sterilization are obliged to confirm by their signatures that the patient has received information about sterilization, its risk and consequences. We asked 97 men and 96 women, if they had received this information prior to their sterilization. They were also asked about their knowledge about sterilization. 54% of the women and 35% of the men indicated that they had not received information. Only few of these wished further information by the hospital doctor. Knowledge about sterilization was good. It is concluded that the information to the patient prior to sterilization is far from optimal. The patients' signature confirming verbal information is not a sufficient safeguard. We recommend, among other things, that the patient should receive written information and that both the general practitioner and the hospital responsible for the operation should ensure that optimal information is received by the patient.

  3. An Intervention and Assessment to Improve Information Literacy

    ERIC Educational Resources Information Center

    Scharf, Davida

    2013-01-01

    Purpose: The goal of the study was to test an intervention using a brief essay as an instrument for evaluating higher-order information literacy skills in college students, while accounting for prior conditions such as socioeconomic status and prior academic achievement, and identify other predictors of information literacy through an evaluation…

  4. Filtering observations without the initial guess

    NASA Astrophysics Data System (ADS)

    Chin, T. M.; Abbondanza, C.; Gross, R. S.; Heflin, M. B.; Parker, J. W.; Soja, B.; Wu, X.

    2017-12-01

    Noisy geophysical observations sampled irregularly over space and time are often numerically "analyzed" or "filtered" before scientific usage. The standard analysis and filtering techniques based on the Bayesian principle requires "a priori" joint distribution of all the geophysical parameters of interest. However, such prior distributions are seldom known fully in practice, and best-guess mean values (e.g., "climatology" or "background" data if available) accompanied by some arbitrarily set covariance values are often used in lieu. It is therefore desirable to be able to exploit efficient (time sequential) Bayesian algorithms like the Kalman filter while not forced to provide a prior distribution (i.e., initial mean and covariance). An example of this is the estimation of the terrestrial reference frame (TRF) where requirement for numerical precision is such that any use of a priori constraints on the observation data needs to be minimized. We will present the Information Filter algorithm, a variant of the Kalman filter that does not require an initial distribution, and apply the algorithm (and an accompanying smoothing algorithm) to the TRF estimation problem. We show that the information filter allows temporal propagation of partial information on the distribution (marginal distribution of a transformed version of the state vector), instead of the full distribution (mean and covariance) required by the standard Kalman filter. The information filter appears to be a natural choice for the task of filtering observational data in general cases where prior assumption on the initial estimate is not available and/or desirable. For application to data assimilation problems, reduced-order approximations of both the information filter and square-root information filter (SRIF) have been published, and the former has previously been applied to a regional configuration of the HYCOM ocean general circulation model. Such approximation approaches are also briefed in the presentation.

  5. Evaluating the Impact of Genomic Data and Priors on Bayesian Estimates of the Angiosperm Evolutionary Timescale.

    PubMed

    Foster, Charles S P; Sauquet, Hervê; van der Merwe, Marlien; McPherson, Hannah; Rossetto, Maurizio; Ho, Simon Y W

    2017-05-01

    The evolutionary timescale of angiosperms has long been a key question in biology. Molecular estimates of this timescale have shown considerable variation, being influenced by differences in taxon sampling, gene sampling, fossil calibrations, evolutionary models, and choices of priors. Here, we analyze a data set comprising 76 protein-coding genes from the chloroplast genomes of 195 taxa spanning 86 families, including novel genome sequences for 11 taxa, to evaluate the impact of models, priors, and gene sampling on Bayesian estimates of the angiosperm evolutionary timescale. Using a Bayesian relaxed molecular-clock method, with a core set of 35 minimum and two maximum fossil constraints, we estimated that crown angiosperms arose 221 (251-192) Ma during the Triassic. Based on a range of additional sensitivity and subsampling analyses, we found that our date estimates were generally robust to large changes in the parameters of the birth-death tree prior and of the model of rate variation across branches. We found an exception to this when we implemented fossil calibrations in the form of highly informative gamma priors rather than as uniform priors on node ages. Under all other calibration schemes, including trials of seven maximum age constraints, we consistently found that the earliest divergences of angiosperm clades substantially predate the oldest fossils that can be assigned unequivocally to their crown group. Overall, our results and experiments with genome-scale data suggest that reliable estimates of the angiosperm crown age will require increased taxon sampling, significant methodological changes, and new information from the fossil record. [Angiospermae, chloroplast, genome, molecular dating, Triassic.]. © The Author(s) 2016. Published by Oxford University Press, on behalf of the Society of Systematic Biologists. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  6. Voxel inversion of airborne electromagnetic data

    NASA Astrophysics Data System (ADS)

    Auken, E.; Fiandaca, G.; Kirkegaard, C.; Vest Christiansen, A.

    2013-12-01

    Inversion of electromagnetic data usually refers to a model space being linked to the actual observation points, and for airborne surveys the spatial discretization of the model space reflects the flight lines. On the contrary, geological and groundwater models most often refer to a regular voxel grid, not correlated to the geophysical model space. This means that incorporating the geophysical data into the geological and/or hydrological modelling grids involves a spatial relocation of the models, which in itself is a subtle process where valuable information is easily lost. Also the integration of prior information, e.g. from boreholes, is difficult when the observation points do not coincide with the position of the prior information, as well as the joint inversion of airborne and ground-based surveys. We developed a geophysical inversion algorithm working directly in a voxel grid disconnected from the actual measuring points, which then allows for informing directly geological/hydrogeological models, for easier incorporation of prior information and for straightforward integration of different data types in joint inversion. The new voxel model space defines the soil properties (like resistivity) on a set of nodes, and the distribution of the properties is computed everywhere by means of an interpolation function f (e.g. inverse distance or kriging). The position of the nodes is fixed during the inversion and is chosen to sample the soil taking into account topography and inversion resolution. Given this definition of the voxel model space, both 1D and 2D/3D forward responses can be computed. The 1D forward responses are computed as follows: A) a 1D model subdivision, in terms of model thicknesses and direction of the "virtual" horizontal stratification, is defined for each 1D data set. For EM soundings the "virtual" horizontal stratification is set up parallel to the topography at the sounding position. B) the "virtual" 1D models are constructed by interpolating the soil properties in the medium point of the "virtual" layers. For 2D/3D forward responses the algorithm operates similarly, simply filling the 2D/3D meshes of the forward responses by computing the interpolation values in the centres of the mesh cells. The new definition of the voxel model space allows for incorporating straightforwardly the geophysical information into geological and/or hydrological models, just by using for defining the geophysical model space a voxel (hydro)geological grid. This simplify also the propagation of the uncertainty of geophysical parameters into the (hydro)geological models. Furthermore, prior information from boreholes, like resistivity logs, can be applied directly to the voxel model space, even if the borehole positions do not coincide with the actual observation points. In fact, the prior information is constrained to the model parameters through the interpolation function at the borehole locations. The presented algorithm is a further development of the AarhusInv program package developed at Aarhus University (formerly em1dinv), which manages both large scale AEM surveys and ground-based data. This work has been carried out as part of the HyGEM project, supported by the Danish Council of Strategic Research under grant number DSF 11-116763.

  7. The neural correlates of gist-based true and false recognition

    PubMed Central

    Gutchess, Angela H.; Schacter, Daniel L.

    2012-01-01

    When information is thematically related to previously studied information, gist-based processes contribute to false recognition. Using functional MRI, we examined the neural correlates of gist-based recognition as a function of increasing numbers of studied exemplars. Sixteen participants incidentally encoded small, medium, and large sets of pictures, and we compared the neural response at recognition using parametric modulation analyses. For hits, regions in middle occipital, middle temporal, and posterior parietal cortex linearly modulated their activity according to the number of related encoded items. For false alarms, visual, parietal, and hippocampal regions were modulated as a function of the encoded set size. The present results are consistent with prior work in that the neural regions supporting veridical memory also contribute to false memory for related information. The results also reveal that these regions respond to the degree of relatedness among similar items, and implicate perceptual and constructive processes in gist-based false memory. PMID:22155331

  8. Barriers and facilitators to ED physician use of the test and treatment for BPPV

    PubMed Central

    Forman, Jane; Damschroder, Laura; Telian, Steven A.; Fagerlin, Angela; Johnson, Patricia; Brown, Devin L.; An, Lawrence C.; Morgenstern, Lewis B.; Meurer, William J.

    2017-01-01

    Abstract Background: The test and treatment for benign paroxysmal positional vertigo (BPPV) are evidence-based practices supported by clinical guideline statements. Yet these practices are underutilized in the emergency department (ED) and interventions to promote their use are needed. To inform the development of an intervention, we interviewed ED physicians to explore barriers and facilitators to the current use of the Dix-Hallpike test (DHT) and the canalith repositioning maneuver (CRM). Methods: We conducted semi-structured in-person interviews with ED physicians who were recruited at annual ED society meetings in the United States. We analyzed data thematically using qualitative content analysis methods. Results: Based on 50 interviews with ED physicians, barriers that contributed to infrequent use of DHT/CRM that emerged were (1) prior negative experiences or forgetting how to perform them and (2) reliance on the history of present illness to identify BPPV, or using the DHT but misattributing patterns of nystagmus. Based on participants' responses, the principal facilitator of DHT/CRM use was prior positive experiences using these, even if infrequent. When asked which clinical supports would facilitate more frequent use of DHT/CRM, participants agreed supports needed to be brief, readily accessible, and easy to use, and to include well-annotated video examples. Conclusions: Interventions to promote the use of the DHT/CRM in the ED need to overcome prior negative experiences with the DHT/CRM, overreliance on the history of present illness, and the underuse and misattribution of patterns of nystagmus. Future resources need to be sensitive to provider preferences for succinct information and video examples. PMID:28680765

  9. A web-based intervention to promote applications for rehabilitation: a study protocol for a randomized controlled trial.

    PubMed

    Spanier, Katja; Streibelt, Marco; Ünalan, Firat; Bethge, Matthias

    2015-09-29

    The German welfare system follows the principle "rehabilitation rather than pension," but more than the half of all disability pensioners did not utilize medical rehabilitation before their early retirement. A major barrier is the application procedure. Lack of information about the opportunity to utilize rehabilitation services restricts the chance to improve work ability and to prevent health-related early retirement by rehabilitation programs. The establishment of new access paths to medical rehabilitation services was, therefore, identified as a major challenge for rehabilitation research in a recent expertise. Thus, a web-based information guide was developed to support the application for a medical rehabilitation program. For this study, the development of a web-based information guide was based on the health action process approach. Four modules were established. Three modules support forming an intention by strengthening risk perception (module 1), positive outcome expectancies (module 2) and self-efficacy (module 3). A fourth module aims at the realization of actual behavior by offering instructions on how to plan and to push the application process. The study on the effectiveness of the web-based information guide will be performed as a randomized controlled trial. Persons aged 40 to 59 years with prior sick leave benefits during the preceding year will be included. A sample of 16,000 persons will be randomly drawn from the registers of 3 pension insurance agencies. These persons will receive a questionnaire to determine baseline characteristics. Respondents of this first survey will be randomly allocated either to the intervention or the control group. Both study groups will then receive letters with general information about rehabilitation. The intervention group will additionally receive a link to the web-based information guide. After 1 year, a second survey will be conducted. Additionally, administrative data will be used to determine if participants apply for rehabilitation and finally start a rehabilitation program. The primary outcomes are the proportion of applied and utilized medical rehabilitation services. Secondary outcomes are cognitions on rehabilitation, self-rated work ability, health-related quality of life and perceived disability, as well as days with sick leave benefits and days of regular employment. The randomized controlled trial will provide highest ranked evidence to clarify whether theory-driven web-based information supports access to rehabilitation services for people with prior sickness benefits. German Clinical Trials Register (Identifier: DRKS00005658 , 16 January 2014).

  10. Kurtosis based weighted sparse model with convex optimization technique for bearing fault diagnosis

    NASA Astrophysics Data System (ADS)

    Zhang, Han; Chen, Xuefeng; Du, Zhaohui; Yan, Ruqiang

    2016-12-01

    The bearing failure, generating harmful vibrations, is one of the most frequent reasons for machine breakdowns. Thus, performing bearing fault diagnosis is an essential procedure to improve the reliability of the mechanical system and reduce its operating expenses. Most of the previous studies focused on rolling bearing fault diagnosis could be categorized into two main families, kurtosis-based filter method and wavelet-based shrinkage method. Although tremendous progresses have been made, their effectiveness suffers from three potential drawbacks: firstly, fault information is often decomposed into proximal frequency bands and results in impulsive feature frequency band splitting (IFFBS) phenomenon, which significantly degrades the performance of capturing the optimal information band; secondly, noise energy spreads throughout all frequency bins and contaminates fault information in the information band, especially under the heavy noisy circumstance; thirdly, wavelet coefficients are shrunk equally to satisfy the sparsity constraints and most of the feature information energy are thus eliminated unreasonably. Therefore, exploiting two pieces of prior information (i.e., one is that the coefficient sequences of fault information in the wavelet basis is sparse, and the other is that the kurtosis of the envelope spectrum could evaluate accurately the information capacity of rolling bearing faults), a novel weighted sparse model and its corresponding framework for bearing fault diagnosis is proposed in this paper, coined KurWSD. KurWSD formulates the prior information into weighted sparse regularization terms and then obtains a nonsmooth convex optimization problem. The alternating direction method of multipliers (ADMM) is sequentially employed to solve this problem and the fault information is extracted through the estimated wavelet coefficients. Compared with state-of-the-art methods, KurWSD overcomes the three drawbacks and utilizes the advantages of both family tools. KurWSD has three main advantages: firstly, all the characteristic information scattered in proximal sub-bands is gathered through synthesizing those impulsive dominant sub-band signals and thus eliminates the dilemma of the IFFBS phenomenon. Secondly, the noises in the focused sub-bands could be alleviated efficiently through shrinking or removing the dense wavelet coefficients of Gaussian noise. Lastly, wavelet coefficients with faulty information are reliably detected and preserved through manipulating wavelet coefficients discriminatively based on the contribution to the impulsive components. Moreover, the reliability and effectiveness of the KurWSD are demonstrated through simulated and experimental signals.

  11. Uninformative contexts support word learning for high-skill spellers.

    PubMed

    Eskenazi, Michael A; Swischuk, Natascha K; Folk, Jocelyn R; Abraham, Ashley N

    2018-04-30

    The current study investigated how high-skill spellers and low-skill spellers incidentally learn words during reading. The purpose of the study was to determine whether readers can use uninformative contexts to support word learning after forming a lexical representation for a novel word, consistent with instance-based resonance processes. Previous research has found that uninformative contexts damage word learning; however, there may have been insufficient exposure to informative contexts (only one) prior to exposure to uninformative contexts (Webb, 2007; Webb, 2008). In Experiment 1, participants read sentences with one novel word (i.e., blaph, clurge) embedded in them in three different conditions: Informative (six informative contexts to support word learning), Mixed (three informative contexts followed by three uninformative contexts), and Uninformative (six uninformative contexts). Experiment 2 added a new condition with only three informative contexts to further clarify the conclusions of Experiment 1. Results indicated that uninformative contexts can support word learning, but only for high-skill spellers. Further, when participants learned the spelling of the novel word, they were more likely to learn the meaning of that word. This effect was much larger for high-skill spellers than for low-skill spellers. Results are consistent with the Lexical Quality Hypothesis (LQH) in that high-skill spellers form stronger orthographic representations which support word learning (Perfetti, 2007). Results also support an instance-based resonance process of word learning in that prior informative contexts can be reactivated to support word learning in future contexts (Bolger, Balass, Landen, & Perfetti, 2008; Balass, Nelson, & Perfetti, 2010; Reichle & Perfetti, 2003). (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  12. High-resolution atmospheric inversion of urban CO2 emissions during the dormant season of the Indianapolis Flux Experiment (INFLUX)

    NASA Astrophysics Data System (ADS)

    Lauvaux, Thomas; Miles, Natasha L.; Deng, Aijun; Richardson, Scott J.; Cambaliza, Maria O.; Davis, Kenneth J.; Gaudet, Brian; Gurney, Kevin R.; Huang, Jianhua; O'Keefe, Darragh; Song, Yang; Karion, Anna; Oda, Tomohiro; Patarasuk, Risa; Razlivanov, Igor; Sarmiento, Daniel; Shepson, Paul; Sweeney, Colm; Turnbull, Jocelyn; Wu, Kai

    2016-05-01

    Based on a uniquely dense network of surface towers measuring continuously the atmospheric concentrations of greenhouse gases (GHGs), we developed the first comprehensive monitoring systems of CO2 emissions at high resolution over the city of Indianapolis. The urban inversion evaluated over the 2012-2013 dormant season showed a statistically significant increase of about 20% (from 4.5 to 5.7 MtC ± 0.23 MtC) compared to the Hestia CO2 emission estimate, a state-of-the-art building-level emission product. Spatial structures in prior emission errors, mostly undetermined, appeared to affect the spatial pattern in the inverse solution and the total carbon budget over the entire area by up to 15%, while the inverse solution remains fairly insensitive to the CO2 boundary inflow and to the different prior emissions (i.e., ODIAC). Preceding the surface emission optimization, we improved the atmospheric simulations using a meteorological data assimilation system also informing our Bayesian inversion system through updated observations error variances. Finally, we estimated the uncertainties associated with undetermined parameters using an ensemble of inversions. The total CO2 emissions based on the ensemble mean and quartiles (5.26-5.91 MtC) were statistically different compared to the prior total emissions (4.1 to 4.5 MtC). Considering the relatively small sensitivity to the different parameters, we conclude that atmospheric inversions are potentially able to constrain the carbon budget of the city, assuming sufficient data to measure the inflow of GHG over the city, but additional information on prior emission error structures are required to determine the spatial structures of urban emissions at high resolution.

  13. Supporting Multimedia Learning with Visual Signalling and Animated Pedagogical Agent: Moderating Effects of Prior Knowledge

    ERIC Educational Resources Information Center

    Johnson, A. M.; Ozogul, G.; Reisslein, M.

    2015-01-01

    An experiment examined the effects of visual signalling to relevant information in multiple external representations and the visual presence of an animated pedagogical agent (APA). Students learned electric circuit analysis using a computer-based learning environment that included Cartesian graphs, equations and electric circuit diagrams. The…

  14. The Role of Structure in Learning Non-Euclidean Geometry

    ERIC Educational Resources Information Center

    Asmuth, Jennifer A.

    2009-01-01

    How do people learn novel mathematical information that contradicts prior knowledge? The focus of this thesis is the role of structure in the acquisition of knowledge about hyperbolic geometry, a non-Euclidean geometry. In a series of three experiments, I contrast a more holistic structure--training based on closed figures--with a mathematically…

  15. Understanding Practitioners' Characteristics and Perspectives Prior to the Dissemination of an Evidence-Based Intervention

    ERIC Educational Resources Information Center

    Baumann, Barbara L.; Kolko, David J.; Collins, Kathryn; Herschell, Amy D.

    2006-01-01

    Objectives: To describe the characteristics and repertoires of community practitioners serving families involved in child physical abuse that may inform training and treatment dissemination efforts. The aims are to: (a) describe the background characteristics of these clinicians; (b) document their most common intervention techniques; (c) examine…

  16. 37 CFR 11.10 - Restrictions on practice in patent matters.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... or peripheral issue. A finding of substantiality should be based not only on the effort devoted to a... sufficient to suggest the relationship of the prior matter to his or her former office, e.g., technology... Department of Commerce for information concerning applicable post-employment restrictions. (d) An employee of...

  17. Knowledge of Algebra for Teaching: A Framework of Knowledge and Practices

    ERIC Educational Resources Information Center

    McCrory, Raven; Floden, Robert; Ferrini-Mundy, Joan; Reckase, Mark D.; Senk, Sharon L.

    2012-01-01

    Defining what teachers need to know to teach algebra successfully is important for informing teacher preparation and professional development efforts. Based on prior research, analysis of video, interviews with teachers, and analysis of textbooks, we define categories of knowledge and practices of teaching for understanding and assessing teachers'…

  18. Bayesian Structural Equation Modeling: A More Flexible Representation of Substantive Theory

    ERIC Educational Resources Information Center

    Muthen, Bengt; Asparouhov, Tihomir

    2012-01-01

    This article proposes a new approach to factor analysis and structural equation modeling using Bayesian analysis. The new approach replaces parameter specifications of exact zeros with approximate zeros based on informative, small-variance priors. It is argued that this produces an analysis that better reflects substantive theories. The proposed…

  19. The New Philanthropist: Eric Schnell--Ohio State University

    ERIC Educational Resources Information Center

    Library Journal, 2005

    2005-01-01

    As head of information technology at the Prior Health Sciences Library, Eric Schnell likes to improve products that don't fully meet his library's purposes. His first major software product, the award-winning Prospero Electronic Delivery Project, is a web-based document delivery system designed to complement Ariel[R] by converting documents to a…

  20. New formulae for estimating stature in the Balkans.

    PubMed

    Ross, Ann H; Konigsberg, Lyle W

    2002-01-01

    Recent studies of secular change and allometry have observed differential limb proportions between the sexes, among and within populations. These studies suggest that stature prediction formulae developed from American Whites may be inappropriate for European populations. The purpose of this investigation is to present more appropriate stature prediction equations for use in the Balkans to aid present-day identifications of the victims of genocide. The reference sample totals 545 white males obtained from World War II data. The Eastern European sample totals 177 males and includes both Bosnian and Croatian victims of the recent war. Mean stature for Eastern Europeans was obtained from the literature. Results show that formulae based on Trotter and Gleser systematically underestimate stature in the Balkans. Because Eastern Europeans are taller than American Whites it is appropriate to use this as an "informative prior" that can be applied to future cases. This informative prior can be used in predictive formulae, since it is probably similar to the sample from which the Balkan forensic cases were drawn. Based on Bayes' Theorem new predictive stature formulae are presented for Eastern Europeans.

  1. An information-based approach to change-point analysis with applications to biophysics and cell biology.

    PubMed

    Wiggins, Paul A

    2015-07-21

    This article describes the application of a change-point algorithm to the analysis of stochastic signals in biological systems whose underlying state dynamics consist of transitions between discrete states. Applications of this analysis include molecular-motor stepping, fluorophore bleaching, electrophysiology, particle and cell tracking, detection of copy number variation by sequencing, tethered-particle motion, etc. We present a unified approach to the analysis of processes whose noise can be modeled by Gaussian, Wiener, or Ornstein-Uhlenbeck processes. To fit the model, we exploit explicit, closed-form algebraic expressions for maximum-likelihood estimators of model parameters and estimated information loss of the generalized noise model, which can be computed extremely efficiently. We implement change-point detection using the frequentist information criterion (which, to our knowledge, is a new information criterion). The frequentist information criterion specifies a single, information-based statistical test that is free from ad hoc parameters and requires no prior probability distribution. We demonstrate this information-based approach in the analysis of simulated and experimental tethered-particle-motion data. Copyright © 2015 Biophysical Society. Published by Elsevier Inc. All rights reserved.

  2. Dissecting effects of complex mixtures: who's afraid of informative priors?

    PubMed

    Thomas, Duncan C; Witte, John S; Greenland, Sander

    2007-03-01

    Epidemiologic studies commonly investigate multiple correlated exposures, which are difficult to analyze appropriately. Hierarchical modeling provides a promising approach for analyzing such data by adding a higher-level structure or prior model for the exposure effects. This prior model can incorporate additional information on similarities among the correlated exposures and can be parametric, semiparametric, or nonparametric. We discuss the implications of applying these models and argue for their expanded use in epidemiology. While a prior model adds assumptions to the conventional (first-stage) model, all statistical methods (including conventional methods) make strong intrinsic assumptions about the processes that generated the data. One should thus balance prior modeling assumptions against assumptions of validity, and use sensitivity analyses to understand their implications. In doing so - and by directly incorporating into our analyses information from other studies or allied fields - we can improve our ability to distinguish true causes of disease from noise and bias.

  3. Conceptual issues in Bayesian divergence time estimation

    PubMed Central

    2016-01-01

    Bayesian inference of species divergence times is an unusual statistical problem, because the divergence time parameters are not identifiable unless both fossil calibrations and sequence data are available. Commonly used marginal priors on divergence times derived from fossil calibrations may conflict with node order on the phylogenetic tree causing a change in the prior on divergence times for a particular topology. Care should be taken to avoid confusing this effect with changes due to informative sequence data. This effect is illustrated with examples. A topology-consistent prior that preserves the marginal priors is defined and examples are constructed. Conflicts between fossil calibrations and relative branch lengths (based on sequence data) can cause estimates of divergence times that are grossly incorrect, yet have a narrow posterior distribution. An example of this effect is given; it is recommended that overly narrow posterior distributions of divergence times should be carefully scrutinized. This article is part of the themed issue ‘Dating species divergences using rocks and clocks’. PMID:27325831

  4. Conceptual issues in Bayesian divergence time estimation.

    PubMed

    Rannala, Bruce

    2016-07-19

    Bayesian inference of species divergence times is an unusual statistical problem, because the divergence time parameters are not identifiable unless both fossil calibrations and sequence data are available. Commonly used marginal priors on divergence times derived from fossil calibrations may conflict with node order on the phylogenetic tree causing a change in the prior on divergence times for a particular topology. Care should be taken to avoid confusing this effect with changes due to informative sequence data. This effect is illustrated with examples. A topology-consistent prior that preserves the marginal priors is defined and examples are constructed. Conflicts between fossil calibrations and relative branch lengths (based on sequence data) can cause estimates of divergence times that are grossly incorrect, yet have a narrow posterior distribution. An example of this effect is given; it is recommended that overly narrow posterior distributions of divergence times should be carefully scrutinized.This article is part of the themed issue 'Dating species divergences using rocks and clocks'. © 2016 The Author(s).

  5. Does prior domain-specific content knowledge influence students' recall of arguments surrounding interdisciplinary topics?

    PubMed

    Schmidt, Hiemke K; Rothgangel, Martin; Grube, Dietmar

    2017-12-01

    Awareness of various arguments can help interactants present opinions, stress points, and build counterarguments during discussions. At school, some topics are taught in a way that students learn to accumulate knowledge and gather arguments, and later employ them during debates. Prior knowledge may facilitate recalling information on well structured, fact-based topics, but does it facilitate recalling arguments during discussions on complex, interdisciplinary topics? We assessed the prior knowledge in domains related to a bioethical topic of 277 students from Germany (approximately 15 years old), their interest in the topic, and their general knowledge. The students read a text with arguments for and against prenatal diagnostics and tried to recall the arguments one week later and again six weeks later. Prior knowledge in various domains related to the topic individually and separately helped students recall the arguments. These relationships were independent of students' interest in the topic and their general knowledge. Copyright © 2017 The Foundation for Professionals in Services for Adolescents. Published by Elsevier Ltd. All rights reserved.

  6. Abdominal multi-organ CT segmentation using organ correlation graph and prediction-based shape and location priors.

    PubMed

    Okada, Toshiyuki; Linguraru, Marius George; Hori, Masatoshi; Summers, Ronald M; Tomiyama, Noriyuki; Sato, Yoshinobu

    2013-01-01

    The paper addresses the automated segmentation of multiple organs in upper abdominal CT data. We propose a framework of multi-organ segmentation which is adaptable to any imaging conditions without using intensity information in manually traced training data. The features of the framework are as follows: (1) the organ correlation graph (OCG) is introduced, which encodes the spatial correlations among organs inherent in human anatomy; (2) the patient-specific organ shape and location priors obtained using OCG enable the estimation of intensity priors from only target data and optionally a number of untraced CT data of the same imaging condition as the target data. The proposed methods were evaluated through segmentation of eight abdominal organs (liver, spleen, left and right kidney, pancreas, gallbladder, aorta, and inferior vena cava) from 86 CT data obtained by four imaging conditions at two hospitals. The performance was comparable to the state-of-the-art method using intensity priors constructed from manually traced data.

  7. A new approach for reducing beam hardening artifacts in polychromatic X-ray computed tomography using more accurate prior image.

    PubMed

    Wang, Hui; Xu, Yanan; Shi, Hongli

    2018-03-15

    Metal artifacts severely degrade CT image quality in clinical diagnosis, which are difficult to removed, especially for the beam hardening artifacts. The metal artifact reduction (MAR) based on prior images are the most frequently-used methods. However, there exists a lot misclassification in most prior images caused by absence of prior information such as spectrum distribution of X-ray beam source, especially when multiple or big metal are included. This work aims is to identify a more accurate prior image to improve image quality. The proposed method includes four steps. First, the metal image is segmented by thresholding an initial image, where the metal traces are identified in the initial projection data using the forward projection of the metal image. Second, the accurate absorbent model of certain metal image is calculated according to the spectrum distribution of certain X-ray beam source and energy-dependent attenuation coefficients of metal. Third, a new metal image is reconstructed by the general analytical reconstruction algorithm such as filtered back projection (FPB). The prior image is obtained by segmenting the difference image between the initial image and the new metal image into air, tissue and bone. Fourth, the initial projection data are normalized by dividing the projection data of prior image pixel to pixel. The final corrected image is obtained by interpolation, denormalization and reconstruction. Several clinical images with dental fillings and knee prostheses were used to evaluate the proposed algorithm and normalized metal artifact reduction (NMAR) and linear interpolation (LI) method. The results demonstrate the artifacts were reduced efficiently by the proposed method. The proposed method could obtain an exact prior image using the prior information about X-ray beam source and energy-dependent attenuation coefficients of metal. As a result, better performance of reducing beam hardening artifacts can be achieved. Moreover, the process of the proposed method is rather simple and little extra calculation burden is necessary. It has superiorities over other algorithms when include multiple and/or big implants.

  8. Can Bayesian models play a role in dental caries epidemiology? Evidence from an application to the BELCAP data set.

    PubMed

    Matranga, Domenica; Firenze, Alberto; Vullo, Angela

    2013-10-01

    The aim of this study was to show the potential of Bayesian analysis in statistical modelling of dental caries data. Because of the bounded nature of the dmft (DMFT) index, zero-inflated binomial (ZIB) and beta-binomial (ZIBB) models were considered. The effects of incorporating prior information available about the parameters of models were also shown. The data set used in this study was the Belo Horizonte Caries Prevention (BELCAP) study (Böhning et al. (1999)), consisting of five variables collected among 797 Brazilian school children designed to evaluate four programmes for reducing caries. Only the eight primary molar teeth were considered in the data set. A data augmentation algorithm was used for estimation. Firstly, noninformative priors were used to express our lack of knowledge about the regression parameters. Secondly, prior information about the probability of being a structural zero dmft and the probability of being caries affected in the subpopulation of susceptible children was incorporated. With noninformative priors, the best fitting model was the ZIBB. Education (OR = 0.76, 95% CrI: 0.59, 0.99), all interventions (OR = 0.46, 95% CrI: 0.35, 0.62), rinsing (OR = 0.61, 95% CrI: 0.47, 0.80) and hygiene (OR = 0.65, 95% CrI: 0.49, 0.86) were demonstrated to be factors protecting children from being caries affected. Being male increased the probability of being caries diseased (OR = 1.19, 95% CrI: 1.01, 1.42). However, after incorporating informative priors, ZIB models' estimates were not influenced, while ZIBB models reduced deviance and confirmed the association with all interventions and rinsing only. In our application, Bayesian estimates showed a similar accuracy and precision than likelihood-based estimates, although they offered many computational advantages and the possibility of expressing all forms of uncertainty in terms of probability. The overdispersion parameter could expound why the introduction of prior information had significant effects on the parameters of the ZIBB model, while ZIB estimates remained unchanged. Finally, the best performance of ZIBB compared to the ZIB model was shown to catch overdispersion in data. © 2013 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  9. The Power Prior: Theory and Applications

    PubMed Central

    Ibrahim, Joseph G.; Chen, Ming-Hui; Gwon, Yeongjin; Chen, Fang

    2015-01-01

    The power prior has been widely used in many applications covering a large number of disciplines. The power prior is intended to be an informative prior constructed from historical data. It has been used in clinical trials, genetics, health care, psychology, environmental health, engineering, economics, and business. It has also been applied for a wide variety of models and settings, both in the experimental design and analysis contexts. In this review article, we give an A to Z exposition of the power prior and its applications to date. We review its theoretical properties, variations in its formulation, statistical contexts for which it has been used, applications, and its advantages over other informative priors. We review models for which it has been used, including generalized linear models, survival models, and random effects models. Statistical areas where the power prior has been used include model selection, experimental design, hierarchical modeling, and conjugate priors. Prequentist properties of power priors in posterior inference are established and a simulation study is conducted to further examine the empirical performance of the posterior estimates with power priors. Real data analyses are given illustrating the power prior as well as the use of the power prior in the Bayesian design of clinical trials. PMID:26346180

  10. Informed walks: whispering hints to gene hunters inside networks' jungle.

    PubMed

    Bourdakou, Marilena M; Spyrou, George M

    2017-10-11

    Systemic approaches offer a different point of view on the analysis of several types of molecular associations as well as on the identification of specific gene communities in several cancer types. However, due to lack of sufficient data needed to construct networks based on experimental evidence, statistical gene co-expression networks are widely used instead. Many efforts have been made to exploit the information hidden in these networks. However, these approaches still need to capitalize comprehensively the prior knowledge encrypted into molecular pathway associations and improve their efficiency regarding the discovery of both exclusive subnetworks as candidate biomarkers and conserved subnetworks that may uncover common origins of several cancer types. In this study we present the development of the Informed Walks model based on random walks that incorporate information from molecular pathways to mine candidate genes and gene-gene links. The proposed model has been applied to TCGA (The Cancer Genome Atlas) datasets from seven different cancer types, exploring the reconstructed co-expression networks of the whole set of genes and driving to highlighted sub-networks for each cancer type. In the sequel, we elucidated the impact of each subnetwork on the indication of underlying exclusive and common molecular mechanisms as well as on the short-listing of drugs that have the potential to suppress the corresponding cancer type through a drug-repurposing pipeline. We have developed a method of gene subnetwork highlighting based on prior knowledge, capable to give fruitful insights regarding the underlying molecular mechanisms and valuable input to drug-repurposing pipelines for a variety of cancer types.

  11. Knowledge-based IMRT treatment planning for prostate cancer.

    PubMed

    Chanyavanich, Vorakarn; Das, Shiva K; Lee, William R; Lo, Joseph Y

    2011-05-01

    To demonstrate the feasibility of using a knowledge base of prior treatment plans to generate new prostate intensity modulated radiation therapy (IMRT) plans. Each new case would be matched against others in the knowledge base. Once the best match is identified, that clinically approved plan is used to generate the new plan. A database of 100 prostate IMRT treatment plans was assembled into an information-theoretic system. An algorithm based on mutual information was implemented to identify similar patient cases by matching 2D beam's eye view projections of contours. Ten randomly selected query cases were each matched with the most similar case from the database of prior clinically approved plans. Treatment parameters from the matched case were used to develop new treatment plans. A comparison of the differences in the dose-volume histograms between the new and the original treatment plans were analyzed. On average, the new knowledge-based plan is capable of achieving very comparable planning target volume coverage as the original plan, to within 2% as evaluated for D98, D95, and D1. Similarly, the dose to the rectum and dose to the bladder are also comparable to the original plan. For the rectum, the mean and standard deviation of the dose percentage differences for D20, D30, and D50 are 1.8% +/- 8.5%, -2.5% +/- 13.9%, and -13.9% +/- 23.6%, respectively. For the bladder, the mean and standard deviation of the dose percentage differences for D20, D30, and D50 are -5.9% +/- 10.8%, -12.2% +/- 14.6%, and -24.9% +/- 21.2%, respectively. A negative percentage difference indicates that the new plan has greater dose sparing as compared to the original plan. The authors demonstrate a knowledge-based approach of using prior clinically approved treatment plans to generate clinically acceptable treatment plans of high quality. This semiautomated approach has the potential to improve the efficiency of the treatment planning process while ensuring that high quality plans are developed.

  12. Lateral orbitofrontal cortex anticipates choices and integrates prior with current information

    PubMed Central

    Nogueira, Ramon; Abolafia, Juan M.; Drugowitsch, Jan; Balaguer-Ballester, Emili; Sanchez-Vives, Maria V.; Moreno-Bote, Rubén

    2017-01-01

    Adaptive behavior requires integrating prior with current information to anticipate upcoming events. Brain structures related to this computation should bring relevant signals from the recent past into the present. Here we report that rats can integrate the most recent prior information with sensory information, thereby improving behavior on a perceptual decision-making task with outcome-dependent past trial history. We find that anticipatory signals in the orbitofrontal cortex about upcoming choice increase over time and are even present before stimulus onset. These neuronal signals also represent the stimulus and relevant second-order combinations of past state variables. The encoding of choice, stimulus and second-order past state variables resides, up to movement onset, in overlapping populations. The neuronal representation of choice before stimulus onset and its build-up once the stimulus is presented suggest that orbitofrontal cortex plays a role in transforming immediate prior and stimulus information into choices using a compact state-space representation. PMID:28337990

  13. Superposing pure quantum states with partial prior information

    NASA Astrophysics Data System (ADS)

    Dogra, Shruti; Thomas, George; Ghosh, Sibasish; Suter, Dieter

    2018-05-01

    The principle of superposition is an intriguing feature of quantum mechanics, which is regularly exploited in many different circumstances. A recent work [M. Oszmaniec et al., Phys. Rev. Lett. 116, 110403 (2016), 10.1103/PhysRevLett.116.110403] shows that the fundamentals of quantum mechanics restrict the process of superimposing two unknown pure states, even though it is possible to superimpose two quantum states with partial prior knowledge. The prior knowledge imposes geometrical constraints on the choice of input states. We discuss an experimentally feasible protocol to superimpose multiple pure states of a d -dimensional quantum system and carry out an explicit experimental realization for two single-qubit pure states with partial prior information on a two-qubit NMR quantum information processor.

  14. Prior medical conditions and medication use and risk of non-Hodgkin lymphoma in Connecticut United States women.

    PubMed

    Zhang, Yawei; Holford, Theodore R; Leaderer, Brian; Zahm, Shelia Hoar; Boyle, Peter; Morton, Lindsay McOmber; Zhang, Bing; Zou, Kaiyong; Flynn, Stuart; Tallini, Giovanni; Owens, Patricia H; Zheng, Tongzhang

    2004-05-01

    To further investigate the role of prior medical conditions and medication use in the etiology of non-Hodgkin lymphoma (NHL), we analyzed the data from a population-based case-control study of NHL in Connecticut women. A total of 601 histologically confirmed incident cases of NHL and 717 population-based controls were included in this study. In-person interviews were administered using standardized, structured questionnaires to collect information on medical conditions and medication use. An increased risk was found among women who had a history of autoimmune disorders (such as rheumatoid arthritis, lupus erythematosus, Sjogren's syndrome, and multiple sclerosis), anemia, eczema, or psoriasis. An increased risk was also observed among women who had used steroidal anti-inflammatory drugs and tranquilizers. A reduced risk was found for women who had scarlet fever or who had used estrogen replacement therapy, aspirin, medications for non-insulin dependent diabetes, HMG-CoA reductase inhibitors, or beta-adrenergic blocking agents. Risk associated with past medical history appeared to vary based on NHL subtypes, but the results were based on small number of exposed subjects. A relationship between certain prior medical conditions and medication use and risk of NHL was observed in this study. Further studies are warranted to confirm our findings.

  15. The Best of Both Worlds

    PubMed Central

    Ter Wal, Anne L.J.; Alexy, Oliver; Block, Jörn; Sandner, Philipp G.

    2016-01-01

    Open networks give actors non-redundant information that is diverse, while closed networks offer redundant information that is easier to interpret. Integrating arguments about network structure and the similarity of actors’ knowledge, we propose two types of network configurations that combine diversity and ease of interpretation. Closed-diverse networks offer diversity in actors’ knowledge domains and shared third-party ties to help in interpreting that knowledge. In open-specialized networks, structural holes offer diversity, while shared interpretive schema and overlap between received information and actors’ prior knowledge help in interpreting new information without the help of third parties. In contrast, actors in open-diverse networks suffer from information overload due to the lack of shared schema or overlapping prior knowledge for the interpretation of diverse information, and actors in closed-specialized networks suffer from overembeddedness because they cannot access diverse information. Using CrunchBase data on early-stage venture capital investments in the U.S. information technology sector, we test the effect of investors’ social capital on the success of their portfolio ventures. We find that ventures have the highest chances of success if their syndicating investors have either open-specialized or closed-diverse networks. These effects are manifested beyond the direct effects of ventures’ or investors’ quality and are robust to controlling for the possibility that certain investors could have chosen more promising ventures at the time of first funding. PMID:27499546

  16. Bias in diet determination: incorporating traditional methods in Bayesian mixing models.

    PubMed

    Franco-Trecu, Valentina; Drago, Massimiliano; Riet-Sapriza, Federico G; Parnell, Andrew; Frau, Rosina; Inchausti, Pablo

    2013-01-01

    There are not "universal methods" to determine diet composition of predators. Most traditional methods are biased because of their reliance on differential digestibility and the recovery of hard items. By relying on assimilated food, stable isotope and Bayesian mixing models (SIMMs) resolve many biases of traditional methods. SIMMs can incorporate prior information (i.e. proportional diet composition) that may improve the precision in the estimated dietary composition. However few studies have assessed the performance of traditional methods and SIMMs with and without informative priors to study the predators' diets. Here we compare the diet compositions of the South American fur seal and sea lions obtained by scats analysis and by SIMMs-UP (uninformative priors) and assess whether informative priors (SIMMs-IP) from the scat analysis improved the estimated diet composition compared to SIMMs-UP. According to the SIMM-UP, while pelagic species dominated the fur seal's diet the sea lion's did not have a clear dominance of any prey. In contrast, SIMM-IP's diets compositions were dominated by the same preys as in scat analyses. When prior information influenced SIMMs' estimates, incorporating informative priors improved the precision in the estimated diet composition at the risk of inducing biases in the estimates. If preys isotopic data allow discriminating preys' contributions to diets, informative priors should lead to more precise but unbiased estimated diet composition. Just as estimates of diet composition obtained from traditional methods are critically interpreted because of their biases, care must be exercised when interpreting diet composition obtained by SIMMs-IP. The best approach to obtain a near-complete view of predators' diet composition should involve the simultaneous consideration of different sources of partial evidence (traditional methods, SIMM-UP and SIMM-IP) in the light of natural history of the predator species so as to reliably ascertain and weight the information yielded by each method.

  17. Bias in Diet Determination: Incorporating Traditional Methods in Bayesian Mixing Models

    PubMed Central

    Franco-Trecu, Valentina; Drago, Massimiliano; Riet-Sapriza, Federico G.; Parnell, Andrew; Frau, Rosina; Inchausti, Pablo

    2013-01-01

    There are not “universal methods” to determine diet composition of predators. Most traditional methods are biased because of their reliance on differential digestibility and the recovery of hard items. By relying on assimilated food, stable isotope and Bayesian mixing models (SIMMs) resolve many biases of traditional methods. SIMMs can incorporate prior information (i.e. proportional diet composition) that may improve the precision in the estimated dietary composition. However few studies have assessed the performance of traditional methods and SIMMs with and without informative priors to study the predators’ diets. Here we compare the diet compositions of the South American fur seal and sea lions obtained by scats analysis and by SIMMs-UP (uninformative priors) and assess whether informative priors (SIMMs-IP) from the scat analysis improved the estimated diet composition compared to SIMMs-UP. According to the SIMM-UP, while pelagic species dominated the fur seal’s diet the sea lion’s did not have a clear dominance of any prey. In contrast, SIMM-IP’s diets compositions were dominated by the same preys as in scat analyses. When prior information influenced SIMMs’ estimates, incorporating informative priors improved the precision in the estimated diet composition at the risk of inducing biases in the estimates. If preys isotopic data allow discriminating preys’ contributions to diets, informative priors should lead to more precise but unbiased estimated diet composition. Just as estimates of diet composition obtained from traditional methods are critically interpreted because of their biases, care must be exercised when interpreting diet composition obtained by SIMMs-IP. The best approach to obtain a near-complete view of predators’ diet composition should involve the simultaneous consideration of different sources of partial evidence (traditional methods, SIMM-UP and SIMM-IP) in the light of natural history of the predator species so as to reliably ascertain and weight the information yielded by each method. PMID:24224031

  18. A Calibrated Power Prior Approach to Borrow Information from Historical Data with Application to Biosimilar Clinical Trials.

    PubMed

    Pan, Haitao; Yuan, Ying; Xia, Jielai

    2017-11-01

    A biosimilar refers to a follow-on biologic intended to be approved for marketing based on biosimilarity to an existing patented biological product (i.e., the reference product). To develop a biosimilar product, it is essential to demonstrate biosimilarity between the follow-on biologic and the reference product, typically through two-arm randomization trials. We propose a Bayesian adaptive design for trials to evaluate biosimilar products. To take advantage of the abundant historical data on the efficacy of the reference product that is typically available at the time a biosimilar product is developed, we propose the calibrated power prior, which allows our design to adaptively borrow information from the historical data according to the congruence between the historical data and the new data collected from the current trial. We propose a new measure, the Bayesian biosimilarity index, to measure the similarity between the biosimilar and the reference product. During the trial, we evaluate the Bayesian biosimilarity index in a group sequential fashion based on the accumulating interim data, and stop the trial early once there is enough information to conclude or reject the similarity. Extensive simulation studies show that the proposed design has higher power than traditional designs. We applied the proposed design to a biosimilar trial for treating rheumatoid arthritis.

  19. When Generating Answers Benefits Arithmetic Skill: The Importance of Prior Knowledge

    ERIC Educational Resources Information Center

    Rittle-Johnson, Bethany; Kmicikewycz, Alexander Oleksij

    2008-01-01

    People remember information better if they generate the information while studying rather than read the information. However, prior research has not investigated whether this generation effect extends to related but unstudied items and has not been conducted in classroom settings. We compared third graders' success on studied and unstudied…

  20. 76 FR 11256 - Notice of Proposed Information Collection: Comment Request Loan Sales Bidder Qualification Statement

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-01

    ... techniques of other forms of information technology, e.g., permitting electronic submission of responses..., Equity Size, Prior History with HUD Loans and prior sales participation. By executing the Qualification...

  1. Integrating informative priors from experimental research with Bayesian methods: an example from radiation epidemiology.

    PubMed

    Hamra, Ghassan; Richardson, David; Maclehose, Richard; Wing, Steve

    2013-01-01

    Informative priors can be a useful tool for epidemiologists to handle problems of sparse data in regression modeling. It is sometimes the case that an investigator is studying a population exposed to two agents, X and Y, where Y is the agent of primary interest. Previous research may suggest that the exposures have different effects on the health outcome of interest, one being more harmful than the other. Such information may be derived from epidemiologic analyses; however, in the case where such evidence is unavailable, knowledge can be drawn from toxicologic studies or other experimental research. Unfortunately, using toxicologic findings to develop informative priors in epidemiologic analyses requires strong assumptions, with no established method for its utilization. We present a method to help bridge the gap between animal and cellular studies and epidemiologic research by specification of an order-constrained prior. We illustrate this approach using an example from radiation epidemiology.

  2. Integrating Informative Priors from Experimental Research with Bayesian Methods

    PubMed Central

    Hamra, Ghassan; Richardson, David; MacLehose, Richard; Wing, Steve

    2013-01-01

    Informative priors can be a useful tool for epidemiologists to handle problems of sparse data in regression modeling. It is sometimes the case that an investigator is studying a population exposed to two agents, X and Y, where Y is the agent of primary interest. Previous research may suggest that the exposures have different effects on the health outcome of interest, one being more harmful than the other. Such information may be derived from epidemiologic analyses; however, in the case where such evidence is unavailable, knowledge can be drawn from toxicologic studies or other experimental research. Unfortunately, using toxicologic findings to develop informative priors in epidemiologic analyses requires strong assumptions, with no established method for its utilization. We present a method to help bridge the gap between animal and cellular studies and epidemiologic research by specification of an order-constrained prior. We illustrate this approach using an example from radiation epidemiology. PMID:23222512

  3. Meta-analysis of few small studies in orphan diseases.

    PubMed

    Friede, Tim; Röver, Christian; Wandel, Simon; Neuenschwander, Beat

    2017-03-01

    Meta-analyses in orphan diseases and small populations generally face particular problems, including small numbers of studies, small study sizes and heterogeneity of results. However, the heterogeneity is difficult to estimate if only very few studies are included. Motivated by a systematic review in immunosuppression following liver transplantation in children, we investigate the properties of a range of commonly used frequentist and Bayesian procedures in simulation studies. Furthermore, the consequences for interval estimation of the common treatment effect in random-effects meta-analysis are assessed. The Bayesian credibility intervals using weakly informative priors for the between-trial heterogeneity exhibited coverage probabilities in excess of the nominal level for a range of scenarios considered. However, they tended to be shorter than those obtained by the Knapp-Hartung method, which were also conservative. In contrast, methods based on normal quantiles exhibited coverages well below the nominal levels in many scenarios. With very few studies, the performance of the Bayesian credibility intervals is of course sensitive to the specification of the prior for the between-trial heterogeneity. In conclusion, the use of weakly informative priors as exemplified by half-normal priors (with a scale of 0.5 or 1.0) for log odds ratios is recommended for applications in rare diseases. © 2016 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd. © 2016 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd.

  4. Influence of the Size of Cohorts in Adaptive Design for Nonlinear Mixed Effects Models: An Evaluation by Simulation for a Pharmacokinetic and Pharmacodynamic Model for a Biomarker in Oncology

    PubMed Central

    Lestini, Giulia; Dumont, Cyrielle; Mentré, France

    2015-01-01

    Purpose In this study we aimed to evaluate adaptive designs (ADs) by clinical trial simulation for a pharmacokinetic-pharmacodynamic model in oncology and to compare them with one-stage designs, i.e. when no adaptation is performed, using wrong prior parameters. Methods We evaluated two one-stage designs, ξ0 and ξ*, optimised for prior and true population parameters, Ψ0 and Ψ*, and several ADs (two-, three- and five-stage). All designs had 50 patients. For ADs, the first cohort design was ξ0. The next cohort design was optimised using prior information updated from the previous cohort. Optimal design was based on the determinant of the Fisher information matrix using PFIM. Design evaluation was performed by clinical trial simulations using data simulated from Ψ*. Results Estimation results of two-stage ADs and ξ* were close and much better than those obtained with ξ0. The balanced two-stage AD performed better than two-stage ADs with different cohort sizes. Three-and five-stage ADs were better than two-stage with small first cohort, but not better than the balanced two-stage design. Conclusions Two-stage ADs are useful when prior parameters are unreliable. In case of small first cohort, more adaptations are needed but these designs are complex to implement. PMID:26123680

  5. Influence of the Size of Cohorts in Adaptive Design for Nonlinear Mixed Effects Models: An Evaluation by Simulation for a Pharmacokinetic and Pharmacodynamic Model for a Biomarker in Oncology.

    PubMed

    Lestini, Giulia; Dumont, Cyrielle; Mentré, France

    2015-10-01

    In this study we aimed to evaluate adaptive designs (ADs) by clinical trial simulation for a pharmacokinetic-pharmacodynamic model in oncology and to compare them with one-stage designs, i.e., when no adaptation is performed, using wrong prior parameters. We evaluated two one-stage designs, ξ0 and ξ*, optimised for prior and true population parameters, Ψ0 and Ψ*, and several ADs (two-, three- and five-stage). All designs had 50 patients. For ADs, the first cohort design was ξ0. The next cohort design was optimised using prior information updated from the previous cohort. Optimal design was based on the determinant of the Fisher information matrix using PFIM. Design evaluation was performed by clinical trial simulations using data simulated from Ψ*. Estimation results of two-stage ADs and ξ * were close and much better than those obtained with ξ 0. The balanced two-stage AD performed better than two-stage ADs with different cohort sizes. Three- and five-stage ADs were better than two-stage with small first cohort, but not better than the balanced two-stage design. Two-stage ADs are useful when prior parameters are unreliable. In case of small first cohort, more adaptations are needed but these designs are complex to implement.

  6. Knowledge Modeling in Prior Art Search

    NASA Astrophysics Data System (ADS)

    Graf, Erik; Frommholz, Ingo; Lalmas, Mounia; van Rijsbergen, Keith

    This study explores the benefits of integrating knowledge representations in prior art patent retrieval. Key to the introduced approach is the utilization of human judgment available in the form of classifications assigned to patent documents. The paper first outlines in detail how a methodology for the extraction of knowledge from such an hierarchical classification system can be established. Further potential ways of integrating this knowledge with existing Information Retrieval paradigms in a scalable and flexible manner are investigated. Finally based on these integration strategies the effectiveness in terms of recall and precision is evaluated in the context of a prior art search task for European patents. As a result of this evaluation it can be established that in general the proposed knowledge expansion techniques are particularly beneficial to recall and, with respect to optimizing field retrieval settings, further result in significant precision gains.

  7. Molecular bases for unity and diversity in organic evolution

    NASA Technical Reports Server (NTRS)

    Fox, S. W.; Ruecknagel, P.; Braunitzer, G.

    1991-01-01

    The origin of biological information has been ascribed at various times to DNA, RNA, or protein. The origin of nucleic acids without the action of prior informed protein has not been supported by plausible experiments, although such possibilities have been examined. The behavior of thermal proteins and of the microspheres selfassembled therefrom explain the origin of the first cells, the first membrane, the first reproduction cycle, ancient metabolism including ATP-aided synthesis of peptides and polynucleotides, growth, bioelectricity, and polybiofunctionality in general.

  8. Rescue karyotyping: a case series of array-based comparative genomic hybridization evaluation of archival conceptual tissue

    PubMed Central

    2014-01-01

    Background Determination of fetal aneuploidy is central to evaluation of recurrent pregnancy loss (RPL). However, obtaining this information at the time of a miscarriage is not always possible or may not have been ordered. Here we report on “rescue karyotyping”, wherein DNA extracted from archived paraffin-embedded pregnancy loss tissue from a prior dilation and curettage (D&C) is evaluated by array-based comparative genomic hybridization (aCGH). Methods A retrospective case series was conducted at an academic medical center. Patients included had unexplained RPL and a prior pregnancy loss for which karyotype information would be clinically informative but was unavailable. After extracting DNA from slides of archived tissue, aCGH with a reduced stringency approach was performed, allowing for analysis of partially degraded DNA. Statistics were computed using STATA v12.1 (College Station, TX). Results Rescue karyotyping was attempted on 20 specimens from 17 women. DNA was successfully extracted in 16 samples (80.0%), enabling analysis at either high or low resolution. The longest interval from tissue collection to DNA extraction was 4.2 years. There was no significant difference in specimen sufficiency for analysis in the collection-to-extraction interval (p = 0.14) or gestational age at pregnancy loss (p = 0.32). Eight specimens showed copy number variants: 3 trisomies, 2 partial chromosomal deletions, 1 mosaic abnormality and 2 unclassified variants. Conclusions Rescue karyotyping using aCGH on DNA extracted from paraffin-embedded tissue provides the opportunity to obtain critical fetal cytogenetic information from a prior loss, even if it occurred years earlier. Given the ubiquitous archiving of paraffin embedded tissue obtained during a D&C and the ease of obtaining results despite long loss-to-testing intervals or early gestational age at time of fetal demise, this may provide a useful technique in the evaluation of couples with recurrent pregnancy loss. PMID:24589081

  9. Automatic information extraction from unstructured mammography reports using distributed semantics.

    PubMed

    Gupta, Anupama; Banerjee, Imon; Rubin, Daniel L

    2018-02-01

    To date, the methods developed for automated extraction of information from radiology reports are mainly rule-based or dictionary-based, and, therefore, require substantial manual effort to build these systems. Recent efforts to develop automated systems for entity detection have been undertaken, but little work has been done to automatically extract relations and their associated named entities in narrative radiology reports that have comparable accuracy to rule-based methods. Our goal is to extract relations in a unsupervised way from radiology reports without specifying prior domain knowledge. We propose a hybrid approach for information extraction that combines dependency-based parse tree with distributed semantics for generating structured information frames about particular findings/abnormalities from the free-text mammography reports. The proposed IE system obtains a F 1 -score of 0.94 in terms of completeness of the content in the information frames, which outperforms a state-of-the-art rule-based system in this domain by a significant margin. The proposed system can be leveraged in a variety of applications, such as decision support and information retrieval, and may also easily scale to other radiology domains, since there is no need to tune the system with hand-crafted information extraction rules. Copyright © 2018 Elsevier Inc. All rights reserved.

  10. Track Everything: Limiting Prior Knowledge in Online Multi-Object Recognition.

    PubMed

    Wong, Sebastien C; Stamatescu, Victor; Gatt, Adam; Kearney, David; Lee, Ivan; McDonnell, Mark D

    2017-10-01

    This paper addresses the problem of online tracking and classification of multiple objects in an image sequence. Our proposed solution is to first track all objects in the scene without relying on object-specific prior knowledge, which in other systems can take the form of hand-crafted features or user-based track initialization. We then classify the tracked objects with a fast-learning image classifier, that is based on a shallow convolutional neural network architecture and demonstrate that object recognition improves when this is combined with object state information from the tracking algorithm. We argue that by transferring the use of prior knowledge from the detection and tracking stages to the classification stage, we can design a robust, general purpose object recognition system with the ability to detect and track a variety of object types. We describe our biologically inspired implementation, which adaptively learns the shape and motion of tracked objects, and apply it to the Neovision2 Tower benchmark data set, which contains multiple object types. An experimental evaluation demonstrates that our approach is competitive with the state-of-the-art video object recognition systems that do make use of object-specific prior knowledge in detection and tracking, while providing additional practical advantages by virtue of its generality.

  11. MC3: Multi-core Markov-chain Monte Carlo code

    NASA Astrophysics Data System (ADS)

    Cubillos, Patricio; Harrington, Joseph; Lust, Nate; Foster, AJ; Stemm, Madison; Loredo, Tom; Stevenson, Kevin; Campo, Chris; Hardin, Matt; Hardy, Ryan

    2016-10-01

    MC3 (Multi-core Markov-chain Monte Carlo) is a Bayesian statistics tool that can be executed from the shell prompt or interactively through the Python interpreter with single- or multiple-CPU parallel computing. It offers Markov-chain Monte Carlo (MCMC) posterior-distribution sampling for several algorithms, Levenberg-Marquardt least-squares optimization, and uniform non-informative, Jeffreys non-informative, or Gaussian-informative priors. MC3 can share the same value among multiple parameters and fix the value of parameters to constant values, and offers Gelman-Rubin convergence testing and correlated-noise estimation with time-averaging or wavelet-based likelihood estimation methods.

  12. The Influence of Prior Knowledge on the Retrieval-Directed Function of Note Taking in Prior Knowledge Activation

    ERIC Educational Resources Information Center

    Wetzels, Sandra A. J.; Kester, Liesbeth; van Merrienboer, Jeroen J. G.; Broers, Nick J.

    2011-01-01

    Background: Prior knowledge activation facilitates learning. Note taking during prior knowledge activation (i.e., note taking directed at retrieving information from memory) might facilitate the activation process by enabling learners to build an external representation of their prior knowledge. However, taking notes might be less effective in…

  13. Incorporation of stochastic engineering models as prior information in Bayesian medical device trials.

    PubMed

    Haddad, Tarek; Himes, Adam; Thompson, Laura; Irony, Telba; Nair, Rajesh

    2017-01-01

    Evaluation of medical devices via clinical trial is often a necessary step in the process of bringing a new product to market. In recent years, device manufacturers are increasingly using stochastic engineering models during the product development process. These models have the capability to simulate virtual patient outcomes. This article presents a novel method based on the power prior for augmenting a clinical trial using virtual patient data. To properly inform clinical evaluation, the virtual patient model must simulate the clinical outcome of interest, incorporating patient variability, as well as the uncertainty in the engineering model and in its input parameters. The number of virtual patients is controlled by a discount function which uses the similarity between modeled and observed data. This method is illustrated by a case study of cardiac lead fracture. Different discount functions are used to cover a wide range of scenarios in which the type I error rates and power vary for the same number of enrolled patients. Incorporation of engineering models as prior knowledge in a Bayesian clinical trial design can provide benefits of decreased sample size and trial length while still controlling type I error rate and power.

  14. The detection of faked identity using unexpected questions and mouse dynamics.

    PubMed

    Monaro, Merylin; Gamberini, Luciano; Sartori, Giuseppe

    2017-01-01

    The detection of faked identities is a major problem in security. Current memory-detection techniques cannot be used as they require prior knowledge of the respondent's true identity. Here, we report a novel technique for detecting faked identities based on the use of unexpected questions that may be used to check the respondent identity without any prior autobiographical information. While truth-tellers respond automatically to unexpected questions, liars have to "build" and verify their responses. This lack of automaticity is reflected in the mouse movements used to record the responses as well as in the number of errors. Responses to unexpected questions are compared to responses to expected and control questions (i.e., questions to which a liar also must respond truthfully). Parameters that encode mouse movement were analyzed using machine learning classifiers and the results indicate that the mouse trajectories and errors on unexpected questions efficiently distinguish liars from truth-tellers. Furthermore, we showed that liars may be identified also when they are responding truthfully. Unexpected questions combined with the analysis of mouse movement may efficiently spot participants with faked identities without the need for any prior information on the examinee.

  15. Selection of the effect size for sample size determination for a continuous response in a superiority clinical trial using a hybrid classical and Bayesian procedure.

    PubMed

    Ciarleglio, Maria M; Arendt, Christopher D; Peduzzi, Peter N

    2016-06-01

    When designing studies that have a continuous outcome as the primary endpoint, the hypothesized effect size ([Formula: see text]), that is, the hypothesized difference in means ([Formula: see text]) relative to the assumed variability of the endpoint ([Formula: see text]), plays an important role in sample size and power calculations. Point estimates for [Formula: see text] and [Formula: see text] are often calculated using historical data. However, the uncertainty in these estimates is rarely addressed. This article presents a hybrid classical and Bayesian procedure that formally integrates prior information on the distributions of [Formula: see text] and [Formula: see text] into the study's power calculation. Conditional expected power, which averages the traditional power curve using the prior distributions of [Formula: see text] and [Formula: see text] as the averaging weight, is used, and the value of [Formula: see text] is found that equates the prespecified frequentist power ([Formula: see text]) and the conditional expected power of the trial. This hypothesized effect size is then used in traditional sample size calculations when determining sample size for the study. The value of [Formula: see text] found using this method may be expressed as a function of the prior means of [Formula: see text] and [Formula: see text], [Formula: see text], and their prior standard deviations, [Formula: see text]. We show that the "naïve" estimate of the effect size, that is, the ratio of prior means, should be down-weighted to account for the variability in the parameters. An example is presented for designing a placebo-controlled clinical trial testing the antidepressant effect of alprazolam as monotherapy for major depression. Through this method, we are able to formally integrate prior information on the uncertainty and variability of both the treatment effect and the common standard deviation into the design of the study while maintaining a frequentist framework for the final analysis. Solving for the effect size which the study has a high probability of correctly detecting based on the available prior information on the difference [Formula: see text] and the standard deviation [Formula: see text] provides a valuable, substantiated estimate that can form the basis for discussion about the study's feasibility during the design phase. © The Author(s) 2016.

  16. A Comparison of the Incremental Difference between the Beginning and Ending Heart Rate When Shorthand Writers Are Informed and Not Informed of Speeds of Dictation.

    ERIC Educational Resources Information Center

    Dickey, Patsy A.

    1980-01-01

    Forty female students were used to compare the incremental difference in heart rate of shorthand writers when they were informed and not informed of shorthand speeds prior to dictation. It was concluded that students' performances were enhanced by receiving instructions as to speed of dictation prior to the take. (Author/CT)

  17. Elapsed decision time affects the weighting of prior probability in a perceptual decision task

    PubMed Central

    Hanks, Timothy D.; Mazurek, Mark E.; Kiani, Roozbeh; Hopp, Elizabeth; Shadlen, Michael N.

    2012-01-01

    Decisions are often based on a combination of new evidence with prior knowledge of the probable best choice. Optimal combination requires knowledge about the reliability of evidence, but in many realistic situations, this is unknown. Here we propose and test a novel theory: the brain exploits elapsed time during decision formation to combine sensory evidence with prior probability. Elapsed time is useful because (i) decisions that linger tend to arise from less reliable evidence, and (ii) the expected accuracy at a given decision time depends on the reliability of the evidence gathered up to that point. These regularities allow the brain to combine prior information with sensory evidence by weighting the latter in accordance with reliability. To test this theory, we manipulated the prior probability of the rewarded choice while subjects performed a reaction-time discrimination of motion direction using a range of stimulus reliabilities that varied from trial to trial. The theory explains the effect of prior probability on choice and reaction time over a wide range of stimulus strengths. We found that prior probability was incorporated into the decision process as a dynamic bias signal that increases as a function of decision time. This bias signal depends on the speed-accuracy setting of human subjects, and it is reflected in the firing rates of neurons in the lateral intraparietal cortex (LIP) of rhesus monkeys performing this task. PMID:21525274

  18. Elapsed decision time affects the weighting of prior probability in a perceptual decision task.

    PubMed

    Hanks, Timothy D; Mazurek, Mark E; Kiani, Roozbeh; Hopp, Elisabeth; Shadlen, Michael N

    2011-04-27

    Decisions are often based on a combination of new evidence with prior knowledge of the probable best choice. Optimal combination requires knowledge about the reliability of evidence, but in many realistic situations, this is unknown. Here we propose and test a novel theory: the brain exploits elapsed time during decision formation to combine sensory evidence with prior probability. Elapsed time is useful because (1) decisions that linger tend to arise from less reliable evidence, and (2) the expected accuracy at a given decision time depends on the reliability of the evidence gathered up to that point. These regularities allow the brain to combine prior information with sensory evidence by weighting the latter in accordance with reliability. To test this theory, we manipulated the prior probability of the rewarded choice while subjects performed a reaction-time discrimination of motion direction using a range of stimulus reliabilities that varied from trial to trial. The theory explains the effect of prior probability on choice and reaction time over a wide range of stimulus strengths. We found that prior probability was incorporated into the decision process as a dynamic bias signal that increases as a function of decision time. This bias signal depends on the speed-accuracy setting of human subjects, and it is reflected in the firing rates of neurons in the lateral intraparietal area (LIP) of rhesus monkeys performing this task.

  19. Evaluation of two methods for using MR information in PET reconstruction

    NASA Astrophysics Data System (ADS)

    Caldeira, L.; Scheins, J.; Almeida, P.; Herzog, H.

    2013-02-01

    Using magnetic resonance (MR) information in maximum a posteriori (MAP) algorithms for positron emission tomography (PET) image reconstruction has been investigated in the last years. Recently, three methods to introduce this information have been evaluated and the Bowsher prior was considered the best. Its main advantage is that it does not require image segmentation. Another method that has been widely used for incorporating MR information is using boundaries obtained by segmentation. This method has also shown improvements in image quality. In this paper, two methods for incorporating MR information in PET reconstruction are compared. After a Bayes parameter optimization, the reconstructed images were compared using the mean squared error (MSE) and the coefficient of variation (CV). MSE values are 3% lower in Bowsher than using boundaries. CV values are 10% lower in Bowsher than using boundaries. Both methods performed better than using no prior, that is, maximum likelihood expectation maximization (MLEM) or MAP without anatomic information in terms of MSE and CV. Concluding, incorporating MR information using the Bowsher prior gives better results in terms of MSE and CV than boundaries. MAP algorithms showed again to be effective in noise reduction and convergence, specially when MR information is incorporated. The robustness of the priors in respect to noise and inhomogeneities in the MR image has however still to be performed.

  20. Predictable information in neural signals during resting state is reduced in autism spectrum disorder.

    PubMed

    Brodski-Guerniero, Alla; Naumer, Marcus J; Moliadze, Vera; Chan, Jason; Althen, Heike; Ferreira-Santos, Fernando; Lizier, Joseph T; Schlitt, Sabine; Kitzerow, Janina; Schütz, Magdalena; Langer, Anne; Kaiser, Jochen; Freitag, Christine M; Wibral, Michael

    2018-04-04

    The neurophysiological underpinnings of the nonsocial symptoms of autism spectrum disorder (ASD) which include sensory and perceptual atypicalities remain poorly understood. Well-known accounts of less dominant top-down influences and more dominant bottom-up processes compete to explain these characteristics. These accounts have been recently embedded in the popular framework of predictive coding theory. To differentiate between competing accounts, we studied altered information dynamics in ASD by quantifying predictable information in neural signals. Predictable information in neural signals measures the amount of stored information that is used for the next time step of a neural process. Thus, predictable information limits the (prior) information which might be available for other brain areas, for example, to build predictions for upcoming sensory information. We studied predictable information in neural signals based on resting-state magnetoencephalography (MEG) recordings of 19 ASD patients and 19 neurotypical controls aged between 14 and 27 years. Using whole-brain beamformer source analysis, we found reduced predictable information in ASD patients across the whole brain, but in particular in posterior regions of the default mode network. In these regions, epoch-by-epoch predictable information was positively correlated with source power in the alpha and beta frequency range as well as autocorrelation decay time. Predictable information in precuneus and cerebellum was negatively associated with nonsocial symptom severity, indicating a relevance of the analysis of predictable information for clinical research in ASD. Our findings are compatible with the assumption that use or precision of prior knowledge is reduced in ASD patients. © 2018 Wiley Periodicals, Inc.

  1. What are they up to? The role of sensory evidence and prior knowledge in action understanding.

    PubMed

    Chambon, Valerian; Domenech, Philippe; Pacherie, Elisabeth; Koechlin, Etienne; Baraduc, Pierre; Farrer, Chlöé

    2011-02-18

    Explaining or predicting the behaviour of our conspecifics requires the ability to infer the intentions that motivate it. Such inferences are assumed to rely on two types of information: (1) the sensory information conveyed by movement kinematics and (2) the observer's prior expectations--acquired from past experience or derived from prior knowledge. However, the respective contribution of these two sources of information is still controversial. This controversy stems in part from the fact that "intention" is an umbrella term that may embrace various sub-types each being assigned different scopes and targets. We hypothesized that variations in the scope and target of intentions may account for variations in the contribution of visual kinematics and prior knowledge to the intention inference process. To test this hypothesis, we conducted four behavioural experiments in which participants were instructed to identify different types of intention: basic intentions (i.e. simple goal of a motor act), superordinate intentions (i.e. general goal of a sequence of motor acts), or social intentions (i.e. intentions accomplished in a context of reciprocal interaction). For each of the above-mentioned intentions, we varied (1) the amount of visual information available from the action scene and (2) participant's prior expectations concerning the intention that was more likely to be accomplished. First, we showed that intentional judgments depend on a consistent interaction between visual information and participant's prior expectations. Moreover, we demonstrated that this interaction varied according to the type of intention to be inferred, with participant's priors rather than perceptual evidence exerting a greater effect on the inference of social and superordinate intentions. The results are discussed by appealing to the specific properties of each type of intention considered and further interpreted in the light of a hierarchical model of action representation.

  2. Using the Web as a Strategic Resource: An Applied Classroom Exercise.

    ERIC Educational Resources Information Center

    Wright, Kathleen M.; Granger, Mary J.

    This paper reports the findings of an experiment designed to test extensions of the Technology Acceptance Model (TAM) within the context of using the World Wide Web to gather and analyze financial information. The proposed extensions are three-fold. Based on prior research, cognitive absorption variables are posited as predeterminants of ease of…

  3. Exploring Teachers' Process of Change in Incorporating Problem Solving into the Mathematics Classroom

    ERIC Educational Resources Information Center

    Rutherford, Vanessa

    2012-01-01

    This study explores how a problem-solving based professional learning community (PLC) affects the beliefs, knowledge, and instructional practices of two sixth-grade mathematics teachers. An interview and two observations were conducted prior to beginning the year-long PLC in order to gather information about the participants' beliefs,…

  4. An Alumni Assessment of MIS Related Job Skill Importance and Skill Gaps

    ERIC Educational Resources Information Center

    Wilkerson, Jerod W.

    2012-01-01

    This paper presents the results of a job skill survey of Management Information Systems (MIS) alumni from a Northeastern U.S. university. The study assesses job skill importance and skill gaps associated with 104 technical and non-technical skill items. Survey items were grouped into 6 categories based on prior research. Skill importance and skill…

  5. Student Connections with Academic Texts: A Phenomenographic Study of Reading

    ERIC Educational Resources Information Center

    MacMillan, Margy

    2014-01-01

    Concerns about the ability of post-secondary students to read scholarly materials are well documented in the literature. A key aspect of reading at the deeper level expected of these students is connecting new information to prior knowledge. This study is based on an activity where students were explicitly required to make such connections as part…

  6. Teaching Assistants' Perceptions of a Training to Support an Inquiry-Based General Chemistry Laboratory Course

    ERIC Educational Resources Information Center

    Wheeler, Lindsay B.; Maeng, Jennifer L.; Whitworth, Brooke A.

    2015-01-01

    The purpose of this qualitative investigation was to better understand teaching assistants' (TAs') perceptions of training in a guided inquiry undergraduate general chemistry laboratory context. The training was developed using existing TA training literature and informed by situated learning theory. TAs engaged in training prior to teaching (~25…

  7. 21 CFR 181.30 - Substances used in the manufacture of paper and paperboard products used in food packaging.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN CONSUMPTION (CONTINUED) PRIOR-SANCTIONED..., based on available scientific information and data. 1-Alkyl (C6-C18)3-amino-3-aminopropane monoacetate.* Borax or boric acid for use in adhesives, sizes, and coatings.* Butadiene-styrene copolymer. Chromium...

  8. A Primer on Decision Analysis for Individually Prescribed Instruction. ACT Technical Bulletin No. 17.

    ERIC Educational Resources Information Center

    Davis, Charles E.; And Others

    A coherent system of decision making is described that may be incorporated into an instructional sequence to provide a supplement to the experience-based judgment of the classroom teacher. The elements of this decision process incorporate prior information such as a teacher's past experience, experimental results such as a test score, and…

  9. Embeddedness and New Idea Discussion in Professional Networks: The Mediating Role of Affect-Based Trust

    ERIC Educational Resources Information Center

    Chua, Roy Y. J.; Morris, Michael W.; Ingram, Paul

    2010-01-01

    This article examines how managers' tendency to discuss new ideas with others in their professional networks depends on the density of shared ties surrounding a given relationship. Consistent with prior research which found that embeddedness enhances information flow, an egocentric network survey of mid-level executives shows that managers tend to…

  10. 75 FR 43917 - Proposed Information Collection; Comment Request; Implementation of Tariff Rate Quota Established...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-27

    ...'s and boys' worsted wool suits and suit-like jackets and trousers in the United States and who apply for an allocation based on the amount of such suits cut and sewn during the prior calendar year... be reallocated. II. Method of Collection Forms are available on the Internet and by mail to...

  11. How Recent History Affects Perception: The Normative Approach and Its Heuristic Approximation

    PubMed Central

    Raviv, Ofri; Ahissar, Merav; Loewenstein, Yonatan

    2012-01-01

    There is accumulating evidence that prior knowledge about expectations plays an important role in perception. The Bayesian framework is the standard computational approach to explain how prior knowledge about the distribution of expected stimuli is incorporated with noisy observations in order to improve performance. However, it is unclear what information about the prior distribution is acquired by the perceptual system over short periods of time and how this information is utilized in the process of perceptual decision making. Here we address this question using a simple two-tone discrimination task. We find that the “contraction bias”, in which small magnitudes are overestimated and large magnitudes are underestimated, dominates the pattern of responses of human participants. This contraction bias is consistent with the Bayesian hypothesis in which the true prior information is available to the decision-maker. However, a trial-by-trial analysis of the pattern of responses reveals that the contribution of most recent trials to performance is overweighted compared with the predictions of a standard Bayesian model. Moreover, we study participants' performance in a-typical distributions of stimuli and demonstrate substantial deviations from the ideal Bayesian detector, suggesting that the brain utilizes a heuristic approximation of the Bayesian inference. We propose a biologically plausible model, in which decision in the two-tone discrimination task is based on a comparison between the second tone and an exponentially-decaying average of the first tone and past tones. We show that this model accounts for both the contraction bias and the deviations from the ideal Bayesian detector hypothesis. These findings demonstrate the power of Bayesian-like heuristics in the brain, as well as their limitations in their failure to fully adapt to novel environments. PMID:23133343

  12. Stochastic, goal-oriented rapid impact modeling of uncertainty and environmental impacts in poorly-sampled sites using ex-situ priors

    NASA Astrophysics Data System (ADS)

    Li, Xiaojun; Li, Yandong; Chang, Ching-Fu; Tan, Benjamin; Chen, Ziyang; Sege, Jon; Wang, Changhong; Rubin, Yoram

    2018-01-01

    Modeling of uncertainty associated with subsurface dynamics has long been a major research topic. Its significance is widely recognized for real-life applications. Despite the huge effort invested in the area, major obstacles still remain on the way from theory and applications. Particularly problematic here is the confusion between modeling uncertainty and modeling spatial variability, which translates into a (mis)conception, in fact an inconsistency, in that it suggests that modeling of uncertainty and modeling of spatial variability are equivalent, and as such, requiring a lot of data. This paper investigates this challenge against the backdrop of a 7 km, deep underground tunnel in China, where environmental impacts are of major concern. We approach the data challenge by pursuing a new concept for Rapid Impact Modeling (RIM), which bypasses altogether the need to estimate posterior distributions of model parameters, focusing instead on detailed stochastic modeling of impacts, conditional to all information available, including prior, ex-situ information and in-situ measurements as well. A foundational element of RIM is the construction of informative priors for target parameters using ex-situ data, relying on ensembles of well-documented sites, pre-screened for geological and hydrological similarity to the target site. The ensembles are built around two sets of similarity criteria: a physically-based set of criteria and an additional set covering epistemic criteria. In another variation to common Bayesian practice, we update the priors to obtain conditional distributions of the target (environmental impact) dependent variables and not the hydrological variables. This recognizes that goal-oriented site characterization is in many cases more useful in applications compared to parameter-oriented characterization.

  13. Applying Sequential Analytic Methods to Self-Reported Information to Anticipate Care Needs.

    PubMed

    Bayliss, Elizabeth A; Powers, J David; Ellis, Jennifer L; Barrow, Jennifer C; Strobel, MaryJo; Beck, Arne

    2016-01-01

    Identifying care needs for newly enrolled or newly insured individuals is important under the Affordable Care Act. Systematically collected patient-reported information can potentially identify subgroups with specific care needs prior to service use. We conducted a retrospective cohort investigation of 6,047 individuals who completed a 10-question needs assessment upon initial enrollment in Kaiser Permanente Colorado (KPCO), a not-for-profit integrated delivery system, through the Colorado State Individual Exchange. We used responses from the Brief Health Questionnaire (BHQ), to develop a predictive model for cost for receiving care in the top 25 percent, then applied cluster analytic techniques to identify different high-cost subpopulations. Per-member, per-month cost was measured from 6 to 12 months following BHQ response. BHQ responses significantly predictive of high-cost care included self-reported health status, functional limitations, medication use, presence of 0-4 chronic conditions, self-reported emergency department (ED) use during the prior year, and lack of prior insurance. Age, gender, and deductible-based insurance product were also predictive. The largest possible range of predicted probabilities of being in the top 25 percent of cost was 3.5 percent to 96.4 percent. Within the top cost quartile, examples of potentially actionable clusters of patients included those with high morbidity, prior utilization, depression risk and financial constraints; those with high morbidity, previously uninsured individuals with few financial constraints; and relatively healthy, previously insured individuals with medication needs. Applying sequential predictive modeling and cluster analytic techniques to patient-reported information can identify subgroups of individuals within heterogeneous populations who may benefit from specific interventions to optimize initial care delivery.

  14. Development Of An Educational Video To Improve Patient Knowledge And Communication With Their Healthcare Providers About Colorectal Cancer Screening

    PubMed Central

    Katz, Mira L.; Heaner, Sarah; Reiter, Paul; van Putten, Julie; Murray, Lee; McDougle, Leon; Cegala, Donald J.; Post, Douglas; David, Prabu; Slater, Michael; Paskett, Electra D.

    2009-01-01

    Background Low rates of colorectal cancer (CRC) screening persist due to individual, provider and system level barriers. Purpose To develop and obtain initial feedback about a CRC screening educational video from community members and medical professionals. Methods Focus groups of patients were conducted prior to the development of an educational video and focus groups of patients provided initial feedback about the developed CRC screening educational video. Medical personnel reviewed the video and made recommendations prior to final editing of the video. Results Patients identified CRC screening barriers and made suggestions about the information to include in the educational video. Their suggestions included using a healthcare provider to state the importance of completing CRC screening, demonstrate how to complete the fecal occult blood test, and that men and women from diverse ethnic groups and races could be included in the same video. Participants reviewed the developed video and mentioned that their suggestions were portrayed correctly, the video was culturally appropriate, and the information presented in the video was easy to understand. Medical personnel made suggestions on ways to improve the content and the delivery of the medical information prior to final editing of the video. Discussion Participants provided valuable information in the development of an educational video to improve patient knowledge and patient-provider communication about CRC screening. The educational video developed was based on the Protection Motivation Theory and addressed the colon cancer screening barriers identified in this mostly minority and low-income patient population. Future research will determine if CRC screening increases among patients who watch the educational video. Translation to Health Education Practice Educational videos can provide important information about CRC and CRC screening to average-risk adults. PMID:20209024

  15. Pilot-testing an adverse drug event reporting form prior to its implementation in an electronic health record.

    PubMed

    Chruscicki, Adam; Badke, Katherin; Peddie, David; Small, Serena; Balka, Ellen; Hohl, Corinne M

    2016-01-01

    Adverse drug events (ADEs), harmful unintended consequences of medication use, are a leading cause of hospital admissions, yet are rarely documented in a structured format between care providers. We describe pilot-testing structured ADE documentation fields prior to integration into an electronic medical record (EMR). We completed a qualitative study at two Canadian hospitals. Using data derived from a systematic review of the literature, we developed screen mock-ups for an ADE reporting platform, iteratively revised in participatory workshops with diverse end-user groups. We designed a paper-based form reflecting the data elements contained in the mock-ups. We distributed them to a convenience sample of clinical pharmacists, and completed ethnographic workplace observations while the forms were used. We reviewed completed forms, collected feedback from pharmacists using semi-structured interviews, and coded the data in NVivo for themes related to the ADE form. We completed 25 h of clinical observations, and 24 ADEs were documented. Pharmacists perceived the form as simple and clear, with sufficient detail to capture ADEs. They identified fields for omission, and others requiring more detail. Pharmacists encountered barriers to documenting ADEs including uncertainty about what constituted a reportable ADE, inability to complete patient follow-up, the need for inter-professional communication to rule out alternative diagnoses, and concern about creating a permanent record. Paper-based pilot-testing allowed planning for important modifications in an ADE documentation form prior to implementation in an EMR. While paper-based piloting is rarely reported prior to EMR implementations, it can inform design and enhance functionality. Piloting with other groups of care providers and in different healthcare settings will likely lead to further revisions prior to broader implementations.

  16. The power prior: theory and applications.

    PubMed

    Ibrahim, Joseph G; Chen, Ming-Hui; Gwon, Yeongjin; Chen, Fang

    2015-12-10

    The power prior has been widely used in many applications covering a large number of disciplines. The power prior is intended to be an informative prior constructed from historical data. It has been used in clinical trials, genetics, health care, psychology, environmental health, engineering, economics, and business. It has also been applied for a wide variety of models and settings, both in the experimental design and analysis contexts. In this review article, we give an A-to-Z exposition of the power prior and its applications to date. We review its theoretical properties, variations in its formulation, statistical contexts for which it has been used, applications, and its advantages over other informative priors. We review models for which it has been used, including generalized linear models, survival models, and random effects models. Statistical areas where the power prior has been used include model selection, experimental design, hierarchical modeling, and conjugate priors. Frequentist properties of power priors in posterior inference are established, and a simulation study is conducted to further examine the empirical performance of the posterior estimates with power priors. Real data analyses are given illustrating the power prior as well as the use of the power prior in the Bayesian design of clinical trials. Copyright © 2015 John Wiley & Sons, Ltd.

  17. The value of prior knowledge in machine learning of complex network systems.

    PubMed

    Ferranti, Dana; Krane, David; Craft, David

    2017-11-15

    Our overall goal is to develop machine-learning approaches based on genomics and other relevant accessible information for use in predicting how a patient will respond to a given proposed drug or treatment. Given the complexity of this problem, we begin by developing, testing and analyzing learning methods using data from simulated systems, which allows us access to a known ground truth. We examine the benefits of using prior system knowledge and investigate how learning accuracy depends on various system parameters as well as the amount of training data available. The simulations are based on Boolean networks-directed graphs with 0/1 node states and logical node update rules-which are the simplest computational systems that can mimic the dynamic behavior of cellular systems. Boolean networks can be generated and simulated at scale, have complex yet cyclical dynamics and as such provide a useful framework for developing machine-learning algorithms for modular and hierarchical networks such as biological systems in general and cancer in particular. We demonstrate that utilizing prior knowledge (in the form of network connectivity information), without detailed state equations, greatly increases the power of machine-learning algorithms to predict network steady-state node values ('phenotypes') and perturbation responses ('drug effects'). Links to codes and datasets here: https://gray.mgh.harvard.edu/people-directory/71-david-craft-phd. dcraft@broadinstitute.org. Supplementary data are available at Bioinformatics online. © The Author 2017. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com

  18. Collaborative learning in networks.

    PubMed

    Mason, Winter; Watts, Duncan J

    2012-01-17

    Complex problems in science, business, and engineering typically require some tradeoff between exploitation of known solutions and exploration for novel ones, where, in many cases, information about known solutions can also disseminate among individual problem solvers through formal or informal networks. Prior research on complex problem solving by collectives has found the counterintuitive result that inefficient networks, meaning networks that disseminate information relatively slowly, can perform better than efficient networks for problems that require extended exploration. In this paper, we report on a series of 256 Web-based experiments in which groups of 16 individuals collectively solved a complex problem and shared information through different communication networks. As expected, we found that collective exploration improved average success over independent exploration because good solutions could diffuse through the network. In contrast to prior work, however, we found that efficient networks outperformed inefficient networks, even in a problem space with qualitative properties thought to favor inefficient networks. We explain this result in terms of individual-level explore-exploit decisions, which we find were influenced by the network structure as well as by strategic considerations and the relative payoff between maxima. We conclude by discussing implications for real-world problem solving and possible extensions.

  19. Collaborative learning in networks

    PubMed Central

    Mason, Winter; Watts, Duncan J.

    2012-01-01

    Complex problems in science, business, and engineering typically require some tradeoff between exploitation of known solutions and exploration for novel ones, where, in many cases, information about known solutions can also disseminate among individual problem solvers through formal or informal networks. Prior research on complex problem solving by collectives has found the counterintuitive result that inefficient networks, meaning networks that disseminate information relatively slowly, can perform better than efficient networks for problems that require extended exploration. In this paper, we report on a series of 256 Web-based experiments in which groups of 16 individuals collectively solved a complex problem and shared information through different communication networks. As expected, we found that collective exploration improved average success over independent exploration because good solutions could diffuse through the network. In contrast to prior work, however, we found that efficient networks outperformed inefficient networks, even in a problem space with qualitative properties thought to favor inefficient networks. We explain this result in terms of individual-level explore-exploit decisions, which we find were influenced by the network structure as well as by strategic considerations and the relative payoff between maxima. We conclude by discussing implications for real-world problem solving and possible extensions. PMID:22184216

  20. A blind deconvolution method based on L1/L2 regularization prior in the gradient space

    NASA Astrophysics Data System (ADS)

    Cai, Ying; Shi, Yu; Hua, Xia

    2018-02-01

    In the process of image restoration, the result of image restoration is very different from the real image because of the existence of noise, in order to solve the ill posed problem in image restoration, a blind deconvolution method based on L1/L2 regularization prior to gradient domain is proposed. The method presented in this paper first adds a function to the prior knowledge, which is the ratio of the L1 norm to the L2 norm, and takes the function as the penalty term in the high frequency domain of the image. Then, the function is iteratively updated, and the iterative shrinkage threshold algorithm is applied to solve the high frequency image. In this paper, it is considered that the information in the gradient domain is better for the estimation of blur kernel, so the blur kernel is estimated in the gradient domain. This problem can be quickly implemented in the frequency domain by fast Fast Fourier Transform. In addition, in order to improve the effectiveness of the algorithm, we have added a multi-scale iterative optimization method. This paper proposes the blind deconvolution method based on L1/L2 regularization priors in the gradient space can obtain the unique and stable solution in the process of image restoration, which not only keeps the edges and details of the image, but also ensures the accuracy of the results.

  1. A Regions of Confidence Based Approach to Enhance Segmentation with Shape Priors.

    PubMed

    Appia, Vikram V; Ganapathy, Balaji; Abufadel, Amer; Yezzi, Anthony; Faber, Tracy

    2010-01-18

    We propose an improved region based segmentation model with shape priors that uses labels of confidence/interest to exclude the influence of certain regions in the image that may not provide useful information for segmentation. These could be regions in the image which are expected to have weak, missing or corrupt edges or they could be regions in the image which the user is not interested in segmenting, but are part of the object being segmented. In the training datasets, along with the manual segmentations we also generate an auxiliary map indicating these regions of low confidence/interest. Since, all the training images are acquired under similar conditions, we can train our algorithm to estimate these regions as well. Based on this training we will generate a map which indicates the regions in the image that are likely to contain no useful information for segmentation. We then use a parametric model to represent the segmenting curve as a combination of shape priors obtained by representing the training data as a collection of signed distance functions. We evolve an objective energy functional to evolve the global parameters that are used to represent the curve. We vary the influence each pixel has on the evolution of these parameters based on the confidence/interest label. When we use these labels to indicate the regions with low confidence; the regions containing accurate edges will have a dominant role in the evolution of the curve and the segmentation in the low confidence regions will be approximated based on the training data. Since our model evolves global parameters, it improves the segmentation even in the regions with accurate edges. This is because we eliminate the influence of the low confidence regions which may mislead the final segmentation. Similarly when we use the labels to indicate the regions which are not of importance, we will get a better segmentation of the object in the regions we are interested in.

  2. Quantum secret sharing with identity authentication based on Bell states

    NASA Astrophysics Data System (ADS)

    Abulkasim, Hussein; Hamad, Safwat; Khalifa, Amal; El Bahnasy, Khalid

    Quantum secret sharing techniques allow two parties or more to securely share a key, while the same number of parties or less can efficiently deduce the secret key. In this paper, we propose an authenticated quantum secret sharing protocol, where a quantum dialogue protocol is adopted to authenticate the identity of the parties. The participants simultaneously authenticate the identity of each other based on parts of a prior shared key. Moreover, the whole prior shared key can be reused for deducing the secret data. Although the proposed scheme does not significantly improve the efficiency performance, it is more secure compared to some existing quantum secret sharing scheme due to the identity authentication process. In addition, the proposed scheme can stand against participant attack, man-in-the-middle attack, impersonation attack, Trojan-horse attack as well as information leaks.

  3. Nonword repetition in lexical decision: support for two opposing processes.

    PubMed

    Wagenmakers, Eric-Jan; Zeelenberg, René; Steyvers, Mark; Shiffrin, Richard; Raaijmakers, Jeroen

    2004-10-01

    We tested and confirmed the hypothesis that the prior presentation of nonwords in lexical decision is the net result of two opposing processes: (1) a relatively fast inhibitory process based on global familiarity; and (2) a relatively slow facilitatory process based on the retrieval of specific episodic information. In three studies, we manipulated speed-stress to influence the balance between the two processes. Experiment 1 showed item-specific improvement for repeated nonwords in a standard "respond-when-ready" lexical decision task. Experiment 2 used a 400-ms deadline procedure and showed performance for nonwords to be unaffected by up to four prior presentations. In Experiment 3 we used a signal-to-respond procedure with variable time intervals and found negative repetition priming for repeated nonwords. These results can be accounted for by dual-process models of lexical decision.

  4. Cognitive flexibility and undergraduate physiology students: increasing advanced knowledge acquisition within an ill-structured domain.

    PubMed

    Rhodes, Ashley E; Rozell, Timothy G

    2017-09-01

    Cognitive flexibility is defined as the ability to assimilate previously learned information and concepts to generate novel solutions to new problems. This skill is crucial for success within ill-structured domains such as biology, physiology, and medicine, where many concepts are simultaneously required for understanding a complex problem, yet the problem consists of patterns or combinations of concepts that are not consistently used or needed across all examples. To succeed within ill-structured domains, a student must possess a certain level of cognitive flexibility: rigid thought processes and prepackaged informational retrieval schemes relying on rote memorization will not suffice. In this study, we assessed the cognitive flexibility of undergraduate physiology students using a validated instrument entitled Student's Approaches to Learning (SAL). The SAL evaluates how deeply and in what way information is processed, as well as the investment of time and mental energy that a student is willing to expend by measuring constructs such as elaboration and memorization. Our results indicate that students who rely primarily on memorization when learning new information have a smaller knowledge base about physiological concepts, as measured by a prior knowledge assessment and unit exams. However, students who rely primarily on elaboration when learning new information have a more well-developed knowledge base about physiological concepts, which is displayed by higher scores on a prior knowledge assessment and increased performance on unit exams. Thus students with increased elaboration skills possibly possess a higher level of cognitive flexibility and are more likely to succeed within ill-structured domains. Copyright © 2017 the American Physiological Society.

  5. Evaluation of uncertainty in the adjustment of fundamental constants

    NASA Astrophysics Data System (ADS)

    Bodnar, Olha; Elster, Clemens; Fischer, Joachim; Possolo, Antonio; Toman, Blaza

    2016-02-01

    Combining multiple measurement results for the same quantity is an important task in metrology and in many other areas. Examples include the determination of fundamental constants, the calculation of reference values in interlaboratory comparisons, or the meta-analysis of clinical studies. However, neither the GUM nor its supplements give any guidance for this task. Various approaches are applied such as weighted least-squares in conjunction with the Birge ratio or random effects models. While the former approach, which is based on a location-scale model, is particularly popular in metrology, the latter represents a standard tool used in statistics for meta-analysis. We investigate the reliability and robustness of the location-scale model and the random effects model with particular focus on resulting coverage or credible intervals. The interval estimates are obtained by adopting a Bayesian point of view in conjunction with a non-informative prior that is determined by a currently favored principle for selecting non-informative priors. Both approaches are compared by applying them to simulated data as well as to data for the Planck constant and the Newtonian constant of gravitation. Our results suggest that the proposed Bayesian inference based on the random effects model is more reliable and less sensitive to model misspecifications than the approach based on the location-scale model.

  6. Pre-examination factors affecting molecular diagnostic test results and interpretation: A case-based approach.

    PubMed

    Payne, Deborah A; Baluchova, Katarina; Peoc'h, Katell H; van Schaik, Ron H N; Chan, K C Allen; Maekawa, Masato; Mamotte, Cyril; Russomando, Graciela; Rousseau, François; Ahmad-Nejad, Parviz

    2017-04-01

    Multiple organizations produce guidance documents that provide opportunities to harmonize quality practices for diagnostic testing. The International Organization for Standardization ISO 15189 standard addresses requirements for quality in management and technical aspects of the clinical laboratory. One technical aspect addresses the complexities of the pre-examination phase prior to diagnostic testing. The Committee for Molecular Diagnostics of the International Federation for Clinical Chemistry and Laboratory Medicine (also known as, IFCC C-MD) conducted a survey of international molecular laboratories and determined ISO 15189 to be the most referenced guidance document. In this review, the IFCC C-MD provides case-based examples illustrating the value of select pre-examination processes as these processes relate to molecular diagnostic testing. Case-based examples in infectious disease, oncology, inherited disease and pharmacogenomics address the utility of: 1) providing information to patients and users, 2) designing requisition forms, 3) obtaining informed consent and 4) maintaining sample integrity prior to testing. The pre-examination phase requires extensive and consistent communication between the laboratory, the healthcare provider and the end user. The clinical vignettes presented in this paper illustrate the value of applying select ISO 15189 recommendations for general laboratory to the more specialized area of Molecular Diagnostics. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Topic model-based mass spectrometric data analysis in cancer biomarker discovery studies.

    PubMed

    Wang, Minkun; Tsai, Tsung-Heng; Di Poto, Cristina; Ferrarini, Alessia; Yu, Guoqiang; Ressom, Habtom W

    2016-08-18

    A fundamental challenge in quantitation of biomolecules for cancer biomarker discovery is owing to the heterogeneous nature of human biospecimens. Although this issue has been a subject of discussion in cancer genomic studies, it has not yet been rigorously investigated in mass spectrometry based proteomic and metabolomic studies. Purification of mass spectometric data is highly desired prior to subsequent analysis, e.g., quantitative comparison of the abundance of biomolecules in biological samples. We investigated topic models to computationally analyze mass spectrometric data considering both integrated peak intensities and scan-level features, i.e., extracted ion chromatograms (EICs). Probabilistic generative models enable flexible representation in data structure and infer sample-specific pure resources. Scan-level modeling helps alleviate information loss during data preprocessing. We evaluated the capability of the proposed models in capturing mixture proportions of contaminants and cancer profiles on LC-MS based serum proteomic and GC-MS based tissue metabolomic datasets acquired from patients with hepatocellular carcinoma (HCC) and liver cirrhosis as well as synthetic data we generated based on the serum proteomic data. The results we obtained by analysis of the synthetic data demonstrated that both intensity-level and scan-level purification models can accurately infer the mixture proportions and the underlying true cancerous sources with small average error ratios (<7 %) between estimation and ground truth. By applying the topic model-based purification to mass spectrometric data, we found more proteins and metabolites with significant changes between HCC cases and cirrhotic controls. Candidate biomarkers selected after purification yielded biologically meaningful pathway analysis results and improved disease discrimination power in terms of the area under ROC curve compared to the results found prior to purification. We investigated topic model-based inference methods to computationally address the heterogeneity issue in samples analyzed by LC/GC-MS. We observed that incorporation of scan-level features have the potential to lead to more accurate purification results by alleviating the loss in information as a result of integrating peaks. We believe cancer biomarker discovery studies that use mass spectrometric analysis of human biospecimens can greatly benefit from topic model-based purification of the data prior to statistical and pathway analyses.

  8. Optimal Duration of Conservative Management Prior to Surgery for Cervical and Lumbar Radiculopathy: A Literature Review

    PubMed Central

    Alentado, Vincent J.; Lubelski, Daniel; Steinmetz, Michael P.; Benzel, Edward C.; Mroz, Thomas E.

    2014-01-01

    Study Design Literature review. Objective Since the 1970s, spine surgeons have commonly required 6 weeks of failed conservative treatment prior to considering surgical intervention for various spinal pathologies. It is unclear, however, if this standard has been validated in the literature. The authors review the natural history, outcomes, and cost-effectiveness studies relating to the current standard of 6 weeks of nonoperative care prior to surgery for patients with spinal pathologies. Methods A systematic Medline search from 1953 to 2013 was performed to identify natural history, outcomes, and cost-effectiveness studies relating to the optimal period of conservative management prior to surgical intervention for both cervical and lumbar radiculopathy. Demographic information, operative indications, and clinical outcomes are reviewed for each study. Results A total of 5,719 studies were identified; of these, 13 studies were selected for inclusion. Natural history studies demonstrated that 88% of patients with cervical radiculopathy and 70% of patients with lumbar radiculopathy showed improvement within 4 weeks following onset of symptoms. Outcomes and cost-effectiveness studies supported surgical intervention within 8 weeks of symptom onset for both cervical and lumbar radiculopathy. Conclusions There are limited studies supporting any optimal duration of conservative treatment prior to surgery for cervical and lumbar radiculopathy. Therefore, evidence-based conclusions cannot be made. Based on the available literature, we suggest that an optimal timing for surgery following cervical radiculopathy is within 8 weeks of onset of symptoms. A shorter period of 4 weeks may be appropriate based on natural history studies. Additionally, we found that optimal timing for surgery following lumbar radiculopathy is between 4 and 8 weeks. A prospective study is needed to explicitly identify the optimal duration of conservative therapy prior to surgery so that costs may be reduced and patient outcomes improved. PMID:25396110

  9. Extended preoperative patient education using a multimedia DVD-impact on patients receiving a laparoscopic cholecystectomy: a randomised controlled trial.

    PubMed

    Wilhelm, D; Gillen, S; Wirnhier, H; Kranzfelder, M; Schneider, A; Schmidt, A; Friess, H; Feussner, H

    2009-03-01

    The informed consent is a legal requirement prior to surgery and should be based on an extensive preoperative interview. Multimedia productions can therefore be utilised as supporting tool. In a prospective randomised trial, we evaluated the impact of an extended education on patients undergoing cholecystectomy. For extended patient information, a professionally built DVD was used. After randomisation to either the DVD or the control group, patients were informed with or without additional presentation of the DVD. The quality of education was evaluated using a purpose-built questionnaire. One hundred fourteen patients were included in the DVD and 98 in the control group. Patient characteristics did not differ significantly despite a higher educational level in the DVD group. The score of correctly answered questions was higher in the DVD group (19.88 vs. 17.58 points, p < 0.001). As subgroup analysis revealed, particular patient characteristics additionally impacted on results. Patients should be informed the most extensively prior to any surgical procedure. Multimedia productions therefore offer a suitable instrument. In the presented study, we could prove the positive impact of an information DVD on patients knowledge. Nevertheless, multimedia tools cannot replace personal interaction and should only be used to support daily work.

  10. Using texts in science education: cognitive processes and knowledge representation.

    PubMed

    van den Broek, Paul

    2010-04-23

    Texts form a powerful tool in teaching concepts and principles in science. How do readers extract information from a text, and what are the limitations in this process? Central to comprehension of and learning from a text is the construction of a coherent mental representation that integrates the textual information and relevant background knowledge. This representation engenders learning if it expands the reader's existing knowledge base or if it corrects misconceptions in this knowledge base. The Landscape Model captures the reading process and the influences of reader characteristics (such as working-memory capacity, reading goal, prior knowledge, and inferential skills) and text characteristics (such as content/structure of presented information, processing demands, and textual cues). The model suggests factors that can optimize--or jeopardize--learning science from text.

  11. Beyond maximum entropy: Fractal pixon-based image reconstruction

    NASA Technical Reports Server (NTRS)

    Puetter, R. C.; Pina, R. K.

    1994-01-01

    We have developed a new Bayesian image reconstruction method that has been shown to be superior to the best implementations of other methods, including Goodness-of-Fit (e.g. Least-Squares and Lucy-Richardson) and Maximum Entropy (ME). Our new method is based on the concept of the pixon, the fundamental, indivisible unit of picture information. Use of the pixon concept provides an improved image model, resulting in an image prior which is superior to that of standard ME.

  12. Short-Term Memory Affects Color Perception in Context

    PubMed Central

    Olkkonen, Maria; Allred, Sarah R.

    2014-01-01

    Color-based object selection — for instance, looking for ripe tomatoes in the market — places demands on both perceptual and memory processes: it is necessary to form a stable perceptual estimate of surface color from a variable visual signal, as well as to retain multiple perceptual estimates in memory while comparing objects. Nevertheless, perceptual and memory processes in the color domain are generally studied in separate research programs with the assumption that they are independent. Here, we demonstrate a strong failure of independence between color perception and memory: the effect of context on color appearance is substantially weakened by a short retention interval between a reference and test stimulus. This somewhat counterintuitive result is consistent with Bayesian estimation: as the precision of the representation of the reference surface and its context decays in memory, prior information gains more weight, causing the retained percepts to be drawn toward prior information about surface and context color. This interaction implies that to fully understand information processing in real-world color tasks, perception and memory need to be considered jointly. PMID:24475131

  13. Exploring Motivations, Awareness of Side Effects, and Attitudes among Potential Egg Donors

    PubMed Central

    Gezinski, Lindsay B.; Karandikar, Sharvari; Carter, James; White, Melinda

    2016-01-01

    This research study surveyed prospective egg donors at orientation to (a) understand women’s motivations to donate eggs, (b) assess awareness and knowledge of egg donation prior to entry into the egg donation program, and (c) explore attitudes toward egg donation. Ninety-two women completed the questionnaire at one fertility clinic located in the Midwest between August 2011 and August 2012. Descriptive and inferential statistics as well as textual analysis were used to analyze the data. Three themes emerged regarding participant motivations: (1) altruistic, (2) financial, and (3) desire to pass on genetic material. The majority of participants were unconcerned with potential physical and psychological side effects; however, differences emerged based on motherhood status and educational level. Although potential donors felt recipients should receive some information about the donor, they tended to value privacy regarding information giving to resultant offspring. This research study has implications for social work practice, policy, and future research. It is crucial that women receive adequate procedural and side effect information prior to engaging in egg donation. PMID:27263197

  14. Predicting clinical trial results based on announcements of interim analyses

    PubMed Central

    2014-01-01

    Background Announcements of interim analyses of a clinical trial convey information about the results beyond the trial’s Data Safety Monitoring Board (DSMB). The amount of information conveyed may be minimal, but the fact that none of the trial’s stopping boundaries has been crossed implies that the experimental therapy is neither extremely effective nor hopeless. Predicting success of the ongoing trial is of interest to the trial’s sponsor, the medical community, pharmaceutical companies, and investors. We determine the probability of trial success by quantifying only the publicly available information from interim analyses of an ongoing trial. We illustrate our method in the context of the National Surgical Adjuvant Breast and Bowel (NSABP) trial, C-08. Methods We simulated trials based on the specifics of the NSABP C-08 protocol that were publicly available. We quantified the uncertainty around the treatment effect using prior weights for the various possibilities in light of other colon cancer studies and other studies of the investigational agent, bevacizumab. We considered alternative prior distributions. Results Subsequent to the trial’s third interim analysis, our predictive probabilities were: that the trial would eventually be successful, 48.0%; would stop for futility, 7.4%; and would continue to completion without statistical significance, 44.5%. The actual trial continued to completion without statistical significance. Conclusions Announcements of interim analyses provide information outside the DSMB’s sphere of confidentiality. This information is potentially helpful to clinical trial prognosticators. ‘Information leakage’ from standard interim analyses such as in NSABP C-08 is conventionally viewed as acceptable even though it may be quite revealing. Whether leakage from more aggressive types of adaptations is acceptable should be assessed at the design stage. PMID:24607270

  15. A Monte Carlo–Based Bayesian Approach for Measuring Agreement in a Qualitative Scale

    PubMed Central

    Pérez Sánchez, Carlos Javier

    2014-01-01

    Agreement analysis has been an active research area whose techniques have been widely applied in psychology and other fields. However, statistical agreement among raters has been mainly considered from a classical statistics point of view. Bayesian methodology is a viable alternative that allows the inclusion of subjective initial information coming from expert opinions, personal judgments, or historical data. A Bayesian approach is proposed by providing a unified Monte Carlo–based framework to estimate all types of measures of agreement in a qualitative scale of response. The approach is conceptually simple and it has a low computational cost. Both informative and non-informative scenarios are considered. In case no initial information is available, the results are in line with the classical methodology, but providing more information on the measures of agreement. For the informative case, some guidelines are presented to elicitate the prior distribution. The approach has been applied to two applications related to schizophrenia diagnosis and sensory analysis. PMID:29881002

  16. Informative priors on fetal fraction increase power of the noninvasive prenatal screen.

    PubMed

    Xu, Hanli; Wang, Shaowei; Ma, Lin-Lin; Huang, Shuai; Liang, Lin; Liu, Qian; Liu, Yang-Yang; Liu, Ke-Di; Tan, Ze-Min; Ban, Hao; Guan, Yongtao; Lu, Zuhong

    2017-11-09

    PurposeNoninvasive prenatal screening (NIPS) sequences a mixture of the maternal and fetal cell-free DNA. Fetal trisomy can be detected by examining chromosomal dosages estimated from sequencing reads. The traditional method uses the Z-test, which compares a subject against a set of euploid controls, where the information of fetal fraction is not fully utilized. Here we present a Bayesian method that leverages informative priors on the fetal fraction.MethodOur Bayesian method combines the Z-test likelihood and informative priors of the fetal fraction, which are learned from the sex chromosomes, to compute Bayes factors. Bayesian framework can account for nongenetic risk factors through the prior odds, and our method can report individual positive/negative predictive values.ResultsOur Bayesian method has more power than the Z-test method. We analyzed 3,405 NIPS samples and spotted at least 9 (of 51) possible Z-test false positives.ConclusionBayesian NIPS is more powerful than the Z-test method, is able to account for nongenetic risk factors through prior odds, and can report individual positive/negative predictive values.Genetics in Medicine advance online publication, 9 November 2017; doi:10.1038/gim.2017.186.

  17. A randomized trial of pictorial versus prose-based medication information pamphlets.

    PubMed

    Thompson, Andrew E; Goldszmidt, Mark A; Schwartz, Alan J; Bashook, Philip G

    2010-03-01

    The goal of this study was to compare prose and pictorial-based information pamphlets about the medication methotrexate in the domains of free recall, cued recall, comprehension and utility. A single blind, randomized trial of picture versus prose-based information pamphlets including 100 participants aged 18-65 years of age, who had not completed high school, could read English, and had no prior knowledge about methotrexate. Superiority of pamphlet type was assessed using immediate free recall, cued recall and comprehension. There were no differences between picture and prose pamphlets in free recall, cued recall, and comprehension either immediately or after a 1-week interval. Immediate free recall of important information was 17-26%; free recall fell even lower to 7-16% after 1 week. The pictorial pamphlet was preferred over the prose-based pamphlet. This study found no benefit in free recall, cued recall, or comprehension through the addition of pictograms to a simple prose-based medication pamphlet. In order for them to be effective in clinical practice, even simple medication information pamphlets that have been assessed for patients' ability to comprehend them cannot be used as the sole means for conveying important medication-related information to patients. Copyright 2010 Elsevier Ireland Ltd. All rights reserved.

  18. Interviews of living kidney donors to assess donation-related concerns and information-gathering practices.

    PubMed

    Ruck, Jessica M; Van Pilsum Rasmussen, Sarah E; Henderson, Macey L; Massie, Allan B; Segev, Dorry L

    2018-06-08

    Efforts are underway to improve living kidney donor (LKD) education, but current LKD concerns and information-gathering preferences have not been ascertained to inform evidence-based resource development. As a result, prior studies have found that donors desire information that is not included in current informed consent and/or educational materials. We conducted semi-structured interviews with 50 LKDs who donated at our center to assess (1) concerns about donation that they either had personally before or after donation or heard from family members or friends, (2) information that they had desired before donation, and (3) where they sought information about donation. We used thematic analysis of verbatim interview transcriptions to identify donation-related concerns. We compared the demographic characteristics of participants reporting specific concerns using Fisher's exact test. We identified 19 unique concerns that participants had or heard about living kidney donation. 20% of participants reported having had no pre-donation concerns; 38% reported no post-donation concerns. The most common concern pre-donation was future kidney failure (22%), post-donation was the recovery process (24%), and from family was endangering their family unit (16%). 44% of participants reported being less concerned than family. 26% of participants wished they had had additional information prior to donating, including practical advice for recovery (10%) and information about specific complications (14%). Caucasian participants were more likely to hear at least one concern from family (76% vs. 33%, p = 0.02). The most commonly consulted educational resources were health care providers (100%) and websites (79% of donors since 2000). 26% of participants had had contact with other donors; an additional 20% desired contact with other LKDs. Potential donors not only have personal donation-related concerns but frequently hear donation-related concerns from family members and friends. Current gaps in donor education include an absence of practical, peer-to-peer advice about donation from other prior donors and materials directed and potential donors' family members and friends. These findings can inform the development of new educational practices and resources targeted not only at LKDs but at their social networks.

  19. 22 CFR 129.8 - Prior notification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Prior notification. 129.8 Section 129.8 Foreign Relations DEPARTMENT OF STATE INTERNATIONAL TRAFFIC IN ARMS REGULATIONS REGISTRATION AND LICENSING OF...,000, except for sharing of basic marketing information (e.g., information that does not include...

  20. Damage of composite structures: Detection technique, dynamic response and residual strength

    NASA Astrophysics Data System (ADS)

    Lestari, Wahyu

    2001-10-01

    Reliable and accurate health monitoring techniques can prevent catastrophic failures of structures. Conventional damage detection methods are based on visual or localized experimental methods and very often require prior information concerning the vicinity of the damage or defect. The structure must also be readily accessible for inspections. The techniques are also labor intensive. In comparison to these methods, health-monitoring techniques that are based on the structural dynamic response offers unique information on failure of structures. However, systematic relations between the experimental data and the defect are not available and frequently, the number of vibration modes needed for an accurate identification of defects is much higher than the number of modes that can be readily identified in the experiment. These motivated us to develop an experimental data based detection method with systematic relationships between the experimentally identified information and the analytical or mathematical model representing the defective structures. The developed technique use changes in vibrational curvature modes and natural frequencies. To avoid misinterpretation of the identified information, we also need to understand the effects of defects on the structural dynamic response prior to developing health-monitoring techniques. In this thesis work we focus on two type of defects in composite structures, namely delamination and edge notch like defect. Effects of nonlinearity due to the presence of defect and due to the axial stretching are studied for beams with delamination. Once defects are detected in a structure, next concern is determining the effects of the defects on the strength of the structure and its residual stiffness under dynamic loading. In this thesis, energy release rate due to dynamic loading in a delaminated structure is studied, which will be a foundation toward determining the residual strength of the structure.

  1. Modelling heterogeneity variances in multiple treatment comparison meta-analysis--are informative priors the better solution?

    PubMed

    Thorlund, Kristian; Thabane, Lehana; Mills, Edward J

    2013-01-11

    Multiple treatment comparison (MTC) meta-analyses are commonly modeled in a Bayesian framework, and weakly informative priors are typically preferred to mirror familiar data driven frequentist approaches. Random-effects MTCs have commonly modeled heterogeneity under the assumption that the between-trial variance for all involved treatment comparisons are equal (i.e., the 'common variance' assumption). This approach 'borrows strength' for heterogeneity estimation across treatment comparisons, and thus, ads valuable precision when data is sparse. The homogeneous variance assumption, however, is unrealistic and can severely bias variance estimates. Consequently 95% credible intervals may not retain nominal coverage, and treatment rank probabilities may become distorted. Relaxing the homogeneous variance assumption may be equally problematic due to reduced precision. To regain good precision, moderately informative variance priors or additional mathematical assumptions may be necessary. In this paper we describe four novel approaches to modeling heterogeneity variance - two novel model structures, and two approaches for use of moderately informative variance priors. We examine the relative performance of all approaches in two illustrative MTC data sets. We particularly compare between-study heterogeneity estimates and model fits, treatment effect estimates and 95% credible intervals, and treatment rank probabilities. In both data sets, use of moderately informative variance priors constructed from the pair wise meta-analysis data yielded the best model fit and narrower credible intervals. Imposing consistency equations on variance estimates, assuming variances to be exchangeable, or using empirically informed variance priors also yielded good model fits and narrow credible intervals. The homogeneous variance model yielded high precision at all times, but overall inadequate estimates of between-trial variances. Lastly, treatment rankings were similar among the novel approaches, but considerably different when compared with the homogenous variance approach. MTC models using a homogenous variance structure appear to perform sub-optimally when between-trial variances vary between comparisons. Using informative variance priors, assuming exchangeability or imposing consistency between heterogeneity variances can all ensure sufficiently reliable and realistic heterogeneity estimation, and thus more reliable MTC inferences. All four approaches should be viable candidates for replacing or supplementing the conventional homogeneous variance MTC model, which is currently the most widely used in practice.

  2. RFID-based information visibility for hospital operations: exploring its positive effects using discrete event simulation.

    PubMed

    Asamoah, Daniel A; Sharda, Ramesh; Rude, Howard N; Doran, Derek

    2016-10-12

    Long queues and wait times often occur at hospitals and affect smooth delivery of health services. To improve hospital operations, prior studies have developed scheduling techniques to minimize patient wait times. However, these studies lack in demonstrating how such techniques respond to real-time information needs of hospitals and efficiently manage wait times. This article presents a multi-method study on the positive impact of providing real-time scheduling information to patients using the RFID technology. Using a simulation methodology, we present a generic scenario, which can be mapped to real-life situations, where patients can select the order of laboratory services. The study shows that information visibility offered by RFID technology results in decreased wait times and improves resource utilization. We also discuss the applicability of the results based on field interviews granted by hospital clinicians and administrators on the perceived barriers and benefits of an RFID system.

  3. The neural dynamics of updating person impressions

    PubMed Central

    Cai, Yang; Todorov, Alexander

    2013-01-01

    Person perception is a dynamic, evolving process. Because other people are an endless source of social information, people need to update their impressions of others based upon new information. We devised an fMRI study to identify brain regions involved in updating impressions. Participants saw faces paired with valenced behavioral information and were asked to form impressions of these individuals. Each face was seen five times in a row, each time with a different behavioral description. Critically, for half of the faces the behaviors were evaluatively consistent, while for the other half they were inconsistent. In line with prior work, dorsomedial prefrontal cortex (dmPFC) was associated with forming impressions of individuals based on behavioral information. More importantly, a whole-brain analysis revealed a network of other regions associated with updating impressions of individuals who exhibited evaluatively inconsistent behaviors, including rostrolateral PFC, superior temporal sulcus, right inferior parietal lobule and posterior cingulate cortex. PMID:22490923

  4. Use of Airport Noise Complaint Files to Improve Understanding of Community Response to Aircraft Noise

    NASA Technical Reports Server (NTRS)

    Fidell, Sanford; Howe, Richard

    1998-01-01

    This study assessed the feasibility of using complaint information archived by modem airport monitoring systems to conduct quantitative analyses of the causes of aircraft noise complaints and their relationship to noise- induced annoyance. It was found that all computer-based airport monitoring systems provide at least rudimentary tools for performing data base searches by complainant name, address, date, time of day, and types of aircraft and complaints. Analyses of such information can provide useful information about longstanding concerns, such as the extent to which complaint rates are driven by objectively measurable aspects of aircraft operations; the degree to which changes in complaint rates can be predicted prior to implementation of noise mitigation measures; and the degree to which aircraft complaint information can be used to simplify and otherwise improve prediction of the prevalence of noise-induced annoyance in communities.

  5. The Best of Both Worlds: The Benefits of Open-specialized and Closed-diverse Syndication Networks for New Ventures' Success.

    PubMed

    Ter Wal, Anne L J; Alexy, Oliver; Block, Jörn; Sandner, Philipp G

    2016-09-01

    Open networks give actors non-redundant information that is diverse, while closed networks offer redundant information that is easier to interpret. Integrating arguments about network structure and the similarity of actors' knowledge, we propose two types of network configurations that combine diversity and ease of interpretation. Closed-diverse networks offer diversity in actors' knowledge domains and shared third-party ties to help in interpreting that knowledge. In open-specialized networks, structural holes offer diversity, while shared interpretive schema and overlap between received information and actors' prior knowledge help in interpreting new information without the help of third parties. In contrast, actors in open-diverse networks suffer from information overload due to the lack of shared schema or overlapping prior knowledge for the interpretation of diverse information, and actors in closed-specialized networks suffer from overembeddedness because they cannot access diverse information. Using CrunchBase data on early-stage venture capital investments in the U.S. information technology sector, we test the effect of investors' social capital on the success of their portfolio ventures. We find that ventures have the highest chances of success if their syndicating investors have either open-specialized or closed-diverse networks. These effects are manifested beyond the direct effects of ventures' or investors' quality and are robust to controlling for the possibility that certain investors could have chosen more promising ventures at the time of first funding.

  6. Combining sky and earth: desert ants (Melophorus bagoti) show weighted integration of celestial and terrestrial cues.

    PubMed

    Legge, Eric L G; Wystrach, Antoine; Spetch, Marcia L; Cheng, Ken

    2014-12-01

    Insects typically use celestial sources of directional information for path integration, and terrestrial panoramic information for view-based navigation. Here we set celestial and terrestrial sources of directional information in conflict for homing desert ants (Melophorus bagoti). In the first experiment, ants learned to navigate out of a round experimental arena with a distinctive artificial panorama. On crucial tests, we rotated the arena to create a conflict between the artificial panorama and celestial information. In a second experiment, ants at a feeder in their natural visually-cluttered habitat were displaced prior to their homing journey so that the dictates of path integration (feeder to nest direction) based on a celestial compass conflicted with the dictates of view-based navigation (release point to nest direction) based on the natural terrestrial panorama. In both experiments, ants generally headed in a direction intermediate to the dictates of celestial and terrestrial information. In the second experiment, the ants put more weight on the terrestrial cues when they provided better directional information. We conclude that desert ants weight and integrate the dictates of celestial and terrestrial information in determining their initial heading, even when the two directional cues are highly discrepant. © 2014. Published by The Company of Biologists Ltd.

  7. Methods and systems for detecting abnormal digital traffic

    DOEpatents

    Goranson, Craig A [Kennewick, WA; Burnette, John R [Kennewick, WA

    2011-03-22

    Aspects of the present invention encompass methods and systems for detecting abnormal digital traffic by assigning characterizations of network behaviors according to knowledge nodes and calculating a confidence value based on the characterizations from at least one knowledge node and on weighting factors associated with the knowledge nodes. The knowledge nodes include a characterization model based on prior network information. At least one of the knowledge nodes should not be based on fixed thresholds or signatures. The confidence value includes a quantification of the degree of confidence that the network behaviors constitute abnormal network traffic.

  8. Cortical Measures of Phoneme-Level Speech Encoding Correlate with the Perceived Clarity of Natural Speech

    PubMed Central

    2018-01-01

    Abstract In real-world environments, humans comprehend speech by actively integrating prior knowledge (P) and expectations with sensory input. Recent studies have revealed effects of prior information in temporal and frontal cortical areas and have suggested that these effects are underpinned by enhanced encoding of speech-specific features, rather than a broad enhancement or suppression of cortical activity. However, in terms of the specific hierarchical stages of processing involved in speech comprehension, the effects of integrating bottom-up sensory responses and top-down predictions are still unclear. In addition, it is unclear whether the predictability that comes with prior information may differentially affect speech encoding relative to the perceptual enhancement that comes with that prediction. One way to investigate these issues is through examining the impact of P on indices of cortical tracking of continuous speech features. Here, we did this by presenting participants with degraded speech sentences that either were or were not preceded by a clear recording of the same sentences while recording non-invasive electroencephalography (EEG). We assessed the impact of prior information on an isolated index of cortical tracking that reflected phoneme-level processing. Our findings suggest the possibility that prior information affects the early encoding of natural speech in a dual manner. Firstly, the availability of prior information, as hypothesized, enhanced the perceived clarity of degraded speech, which was positively correlated with changes in phoneme-level encoding across subjects. In addition, P induced an overall reduction of this cortical measure, which we interpret as resulting from the increase in predictability. PMID:29662947

  9. The Effects of Prior Knowledge Activation on Free Recall and Study Time Allocation.

    ERIC Educational Resources Information Center

    Machiels-Bongaerts, Maureen; And Others

    The effects of mobilizing prior knowledge on information processing were studied. Two hypotheses, the cognitive set-point hypothesis and the selective attention hypothesis, try to account for the facilitation effects of prior knowledge activation. These hypotheses predict different recall patterns as a result of mobilizing prior knowledge. In…

  10. 21 CFR 1.280 - How must you submit prior notice?

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... to FDA. You must submit all prior notice information in the English language, except that an... Commercial System (ABI/ACS); or (2) The FDA PNSI at http://www.access.fda.gov. You must submit prior notice through the FDA Prior Notice System Interface (FDA PNSI) for articles of food imported or offered for...

  11. The Counter-Intuitive Non-Informative Prior for the Bernoulli Family

    ERIC Educational Resources Information Center

    Zhu, Mu; Lu, Arthur Y.

    2004-01-01

    In Bayesian statistics, the choice of the prior distribution is often controversial. Different rules for selecting priors have been suggested in the literature, which, sometimes, produce priors that are difficult for the students to understand intuitively. In this article, we use a simple heuristic to illustrate to the students the rather…

  12. Analyzing small data sets using Bayesian estimation: the case of posttraumatic stress symptoms following mechanical ventilation in burn survivors

    PubMed Central

    van de Schoot, Rens; Broere, Joris J.; Perryck, Koen H.; Zondervan-Zwijnenburg, Mariëlle; van Loey, Nancy E.

    2015-01-01

    Background The analysis of small data sets in longitudinal studies can lead to power issues and often suffers from biased parameter values. These issues can be solved by using Bayesian estimation in conjunction with informative prior distributions. By means of a simulation study and an empirical example concerning posttraumatic stress symptoms (PTSS) following mechanical ventilation in burn survivors, we demonstrate the advantages and potential pitfalls of using Bayesian estimation. Methods First, we show how to specify prior distributions and by means of a sensitivity analysis we demonstrate how to check the exact influence of the prior (mis-) specification. Thereafter, we show by means of a simulation the situations in which the Bayesian approach outperforms the default, maximum likelihood and approach. Finally, we re-analyze empirical data on burn survivors which provided preliminary evidence of an aversive influence of a period of mechanical ventilation on the course of PTSS following burns. Results Not suprisingly, maximum likelihood estimation showed insufficient coverage as well as power with very small samples. Only when Bayesian analysis, in conjunction with informative priors, was used power increased to acceptable levels. As expected, we showed that the smaller the sample size the more the results rely on the prior specification. Conclusion We show that two issues often encountered during analysis of small samples, power and biased parameters, can be solved by including prior information into Bayesian analysis. We argue that the use of informative priors should always be reported together with a sensitivity analysis. PMID:25765534

  13. Analyzing small data sets using Bayesian estimation: the case of posttraumatic stress symptoms following mechanical ventilation in burn survivors.

    PubMed

    van de Schoot, Rens; Broere, Joris J; Perryck, Koen H; Zondervan-Zwijnenburg, Mariëlle; van Loey, Nancy E

    2015-01-01

    Background : The analysis of small data sets in longitudinal studies can lead to power issues and often suffers from biased parameter values. These issues can be solved by using Bayesian estimation in conjunction with informative prior distributions. By means of a simulation study and an empirical example concerning posttraumatic stress symptoms (PTSS) following mechanical ventilation in burn survivors, we demonstrate the advantages and potential pitfalls of using Bayesian estimation. Methods : First, we show how to specify prior distributions and by means of a sensitivity analysis we demonstrate how to check the exact influence of the prior (mis-) specification. Thereafter, we show by means of a simulation the situations in which the Bayesian approach outperforms the default, maximum likelihood and approach. Finally, we re-analyze empirical data on burn survivors which provided preliminary evidence of an aversive influence of a period of mechanical ventilation on the course of PTSS following burns. Results : Not suprisingly, maximum likelihood estimation showed insufficient coverage as well as power with very small samples. Only when Bayesian analysis, in conjunction with informative priors, was used power increased to acceptable levels. As expected, we showed that the smaller the sample size the more the results rely on the prior specification. Conclusion : We show that two issues often encountered during analysis of small samples, power and biased parameters, can be solved by including prior information into Bayesian analysis. We argue that the use of informative priors should always be reported together with a sensitivity analysis.

  14. An information theory criteria based blind method for enumerating active users in DS-CDMA system

    NASA Astrophysics Data System (ADS)

    Samsami Khodadad, Farid; Abed Hodtani, Ghosheh

    2014-11-01

    In this paper, a new and blind algorithm for active user enumeration in asynchronous direct sequence code division multiple access (DS-CDMA) in multipath channel scenario is proposed. The proposed method is based on information theory criteria. There are two main categories of information criteria which are widely used in active user enumeration, Akaike Information Criterion (AIC) and Minimum Description Length (MDL) information theory criteria. The main difference between these two criteria is their penalty functions. Due to this difference, MDL is a consistent enumerator which has better performance in higher signal-to-noise ratios (SNR) but AIC is preferred in lower SNRs. In sequel, we propose a SNR compliance method based on subspace and training genetic algorithm to have the performance of both of them. Moreover, our method uses only a single antenna, in difference to the previous methods which decrease hardware complexity. Simulation results show that the proposed method is capable of estimating the number of active users without any prior knowledge and the efficiency of the method.

  15. A hybrid optimization approach to the estimation of distributed parameters in two-dimensional confined aquifers

    USGS Publications Warehouse

    Heidari, M.; Ranjithan, S.R.

    1998-01-01

    In using non-linear optimization techniques for estimation of parameters in a distributed ground water model, the initial values of the parameters and prior information about them play important roles. In this paper, the genetic algorithm (GA) is combined with the truncated-Newton search technique to estimate groundwater parameters for a confined steady-state ground water model. Use of prior information about the parameters is shown to be important in estimating correct or near-correct values of parameters on a regional scale. The amount of prior information needed for an accurate solution is estimated by evaluation of the sensitivity of the performance function to the parameters. For the example presented here, it is experimentally demonstrated that only one piece of prior information of the least sensitive parameter is sufficient to arrive at the global or near-global optimum solution. For hydraulic head data with measurement errors, the error in the estimation of parameters increases as the standard deviation of the errors increases. Results from our experiments show that, in general, the accuracy of the estimated parameters depends on the level of noise in the hydraulic head data and the initial values used in the truncated-Newton search technique.In using non-linear optimization techniques for estimation of parameters in a distributed ground water model, the initial values of the parameters and prior information about them play important roles. In this paper, the genetic algorithm (GA) is combined with the truncated-Newton search technique to estimate groundwater parameters for a confined steady-state ground water model. Use of prior information about the parameters is shown to be important in estimating correct or near-correct values of parameters on a regional scale. The amount of prior information needed for an accurate solution is estimated by evaluation of the sensitivity of the performance function to the parameters. For the example presented here, it is experimentally demonstrated that only one piece of prior information of the least sensitive parameter is sufficient to arrive at the global or near-global optimum solution. For hydraulic head data with measurement errors, the error in the estimation of parameters increases as the standard deviation of the errors increases. Results from our experiments show that, in general, the accuracy of the estimated parameters depends on the level of noise in the hydraulic head data and the initial values used in the truncated-Newton search technique.

  16. Selecting team players: Considering the impact of contextual performance and workplace deviance on selection decisions in the National Football League.

    PubMed

    Whiting, Steven W; Maynes, Timothy D

    2016-04-01

    Contextual performance and workplace deviance likely influence team functioning and effectiveness and should therefore be considered when evaluating job candidates for team-based roles. However, obtaining this information is difficult given a lack of reliable sources and the desire of job applicants to present themselves in a favorable light. Thus, it is unknown whether those selecting employees for teams incorporate prior contextual performance and workplace deviance into their evaluations, or whether doing so improves the quality of selection decisions. To address these issues, we examined the impact of prior task performance, contextual performance, and workplace deviance on National Football League (NFL) decision maker (organizational insider) and external expert (organizational outsider) evaluations of college football players in the NFL draft, using a content analysis methodology to generate measures of contextual performance and workplace deviance. Our findings indicate that insiders value contextual performance more than outsiders, which is likely because of differing interests and goals that lead to different levels of motivation and/or ability to acquire information about prior contextual performance. We also propose that prior task performance, contextual performance, and workplace deviance will predict player performance in the NFL. Our results support this prediction for task and contextual performance. In addition, we investigated the quality of insider and outsider judgments using Brunswik's (1952) lens model. Implications of our findings for the team selection, contextual performance, and workplace deviance literatures are discussed. (c) 2016 APA, all rights reserved).

  17. Benford's law and the FSD distribution of economic behavioral micro data

    NASA Astrophysics Data System (ADS)

    Villas-Boas, Sofia B.; Fu, Qiuzi; Judge, George

    2017-11-01

    In this paper, we focus on the first significant digit (FSD) distribution of European micro income data and use information theoretic-entropy based methods to investigate the degree to which Benford's FSD law is consistent with the nature of these economic behavioral systems. We demonstrate that Benford's law is not an empirical phenomenon that occurs only in important distributions in physical statistics, but that it also arises in self-organizing dynamic economic behavioral systems. The empirical likelihood member of the minimum divergence-entropy family, is used to recover country based income FSD probability density functions and to demonstrate the implications of using a Benford prior reference distribution in economic behavioral system information recovery.

  18. Maximally Informative Stimuli and Tuning Curves for Sigmoidal Rate-Coding Neurons and Populations

    NASA Astrophysics Data System (ADS)

    McDonnell, Mark D.; Stocks, Nigel G.

    2008-08-01

    A general method for deriving maximally informative sigmoidal tuning curves for neural systems with small normalized variability is presented. The optimal tuning curve is a nonlinear function of the cumulative distribution function of the stimulus and depends on the mean-variance relationship of the neural system. The derivation is based on a known relationship between Shannon’s mutual information and Fisher information, and the optimality of Jeffrey’s prior. It relies on the existence of closed-form solutions to the converse problem of optimizing the stimulus distribution for a given tuning curve. It is shown that maximum mutual information corresponds to constant Fisher information only if the stimulus is uniformly distributed. As an example, the case of sub-Poisson binomial firing statistics is analyzed in detail.

  19. A slide down a slippery slope: ethical guidelines in the dissemination of computer-based presentations

    Treesearch

    Patrick C. Tobin; James L. Frazier

    2009-01-01

    The continual development of technology opens many new and exciting doors in all walks of life, including science. Undoubtedly, we all have benefited from the ability to rapidly disseminate and acquire scientific information. Published articles can be downloaded from the Internet even prior to their "actual" publication date, requests for pdf reprints of...

  20. Supporting Middle School Students' Online Reading of Scientific Resources: Moving beyond Cursory, Fragmented, and Opportunistic Reading

    ERIC Educational Resources Information Center

    Zhang, M.

    2013-01-01

    The abundant scientific resources on the Web provide great opportunities for students to expand their science learning, yet easy access to information does not ensure learning. Prior research has found that middle school students tend to read Web-based scientific resources in a shallow, superficial manner. A software tool was designed to support…

  1. Prior Knowledge of Potential School-Based Violence: Information Students Learn May Prevent a Targeted Attack

    ERIC Educational Resources Information Center

    Pollack, William S.; Modzeleski, William; Rooney, Georgeann

    2008-01-01

    In the wake of several high-profile shootings at schools in the United States, most notably the shootings that occurred at Columbine High School on April 20, 1999, the United States Secret Service (Secret Service) and the United States Department of Education (ED) embarked on a collaborative endeavor to study incidents of planned (or…

  2. A Selective Bibliography on Measurement in Library and Information Services.

    ERIC Educational Resources Information Center

    Reynolds, Rose, Comp.

    The aim of this survey, based on material held in the Aslib Library, was to produce a list of items dealing with cost and costings in library services, for use within the Aslib Research Department. Attention has been concentrated on material published since 1960, although a few items prior to this date have been included. Items which are…

  3. International multi-site survey on the use of online support groups in bipolar disorder.

    PubMed

    Bauer, Rita; Conell, Jörn; Glenn, Tasha; Alda, Martin; Ardau, Raffaella; Baune, Bernhard T; Berk, Michael; Bersudsky, Yuly; Bilderbeck, Amy; Bocchetta, Alberto; Bossini, Letizia; Castro, Angela M Paredes; Cheung, Eric Y W; Chillotti, Caterina; Choppin, Sabine; Zompo, Maria Del; Dias, Rodrigo; Dodd, Seetal; Duffy, Anne; Etain, Bruno; Fagiolini, Andrea; Hernandez, Miryam Fernández; Garnham, Julie; Geddes, John; Gildebro, Jonas; Gonzalez-Pinto, Ana; Goodwin, Guy M; Grof, Paul; Harima, Hirohiko; Hassel, Stefanie; Henry, Chantal; Hidalgo-Mazzei, Diego; Kapur, Vaisnvy; Kunigiri, Girish; Lafer, Beny; Larsen, Erik R; Lewitzka, Ute; Licht, Rasmus W; Hvenegaard Lund, Anne; Misiak, Blazej; Piotrowski, Patryk; Monteith, Scott; Munoz, Rodrigo; Nakanotani, Takako; Nielsen, René E; O'donovan, Claire; Okamura, Yasushi; Osher, Yamima; Reif, Andreas; Ritter, Philipp; Rybakowski, Janusz K; Sagduyu, Kemal; Sawchuk, Brett; Schwartz, Elon; Scippa, Ângela M; Slaney, Claire; Sulaiman, Ahmad H; Suominen, Kirsi; Suwalska, Aleksandra; Tam, Peter; Tatebayashi, Yoshitaka; Tondo, Leonardo; Vieta, Eduard; Vinberg, Maj; Viswanath, Biju; Volkert, Julia; Zetin, Mark; Whybrow, Peter C; Bauer, Michael

    2017-08-01

    Peer support is an established component of recovery from bipolar disorder, and online support groups may offer opportunities to expand the use of peer support at the patient's convenience. Prior research in bipolar disorder has reported value from online support groups. To understand the use of online support groups by patients with bipolar disorder as part of a larger project about information seeking. The results are based on a one-time, paper-based anonymous survey about information seeking by patients with bipolar disorder, which was translated into 12 languages. The survey was completed between March 2014 and January 2016 and included questions on the use of online support groups. All patients were diagnosed by a psychiatrist. Analysis included descriptive statistics and general estimating equations to account for correlated data. The survey was completed by 1222 patients in 17 countries. The patients used the Internet at a percentage similar to the general public. Of the Internet users who looked online for information about bipolar disorder, only 21.0% read or participated in support groups, chats, or forums for bipolar disorder (12.8% of the total sample). Given the benefits reported in prior research, clarification of the role of online support groups in bipolar disorder is needed. With only a minority of patients using online support groups, there are analytical challenges for future studies.

  4. Biomedical image segmentation using geometric deformable models and metaheuristics.

    PubMed

    Mesejo, Pablo; Valsecchi, Andrea; Marrakchi-Kacem, Linda; Cagnoni, Stefano; Damas, Sergio

    2015-07-01

    This paper describes a hybrid level set approach for medical image segmentation. This new geometric deformable model combines region- and edge-based information with the prior shape knowledge introduced using deformable registration. Our proposal consists of two phases: training and test. The former implies the learning of the level set parameters by means of a Genetic Algorithm, while the latter is the proper segmentation, where another metaheuristic, in this case Scatter Search, derives the shape prior. In an experimental comparison, this approach has shown a better performance than a number of state-of-the-art methods when segmenting anatomical structures from different biomedical image modalities. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Use of collateral information to improve LANDSAT classification accuracies

    NASA Technical Reports Server (NTRS)

    Strahler, A. H. (Principal Investigator)

    1981-01-01

    Methods to improve LANDSAT classification accuracies were investigated including: (1) the use of prior probabilities in maximum likelihood classification as a methodology to integrate discrete collateral data with continuously measured image density variables; (2) the use of the logit classifier as an alternative to multivariate normal classification that permits mixing both continuous and categorical variables in a single model and fits empirical distributions of observations more closely than the multivariate normal density function; and (3) the use of collateral data in a geographic information system as exercised to model a desired output information layer as a function of input layers of raster format collateral and image data base layers.

  6. Project Prospector: Unmanned Exploration and Apollo Support Program

    NASA Technical Reports Server (NTRS)

    1969-01-01

    Prior to the establishment of a manned lunar observatory or base, it is essential that a compendium of information be available on the environment, composition, structure, and topography of the moon. In an effort to satisfy this need for improved and detailed information, NASA has undertaken a lunar program which ranges from the utilization of circumlunar flight vehicles, equipped with automatic photographic and radiation measuring equipment which responds to commands from the earth, to actual determination of surface composition and features obtained from unmanned instrumented spacecraft which impact the moon.

  7. 76 FR 19121 - Notice of Submission of Proposed Information Collection to OMB Multifamily Project Applications...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-06

    ... Proposed Information Collection to OMB Multifamily Project Applications and Construction Prior to Initial... facilities is also required as part of the application for firm commitment for mortgage insurance. Project owners/sponsors may apply for permission to commence construction prior to initial endorsement. DATES...

  8. Using heuristic evaluations to assess the safety of health information systems.

    PubMed

    Carvalho, Christopher J; Borycki, Elizabeth M; Kushniruk, Andre W

    2009-01-01

    Health information systems (HISs) are typically seen as a mechanism for reducing medical errors. There is, however, evidence to prove that technology may actually be the cause of errors. As a result, it is crucial to fully test any system prior to its implementation. At present, evidence-based evaluation heuristics do not exist for assessing aspects of interface design that lead to medical errors. A three phase study was conducted to develop evidence-based heuristics for evaluating interfaces. Phase 1 consisted of a systematic review of the literature. In Phase 2 a comprehensive list of 33 evaluation heuristics was developed based on the review that could be used to test for potential technology induced errors. Phase 3 involved applying these healthcare specific heuristics to evaluate a HIS.

  9. The Modified, Multi-patient Observed Simulated Handoff Experience (M-OSHE): Assessment and Feedback for Entering Residents on Handoff Performance.

    PubMed

    Gaffney, Sean; Farnan, Jeanne M; Hirsch, Kristen; McGinty, Michael; Arora, Vineet M

    2016-04-01

    Despite the identification of transfer of patient responsibility as a Core Entrustable Professional Activity for Entering Residency, rigorous methods to evaluate incoming residents' ability to give a verbal handoff of multiple patients are lacking. Our purpose was to implement a multi-patient, simulation-based curriculum to assess verbal handoff performance. Graduate Medical Education (GME) orientation at an urban, academic medical center. Eighty-four incoming residents from four residency programs participated in the study. The curriculum featured an online training module and a multi-patient observed simulated handoff experience (M-OSHE). Participants verbally "handed off" three mock patients of varying acuity and were evaluated by a trained "receiver" using an expert-informed, five-item checklist. Prior handoff experience in medical school was associated with higher checklist scores (23% none vs. 33% either third OR fourth year vs. 58% third AND fourth year, p = 0.021). Prior training was associated with prioritization of patients based on acuity (12% no training vs. 38% prior training, p = 0.014). All participants agreed that the M-OSHE realistically portrayed a clinical setting. The M-OSHE is a promising strategy for teaching and evaluating entering residents' ability to give verbal handoffs of multiple patients. Prior training and more handoff experience was associated with higher performance, which suggests that additional handoff training in medical school may be of benefit.

  10. Calibrated birth-death phylogenetic time-tree priors for bayesian inference.

    PubMed

    Heled, Joseph; Drummond, Alexei J

    2015-05-01

    Here we introduce a general class of multiple calibration birth-death tree priors for use in Bayesian phylogenetic inference. All tree priors in this class separate ancestral node heights into a set of "calibrated nodes" and "uncalibrated nodes" such that the marginal distribution of the calibrated nodes is user-specified whereas the density ratio of the birth-death prior is retained for trees with equal values for the calibrated nodes. We describe two formulations, one in which the calibration information informs the prior on ranked tree topologies, through the (conditional) prior, and the other which factorizes the prior on divergence times and ranked topologies, thus allowing uniform, or any arbitrary prior distribution on ranked topologies. Although the first of these formulations has some attractive properties, the algorithm we present for computing its prior density is computationally intensive. However, the second formulation is always faster and computationally efficient for up to six calibrations. We demonstrate the utility of the new class of multiple-calibration tree priors using both small simulations and a real-world analysis and compare the results to existing schemes. The two new calibrated tree priors described in this article offer greater flexibility and control of prior specification in calibrated time-tree inference and divergence time dating, and will remove the need for indirect approaches to the assessment of the combined effect of calibration densities and tree priors in Bayesian phylogenetic inference. © The Author(s) 2014. Published by Oxford University Press, on behalf of the Society of Systematic Biologists.

  11. Evidence-informed policy formulation and implementation: a comparative case study of two national policies for improving health and social care in Sweden.

    PubMed

    Strehlenert, H; Richter-Sundberg, L; Nyström, M E; Hasson, H

    2015-12-08

    Evidence has come to play a central role in health policymaking. However, policymakers tend to use other types of information besides research evidence. Most prior studies on evidence-informed policy have focused on the policy formulation phase without a systematic analysis of its implementation. It has been suggested that in order to fully understand the policy process, the analysis should include both policy formulation and implementation. The purpose of the study was to explore and compare two policies aiming to improve health and social care in Sweden and to empirically test a new conceptual model for evidence-informed policy formulation and implementation. Two concurrent national policies were studied during the entire policy process using a longitudinal, comparative case study approach. Data was collected through interviews, observations, and documents. A Conceptual Model for Evidence-Informed Policy Formulation and Implementation was developed based on prior frameworks for evidence-informed policymaking and policy dissemination and implementation. The conceptual model was used to organize and analyze the data. The policies differed regarding the use of evidence in the policy formulation and the extent to which the policy formulation and implementation phases overlapped. Similarities between the cases were an emphasis on capacity assessment, modified activities based on the assessment, and a highly active implementation approach relying on networks of stakeholders. The Conceptual Model for Evidence-Informed Policy Formulation and Implementation was empirically useful to organize the data. The policy actors' roles and functions were found to have a great influence on the choices of strategies and collaborators in all policy phases. The Conceptual Model for Evidence-Informed Policy Formulation and Implementation was found to be useful. However, it provided insufficient guidance for analyzing actors involved in the policy process, capacity-building strategies, and overlapping policy phases. A revised version of the model that includes these aspects is suggested.

  12. A computational visual saliency model based on statistics and machine learning.

    PubMed

    Lin, Ru-Je; Lin, Wei-Song

    2014-08-01

    Identifying the type of stimuli that attracts human visual attention has been an appealing topic for scientists for many years. In particular, marking the salient regions in images is useful for both psychologists and many computer vision applications. In this paper, we propose a computational approach for producing saliency maps using statistics and machine learning methods. Based on four assumptions, three properties (Feature-Prior, Position-Prior, and Feature-Distribution) can be derived and combined by a simple intersection operation to obtain a saliency map. These properties are implemented by a similarity computation, support vector regression (SVR) technique, statistical analysis of training samples, and information theory using low-level features. This technique is able to learn the preferences of human visual behavior while simultaneously considering feature uniqueness. Experimental results show that our approach performs better in predicting human visual attention regions than 12 other models in two test databases. © 2014 ARVO.

  13. Pathway-based analyses.

    PubMed

    Kent, Jack W

    2016-02-03

    New technologies for acquisition of genomic data, while offering unprecedented opportunities for genetic discovery, also impose severe burdens of interpretation and penalties for multiple testing. The Pathway-based Analyses Group of the Genetic Analysis Workshop 19 (GAW19) sought reduction of multiple-testing burden through various approaches to aggregation of highdimensional data in pathways informed by prior biological knowledge. Experimental methods testedincluded the use of "synthetic pathways" (random sets of genes) to estimate power and false-positive error rate of methods applied to simulated data; data reduction via independent components analysis, single-nucleotide polymorphism (SNP)-SNP interaction, and use of gene sets to estimate genetic similarity; and general assessment of the efficacy of prior biological knowledge to reduce the dimensionality of complex genomic data. The work of this group explored several promising approaches to managing high-dimensional data, with the caveat that these methods are necessarily constrained by the quality of external bioinformatic annotation.

  14. Inference of Gene Regulatory Networks Using Bayesian Nonparametric Regression and Topology Information.

    PubMed

    Fan, Yue; Wang, Xiao; Peng, Qinke

    2017-01-01

    Gene regulatory networks (GRNs) play an important role in cellular systems and are important for understanding biological processes. Many algorithms have been developed to infer the GRNs. However, most algorithms only pay attention to the gene expression data but do not consider the topology information in their inference process, while incorporating this information can partially compensate for the lack of reliable expression data. Here we develop a Bayesian group lasso with spike and slab priors to perform gene selection and estimation for nonparametric models. B-spline basis functions are used to capture the nonlinear relationships flexibly and penalties are used to avoid overfitting. Further, we incorporate the topology information into the Bayesian method as a prior. We present the application of our method on DREAM3 and DREAM4 datasets and two real biological datasets. The results show that our method performs better than existing methods and the topology information prior can improve the result.

  15. An iterative shrinkage approach to total-variation image restoration.

    PubMed

    Michailovich, Oleg V

    2011-05-01

    The problem of restoration of digital images from their degraded measurements plays a central role in a multitude of practically important applications. A particularly challenging instance of this problem occurs in the case when the degradation phenomenon is modeled by an ill-conditioned operator. In such a situation, the presence of noise makes it impossible to recover a valuable approximation of the image of interest without using some a priori information about its properties. Such a priori information--commonly referred to as simply priors--is essential for image restoration, rendering it stable and robust to noise. Moreover, using the priors makes the recovered images exhibit some plausible features of their original counterpart. Particularly, if the original image is known to be a piecewise smooth function, one of the standard priors used in this case is defined by the Rudin-Osher-Fatemi model, which results in total variation (TV) based image restoration. The current arsenal of algorithms for TV-based image restoration is vast. In this present paper, a different approach to the solution of the problem is proposed based upon the method of iterative shrinkage (aka iterated thresholding). In the proposed method, the TV-based image restoration is performed through a recursive application of two simple procedures, viz. linear filtering and soft thresholding. Therefore, the method can be identified as belonging to the group of first-order algorithms which are efficient in dealing with images of relatively large sizes. Another valuable feature of the proposed method consists in its working directly with the TV functional, rather then with its smoothed versions. Moreover, the method provides a single solution for both isotropic and anisotropic definitions of the TV functional, thereby establishing a useful connection between the two formulae. Finally, a number of standard examples of image deblurring are demonstrated, in which the proposed method can provide restoration results of superior quality as compared to the case of sparse-wavelet deconvolution.

  16. Pediatric Emergency Research Canada (PERC): Patient/Family-Informed Research Priorities for Pediatric Emergency Medicine.

    PubMed

    Bialy, Liza; Plint, Amy C; Freedman, Stephen B; Johnson, David W; Curran, Janet A; Stang, Antonia S

    2018-06-06

    A growing body of literature supports patient and public involvement in the design, prioritization and dissemination of research and evidence based medicine. The objectives of this project were to engage patients and families in developing a prioritized list of research topics for Pediatric Emergency Medicine (PEM) and to compare results with prior research prioritization initiatives in the ED (emergency department) setting. We utilized a systematic process to combine administrative data on frequency of patient presentations to the ED with multiple stakeholder input including an initial stakeholder survey followed by a modified Delphi consensus methodology consisting of two web-based surveys and a face-to-face meeting. The prioritization process resulted in a ranked list of 15 research priorities. The top five priorities were mental health presentations, pain and sedation, practice tools, quality of care delivery and resource utilization. Mental health, pain and sedation, clinical prediction rules, respiratory illnesses /wheeze, patient safety/medication error and sepsis were identified as shared priorities with prior initiatives. Topics identified in our process that were not identified in prior work included resource utilization, ED communication, antibiotic stewardship and patient/family adherence with recommendations. This work identifies key priorities for research in PEM. Comparing our results with prior initiatives in the ED setting identified shared research priorities and opportunities for collaboration among PEM research networks. This work in particular makes an important contribution to the existing literature by including the patient/family perspective missing from prior work. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  17. N-mixture models for estimating population size from spatially replicated counts

    USGS Publications Warehouse

    Royle, J. Andrew

    2004-01-01

    Spatial replication is a common theme in count surveys of animals. Such surveys often generate sparse count data from which it is difficult to estimate population size while formally accounting for detection probability. In this article, i describe a class of models (n-mixture models) which allow for estimation of population size from such data. The key idea is to view site-specific population sizes, n, as independent random variables distributed according to some mixing distribution (e.g., Poisson). Prior parameters are estimated from the marginal likelihood of the data, having integrated over the prior distribution for n. Carroll and lombard (1985, journal of american statistical association 80, 423-426) proposed a class of estimators based on mixing over a prior distribution for detection probability. Their estimator can be applied in limited settings, but is sensitive to prior parameter values that are fixed a priori. Spatial replication provides additional information regarding the parameters of the prior distribution on n that is exploited by the n-mixture models and which leads to reasonable estimates of abundance from sparse data. A simulation study demonstrates superior operating characteristics (bias, confidence interval coverage) of the n-mixture estimator compared to the caroll and lombard estimator. Both estimators are applied to point count data on six species of birds illustrating the sensitivity to choice of prior on p and substantially different estimates of abundance as a consequence.

  18. Multilevel modeling of single-case data: A comparison of maximum likelihood and Bayesian estimation.

    PubMed

    Moeyaert, Mariola; Rindskopf, David; Onghena, Patrick; Van den Noortgate, Wim

    2017-12-01

    The focus of this article is to describe Bayesian estimation, including construction of prior distributions, and to compare parameter recovery under the Bayesian framework (using weakly informative priors) and the maximum likelihood (ML) framework in the context of multilevel modeling of single-case experimental data. Bayesian estimation results were found similar to ML estimation results in terms of the treatment effect estimates, regardless of the functional form and degree of information included in the prior specification in the Bayesian framework. In terms of the variance component estimates, both the ML and Bayesian estimation procedures result in biased and less precise variance estimates when the number of participants is small (i.e., 3). By increasing the number of participants to 5 or 7, the relative bias is close to 5% and more precise estimates are obtained for all approaches, except for the inverse-Wishart prior using the identity matrix. When a more informative prior was added, more precise estimates for the fixed effects and random effects were obtained, even when only 3 participants were included. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  19. Novel bayes factors that capture expert uncertainty in prior density specification in genetic association studies.

    PubMed

    Spencer, Amy V; Cox, Angela; Lin, Wei-Yu; Easton, Douglas F; Michailidou, Kyriaki; Walters, Kevin

    2015-05-01

    Bayes factors (BFs) are becoming increasingly important tools in genetic association studies, partly because they provide a natural framework for including prior information. The Wakefield BF (WBF) approximation is easy to calculate and assumes a normal prior on the log odds ratio (logOR) with a mean of zero. However, the prior variance (W) must be specified. Because of the potentially high sensitivity of the WBF to the choice of W, we propose several new BF approximations with logOR ∼N(0,W), but allow W to take a probability distribution rather than a fixed value. We provide several prior distributions for W which lead to BFs that can be calculated easily in freely available software packages. These priors allow a wide range of densities for W and provide considerable flexibility. We examine some properties of the priors and BFs and show how to determine the most appropriate prior based on elicited quantiles of the prior odds ratio (OR). We show by simulation that our novel BFs have superior true-positive rates at low false-positive rates compared to those from both P-value and WBF analyses across a range of sample sizes and ORs. We give an example of utilizing our BFs to fine-map the CASP8 region using genotype data on approximately 46,000 breast cancer case and 43,000 healthy control samples from the Collaborative Oncological Gene-environment Study (COGS) Consortium, and compare the single-nucleotide polymorphism ranks to those obtained using WBFs and P-values from univariate logistic regression. © 2015 The Authors. *Genetic Epidemiology published by Wiley Periodicals, Inc.

  20. Influence of social norms and palatability on amount consumed and food choice.

    PubMed

    Pliner, Patricia; Mann, Nikki

    2004-04-01

    In two parallel studies, we examined the effect of social influence and palatability on amount consumed and on food choice. In Experiment 1, which looked at amount consumed, participants were provided with either palatable or unpalatable food; they were also given information about how much previous participants had eaten (large or small amounts) or were given no information. In the case of palatable food, participants ate more when led to believe that prior participants had eaten a great deal than when led to believe that prior participants had eaten small amounts or when provided with no information. This social-influence effect was not present when participants received unpalatable food. In Experiment 2, which looked at food choice, some participants learned that prior participants had chosen the palatable food, others learned that prior participants had chosen the unpalatable food, while still others received no information about prior participants' choices. The social-influence manipulation had no effect on participants' food choices; nearly all of them chose the palatable food. The results were discussed in the context of Churchfield's (1995) distinction between judgments about matters of fact and judgments about preferences. The results were also used to illustrate the importance of palatability as a determinant of eating behavior.

  1. Probabilistic Fatigue Life Updating for Railway Bridges Based on Local Inspection and Repair.

    PubMed

    Lee, Young-Joo; Kim, Robin E; Suh, Wonho; Park, Kiwon

    2017-04-24

    Railway bridges are exposed to repeated train loads, which may cause fatigue failure. As critical links in a transportation network, railway bridges are expected to survive for a target period of time, but sometimes they fail earlier than expected. To guarantee the target bridge life, bridge maintenance activities such as local inspection and repair should be undertaken properly. However, this is a challenging task because there are various sources of uncertainty associated with aging bridges, train loads, environmental conditions, and maintenance work. Therefore, to perform optimal risk-based maintenance of railway bridges, it is essential to estimate the probabilistic fatigue life of a railway bridge and update the life information based on the results of local inspections and repair. Recently, a system reliability approach was proposed to evaluate the fatigue failure risk of structural systems and update the prior risk information in various inspection scenarios. However, this approach can handle only a constant-amplitude load and has limitations in considering a cyclic load with varying amplitude levels, which is the major loading pattern generated by train traffic. In addition, it is not feasible to update the prior risk information after bridges are repaired. In this research, the system reliability approach is further developed so that it can handle a varying-amplitude load and update the system-level risk of fatigue failure for railway bridges after inspection and repair. The proposed method is applied to a numerical example of an in-service railway bridge, and the effects of inspection and repair on the probabilistic fatigue life are discussed.

  2. Probabilistic Fatigue Life Updating for Railway Bridges Based on Local Inspection and Repair

    PubMed Central

    Lee, Young-Joo; Kim, Robin E.; Suh, Wonho; Park, Kiwon

    2017-01-01

    Railway bridges are exposed to repeated train loads, which may cause fatigue failure. As critical links in a transportation network, railway bridges are expected to survive for a target period of time, but sometimes they fail earlier than expected. To guarantee the target bridge life, bridge maintenance activities such as local inspection and repair should be undertaken properly. However, this is a challenging task because there are various sources of uncertainty associated with aging bridges, train loads, environmental conditions, and maintenance work. Therefore, to perform optimal risk-based maintenance of railway bridges, it is essential to estimate the probabilistic fatigue life of a railway bridge and update the life information based on the results of local inspections and repair. Recently, a system reliability approach was proposed to evaluate the fatigue failure risk of structural systems and update the prior risk information in various inspection scenarios. However, this approach can handle only a constant-amplitude load and has limitations in considering a cyclic load with varying amplitude levels, which is the major loading pattern generated by train traffic. In addition, it is not feasible to update the prior risk information after bridges are repaired. In this research, the system reliability approach is further developed so that it can handle a varying-amplitude load and update the system-level risk of fatigue failure for railway bridges after inspection and repair. The proposed method is applied to a numerical example of an in-service railway bridge, and the effects of inspection and repair on the probabilistic fatigue life are discussed. PMID:28441768

  3. In vivo bioluminescence tomography based on multi-view projection and 3D surface reconstruction

    NASA Astrophysics Data System (ADS)

    Zhang, Shuang; Wang, Kun; Leng, Chengcai; Deng, Kexin; Hu, Yifang; Tian, Jie

    2015-03-01

    Bioluminescence tomography (BLT) is a powerful optical molecular imaging modality, which enables non-invasive realtime in vivo imaging as well as 3D quantitative analysis in preclinical studies. In order to solve the inverse problem and reconstruct inner light sources accurately, the prior structural information is commonly necessary and obtained from computed tomography or magnetic resonance imaging. This strategy requires expensive hybrid imaging system, complicated operation protocol and possible involvement of ionizing radiation. The overall robustness highly depends on the fusion accuracy between the optical and structural information. In this study we present a pure optical bioluminescence tomographic system (POBTS) and a novel BLT method based on multi-view projection acquisition and 3D surface reconstruction. The POBTS acquired a sparse set of white light surface images and bioluminescent images of a mouse. Then the white light images were applied to an approximate surface model to generate a high quality textured 3D surface reconstruction of the mouse. After that we integrated multi-view luminescent images based on the previous reconstruction, and applied an algorithm to calibrate and quantify the surface luminescent flux in 3D.Finally, the internal bioluminescence source reconstruction was achieved with this prior information. A BALB/C mouse with breast tumor of 4T1-fLuc cells mouse model were used to evaluate the performance of the new system and technique. Compared with the conventional hybrid optical-CT approach using the same inverse reconstruction method, the reconstruction accuracy of this technique was improved. The distance error between the actual and reconstructed internal source was decreased by 0.184 mm.

  4. Leveraging prior quantitative knowledge in guiding pediatric drug development: a case study.

    PubMed

    Jadhav, Pravin R; Zhang, Jialu; Gobburu, Jogarao V S

    2009-01-01

    The manuscript presents the FDA's focus on leveraging prior knowledge in designing informative pediatric trial through this case study. In developing written request for Drug X, an anti-hypertensive for immediate blood pressure (BP) control, the sponsor and FDA conducted clinical trial simulations (CTS) to design trial with proper sample size and support the choice of dose range. The objective was to effectively use prior knowledge from adult patients for drug X, pediatric data from Corlopam (approved for a similar indication) trial and general experience in developing anti-hypertensive agents. Different scenarios governing the exposure response relationship in the pediatric population were simulated to perturb model assumptions. The choice of scenarios was based on the past observation that pediatric population is less responsive and sensitive compared with adults. The conceptual framework presented here should serve as an example on how the industry and FDA scientists can collaborate in designing the pediatric exclusivity trial. Using CTS, inter-disciplinary scientists with the sponsor and FDA can objectively discuss the choice of dose range, sample size, endpoints and other design elements. These efforts are believed to yield plausible trial design, qrational dosing recommendations and useful labeling information in pediatrics. Published in 2009 by John Wiley & Sons, Ltd.

  5. What Are They Up To? The Role of Sensory Evidence and Prior Knowledge in Action Understanding

    PubMed Central

    Chambon, Valerian; Domenech, Philippe; Pacherie, Elisabeth; Koechlin, Etienne; Baraduc, Pierre; Farrer, Chlöé

    2011-01-01

    Explaining or predicting the behaviour of our conspecifics requires the ability to infer the intentions that motivate it. Such inferences are assumed to rely on two types of information: (1) the sensory information conveyed by movement kinematics and (2) the observer's prior expectations – acquired from past experience or derived from prior knowledge. However, the respective contribution of these two sources of information is still controversial. This controversy stems in part from the fact that “intention” is an umbrella term that may embrace various sub-types each being assigned different scopes and targets. We hypothesized that variations in the scope and target of intentions may account for variations in the contribution of visual kinematics and prior knowledge to the intention inference process. To test this hypothesis, we conducted four behavioural experiments in which participants were instructed to identify different types of intention: basic intentions (i.e. simple goal of a motor act), superordinate intentions (i.e. general goal of a sequence of motor acts), or social intentions (i.e. intentions accomplished in a context of reciprocal interaction). For each of the above-mentioned intentions, we varied (1) the amount of visual information available from the action scene and (2) participant's prior expectations concerning the intention that was more likely to be accomplished. First, we showed that intentional judgments depend on a consistent interaction between visual information and participant's prior expectations. Moreover, we demonstrated that this interaction varied according to the type of intention to be inferred, with participant's priors rather than perceptual evidence exerting a greater effect on the inference of social and superordinate intentions. The results are discussed by appealing to the specific properties of each type of intention considered and further interpreted in the light of a hierarchical model of action representation. PMID:21364992

  6. ATLes: the strategic application of Web-based technology to address learning objectives and enhance classroom discussion in a veterinary pathology course.

    PubMed

    Hines, Stephen A; Collins, Peggy L; Quitadamo, Ian J; Brahler, C Jayne; Knudson, Cameron D; Crouch, Gregory J

    2005-01-01

    A case-based program called ATLes (Adaptive Teaching and Learning Environments) was designed for use in a systemic pathology course and implemented over a four-year period. Second-year veterinary students working in small collaborative learning groups used the program prior to their weekly pathology laboratory. The goals of ATLes were to better address specific learning objectives in the course (notably the appreciation of pathophysiology), to solve previously identified problems associated with information overload and information sorting that commonly occur as part of discovery-based processes, and to enhance classroom discussion. The program was also designed to model and allow students to practice the problem-oriented approach to clinical cases, thereby enabling them to study pathology in a relevant clinical context. Features included opportunities for students to obtain additional information on the case by requesting specific laboratory tests and/or diagnostic procedures. However, students were also required to justify their diagnostic plans and to provide mechanistic analyses. The use of ATLes met most of these objectives. Student acceptance was high, and students favorably reviewed the online ''Content Links'' that made useful information more readily accessible and level appropriate. Students came to the lab better prepared to engage in an in-depth and high-quality discussion and were better able to connect clinical problems to underlying changes in tissue (lesions). However, many students indicated that the required time on task prior to lab might have been excessive relative to what they thought they learned. The classroom discussion, although improved, was not elevated to the expected level-most likely reflecting other missing elements of the learning environment, including the existing student culture and the students' current discussion skills. This article briefly discusses the lessons learned from ATLes and how similar case-based exercises might be combined with other approaches to enhance and enliven classroom discussions in the veterinary curriculum.

  7. How to achieve synergy between medical education and cognitive neuroscience? An exercise on prior knowledge in understanding.

    PubMed

    Ruiter, Dirk J; van Kesteren, Marlieke T R; Fernandez, Guillen

    2012-05-01

    A major challenge in contemporary research is how to connect medical education and cognitive neuroscience and achieve synergy between these domains. Based on this starting point we discuss how this may result in a common language about learning, more educationally focused scientific inquiry, and multidisciplinary research projects. As the topic of prior knowledge in understanding plays a strategic role in both medical education and cognitive neuroscience it is used as a central element in our discussion. A critical condition for the acquisition of new knowledge is the existence of prior knowledge, which can be built in a mental model or schema. Formation of schemas is a central event in student-centered active learning, by which mental models are constructed and reconstructed. These theoretical considerations from cognitive psychology foster scientific discussions that may lead to salient issues and questions for research with cognitive neuroscience. Cognitive neuroscience attempts to understand how knowledge, insight and experience are established in the brain and to clarify their neural correlates. Recently, evidence has been obtained that new information processed by the hippocampus can be consolidated into a stable, neocortical network more rapidly if this new information fits readily into a schema. Opportunities for medical education and medical education research can be created in a fruitful dialogue within an educational multidisciplinary platform. In this synergetic setting many questions can be raised by educational scholars interested in evidence-based education that may be highly relevant for integrative research and the further development of medical education.

  8. A Bayesian hierarchical model for mortality data from cluster-sampling household surveys in humanitarian crises.

    PubMed

    Heudtlass, Peter; Guha-Sapir, Debarati; Speybroeck, Niko

    2018-05-31

    The crude death rate (CDR) is one of the defining indicators of humanitarian emergencies. When data from vital registration systems are not available, it is common practice to estimate the CDR from household surveys with cluster-sampling design. However, sample sizes are often too small to compare mortality estimates to emergency thresholds, at least in a frequentist framework. Several authors have proposed Bayesian methods for health surveys in humanitarian crises. Here, we develop an approach specifically for mortality data and cluster-sampling surveys. We describe a Bayesian hierarchical Poisson-Gamma mixture model with generic (weakly informative) priors that could be used as default in absence of any specific prior knowledge, and compare Bayesian and frequentist CDR estimates using five different mortality datasets. We provide an interpretation of the Bayesian estimates in the context of an emergency threshold and demonstrate how to interpret parameters at the cluster level and ways in which informative priors can be introduced. With the same set of weakly informative priors, Bayesian CDR estimates are equivalent to frequentist estimates, for all practical purposes. The probability that the CDR surpasses the emergency threshold can be derived directly from the posterior of the mean of the mixing distribution. All observation in the datasets contribute to the estimation of cluster-level estimates, through the hierarchical structure of the model. In a context of sparse data, Bayesian mortality assessments have advantages over frequentist ones already when using only weakly informative priors. More informative priors offer a formal and transparent way of combining new data with existing data and expert knowledge and can help to improve decision-making in humanitarian crises by complementing frequentist estimates.

  9. Pre-Whaling Genetic Diversity and Population Ecology in Eastern Pacific Gray Whales: Insights from Ancient DNA and Stable Isotopes

    PubMed Central

    Alter, S. Elizabeth; Newsome, Seth D.; Palumbi, Stephen R.

    2012-01-01

    Commercial whaling decimated many whale populations, including the eastern Pacific gray whale, but little is known about how population dynamics or ecology differed prior to these removals. Of particular interest is the possibility of a large population decline prior to whaling, as such a decline could explain the ∼5-fold difference between genetic estimates of prior abundance and estimates based on historical records. We analyzed genetic (mitochondrial control region) and isotopic information from modern and prehistoric gray whales using serial coalescent simulations and Bayesian skyline analyses to test for a pre-whaling decline and to examine prehistoric genetic diversity, population dynamics and ecology. Simulations demonstrate that significant genetic differences observed between ancient and modern samples could be caused by a large, recent population bottleneck, roughly concurrent with commercial whaling. Stable isotopes show minimal differences between modern and ancient gray whale foraging ecology. Using rejection-based Approximate Bayesian Computation, we estimate the size of the population bottleneck at its minimum abundance and the pre-bottleneck abundance. Our results agree with previous genetic studies suggesting the historical size of the eastern gray whale population was roughly three to five times its current size. PMID:22590499

  10. A Ranking Approach on Large-Scale Graph With Multidimensional Heterogeneous Information.

    PubMed

    Wei, Wei; Gao, Bin; Liu, Tie-Yan; Wang, Taifeng; Li, Guohui; Li, Hang

    2016-04-01

    Graph-based ranking has been extensively studied and frequently applied in many applications, such as webpage ranking. It aims at mining potentially valuable information from the raw graph-structured data. Recently, with the proliferation of rich heterogeneous information (e.g., node/edge features and prior knowledge) available in many real-world graphs, how to effectively and efficiently leverage all information to improve the ranking performance becomes a new challenging problem. Previous methods only utilize part of such information and attempt to rank graph nodes according to link-based methods, of which the ranking performances are severely affected by several well-known issues, e.g., over-fitting or high computational complexity, especially when the scale of graph is very large. In this paper, we address the large-scale graph-based ranking problem and focus on how to effectively exploit rich heterogeneous information of the graph to improve the ranking performance. Specifically, we propose an innovative and effective semi-supervised PageRank (SSP) approach to parameterize the derived information within a unified semi-supervised learning framework (SSLF-GR), then simultaneously optimize the parameters and the ranking scores of graph nodes. Experiments on the real-world large-scale graphs demonstrate that our method significantly outperforms the algorithms that consider such graph information only partially.

  11. Earthquake Predictability: Results From Aggregating Seismicity Data And Assessment Of Theoretical Individual Cases Via Synthetic Data

    NASA Astrophysics Data System (ADS)

    Adamaki, A.; Roberts, R.

    2016-12-01

    For many years an important aim in seismological studies has been forecasting the occurrence of large earthquakes. Despite some well-established statistical behavior of earthquake sequences, expressed by e.g. the Omori law for aftershock sequences and the Gutenburg-Richter distribution of event magnitudes, purely statistical approaches to short-term earthquake prediction have in general not been successful. It seems that better understanding of the processes leading to critical stress build-up prior to larger events is necessary to identify useful precursory activity, if this exists, and statistical analyses are an important tool in this context. There has been considerable debate on the usefulness or otherwise of foreshock studies for short-term earthquake prediction. We investigate generic patterns of foreshock activity using aggregated data and by studying not only strong but also moderate magnitude events. Aggregating empirical local seismicity time series prior to larger events observed in and around Greece reveals a statistically significant increasing rate of seismicity over 20 days prior to M>3.5 earthquakes. This increase cannot be explained by tempo-spatial clustering models such as ETAS, implying genuine changes in the mechanical situation just prior to larger events and thus the possible existence of useful precursory information. Because of tempo-spatial clustering, including aftershocks to foreshocks, even if such generic behavior exists it does not necessarily follow that foreshocks have the potential to provide useful precursory information for individual larger events. Using synthetic catalogs produced based on different clustering models and different presumed system sensitivities we are now investigating to what extent the apparently established generic foreshock rate acceleration may or may not imply that the foreshocks have potential in the context of routine forecasting of larger events. Preliminary results suggest that this is the case, but that it is likely that physically-based models of foreshock clustering will be a necessary, but not necessarily sufficient, basis for successful forecasting.

  12. Classical and Bayesian Seismic Yield Estimation: The 1998 Indian and Pakistani Tests

    NASA Astrophysics Data System (ADS)

    Shumway, R. H.

    2001-10-01

    - The nuclear tests in May, 1998, in India and Pakistan have stimulated a renewed interest in yield estimation, based on limited data from uncalibrated test sites. We study here the problem of estimating yields using classical and Bayesian methods developed by Shumway (1992), utilizing calibration data from the Semipalatinsk test site and measured magnitudes for the 1998 Indian and Pakistani tests given by Murphy (1998). Calibration is done using multivariate classical or Bayesian linear regression, depending on the availability of measured magnitude-yield data and prior information. Confidence intervals for the classical approach are derived applying an extension of Fieller's method suggested by Brown (1982). In the case where prior information is available, the posterior predictive magnitude densities are inverted to give posterior intervals for yield. Intervals obtained using the joint distribution of magnitudes are comparable to the single-magnitude estimates produced by Murphy (1998) and reinforce the conclusion that the announced yields of the Indian and Pakistani tests were too high.

  13. Classical and Bayesian Seismic Yield Estimation: The 1998 Indian and Pakistani Tests

    NASA Astrophysics Data System (ADS)

    Shumway, R. H.

    The nuclear tests in May, 1998, in India and Pakistan have stimulated a renewed interest in yield estimation, based on limited data from uncalibrated test sites. We study here the problem of estimating yields using classical and Bayesian methods developed by Shumway (1992), utilizing calibration data from the Semipalatinsk test site and measured magnitudes for the 1998 Indian and Pakistani tests given by Murphy (1998). Calibration is done using multivariate classical or Bayesian linear regression, depending on the availability of measured magnitude-yield data and prior information. Confidence intervals for the classical approach are derived applying an extension of Fieller's method suggested by Brown (1982). In the case where prior information is available, the posterior predictive magnitude densities are inverted to give posterior intervals for yield. Intervals obtained using the joint distribution of magnitudes are comparable to the single-magnitude estimates produced by Murphy (1998) and reinforce the conclusion that the announced yields of the Indian and Pakistani tests were too high.

  14. Modeling and validating Bayesian accrual models on clinical data and simulations using adaptive priors.

    PubMed

    Jiang, Yu; Simon, Steve; Mayo, Matthew S; Gajewski, Byron J

    2015-02-20

    Slow recruitment in clinical trials leads to increased costs and resource utilization, which includes both the clinic staff and patient volunteers. Careful planning and monitoring of the accrual process can prevent the unnecessary loss of these resources. We propose two hierarchical extensions to the existing Bayesian constant accrual model: the accelerated prior and the hedging prior. The new proposed priors are able to adaptively utilize the researcher's previous experience and current accrual data to produce the estimation of trial completion time. The performance of these models, including prediction precision, coverage probability, and correct decision-making ability, is evaluated using actual studies from our cancer center and simulation. The results showed that a constant accrual model with strongly informative priors is very accurate when accrual is on target or slightly off, producing smaller mean squared error, high percentage of coverage, and a high number of correct decisions as to whether or not continue the trial, but it is strongly biased when off target. Flat or weakly informative priors provide protection against an off target prior but are less efficient when the accrual is on target. The accelerated prior performs similar to a strong prior. The hedging prior performs much like the weak priors when the accrual is extremely off target but closer to the strong priors when the accrual is on target or only slightly off target. We suggest improvements in these models and propose new models for future research. Copyright © 2014 John Wiley & Sons, Ltd.

  15. When Relationships Depicted Diagrammatically Conflict with Prior Knowledge: An Investigation of Students' Interpretations of Evolutionary Trees

    ERIC Educational Resources Information Center

    Novick, Laura R.; Catley, Kefyn M.

    2014-01-01

    Science is an important domain for investigating students' responses to information that contradicts their prior knowledge. In previous studies of this topic, this information was communicated verbally. The present research used diagrams, specifically trees (cladograms) depicting evolutionary relationships among taxa. Effects of college…

  16. 78 FR 65670 - Agency Information Collection Activities; Proposed Collection; Comment Request; Prior Notice of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-01

    ... Food Under the Public Health Security and Bioterrorism Preparedness and Response Act of 2002 AGENCY... appropriate, and other forms of information technology. Prior Notice of Imported Food Under the Public Health... 0910-0520)--Revision The Public Health Security and Bioterrorism Preparedness and Response Act of 2002...

  17. Ten-Month-Old Infants Use Prior Information to Identify an Actor's Goal

    ERIC Educational Resources Information Center

    Sommerville, Jessica A.; Crane, Catharyn C.

    2009-01-01

    For adults, prior information about an individual's likely goals, preferences or dispositions plays a powerful role in interpreting ambiguous behavior and predicting and interpreting behavior in novel contexts. Across two studies, we investigated whether 10-month-old infants' ability to identify the goal of an ambiguous action sequence was…

  18. 40 CFR 60.2953 - What information must I submit prior to initial startup?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... initial startup? 60.2953 Section 60.2953 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... initial startup? You must submit the information specified in paragraphs (a) through (e) of this section prior to initial startup. (a) The type(s) of waste to be burned. (b) The maximum design waste burning...

  19. 40 CFR 60.2195 - What information must I submit prior to initial startup?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... initial startup? 60.2195 Section 60.2195 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY..., 2001 Recordkeeping and Reporting § 60.2195 What information must I submit prior to initial startup? You... startup. (a) The type(s) of waste to be burned. (b) The maximum design waste burning capacity. (c) The...

  20. 40 CFR 60.2953 - What information must I submit prior to initial startup?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... initial startup? 60.2953 Section 60.2953 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... initial startup? You must submit the information specified in paragraphs (a) through (e) of this section prior to initial startup. (a) The type(s) of waste to be burned. (b) The maximum design waste burning...

  1. 40 CFR 60.2953 - What information must I submit prior to initial startup?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... initial startup? 60.2953 Section 60.2953 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY... initial startup? You must submit the information specified in paragraphs (a) through (e) of this section prior to initial startup. (a) The type(s) of waste to be burned. (b) The maximum design waste burning...

  2. 40 CFR 60.2195 - What information must I submit prior to initial startup?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... initial startup? 60.2195 Section 60.2195 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY..., 2001 Recordkeeping and Reporting § 60.2195 What information must I submit prior to initial startup? You... startup. (a) The type(s) of waste to be burned. (b) The maximum design waste burning capacity. (c) The...

  3. 40 CFR 60.2195 - What information must I submit prior to initial startup?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... initial startup? 60.2195 Section 60.2195 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY..., 2001 Recordkeeping and Reporting § 60.2195 What information must I submit prior to initial startup? You... startup. (a) The type(s) of waste to be burned. (b) The maximum design waste burning capacity. (c) The...

  4. Application of Radar-Based Accumulated Rainfall Products for Early Detection of Heavy Rainfall Occurrence

    NASA Astrophysics Data System (ADS)

    Nishiyama, K.; Wakimizu, K.; Yokota, I.; Tsukahara, K.; Moriyama, T.

    2016-12-01

    In Japan, river and debris flow disasters have been frequently caused by heavy rainfall occurrence under the influence of the activity of a stationary front and associated inflow of a large amount of moisture into the front. However, it is very difficult to predict numerically-based heavy rainfall and associated landslide accurately. Therefore, the use of meteorological radar information is required for enhancing decision-making ability to urge the evacuation of local residents by local government staffs prior to the occurrence of the heavy rainfall disaster. It is also desirable that the local residents acquire the ability to determine the evacuation immediately after confirming radar information by themselves. Actually, it is difficult for untrained local residents and local government staffs to easily recognize where heavy rainfall occurs locally for a couple of hours. This reason is that the image of radar echoes is equivalent to instant electromagnetic distribution measured per a couple of minutes, and the distribution of the radar echoes moves together with the movement of a synoptic system. Therefore, in this study, considering that the movement of radar echoes also may stop in a specific area if stationary front system becomes dominant, radar-based accumulated rainfall information is defined here. The rainfall product is derived by the integration of radar intensity measured every ten minutes during previous 1 hours. Using this product, it was investigated whether and how the radar-based accumulated rainfall displayed at an interval of ten minutes can be applied for early detection of heavy rainfall occurrence. The results are summarized as follows. 1) Radar-based accumulated rainfall products could confirm that some of stationary heavy rainfall systems had already appeared prior to disaster occurrence, and clearly identify the movement of heavy rainfall area. 2) Moreover, accumulated area of rainfall could be visually and easily identified, compared with time-series (movie) of real-time radar-based rainfall intensity. Therefore, the accumulated rainfall distribution provides effective information for early detection of heavy rainfall causing disasters through the training of local residents and local government staffs who have no meteorologically-technical knowledge.

  5. Changing ideas about others’ intentions: updating prior expectations tunes activity in the human motor system

    PubMed Central

    Jacquet, Pierre O.; Roy, Alice C.; Chambon, Valérian; Borghi, Anna M.; Salemme, Roméo; Farnè, Alessandro; Reilly, Karen T.

    2016-01-01

    Predicting intentions from observing another agent’s behaviours is often thought to depend on motor resonance – i.e., the motor system’s response to a perceived movement by the activation of its stored motor counterpart, but observers might also rely on prior expectations, especially when actions take place in perceptually uncertain situations. Here we assessed motor resonance during an action prediction task using transcranial magnetic stimulation to probe corticospinal excitability (CSE) and report that experimentally-induced updates in observers’ prior expectations modulate CSE when predictions are made under situations of perceptual uncertainty. We show that prior expectations are updated on the basis of both biomechanical and probabilistic prior information and that the magnitude of the CSE modulation observed across participants is explained by the magnitude of change in their prior expectations. These findings provide the first evidence that when observers predict others’ intentions, motor resonance mechanisms adapt to changes in their prior expectations. We propose that this adaptive adjustment might reflect a regulatory control mechanism that shares some similarities with that observed during action selection. Such a mechanism could help arbitrate the competition between biomechanical and probabilistic prior information when appropriate for prediction. PMID:27243157

  6. Changing ideas about others' intentions: updating prior expectations tunes activity in the human motor system.

    PubMed

    Jacquet, Pierre O; Roy, Alice C; Chambon, Valérian; Borghi, Anna M; Salemme, Roméo; Farnè, Alessandro; Reilly, Karen T

    2016-05-31

    Predicting intentions from observing another agent's behaviours is often thought to depend on motor resonance - i.e., the motor system's response to a perceived movement by the activation of its stored motor counterpart, but observers might also rely on prior expectations, especially when actions take place in perceptually uncertain situations. Here we assessed motor resonance during an action prediction task using transcranial magnetic stimulation to probe corticospinal excitability (CSE) and report that experimentally-induced updates in observers' prior expectations modulate CSE when predictions are made under situations of perceptual uncertainty. We show that prior expectations are updated on the basis of both biomechanical and probabilistic prior information and that the magnitude of the CSE modulation observed across participants is explained by the magnitude of change in their prior expectations. These findings provide the first evidence that when observers predict others' intentions, motor resonance mechanisms adapt to changes in their prior expectations. We propose that this adaptive adjustment might reflect a regulatory control mechanism that shares some similarities with that observed during action selection. Such a mechanism could help arbitrate the competition between biomechanical and probabilistic prior information when appropriate for prediction.

  7. The impact of group membership on collaborative learning with wikis.

    PubMed

    Matschke, Christina; Moskaliuk, Johannes; Kimmerle, Joachim

    2013-02-01

    The social web stimulates learning through collaboration. However, information in the social web is often associated with information about its author. Based on previous evidence that ingroup information is preferred to outgroup information, the current research investigates whether group memberships of wiki authors affect learning. In an experimental study, we manipulated the group memberships (ingroup vs. outgroup) of wiki authors by using nicknames. The designated group memberships (being fans of a soccer team or not) were completely irrelevant for the domain of the wiki (the medical disorder fibromyalgia). Nevertheless, wiki information from the ingroup led to more integration of information into prior knowledge as well as more increase of factual knowledge than information from the outgroup. The results demonstrate that individuals apply social selection strategies when considering information from wikis, which may foster, but also hinder, learning and collaboration. Practical implications for collaborative learning in the social web are discussed.

  8. The Impact of Group Membership on Collaborative Learning with Wikis

    PubMed Central

    Matschke, Christina; Moskaliuk, Johannes

    2013-01-01

    Abstract The social web stimulates learning through collaboration. However, information in the social web is often associated with information about its author. Based on previous evidence that ingroup information is preferred to outgroup information, the current research investigates whether group memberships of wiki authors affect learning. In an experimental study, we manipulated the group memberships (ingroup vs. outgroup) of wiki authors by using nicknames. The designated group memberships (being fans of a soccer team or not) were completely irrelevant for the domain of the wiki (the medical disorder fibromyalgia). Nevertheless, wiki information from the ingroup led to more integration of information into prior knowledge as well as more increase of factual knowledge than information from the outgroup. The results demonstrate that individuals apply social selection strategies when considering information from wikis, which may foster, but also hinder, learning and collaboration. Practical implications for collaborative learning in the social web are discussed. PMID:23113690

  9. Comparing hard and soft prior bounds in geophysical inverse problems

    NASA Technical Reports Server (NTRS)

    Backus, George E.

    1988-01-01

    In linear inversion of a finite-dimensional data vector y to estimate a finite-dimensional prediction vector z, prior information about X sub E is essential if y is to supply useful limits for z. The one exception occurs when all the prediction functionals are linear combinations of the data functionals. Two forms of prior information are compared: a soft bound on X sub E is a probability distribution p sub x on X which describes the observer's opinion about where X sub E is likely to be in X; a hard bound on X sub E is an inequality Q sub x(X sub E, X sub E) is equal to or less than 1, where Q sub x is a positive definite quadratic form on X. A hard bound Q sub x can be softened to many different probability distributions p sub x, but all these p sub x's carry much new information about X sub E which is absent from Q sub x, and some information which contradicts Q sub x. Both stochastic inversion (SI) and Bayesian inference (BI) estimate z from y and a soft prior bound p sub x. If that probability distribution was obtained by softening a hard prior bound Q sub x, rather than by objective statistical inference independent of y, then p sub x contains so much unsupported new information absent from Q sub x that conclusions about z obtained with SI or BI would seen to be suspect.

  10. Comparing hard and soft prior bounds in geophysical inverse problems

    NASA Technical Reports Server (NTRS)

    Backus, George E.

    1987-01-01

    In linear inversion of a finite-dimensional data vector y to estimate a finite-dimensional prediction vector z, prior information about X sub E is essential if y is to supply useful limits for z. The one exception occurs when all the prediction functionals are linear combinations of the data functionals. Two forms of prior information are compared: a soft bound on X sub E is a probability distribution p sub x on X which describeds the observer's opinion about where X sub E is likely to be in X; a hard bound on X sub E is an inequality Q sub x(X sub E, X sub E) is equal to or less than 1, where Q sub x is a positive definite quadratic form on X. A hard bound Q sub x can be softened to many different probability distributions p sub x, but all these p sub x's carry much new information about X sub E which is absent from Q sub x, and some information which contradicts Q sub x. Both stochastic inversion (SI) and Bayesian inference (BI) estimate z from y and a soft prior bound p sub x. If that probability distribution was obtained by softening a hard prior bound Q sub x, rather than by objective statistical inference independent of y, then p sub x contains so much unsupported new information absent from Q sub x that conclusions about z obtained with SI or BI would seen to be suspect.

  11. The long-term financial consequences of breast cancer: a Danish registry-based cohort study.

    PubMed

    Jensen, Laura Schärfe; Overgaard, Charlotte; Bøggild, Henrik; Garne, Jens Peter; Lund, Thomas; Overvad, Kim; Fonager, Kirsten

    2017-10-30

    A breast cancer diagnosis affects an individual's affiliation to labour market, but the long-term consequences of breast cancer on income in a Danish setting have not been examined. The present study investigated whether breast cancer affected future income among Danish women that participated in the work force. We also examined the roles of sociodemographic factors and prior psychiatric medical treatment. This registry-based cohort study was based on information retrieved from linked Danish nationwide registries. We compared the incomes of 13,101 women (aged 30-59 years) diagnosed with breast cancer (exposed) to those of 60,819 women without breast cancer (unexposed). Changes in income were examined during a 10-year follow-up; for each follow-up year, we calculated the mean annual income and the relative change compared to the income earned one year prior to diagnosis. Expected changes in Danish female income, according to calendar year and age, were estimated based on information from Statistics Denmark. For exposed and unexposed groups, the observed income changes were dichotomized to those above and those below the expected change in income in the Danish female population. We examined the impact of breast cancer on income each year of follow-up with logistic regression models. Analyses were stratified according to educational level, marital status, and prior psychiatric medical treatment. Breast cancer had a temporary negative effect on income. The effect was largest during the first three years after diagnosis; thereafter, the gap narrowed between exposed and unexposed cohorts. The odds ratio for an increase in income in the cancer cohort compared to the cancer-free cohort was 0.81 (95% CI 0.77-0.84) after three years. After seven years, no significant difference was observed between cohorts. Stratified analyses demonstrated that the negative effect of breast cancer on income lasted longest among women with high educational levels. Being single or having received psychiatric medical treatment increased the chance to experience an increase in income among women with breast cancer. A breast cancer diagnosis led to negative effects on income, which ameliorated over the following seven years. Sociodemographic factors and prior psychiatric medical treatment might influence long-term consequences of breast cancer on income.

  12. A Study of the Impact Educational Setting Has on Academic Proficiency of American Indian Students as Measured by the Minnesota Comprehensive Assessment

    ERIC Educational Resources Information Center

    Hillstrom, Crowley

    2013-01-01

    The Minnesota Department of Education has collected Minnesota Comprehensive Assessments (MCA) results on every American Indian student who has taken the tests. This information has been made available so communities and parents can assess how their districts, schools, and students are performing based upon MCA proficiency criteria. Prior to this…

  13. Factors Affecting ICT Adoption among Distance Education Students Based on the Technology Acceptance Model--A Case Study at a Distance Education University in Iran

    ERIC Educational Resources Information Center

    Dastjerdi, Negin Barat

    2016-01-01

    The incorporation of Information and Communication Technologies (ICT) into education systems is an active program and movement in education that illustrates modern education and enables an all-encompassing presence in the third millennium; however, prior to applying ICT, the factors affecting the adoption and use of these technologies should be…

  14. 26 CFR 1.669(c)-2A - Computation of the beneficiary's income and tax for a prior taxable year.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... either the exact method or the short-cut method shall be determined by reference to the information... shows a mathematical error on its face which resulted in the wrong amount of tax being paid for such... amounts in such gross income, shall be based upon the return after the correction of such mathematical...

  15. Video as an effective method to deliver pretest information for rapid human immunodeficiency testing.

    PubMed

    Merchant, Roland C; Clark, Melissa A; Mayer, Kenneth H; Seage Iii, George R; DeGruttola, Victor G; Becker, Bruce M

    2009-02-01

    Video-based delivery of human immunodeficiency virus (HIV) pretest information might assist in streamlining HIV screening and testing efforts in the emergency department (ED). The objectives of this study were to determine if the video "Do you know about rapid HIV testing?" is an acceptable alternative to an in-person information session on rapid HIV pretest information, in regard to comprehension of rapid HIV pretest fundamentals, and to identify patients who might have difficulties in comprehending pretest information. This was a noninferiority trial of 574 participants in an ED opt-in rapid HIV screening program who were randomly assigned to receive identical pretest information from either an animated and live-action 9.5-minute video or an in-person information session. Pretest information comprehension was assessed using a questionnaire. The video would be accepted as not inferior to the in-person information session if the 95% confidence interval (CI) of the difference (Delta) in mean scores on the questionnaire between the two information groups was less than a 10% decrease in the in-person information session arm's mean score. Linear regression models were constructed to identify patients with lower mean scores based upon study arm assignment, demographic characteristics, and history of prior HIV testing. The questionnaire mean scores were 20.1 (95% CI = 19.7 to 20.5) for the video arm and 20.8 (95% CI = 20.4 to 21.2) for the in-person information session arm. The difference in mean scores compared to the mean score for the in-person information session met the noninferiority criterion for this investigation (Delta = 0.68; 95% CI = 0.18 to 1.26). In a multivariable linear regression model, Blacks/African Americans, Hispanics, and those with Medicare and Medicaid insurance exhibited slightly lower mean scores, regardless of the pretest information delivery format. There was a strong relationship between fewer years of formal education and lower mean scores on the questionnaire. Age, gender, type of insurance, partner/marital status, and history of prior HIV testing were not predictive of scores on the questionnaire. In terms of patient comprehension of rapid HIV pretest information fundamentals, the video was an acceptable substitute to pretest information delivered by an HIV test counselor. Both the video and the in-person information session were less effective in providing pretest information for patients with fewer years of formal education.

  16. Spatio-Temporal Information Analysis of Event-Related BOLD Responses

    PubMed Central

    Alpert, Galit Fuhrmann; Handwerker, Dan; Sun, Felice T.; D’Esposito, Mark; Knight, Robert T.

    2009-01-01

    A new approach for analysis of event related fMRI (BOLD) signals is proposed. The technique is based on measures from information theory and is used both for spatial localization of task related activity, as well as for extracting temporal information regarding the task dependent propagation of activation across different brain regions. This approach enables whole brain visualization of voxels (areas) most involved in coding of a specific task condition, the time at which they are most informative about the condition, as well as their average amplitude at that preferred time. The approach does not require prior assumptions about the shape of the hemodynamic response function (HRF), nor about linear relations between BOLD response and presented stimuli (or task conditions). We show that relative delays between different brain regions can also be computed without prior knowledge of the experimental design, suggesting a general method that could be applied for analysis of differential time delays that occur during natural, uncontrolled conditions. Here we analyze BOLD signals recorded during performance of a motor learning task. We show that during motor learning, the BOLD response of unimodal motor cortical areas precedes the response in higher-order multimodal association areas, including posterior parietal cortex. Brain areas found to be associated with reduced activity during motor learning, predominantly in prefrontal brain regions, are informative about the task typically at significantly later times. PMID:17188515

  17. Explanation and Prior Knowledge Interact to Guide Learning

    ERIC Educational Resources Information Center

    Williams, Joseph J.; Lombrozo, Tania

    2013-01-01

    How do explaining and prior knowledge contribute to learning? Four experiments explored the relationship between explanation and prior knowledge in category learning. The experiments independently manipulated whether participants were prompted to explain the category membership of study observations and whether category labels were informative in…

  18. Dealing with difficult deformations: construction of a knowledge-based deformation atlas

    NASA Astrophysics Data System (ADS)

    Thorup, S. S.; Darvann, T. A.; Hermann, N. V.; Larsen, P.; Ólafsdóttir, H.; Paulsen, R. R.; Kane, A. A.; Govier, D.; Lo, L.-J.; Kreiborg, S.; Larsen, R.

    2010-03-01

    Twenty-three Taiwanese infants with unilateral cleft lip and palate (UCLP) were CT-scanned before lip repair at the age of 3 months, and again after lip repair at the age of 12 months. In order to evaluate the surgical result, detailed point correspondence between pre- and post-surgical images was needed. We have previously demonstrated that non-rigid registration using B-splines is able to provide automated determination of point correspondences in populations of infants without cleft lip. However, this type of registration fails when applied to the task of determining the complex deformation from before to after lip closure in infants with UCLP. The purpose of the present work was to show that use of prior information about typical deformations due to lip closure, through the construction of a knowledge-based atlas of deformations, could overcome the problem. Initially, mean volumes (atlases) for the pre- and post-surgical populations, respectively, were automatically constructed by non-rigid registration. An expert placed corresponding landmarks in the cleft area in the two atlases; this provided prior information used to build a knowledge-based deformation atlas. We model the change from pre- to post-surgery using thin-plate spline warping. The registration results are convincing and represent a first move towards an automatic registration method for dealing with difficult deformations due to this type of surgery.

  19. Integrative Bayesian variable selection with gene-based informative priors for genome-wide association studies.

    PubMed

    Zhang, Xiaoshuai; Xue, Fuzhong; Liu, Hong; Zhu, Dianwen; Peng, Bin; Wiemels, Joseph L; Yang, Xiaowei

    2014-12-10

    Genome-wide Association Studies (GWAS) are typically designed to identify phenotype-associated single nucleotide polymorphisms (SNPs) individually using univariate analysis methods. Though providing valuable insights into genetic risks of common diseases, the genetic variants identified by GWAS generally account for only a small proportion of the total heritability for complex diseases. To solve this "missing heritability" problem, we implemented a strategy called integrative Bayesian Variable Selection (iBVS), which is based on a hierarchical model that incorporates an informative prior by considering the gene interrelationship as a network. It was applied here to both simulated and real data sets. Simulation studies indicated that the iBVS method was advantageous in its performance with highest AUC in both variable selection and outcome prediction, when compared to Stepwise and LASSO based strategies. In an analysis of a leprosy case-control study, iBVS selected 94 SNPs as predictors, while LASSO selected 100 SNPs. The Stepwise regression yielded a more parsimonious model with only 3 SNPs. The prediction results demonstrated that the iBVS method had comparable performance with that of LASSO, but better than Stepwise strategies. The proposed iBVS strategy is a novel and valid method for Genome-wide Association Studies, with the additional advantage in that it produces more interpretable posterior probabilities for each variable unlike LASSO and other penalized regression methods.

  20. Task switching in rhesus macaques (Macaca mulatta) and tufted capuchin monkeys (Cebus apella) during computerized categorization tasks.

    PubMed

    Smith, Travis R; Beran, Michael J

    2018-05-31

    The present experiments extended to monkeys a previously used abstract categorization procedure (Castro & Wasserman, 2016) where pigeons had categorized arrays of clipart icons based upon two task rules: the number of clipart objects in the array or the variability of objects in the array. Experiment 1 replicated Castro and Wasserman by using capuchin monkeys and rhesus monkeys and reported that monkeys' performances were similar to pigeons' in terms of acquisition, pattern of errors, and the absence of switch costs. Furthermore, monkeys' insensitivity to the added irrelevant information suggested that an associative (rather than rule-based) categorization mechanism was dominant. Experiment 2 was conducted to include categorization cue reversals to determine (a) whether the monkeys would quickly adapt to the reversals and inhibit interference from a prereversal task rule (consistent with a rule-based mechanism) and (b) whether the latency to make a response prior to a correct or incorrect outcome was informative about the presence of a cognitive mechanism. The cue reassignment produced profound and long-lasting performance deficits, and a long reacquisition phase suggested the involvement of associative learning processes; however, monkeys also displayed longer latencies to choose prior to correct responses on challenging trials, suggesting the involvement of nonassociative processes. Together these performances suggest a mix of associative and cognitive-control processes governing monkey categorization judgments. (PsycINFO Database Record (c) 2018 APA, all rights reserved).

  1. Standard Anatomic Terminologies: Comparison for Use in a Health Information Exchange–Based Prior Computed Tomography (CT) Alerting System

    PubMed Central

    Lowry, Tina; Vreeman, Daniel J; Loo, George T; Delman, Bradley N; Thum, Frederick L; Slovis, Benjamin H; Shapiro, Jason S

    2017-01-01

    Background A health information exchange (HIE)–based prior computed tomography (CT) alerting system may reduce avoidable CT imaging by notifying ordering clinicians of prior relevant studies when a study is ordered. For maximal effectiveness, a system would alert not only for prior same CTs (exams mapped to the same code from an exam name terminology) but also for similar CTs (exams mapped to different exam name terminology codes but in the same anatomic region) and anatomically proximate CTs (exams in adjacent anatomic regions). Notification of previous same studies across an HIE requires mapping of local site CT codes to a standard terminology for exam names (such as Logical Observation Identifiers Names and Codes [LOINC]) to show that two studies with different local codes and descriptions are equivalent. Notifying of prior similar or proximate CTs requires an additional mapping of exam codes to anatomic regions, ideally coded by an anatomic terminology. Several anatomic terminologies exist, but no prior studies have evaluated how well they would support an alerting use case. Objective The aim of this study was to evaluate the fitness of five existing standard anatomic terminologies to support similar or proximate alerts of an HIE-based prior CT alerting system. Methods We compared five standard anatomic terminologies (Foundational Model of Anatomy, Systematized Nomenclature of Medicine Clinical Terms, RadLex, LOINC, and LOINC/Radiological Society of North America [RSNA] Radiology Playbook) to an anatomic framework created specifically for our use case (Simple ANatomic Ontology for Proximity or Similarity [SANOPS]), to determine whether the existing terminologies could support our use case without modification. On the basis of an assessment of optimal terminology features for our purpose, we developed an ordinal anatomic terminology utility classification. We mapped samples of 100 random and the 100 most frequent LOINC CT codes to anatomic regions in each terminology, assigned utility classes for each mapping, and statistically compared each terminology’s utility class rankings. We also constructed seven hypothetical alerting scenarios to illustrate the terminologies’ differences. Results Both RadLex and the LOINC/RSNA Radiology Playbook anatomic terminologies ranked significantly better (P<.001) than the other standard terminologies for the 100 most frequent CTs, but no terminology ranked significantly better than any other for 100 random CTs. Hypothetical scenarios illustrated instances where no standard terminology would support appropriate proximate or similar alerts, without modification. Conclusions LOINC/RSNA Radiology Playbook and RadLex’s anatomic terminologies appear well suited to support proximate or similar alerts for commonly ordered CTs, but for less commonly ordered tests, modification of the existing terminologies with concepts and relations from SANOPS would likely be required. Our findings suggest SANOPS may serve as a framework for enhancing anatomic terminologies in support of other similar use cases. PMID:29242174

  2. Modelling heterogeneity variances in multiple treatment comparison meta-analysis – Are informative priors the better solution?

    PubMed Central

    2013-01-01

    Background Multiple treatment comparison (MTC) meta-analyses are commonly modeled in a Bayesian framework, and weakly informative priors are typically preferred to mirror familiar data driven frequentist approaches. Random-effects MTCs have commonly modeled heterogeneity under the assumption that the between-trial variance for all involved treatment comparisons are equal (i.e., the ‘common variance’ assumption). This approach ‘borrows strength’ for heterogeneity estimation across treatment comparisons, and thus, ads valuable precision when data is sparse. The homogeneous variance assumption, however, is unrealistic and can severely bias variance estimates. Consequently 95% credible intervals may not retain nominal coverage, and treatment rank probabilities may become distorted. Relaxing the homogeneous variance assumption may be equally problematic due to reduced precision. To regain good precision, moderately informative variance priors or additional mathematical assumptions may be necessary. Methods In this paper we describe four novel approaches to modeling heterogeneity variance - two novel model structures, and two approaches for use of moderately informative variance priors. We examine the relative performance of all approaches in two illustrative MTC data sets. We particularly compare between-study heterogeneity estimates and model fits, treatment effect estimates and 95% credible intervals, and treatment rank probabilities. Results In both data sets, use of moderately informative variance priors constructed from the pair wise meta-analysis data yielded the best model fit and narrower credible intervals. Imposing consistency equations on variance estimates, assuming variances to be exchangeable, or using empirically informed variance priors also yielded good model fits and narrow credible intervals. The homogeneous variance model yielded high precision at all times, but overall inadequate estimates of between-trial variances. Lastly, treatment rankings were similar among the novel approaches, but considerably different when compared with the homogenous variance approach. Conclusions MTC models using a homogenous variance structure appear to perform sub-optimally when between-trial variances vary between comparisons. Using informative variance priors, assuming exchangeability or imposing consistency between heterogeneity variances can all ensure sufficiently reliable and realistic heterogeneity estimation, and thus more reliable MTC inferences. All four approaches should be viable candidates for replacing or supplementing the conventional homogeneous variance MTC model, which is currently the most widely used in practice. PMID:23311298

  3. Uncertainty in sample estimates and the implicit loss function for soil information.

    NASA Astrophysics Data System (ADS)

    Lark, Murray

    2015-04-01

    One significant challenge in the communication of uncertain information is how to enable the sponsors of sampling exercises to make a rational choice of sample size. One way to do this is to compute the value of additional information given the loss function for errors. The loss function expresses the costs that result from decisions made using erroneous information. In certain circumstances, such as remediation of contaminated land prior to development, loss functions can be computed and used to guide rational decision making on the amount of resource to spend on sampling to collect soil information. In many circumstances the loss function cannot be obtained prior to decision making. This may be the case when multiple decisions may be based on the soil information and the costs of errors are hard to predict. The implicit loss function is proposed as a tool to aid decision making in these circumstances. Conditional on a logistical model which expresses costs of soil sampling as a function of effort, and statistical information from which the error of estimates can be modelled as a function of effort, the implicit loss function is the loss function which makes a particular decision on effort rational. In this presentation the loss function is defined and computed for a number of arbitrary decisions on sampling effort for a hypothetical soil monitoring problem. This is based on a logistical model of sampling cost parameterized from a recent geochemical survey of soil in Donegal, Ireland and on statistical parameters estimated with the aid of a process model for change in soil organic carbon. It is shown how the implicit loss function might provide a basis for reflection on a particular choice of sample size by comparing it with the values attributed to soil properties and functions. Scope for further research to develop and apply the implicit loss function to help decision making by policy makers and regulators is then discussed.

  4. The use of multiple models in case-based diagnosis

    NASA Technical Reports Server (NTRS)

    Karamouzis, Stamos T.; Feyock, Stefan

    1993-01-01

    The work described in this paper has as its goal the integration of a number of reasoning techniques into a unified intelligent information system that will aid flight crews with malfunction diagnosis and prognostication. One of these approaches involves using the extensive archive of information contained in aircraft accident reports along with various models of the aircraft as the basis for case-based reasoning about malfunctions. Case-based reasoning draws conclusions on the basis of similarities between the present situation and prior experience. We maintain that the ability of a CBR program to reason about physical systems is significantly enhanced by the addition to the CBR program of various models. This paper describes the diagnostic concepts implemented in a prototypical case based reasoner that operates in the domain of in-flight fault diagnosis, the various models used in conjunction with the reasoner's CBR component, and results from a preliminary evaluation.

  5. Methods for estimating population density in data-limited areas: evaluating regression and tree-based models in Peru.

    PubMed

    Anderson, Weston; Guikema, Seth; Zaitchik, Ben; Pan, William

    2014-01-01

    Obtaining accurate small area estimates of population is essential for policy and health planning but is often difficult in countries with limited data. In lieu of available population data, small area estimate models draw information from previous time periods or from similar areas. This study focuses on model-based methods for estimating population when no direct samples are available in the area of interest. To explore the efficacy of tree-based models for estimating population density, we compare six different model structures including Random Forest and Bayesian Additive Regression Trees. Results demonstrate that without information from prior time periods, non-parametric tree-based models produced more accurate predictions than did conventional regression methods. Improving estimates of population density in non-sampled areas is important for regions with incomplete census data and has implications for economic, health and development policies.

  6. Residual translation compensations in radar target narrowband imaging based on trajectory information

    NASA Astrophysics Data System (ADS)

    Yue, Wenjue; Peng, Bo; Wei, Xizhang; Li, Xiang; Liao, Dongping

    2018-05-01

    High velocity translation will result in defocusing scattering centers in radar imaging. In this paper, we propose a Residual Translation Compensations (RTC) method based on target trajectory information to eliminate the translation effects in radar imaging. Translation could not be simply regarded as a uniformly accelerated motion in reality. So the prior knowledge of the target trajectory is introduced to enhance compensation precision. First we use the two-body orbit model to figure out the radial distance. Then, stepwise compensations are applied to eliminate residual propagation delay based on conjugate multiplication method. Finally, tomography is used to confirm the validity of the method. Compare with translation parameters estimation method based on the spectral peak of the conjugate multiplied signal, RTC method in this paper enjoys a better tomography result. When the Signal Noise Ratio (SNR) of the radar echo signal is 4dB, the scattering centers can also be extracted clearly.

  7. Methods for Estimating Population Density in Data-Limited Areas: Evaluating Regression and Tree-Based Models in Peru

    PubMed Central

    Anderson, Weston; Guikema, Seth; Zaitchik, Ben; Pan, William

    2014-01-01

    Obtaining accurate small area estimates of population is essential for policy and health planning but is often difficult in countries with limited data. In lieu of available population data, small area estimate models draw information from previous time periods or from similar areas. This study focuses on model-based methods for estimating population when no direct samples are available in the area of interest. To explore the efficacy of tree-based models for estimating population density, we compare six different model structures including Random Forest and Bayesian Additive Regression Trees. Results demonstrate that without information from prior time periods, non-parametric tree-based models produced more accurate predictions than did conventional regression methods. Improving estimates of population density in non-sampled areas is important for regions with incomplete census data and has implications for economic, health and development policies. PMID:24992657

  8. Improving treatment intensification to reduce cardiovascular disease risk: a cluster randomized trial

    PubMed Central

    2012-01-01

    Background Blood pressure, lipid, and glycemic control are essential for reducing cardiovascular disease (CVD) risk. Many health care systems have successfully shifted aspects of chronic disease management, including population-based outreach programs designed to address CVD risk factor control, to non-physicians. The purpose of this study is to evaluate provision of new information to non-physician outreach teams on need for treatment intensification in patients with increased CVD risk. Methods Cluster randomized trial (July 1-December 31, 2008) in Kaiser Permanente Northern California registry of members with diabetes mellitus, prior CVD diagnoses and/or chronic kidney disease who were high-priority for treatment intensification: blood pressure ≥ 140 mmHg systolic, LDL-cholesterol ≥ 130 mg/dl, or hemoglobin A1c ≥ 9%; adherent to current medications; no recent treatment intensification). Randomization units were medical center-based outreach teams (4 intervention; 4 control). For intervention teams, priority flags for intensification were added monthly to the registry database with recommended next pharmacotherapeutic steps for each eligible patient. Control teams used the same database without this information. Outcomes included 3-month rates of treatment intensification and risk factor levels during follow-up. Results Baseline risk factor control rates were high (82-90%). In eligible patients, the intervention was associated with significantly greater 3-month intensification rates for blood pressure (34.1 vs. 30.6%) and LDL-cholesterol (28.0 vs 22.7%), but not A1c. No effects on risk factors were observed at 3 months or 12 months follow-up. Intervention teams initiated outreach for only 45-47% of high-priority patients, but also for 27-30% of lower-priority patients. Teams reported difficulties adapting prior outreach strategies to incorporate the new information. Conclusions Information enhancement did not improve risk factor control compared to existing outreach strategies at control centers. Familiarity with prior, relatively successful strategies likely reduced uptake of the innovation and its potential for success at intervention centers. Trial registration ClinicalTrials.gov Identifier NCT00517686 PMID:22747998

  9. Data quality and processing for decision making: divergence between corporate strategy and manufacturing processes

    NASA Astrophysics Data System (ADS)

    McNeil, Ronald D.; Miele, Renato; Shaul, Dennis

    2000-10-01

    Information technology is driving improvements in manufacturing systems. Results are higher productivity and quality. However, corporate strategy is driven by a number of factors and includes data and pressure from multiple stakeholders, which includes employees, managers, executives, stockholders, boards, suppliers and customers. It is also driven by information about competitors and emerging technology. Much information is based on processing of data and the resulting biases of the processors. Thus, stakeholders can base inputs on faulty perceptions, which are not reality based. Prior to processing, data used may be inaccurate. Sources of data and information may include demographic reports, statistical analyses, intelligence reports (e.g., marketing data), technology and primary data collection. The reliability and validity of data as well as the management of sources and information is critical element to strategy formulation. The paper explores data collection, processing and analyses from secondary and primary sources, information generation and report presentation for strategy formulation and contrast this with data and information utilized to drive internal process such as manufacturing. The hypothesis is that internal process, such as manufacturing, are subordinate to corporate strategies. The impact of possible divergence in quality of decisions at the corporate level on IT driven, quality-manufacturing processes based on measurable outcomes is significant. Recommendations for IT improvements at the corporate strategy level are given.

  10. Optimized tomography of continuous variable systems using excitation counting

    NASA Astrophysics Data System (ADS)

    Shen, Chao; Heeres, Reinier W.; Reinhold, Philip; Jiang, Luyao; Liu, Yi-Kai; Schoelkopf, Robert J.; Jiang, Liang

    2016-11-01

    We propose a systematic procedure to optimize quantum state tomography protocols for continuous variable systems based on excitation counting preceded by a displacement operation. Compared with conventional tomography based on Husimi or Wigner function measurement, the excitation counting approach can significantly reduce the number of measurement settings. We investigate both informational completeness and robustness, and provide a bound of reconstruction error involving the condition number of the sensing map. We also identify the measurement settings that optimize this error bound, and demonstrate that the improved reconstruction robustness can lead to an order-of-magnitude reduction of estimation error with given resources. This optimization procedure is general and can incorporate prior information of the unknown state to further simplify the protocol.

  11. A Semi-Discrete Landweber-Kaczmarz Method for Cone Beam Tomography and Laminography Exploiting Geometric Prior Information

    NASA Astrophysics Data System (ADS)

    Vogelgesang, Jonas; Schorr, Christian

    2016-12-01

    We present a semi-discrete Landweber-Kaczmarz method for solving linear ill-posed problems and its application to Cone Beam tomography and laminography. Using a basis function-type discretization in the image domain, we derive a semi-discrete model of the underlying scanning system. Based on this model, the proposed method provides an approximate solution of the reconstruction problem, i.e. reconstructing the density function of a given object from its projections, in suitable subspaces equipped with basis function-dependent weights. This approach intuitively allows the incorporation of additional information about the inspected object leading to a more accurate model of the X-rays through the object. Also, physical conditions of the scanning geometry, like flat detectors in computerized tomography as used in non-destructive testing applications as well as non-regular scanning curves e.g. appearing in computed laminography (CL) applications, are directly taken into account during the modeling process. Finally, numerical experiments of a typical CL application in three dimensions are provided to verify the proposed method. The introduction of geometric prior information leads to a significantly increased image quality and superior reconstructions compared to standard iterative methods.

  12. Mapping the timecourse of goal-directed attention to location and colour in human vision.

    PubMed

    Adams, Rachel C; Chambers, Christopher D

    2012-03-01

    Goal-directed attention prioritises perception of task-relevant stimuli according to location, features, or onset time. In this study we compared the behavioural timecourse of goal-directed selection to locations and colours by varying the stimulus-onset asynchrony (SOA) between cue and target in a strategic cueing paradigm. Participants reported the presence or absence of a target following prior information regarding its location or colour. Results revealed that preparatory selection by colour is more effective at enhancing perceptual sensitivity than selection by location, even though both types of cue provided equivalent overall information. More detailed analysis revealed that this advantage arose due a limitation of spatial attention in maintaining a sufficiently broad focus (>2°) for target detection across multiple stimuli. In contrast, when target stimuli fell within 2° of the spatial attention spotlight, the strategic advantages and speed of spatial and colour attention were equated. Our findings are consistent with the conclusion that, under spatially optimal conditions, prior spatial and colour information are equally proficient at guiding top-down selection. When spatial locations are ambiguous, however, colour-based selection is the more efficient mechanism. Copyright © 2012 Elsevier B.V. All rights reserved.

  13. Knowledge and practices regarding menstruation among adolescent girls in an urban slum, Bijapur.

    PubMed

    Udgiri, Rekha; Angadi, M M; Patil, Shailaja; Sorganvi, Vijaya

    2010-08-01

    Adolescence is a crucial period in woman's life. The adolescent girls of today are the mothers of tomorrow in whose hand lie the future of her family, community and the nation. Because of the scarcity of information regarding the problems of adolescent girls, particularly in urban areas, the present study was undertaken to elicit information about the knowledge and practices regarding menstruation among adolescent girls. With this objective, a community-based cross-sectional study was done in an urban field practice area of BLDEA's Shri BM Patil Medical College, Bijapur. The study subjects included all adolescent girls who had attained menarche. Data was collected by questionnaire method and analysed. Out of 342 adolescent girls 324 (94.74%) were literate. Only 63 (18.42%) had knowledge about menstruation prior to attainment of menarche and this association was found to be statistically significant. The main source of information about menstruation was mother ie, 195 (57.01%). Nearly 81.58% adolescent girls were lacking knowledge about menstruation prior to menarche, this reflects upon the standard of awareness in the society to such important event and it also leads to negative reaction to menarche.

  14. Automatic seizure detection based on the combination of newborn multi-channel EEG and HRV information

    NASA Astrophysics Data System (ADS)

    Mesbah, Mostefa; Balakrishnan, Malarvili; Colditz, Paul B.; Boashash, Boualem

    2012-12-01

    This article proposes a new method for newborn seizure detection that uses information extracted from both multi-channel electroencephalogram (EEG) and a single channel electrocardiogram (ECG). The aim of the study is to assess whether additional information extracted from ECG can improve the performance of seizure detectors based solely on EEG. Two different approaches were used to combine this extracted information. The first approach, known as feature fusion, involves combining features extracted from EEG and heart rate variability (HRV) into a single feature vector prior to feeding it to a classifier. The second approach, called classifier or decision fusion, is achieved by combining the independent decisions of the EEG and the HRV-based classifiers. Tested on recordings obtained from eight newborns with identified EEG seizures, the proposed neonatal seizure detection algorithms achieved 95.20% sensitivity and 88.60% specificity for the feature fusion case and 95.20% sensitivity and 94.30% specificity for the classifier fusion case. These results are considerably better than those involving classifiers using EEG only (80.90%, 86.50%) or HRV only (85.70%, 84.60%).

  15. Comprehensive surface geophysical investigation of karst caves ahead of the tunnel face: A case study in the Xiaoheyan section of the water supply project from Songhua River, Jilin, China

    NASA Astrophysics Data System (ADS)

    Bin, Liu; Zhengyu, Liu; Shucai, Li; Lichao, Nie; Maoxin, Su; Huaifeng, Sun; Kerui, Fan; Xinxin, Zhang; Yonghao, Pang

    2017-09-01

    This paper describes the application of a comprehensive surface geophysical investigation of underground karst systems ahead of the tunnel face in the Xiaoheyan section in the main line of the water supply project from Songhua River, located in Jilin, China. To make an accurate investigation, Surface Electrical Resistivity Tomography (S-ERT), Transient Electromagnetic Method (TEM), Geological Drilling (Geo-D) and Three-dimensional Cross-hole Electrical Resistivity Tomography (3D cross-hole ERT) were applied to gain a comprehensive interpretation. To begin with, S-ERT and TEM are adopted to detect and delineate the underground karst zone. Based on the detection results, surface and in-tunnel Geo-D are placed in major areas with more specific and accurate information gained. After that, survey lines of 3D cross-hole ERT are used to conduct detailed exploration towards underground karst system. In the comprehensive investigation, it is the major question to make the best of prior information so as to promote the quality of detection. The paper has put forward strategies to make the full use of effective information in data processing and the main ideas of those strategies include: (1) Take the resistivity distribution of the subsurface stratum gained by S-ERT inversion as the initial model of TEM inversion; (2) Arrange borehole positions with the results of S-ERT and TEM. After that, gain more accurate information about resistivity of subsurface stratum using those boreholes located; (3) Through the comprehensive analysis of the information about S-ERT, TEM and Geo-D, set the initial model of 3D cross-hole resistivity inversion and meanwhile, gain the variation range of stratum resistivity. At last, a 3D cross-hole resistivity inversion based on the incorporated initial model and inequality constraint is conducted. Constrained inversion and joint interpretation are realized by the effective use of prior information in comprehensive investigation, helping to suppress the non-uniqueness problem of inversion so as to raise its reliability. In this way, a 3D detailed model of underground karst system which is 30 m ahead of tunnel face is finally formed. At the end of the paper, there is a geological sketch of the revealed karst caves, which illustrates the effectiveness of the presented strategy. To sum up, in the comprehensive investigation of underground karst caves, the integrated use of prior information can help to yield more accurate and detailed results.

  16. Negotiating new literacies in science: An examination of at-risk and average-achieving ninth-grade readers' online reading comprehension strategies

    NASA Astrophysics Data System (ADS)

    Sevensma, Kara

    In today's digital world the Internet is becoming an increasingly predominant resource for science information, rapidly eclipsing the traditional science textbook in content area classrooms (Lawless & Schrader, 2008). The shift challenges researchers, educators, administrators, and policy makers to reconsider what it means to read and comprehend online science information. The research on digital literacy is still in its infancy and little is known about the strategies and processes students use when reading science content on the Internet. Even less is known about how at-risk readers comprehend digital science content. Therefore, this study addresses three research questions: (1) What strategies and processes do at-risk and average-achieving readers use as they locate information and generate meaning from science websites? (2) What navigational profiles emerge as at-risk and average-achieving readers construct traversals (unique online paths of information) they locate information and generate meaning from science websites? (3) What individual characteristics influenced students' strategies as they locate information and generate meaning from science websites? Participants were six ninth-grade students in general education biology classrooms. Three were average-achieving readers and three were at-risk readers based on assessments of reading comprehension in traditional print-based texts. The students engaged in a three-day research project about the rainforest biome, locating information online, taking notes, and constructing an information brochure about the rainforest for peers. Data measures prior to and during the research included an Internet use survey, verbal protocols, screen captures of online activity, oral reading fluency assessments, and prior knowledge and topic engagement surveys. Quantitative descriptive and univariate analyses as well as qualitative abductive coding were employed over multiple phases to analyze the data. First, the results suggest that students employed a variety of online reading comprehension strategies in complex and dynamic ways. Among the many strategies revealed, the group of self-regulatory strategies (planning, predicting, monitoring, and evaluating) played a significant role, influencing students' use of all other strategies for locating and generating meaning from science websites. Second, the results also suggested that patterns of strategy use could be examined as unique navigational profiles. Rather than remaining fixed, the navigational profiles of each student altered in response to tasks and research methods. Importantly, all at-risk readers revealed more effective navigational profiles on Day 3 when they were forced by design of the task to attend to project goals and employ more self-regulatory strategies. Third, the results revealed that traditional reading comprehension strategies and prior knowledge of the rainforest also influenced online reading comprehension. Specifically, the at-risk readers with the lowest reading comprehension, oral reading fluency, and prior knowledge scores were more likely than the average-achieving readers to encounter issues in online texts that resulted in constructing ineffective traversals, or online reading paths, and spending significant time investing in online reading that was irrelevant to the research project. Ultimately, this study advanced the understanding about online reading comprehension for average-achieving and at-risk readers in science classrooms, contributing to a gap in the research, suggesting implications for practice, and promoting future research questions.

  17. Information-seeking behavior changes in community-based teaching practices.

    PubMed

    Byrnes, Jennifer A; Kulick, Tracy A; Schwartz, Diane G

    2004-07-01

    A National Library of Medicine information access grant allowed for a collaborative project to provide computer resources in fourteen clinical practice sites that enabled health care professionals to access medical information via PubMed and the Internet. Health care professionals were taught how to access quality, cost-effective information that was user friendly and would result in improved patient care. Selected sites were located in medically underserved areas and received a computer, a printer, and, during year one, a fax machine. Participants were provided dial-up Internet service or were connected to the affiliated hospital's network. Clinicians were trained in how to search PubMed as a tool for practicing evidence-based medicine and to support clinical decision making. Health care providers were also taught how to find patient-education materials and continuing education programs and how to network with other professionals. Prior to the training, participants completed a questionnaire to assess their computer skills and familiarity with searching the Internet, MEDLINE, and other health-related databases. Responses indicated favorable changes in information-seeking behavior, including an increased frequency in conducting MEDLINE searches and Internet searches for work-related information.

  18. Understanding pictorial information in biology: students' cognitive activities and visual reading strategies

    NASA Astrophysics Data System (ADS)

    Brandstetter, Miriam; Sandmann, Angela; Florian, Christine

    2017-06-01

    In classroom, scientific contents are increasingly communicated through visual forms of representations. Students' learning outcomes rely on their ability to read and understand pictorial information. Understanding pictorial information in biology requires cognitive effort and can be challenging to students. Yet evidence-based knowledge about students' visual reading strategies during the process of understanding pictorial information is pending. Therefore, 42 students at the age of 14-15 were asked to think aloud while trying to understand visual representations of the blood circulatory system and the patellar reflex. A category system was developed differentiating 16 categories of cognitive activities. A Principal Component Analysis revealed two underlying patterns of activities that can be interpreted as visual reading strategies: 1. Inferences predominated by using a problem-solving schema; 2. Inferences predominated by recall of prior content knowledge. Each pattern consists of a specific set of cognitive activities that reflect selection, organisation and integration of pictorial information as well as different levels of expertise. The results give detailed insights into cognitive activities of students who were required to understand the pictorial information of complex organ systems. They provide an evidence-based foundation to derive instructional aids that can promote students pictorial-information-based learning on different levels of expertise.

  19. 30 CFR 100.6 - Procedures for review of citations and orders; procedures for assessment of civil penalties and...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ..., the parties may submit any additional relevant information relating to the violation, either prior to... to submit additional information or request a safety and health conference with the District Manager... parties to discuss the issues involved prior to the conference. (d) MSHA will consider all relevant...

  20. 6 CFR 5.46 - Procedure when response to demand is required prior to receiving instructions.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 6 Domestic Security 1 2010-01-01 2010-01-01 false Procedure when response to demand is required prior to receiving instructions. 5.46 Section 5.46 Domestic Security DEPARTMENT OF HOMELAND SECURITY, OFFICE OF THE SECRETARY DISCLOSURE OF RECORDS AND INFORMATION Disclosure of Information in Litigation § 5...

  1. Is Bayesian Estimation Proper for Estimating the Individual's Ability? Research Report 80-3.

    ERIC Educational Resources Information Center

    Samejima, Fumiko

    The effect of prior information in Bayesian estimation is considered, mainly from the standpoint of objective testing. In the estimation of a parameter belonging to an individual, the prior information is, in most cases, the density function of the population to which the individual belongs. Bayesian estimation was compared with maximum likelihood…

  2. Small-Sample Equating with Prior Information. Research Report. ETS RR-09-25

    ERIC Educational Resources Information Center

    Livingston, Samuel A.; Lewis, Charles

    2009-01-01

    This report proposes an empirical Bayes approach to the problem of equating scores on test forms taken by very small numbers of test takers. The equated score is estimated separately at each score point, making it unnecessary to model either the score distribution or the equating transformation. Prior information comes from equatings of other…

  3. Sun Exposure and Melanoma Survival: A GEM Study

    PubMed Central

    Berwick, Marianne; Reiner, Anne S.; Paine, Susan; Armstrong, Bruce K.; Kricker, Anne; Goumas, Chris; Cust, Anne E.; Thomas, Nancy E.; Groben, Pamela A.; From, Lynn; Busam, Klaus; Orlow, Irene; Marrett, Loraine D.; Gallagher, Richard P.; Gruber, Stephen B.; Anton-Culver, Hoda; Rosso, Stefano; Zanetti, Roberto; Kanetsky, Peter A.; Dwyer, Terry; Venn, Alison; Lee-Taylor, Julia; Begg, Colin B.

    2014-01-01

    Background We previously reported a significant association between higher ultraviolet radiation exposure before diagnosis and greater survival with melanoma in a population-based study in Connecticut. We sought to evaluate the hypothesis that sun exposure prior to diagnosis was associated with greater survival in a larger, international population-based study with more detailed exposure information. Methods We conducted a multi-center, international population-based study in four countries – Australia, Italy, Canada and the United States – with 3,578 cases of melanoma with an average of 7.4 years of follow-up. Measures of sun exposure included sunburn, intermittent exposure, hours of holiday sun exposure, hours of water-related outdoor activities, ambient UVB dose, histological solar elastosis and season of diagnosis. Results Results were not strongly supportive of the earlier hypothesis. Having had any sunburn in one year within 10 years of diagnosis was inversely associated with survival; solar elastosis – a measure of lifetime cumulative exposure – was not. Additionally, none of the intermittent exposure measures – water related activities and sunny holidays - were associated with melanoma-specific survival. Estimated ambient UVB dose was not associated with survival. Conclusion Although there was an apparent protective effect of sunburns within 10 years of diagnosis, there was only weak evidence in this large, international, population-based study of melanoma that sun exposure prior to diagnosis is associated with greater melanoma-specific survival. Impact This study adds to the evidence that sun exposure prior to melanoma diagnosis has little effect on survival with melanoma. PMID:25069694

  4. Women who are well informed about prenatal genetic screening delay emotional attachment to their fetus.

    PubMed

    Rowe, Heather; Fisher, Jane; Quinlivan, Julie

    2009-03-01

    Prenatal maternal serum screening allows assessment of risk of chromosomal abnormalities in the fetus and is increasingly being offered to all women regardless of age or prior risk. However ensuring informed choice to participate in screening is difficult and the psychological implications of making an informed decision are uncertain. The aim of this study was to compare the growth of maternal-fetal emotional attachment in groups of women whose decisions about participation in screening were informed or not informed. A prospective longitudinal design was used. English speaking women were recruited in antenatal clinics prior to the offer of second trimester maternal screening. Three self-report questionnaires completed over the course of pregnancy used validated measures of informed choice and maternal-fetal emotional attachment. Attachment scores throughout pregnancy in informed and not-informed groups were compared in repeated measures analysis. 134 completed the first assessment (recruitment 73%) and 68 (58%) provided compete data. The informed group had significantly lower attachment scores (p = 0.023) than the not-informed group prior to testing, but scores were similar (p = 0.482) after test results were known. The findings raise questions about the impact of delayed maternal-fetal attachment and appropriate interventions to facilitate informed choice to participate in screening.

  5. BaTMAn: Bayesian Technique for Multi-image Analysis

    NASA Astrophysics Data System (ADS)

    Casado, J.; Ascasibar, Y.; García-Benito, R.; Guidi, G.; Choudhury, O. S.; Bellocchi, E.; Sánchez, S. F.; Díaz, A. I.

    2016-12-01

    Bayesian Technique for Multi-image Analysis (BaTMAn) characterizes any astronomical dataset containing spatial information and performs a tessellation based on the measurements and errors provided as input. The algorithm iteratively merges spatial elements as long as they are statistically consistent with carrying the same information (i.e. identical signal within the errors). The output segmentations successfully adapt to the underlying spatial structure, regardless of its morphology and/or the statistical properties of the noise. BaTMAn identifies (and keeps) all the statistically-significant information contained in the input multi-image (e.g. an IFS datacube). The main aim of the algorithm is to characterize spatially-resolved data prior to their analysis.

  6. Comparison of different strategies for using fossil calibrations to generate the time prior in Bayesian molecular clock dating.

    PubMed

    Barba-Montoya, Jose; Dos Reis, Mario; Yang, Ziheng

    2017-09-01

    Fossil calibrations are the utmost source of information for resolving the distances between molecular sequences into estimates of absolute times and absolute rates in molecular clock dating analysis. The quality of calibrations is thus expected to have a major impact on divergence time estimates even if a huge amount of molecular data is available. In Bayesian molecular clock dating, fossil calibration information is incorporated in the analysis through the prior on divergence times (the time prior). Here, we evaluate three strategies for converting fossil calibrations (in the form of minimum- and maximum-age bounds) into the prior on times, which differ according to whether they borrow information from the maximum age of ancestral nodes and minimum age of descendent nodes to form constraints for any given node on the phylogeny. We study a simple example that is analytically tractable, and analyze two real datasets (one of 10 primate species and another of 48 seed plant species) using three Bayesian dating programs: MCMCTree, MrBayes and BEAST2. We examine how different calibration strategies, the birth-death process, and automatic truncation (to enforce the constraint that ancestral nodes are older than descendent nodes) interact to determine the time prior. In general, truncation has a great impact on calibrations so that the effective priors on the calibration node ages after the truncation can be very different from the user-specified calibration densities. The different strategies for generating the effective prior also had considerable impact, leading to very different marginal effective priors. Arbitrary parameters used to implement minimum-bound calibrations were found to have a strong impact upon the prior and posterior of the divergence times. Our results highlight the importance of inspecting the joint time prior used by the dating program before any Bayesian dating analysis. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  7. Direct-to-Consumer Genetic Testing: User Motivations, Decision Making, and Perceived Utility of Results.

    PubMed

    Roberts, J Scott; Gornick, Michele C; Carere, Deanna Alexis; Uhlmann, Wendy R; Ruffin, Mack T; Green, Robert C

    2017-01-01

    To describe the interests, decision making, and responses of consumers of direct-to-consumer personal genomic testing (DTC-PGT) services. Prior to 2013 regulatory restrictions on DTC-PGT services, 1,648 consumers from 2 leading companies completed Web surveys before and after receiving test results. Prior to testing, DTC-PGT consumers were as interested in ancestry (74% very interested) and trait information (72%) as they were in disease risks (72%). Among disease risks, heart disease (68% very interested), breast cancer (67%), and Alzheimer disease (66%) were of greatest interest prior to testing. Interest in disease risks was associated with female gender and poorer self-reported health (p < 0.01). Many consumers (38%) did not consider the possibility of unwanted information before purchasing services; this group was more likely to be older, male, and less educated (p < 0.05). After receiving results, 59% of respondents said test information would influence management of their health; 2% reported regret about seeking testing and 1% reported harm from results. DTC-PGT has attracted controversy because of the health-related information it provides, but nonmedical information is of equal or greater interest to consumers. Although many consumers did not fully consider potential risks prior to testing, DTC-PGT was generally perceived as useful in informing future health decisions. © 2017 S. Karger AG, Basel.

  8. Boosting probabilistic graphical model inference by incorporating prior knowledge from multiple sources.

    PubMed

    Praveen, Paurush; Fröhlich, Holger

    2013-01-01

    Inferring regulatory networks from experimental data via probabilistic graphical models is a popular framework to gain insights into biological systems. However, the inherent noise in experimental data coupled with a limited sample size reduces the performance of network reverse engineering. Prior knowledge from existing sources of biological information can address this low signal to noise problem by biasing the network inference towards biologically plausible network structures. Although integrating various sources of information is desirable, their heterogeneous nature makes this task challenging. We propose two computational methods to incorporate various information sources into a probabilistic consensus structure prior to be used in graphical model inference. Our first model, called Latent Factor Model (LFM), assumes a high degree of correlation among external information sources and reconstructs a hidden variable as a common source in a Bayesian manner. The second model, a Noisy-OR, picks up the strongest support for an interaction among information sources in a probabilistic fashion. Our extensive computational studies on KEGG signaling pathways as well as on gene expression data from breast cancer and yeast heat shock response reveal that both approaches can significantly enhance the reconstruction accuracy of Bayesian Networks compared to other competing methods as well as to the situation without any prior. Our framework allows for using diverse information sources, like pathway databases, GO terms and protein domain data, etc. and is flexible enough to integrate new sources, if available.

  9. Learners' strategies for reconstructing cognitive frameworks and navigating conceptual change from prior conception to consensual genetics knowledge

    NASA Astrophysics Data System (ADS)

    Parrott, Annette M.

    Problem. Science teachers are charged with preparing students to become scientifically literate individuals. Teachers are given curriculum that specifies the knowledge that students should come away with; however, they are not necessarily aware of the knowledge with which the student arrives or how best to help them navigate between the two knowledge states. Educators must be aware, not only of where their students are conceptually, but how their students move from their prior knowledge and naive theories, to scientifically acceptable theories. The understanding of how students navigate this course has the potential to revolutionize educational practices. Methods. This study explored how five 9th grade biology students reconstructed their cognitive frameworks and navigated conceptual change from prior conception to consensual genetics knowledge. The research questions investigated were: (1) how do students in the process of changing their naive science theories to accepted science theories describe their journey from prior knowledge to current conception, and (2) what are the methods that students utilize to bridge the gap between alternate and consensual science conceptions to effect conceptual change. Qualitative and quantitative methods were employed to gather and analyze the data. In depth, semi-structured interviews formed the primary data for probing the context and details of students' conceptual change experience. Primary interview data was coded by thematic analysis. Results and discussion. This study revealed information about students' perceived roles in learning, the role of articulation in the conceptual change process, and ways in which a community of learners aids conceptual change. It was ascertained that students see their role in learning primarily as repeating information until they could add that information to their knowledge. Students are more likely to consider challenges to their conceptual frameworks and be more motivated to become active participants in constructing their knowledge when they are working collaboratively with peers instead of receiving instruction from their teacher. Articulation was found to be instrumental in aiding learners in identifying their alternate conceptions as well as in revisiting, investigating and reconstructing their conceptual frameworks. Based on the assumptions generated, suggestions were offered to inform pedagogical practice in support of the conceptual change process.

  10. In silico model-based inference: a contemporary approach for hypothesis testing in network biology

    PubMed Central

    Klinke, David J.

    2014-01-01

    Inductive inference plays a central role in the study of biological systems where one aims to increase their understanding of the system by reasoning backwards from uncertain observations to identify causal relationships among components of the system. These causal relationships are postulated from prior knowledge as a hypothesis or simply a model. Experiments are designed to test the model. Inferential statistics are used to establish a level of confidence in how well our postulated model explains the acquired data. This iterative process, commonly referred to as the scientific method, either improves our confidence in a model or suggests that we revisit our prior knowledge to develop a new model. Advances in technology impact how we use prior knowledge and data to formulate models of biological networks and how we observe cellular behavior. However, the approach for model-based inference has remained largely unchanged since Fisher, Neyman and Pearson developed the ideas in the early 1900’s that gave rise to what is now known as classical statistical hypothesis (model) testing. Here, I will summarize conventional methods for model-based inference and suggest a contemporary approach to aid in our quest to discover how cells dynamically interpret and transmit information for therapeutic aims that integrates ideas drawn from high performance computing, Bayesian statistics, and chemical kinetics. PMID:25139179

  11. In silico model-based inference: a contemporary approach for hypothesis testing in network biology.

    PubMed

    Klinke, David J

    2014-01-01

    Inductive inference plays a central role in the study of biological systems where one aims to increase their understanding of the system by reasoning backwards from uncertain observations to identify causal relationships among components of the system. These causal relationships are postulated from prior knowledge as a hypothesis or simply a model. Experiments are designed to test the model. Inferential statistics are used to establish a level of confidence in how well our postulated model explains the acquired data. This iterative process, commonly referred to as the scientific method, either improves our confidence in a model or suggests that we revisit our prior knowledge to develop a new model. Advances in technology impact how we use prior knowledge and data to formulate models of biological networks and how we observe cellular behavior. However, the approach for model-based inference has remained largely unchanged since Fisher, Neyman and Pearson developed the ideas in the early 1900s that gave rise to what is now known as classical statistical hypothesis (model) testing. Here, I will summarize conventional methods for model-based inference and suggest a contemporary approach to aid in our quest to discover how cells dynamically interpret and transmit information for therapeutic aims that integrates ideas drawn from high performance computing, Bayesian statistics, and chemical kinetics. © 2014 American Institute of Chemical Engineers.

  12. Unscaled Bayes factors for multiple hypothesis testing in microarray experiments.

    PubMed

    Bertolino, Francesco; Cabras, Stefano; Castellanos, Maria Eugenia; Racugno, Walter

    2015-12-01

    Multiple hypothesis testing collects a series of techniques usually based on p-values as a summary of the available evidence from many statistical tests. In hypothesis testing, under a Bayesian perspective, the evidence for a specified hypothesis against an alternative, conditionally on data, is given by the Bayes factor. In this study, we approach multiple hypothesis testing based on both Bayes factors and p-values, regarding multiple hypothesis testing as a multiple model selection problem. To obtain the Bayes factors we assume default priors that are typically improper. In this case, the Bayes factor is usually undetermined due to the ratio of prior pseudo-constants. We show that ignoring prior pseudo-constants leads to unscaled Bayes factor which do not invalidate the inferential procedure in multiple hypothesis testing, because they are used within a comparative scheme. In fact, using partial information from the p-values, we are able to approximate the sampling null distribution of the unscaled Bayes factor and use it within Efron's multiple testing procedure. The simulation study suggests that under normal sampling model and even with small sample sizes, our approach provides false positive and false negative proportions that are less than other common multiple hypothesis testing approaches based only on p-values. The proposed procedure is illustrated in two simulation studies, and the advantages of its use are showed in the analysis of two microarray experiments. © The Author(s) 2011.

  13. Prediction of penicillin resistance in Staphylococcus aureus isolates from dairy cows with mastitis, based on prior test results.

    PubMed

    Grinberg, A; Lopez-Villalobos, N; Lawrence, K; Nulsen, M

    2005-10-01

    To gauge how well prior laboratory test results predict in vitro penicillin resistance of Staphylococcus aureus isolates from dairy cows with mastitis. Population-based data on the farm of origin (n=79), genotype based on pulsed-field gel electrophoresis (PFGE) results, and the penicillin-resistance status of Staph. aureus isolates (n=115) from milk samples collected from dairy cows with mastitis submitted to two diagnostic laboratories over a 6-month period were used. Data were mined stochastically using the all-possible-pairs method, binomial modelling and bootstrap simulation, to test whether prior test results enhance the accuracy of prediction of penicillin resistance on farms. Of all Staph. aureus isolates tested, 38% were penicillin resistant. A significant aggregation of penicillin-resistance status was evident within farms. The probability of random pairs of isolates from the same farm having the same penicillin-resistance status was 76%, compared with 53% for random pairings of samples across all farms. Thus, the resistance status of randomly selected isolates was 1.43 times more likely to correctly predict the status of other isolates from the same farm than the random population pairwise concordance probability (p=0.011). This effect was likely due to the clonal relationship of isolates within farms, as the predictive fraction attributable to prior test results was close to nil when the effect of within-farm clonal infections was withdrawn from the model. Knowledge of the penicillin-resistance status of a prior Staph. aureus isolate significantly enhanced the predictive capability of other isolates from the same farm. In the time and space frame of this study, clinicians using previous information from a farm would have more accurately predicted the penicillin-resistance status of an isolate than they would by chance alone on farms infected with clonal Staph. aureus isolates, but not on farms infected with highly genetically heterogeneous bacterial strains.

  14. Effects of Prior Knowledge on Memory: Implications for Education

    ERIC Educational Resources Information Center

    Shing, Yee Lee; Brod, Garvin

    2016-01-01

    The encoding, consolidation, and retrieval of events and facts form the basis for acquiring new skills and knowledge. Prior knowledge can enhance those memory processes considerably and thus foster knowledge acquisition. But prior knowledge can also hinder knowledge acquisition, in particular when the to-be-learned information is inconsistent with…

  15. Menarche: Prior Knowledge and Experience.

    ERIC Educational Resources Information Center

    Skandhan, K. P.; And Others

    1988-01-01

    Recorded menstruation information among 305 young women in India, assessing the differences between those who did and did not have knowledge of menstruation prior to menarche. Those with prior knowledge considered menarche to be a normal physiological function and had a higher rate of regularity, lower rate of dysmenorrhea, and earlier onset of…

  16. A comparison between Bayes discriminant analysis and logistic regression for prediction of debris flow in southwest Sichuan, China

    NASA Astrophysics Data System (ADS)

    Xu, Wenbo; Jing, Shaocai; Yu, Wenjuan; Wang, Zhaoxian; Zhang, Guoping; Huang, Jianxi

    2013-11-01

    In this study, the high risk areas of Sichuan Province with debris flow, Panzhihua and Liangshan Yi Autonomous Prefecture, were taken as the studied areas. By using rainfall and environmental factors as the predictors and based on the different prior probability combinations of debris flows, the prediction of debris flows was compared in the areas with statistical methods: logistic regression (LR) and Bayes discriminant analysis (BDA). The results through the comprehensive analysis show that (a) with the mid-range scale prior probability, the overall predicting accuracy of BDA is higher than those of LR; (b) with equal and extreme prior probabilities, the overall predicting accuracy of LR is higher than those of BDA; (c) the regional predicting models of debris flows with rainfall factors only have worse performance than those introduced environmental factors, and the predicting accuracies of occurrence and nonoccurrence of debris flows have been changed in the opposite direction as the supplemented information.

  17. Unmet Mental Health Treatment Need and Attitudes Toward Online Mental Health Services Among Community College Students.

    PubMed

    Dunbar, Michael S; Sontag-Padilla, Lisa; Kase, Courtney A; Seelam, Rachana; Stein, Bradley D

    2018-05-01

    A survey assessed use of and attitudes toward online mental health services among community college students to inform how such services may contribute to reducing unmet treatment need. A total of 6,034 students completed a Web-based survey on mental health and use of and attitudes toward mental health services. Logistic regression assessed the relationship between prior mental health treatment and attitudes among students with current serious psychological distress. Among students with psychological distress (N=1,557), 28% reported prior in-person service use and 3% reported online mental health services use; most (60%) reported willingness to use online services. Students with no prior in-person treatment were less likely than those with history of in-person treatment to endorse preferences for in-person services (adjusted odds ratio=.54). Students reported being open to using online mental health services, but utilization was low. Targeted outreach efforts may be required if these services are to reduce unmet treatment need.

  18. Raptor ecology of Raft River Valley, Idaho

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Thurow, T.L.; White, C.M.; Howard, R.P.

    1980-09-01

    Raptor data were gathered in the 988-km/sup 2/ Raft River Valley in southcentral Idaho while conducting a tolerance study on the nesting Ferruginous Hawk (Buteo regalis) near the Department of Energy's Raft River Geothermal Site. Prior research from 1972 to 1977 on the nesting activity of the Ferruginous Hawk population provided a historical information base. These data are combined with new Ferruginous Hawk data collected between 1978 and 1980 to give a continuous 9-year breeding survey. Information on the distribution, density, and production of the other raptor species found in the study area during 1978 and 1979 is also provided.

  19. Unified approach for extrapolation and bridging of adult information in early-phase dose-finding paediatric studies.

    PubMed

    Petit, Caroline; Samson, Adeline; Morita, Satoshi; Ursino, Moreno; Guedj, Jérémie; Jullien, Vincent; Comets, Emmanuelle; Zohar, Sarah

    2018-06-01

    The number of trials conducted and the number of patients per trial are typically small in paediatric clinical studies. This is due to ethical constraints and the complexity of the medical process for treating children. While incorporating prior knowledge from adults may be extremely valuable, this must be done carefully. In this paper, we propose a unified method for designing and analysing dose-finding trials in paediatrics, while bridging information from adults. The dose-range is calculated under three extrapolation options, linear, allometry and maturation adjustment, using adult pharmacokinetic data. To do this, it is assumed that target exposures are the same in both populations. The working model and prior distribution parameters of the dose-toxicity and dose-efficacy relationships are obtained using early-phase adult toxicity and efficacy data at several dose levels. Priors are integrated into the dose-finding process through Bayesian model selection or adaptive priors. This calibrates the model to adjust for misspecification, if the adult and pediatric data are very different. We performed a simulation study which indicates that incorporating prior adult information in this way may improve dose selection in children.

  20. Correction of projective distortion in long-image-sequence mosaics without prior information

    NASA Astrophysics Data System (ADS)

    Yang, Chenhui; Mao, Hongwei; Abousleman, Glen; Si, Jennie

    2010-04-01

    Image mosaicking is the process of piecing together multiple video frames or still images from a moving camera to form a wide-area or panoramic view of the scene being imaged. Mosaics have widespread applications in many areas such as security surveillance, remote sensing, geographical exploration, agricultural field surveillance, virtual reality, digital video, and medical image analysis, among others. When mosaicking a large number of still images or video frames, the quality of the resulting mosaic is compromised by projective distortion. That is, during the mosaicking process, the image frames that are transformed and pasted to the mosaic become significantly scaled down and appear out of proportion with respect to the mosaic. As more frames continue to be transformed, important target information in the frames can be lost since the transformed frames become too small, which eventually leads to the inability to continue further. Some projective distortion correction techniques make use of prior information such as GPS information embedded within the image, or camera internal and external parameters. Alternatively, this paper proposes a new algorithm to reduce the projective distortion without using any prior information whatsoever. Based on the analysis of the projective distortion, we approximate the projective matrix that describes the transformation between image frames using an affine model. Using singular value decomposition, we can deduce the affine model scaling factor that is usually very close to 1. By resetting the image scale of the affine model to 1, the transformed image size remains unchanged. Even though the proposed correction introduces some error in the image matching, this error is typically acceptable and more importantly, the final mosaic preserves the original image size after transformation. We demonstrate the effectiveness of this new correction algorithm on two real-world unmanned air vehicle (UAV) sequences. The proposed method is shown to be effective and suitable for real-time implementation.

Top