Sample records for prior studies methods

  1. Addressing potential prior-data conflict when using informative priors in proof-of-concept studies.

    PubMed

    Mutsvari, Timothy; Tytgat, Dominique; Walley, Rosalind

    2016-01-01

    Bayesian methods are increasingly used in proof-of-concept studies. An important benefit of these methods is the potential to use informative priors, thereby reducing sample size. This is particularly relevant for treatment arms where there is a substantial amount of historical information such as placebo and active comparators. One issue with using an informative prior is the possibility of a mismatch between the informative prior and the observed data, referred to as prior-data conflict. We focus on two methods for dealing with this: a testing approach and a mixture prior approach. The testing approach assesses prior-data conflict by comparing the observed data to the prior predictive distribution and resorting to a non-informative prior if prior-data conflict is declared. The mixture prior approach uses a prior with a precise and diffuse component. We assess these approaches for the normal case via simulation and show they have some attractive features as compared with the standard one-component informative prior. For example, when the discrepancy between the prior and the data is sufficiently marked, and intuitively, one feels less certain about the results, both the testing and mixture approaches typically yield wider posterior-credible intervals than when there is no discrepancy. In contrast, when there is no discrepancy, the results of these approaches are typically similar to the standard approach. Whilst for any specific study, the operating characteristics of any selected approach should be assessed and agreed at the design stage; we believe these two approaches are each worthy of consideration. Copyright © 2015 John Wiley & Sons, Ltd.

  2. Exposure Models for the Prior Distribution in Bayesian Decision Analysis for Occupational Hygiene Decision Making

    PubMed Central

    Lee, Eun Gyung; Kim, Seung Won; Feigley, Charles E.; Harper, Martin

    2015-01-01

    This study introduces two semi-quantitative methods, Structured Subjective Assessment (SSA) and Control of Substances Hazardous to Health (COSHH) Essentials, in conjunction with two-dimensional Monte Carlo simulations for determining prior probabilities. Prior distribution using expert judgment was included for comparison. Practical applications of the proposed methods were demonstrated using personal exposure measurements of isoamyl acetate in an electronics manufacturing facility and of isopropanol in a printing shop. Applicability of these methods in real workplaces was discussed based on the advantages and disadvantages of each method. Although these methods could not be completely independent of expert judgments, this study demonstrated a methodological improvement in the estimation of the prior distribution for the Bayesian decision analysis tool. The proposed methods provide a logical basis for the decision process by considering determinants of worker exposure. PMID:23252451

  3. Tree Biomass Estimation of Chinese fir (Cunninghamia lanceolata) Based on Bayesian Method

    PubMed Central

    Zhang, Jianguo

    2013-01-01

    Chinese fir (Cunninghamia lanceolata (Lamb.) Hook.) is the most important conifer species for timber production with huge distribution area in southern China. Accurate estimation of biomass is required for accounting and monitoring Chinese forest carbon stocking. In the study, allometric equation was used to analyze tree biomass of Chinese fir. The common methods for estimating allometric model have taken the classical approach based on the frequency interpretation of probability. However, many different biotic and abiotic factors introduce variability in Chinese fir biomass model, suggesting that parameters of biomass model are better represented by probability distributions rather than fixed values as classical method. To deal with the problem, Bayesian method was used for estimating Chinese fir biomass model. In the Bayesian framework, two priors were introduced: non-informative priors and informative priors. For informative priors, 32 biomass equations of Chinese fir were collected from published literature in the paper. The parameter distributions from published literature were regarded as prior distributions in Bayesian model for estimating Chinese fir biomass. Therefore, the Bayesian method with informative priors was better than non-informative priors and classical method, which provides a reasonable method for estimating Chinese fir biomass. PMID:24278198

  4. Tree biomass estimation of Chinese fir (Cunninghamia lanceolata) based on Bayesian method.

    PubMed

    Zhang, Xiongqing; Duan, Aiguo; Zhang, Jianguo

    2013-01-01

    Chinese fir (Cunninghamia lanceolata (Lamb.) Hook.) is the most important conifer species for timber production with huge distribution area in southern China. Accurate estimation of biomass is required for accounting and monitoring Chinese forest carbon stocking. In the study, allometric equation W = a(D2H)b was used to analyze tree biomass of Chinese fir. The common methods for estimating allometric model have taken the classical approach based on the frequency interpretation of probability. However, many different biotic and abiotic factors introduce variability in Chinese fir biomass model, suggesting that parameters of biomass model are better represented by probability distributions rather than fixed values as classical method. To deal with the problem, Bayesian method was used for estimating Chinese fir biomass model. In the Bayesian framework, two priors were introduced: non-informative priors and informative priors. For informative priors, 32 biomass equations of Chinese fir were collected from published literature in the paper. The parameter distributions from published literature were regarded as prior distributions in Bayesian model for estimating Chinese fir biomass. Therefore, the Bayesian method with informative priors was better than non-informative priors and classical method, which provides a reasonable method for estimating Chinese fir biomass.

  5. Adaptive allocation for binary outcomes using decreasingly informative priors.

    PubMed

    Sabo, Roy T

    2014-01-01

    A method of outcome-adaptive allocation is presented using Bayes methods, where a natural lead-in is incorporated through the use of informative yet skeptical prior distributions for each treatment group. These prior distributions are modeled on unobserved data in such a way that their influence on the allocation scheme decreases as the trial progresses. Simulation studies show this method to behave comparably to the Bayesian adaptive allocation method described by Thall and Wathen (2007), who incorporate a natural lead-in through sample-size-based exponents.

  6. Investigating the impact of spatial priors on the performance of model-based IVUS elastography

    PubMed Central

    Richards, M S; Doyley, M M

    2012-01-01

    This paper describes methods that provide pre-requisite information for computing circumferential stress in modulus elastograms recovered from vascular tissue—information that could help cardiologists detect life-threatening plaques and predict their propensity to rupture. The modulus recovery process is an ill-posed problem; therefore additional information is needed to provide useful elastograms. In this work, prior geometrical information was used to impose hard or soft constraints on the reconstruction process. We conducted simulation and phantom studies to evaluate and compare modulus elastograms computed with soft and hard constraints versus those computed without any prior information. The results revealed that (1) the contrast-to-noise ratio of modulus elastograms achieved using the soft prior and hard prior reconstruction methods exceeded those computed without any prior information; (2) the soft prior and hard prior reconstruction methods could tolerate up to 8 % measurement noise; and (3) the performance of soft and hard prior modulus elastogram degraded when incomplete spatial priors were employed. This work demonstrates that including spatial priors in the reconstruction process should improve the performance of model-based elastography, and the soft prior approach should enhance the robustness of the reconstruction process to errors in the geometrical information. PMID:22037648

  7. Estimating Tree Height-Diameter Models with the Bayesian Method

    PubMed Central

    Duan, Aiguo; Zhang, Jianguo; Xiang, Congwei

    2014-01-01

    Six candidate height-diameter models were used to analyze the height-diameter relationships. The common methods for estimating the height-diameter models have taken the classical (frequentist) approach based on the frequency interpretation of probability, for example, the nonlinear least squares method (NLS) and the maximum likelihood method (ML). The Bayesian method has an exclusive advantage compared with classical method that the parameters to be estimated are regarded as random variables. In this study, the classical and Bayesian methods were used to estimate six height-diameter models, respectively. Both the classical method and Bayesian method showed that the Weibull model was the “best” model using data1. In addition, based on the Weibull model, data2 was used for comparing Bayesian method with informative priors with uninformative priors and classical method. The results showed that the improvement in prediction accuracy with Bayesian method led to narrower confidence bands of predicted value in comparison to that for the classical method, and the credible bands of parameters with informative priors were also narrower than uninformative priors and classical method. The estimated posterior distributions for parameters can be set as new priors in estimating the parameters using data2. PMID:24711733

  8. Estimating tree height-diameter models with the Bayesian method.

    PubMed

    Zhang, Xiongqing; Duan, Aiguo; Zhang, Jianguo; Xiang, Congwei

    2014-01-01

    Six candidate height-diameter models were used to analyze the height-diameter relationships. The common methods for estimating the height-diameter models have taken the classical (frequentist) approach based on the frequency interpretation of probability, for example, the nonlinear least squares method (NLS) and the maximum likelihood method (ML). The Bayesian method has an exclusive advantage compared with classical method that the parameters to be estimated are regarded as random variables. In this study, the classical and Bayesian methods were used to estimate six height-diameter models, respectively. Both the classical method and Bayesian method showed that the Weibull model was the "best" model using data1. In addition, based on the Weibull model, data2 was used for comparing Bayesian method with informative priors with uninformative priors and classical method. The results showed that the improvement in prediction accuracy with Bayesian method led to narrower confidence bands of predicted value in comparison to that for the classical method, and the credible bands of parameters with informative priors were also narrower than uninformative priors and classical method. The estimated posterior distributions for parameters can be set as new priors in estimating the parameters using data2.

  9. Bayesian bivariate meta-analysis of diagnostic test studies with interpretable priors.

    PubMed

    Guo, Jingyi; Riebler, Andrea; Rue, Håvard

    2017-08-30

    In a bivariate meta-analysis, the number of diagnostic studies involved is often very low so that frequentist methods may result in problems. Using Bayesian inference is particularly attractive as informative priors that add a small amount of information can stabilise the analysis without overwhelming the data. However, Bayesian analysis is often computationally demanding and the selection of the prior for the covariance matrix of the bivariate structure is crucial with little data. The integrated nested Laplace approximations method provides an efficient solution to the computational issues by avoiding any sampling, but the important question of priors remain. We explore the penalised complexity (PC) prior framework for specifying informative priors for the variance parameters and the correlation parameter. PC priors facilitate model interpretation and hyperparameter specification as expert knowledge can be incorporated intuitively. We conduct a simulation study to compare the properties and behaviour of differently defined PC priors to currently used priors in the field. The simulation study shows that the PC prior seems beneficial for the variance parameters. The use of PC priors for the correlation parameter results in more precise estimates when specified in a sensible neighbourhood around the truth. To investigate the usage of PC priors in practice, we reanalyse a meta-analysis using the telomerase marker for the diagnosis of bladder cancer and compare the results with those obtained by other commonly used modelling approaches. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  10. Integrating informative priors from experimental research with Bayesian methods: an example from radiation epidemiology.

    PubMed

    Hamra, Ghassan; Richardson, David; Maclehose, Richard; Wing, Steve

    2013-01-01

    Informative priors can be a useful tool for epidemiologists to handle problems of sparse data in regression modeling. It is sometimes the case that an investigator is studying a population exposed to two agents, X and Y, where Y is the agent of primary interest. Previous research may suggest that the exposures have different effects on the health outcome of interest, one being more harmful than the other. Such information may be derived from epidemiologic analyses; however, in the case where such evidence is unavailable, knowledge can be drawn from toxicologic studies or other experimental research. Unfortunately, using toxicologic findings to develop informative priors in epidemiologic analyses requires strong assumptions, with no established method for its utilization. We present a method to help bridge the gap between animal and cellular studies and epidemiologic research by specification of an order-constrained prior. We illustrate this approach using an example from radiation epidemiology.

  11. Integrating Informative Priors from Experimental Research with Bayesian Methods

    PubMed Central

    Hamra, Ghassan; Richardson, David; MacLehose, Richard; Wing, Steve

    2013-01-01

    Informative priors can be a useful tool for epidemiologists to handle problems of sparse data in regression modeling. It is sometimes the case that an investigator is studying a population exposed to two agents, X and Y, where Y is the agent of primary interest. Previous research may suggest that the exposures have different effects on the health outcome of interest, one being more harmful than the other. Such information may be derived from epidemiologic analyses; however, in the case where such evidence is unavailable, knowledge can be drawn from toxicologic studies or other experimental research. Unfortunately, using toxicologic findings to develop informative priors in epidemiologic analyses requires strong assumptions, with no established method for its utilization. We present a method to help bridge the gap between animal and cellular studies and epidemiologic research by specification of an order-constrained prior. We illustrate this approach using an example from radiation epidemiology. PMID:23222512

  12. Incorporating Functional Genomic Information in Genetic Association Studies Using an Empirical Bayes Approach.

    PubMed

    Spencer, Amy V; Cox, Angela; Lin, Wei-Yu; Easton, Douglas F; Michailidou, Kyriaki; Walters, Kevin

    2016-04-01

    There is a large amount of functional genetic data available, which can be used to inform fine-mapping association studies (in diseases with well-characterised disease pathways). Single nucleotide polymorphism (SNP) prioritization via Bayes factors is attractive because prior information can inform the effect size or the prior probability of causal association. This approach requires the specification of the effect size. If the information needed to estimate a priori the probability density for the effect sizes for causal SNPs in a genomic region isn't consistent or isn't available, then specifying a prior variance for the effect sizes is challenging. We propose both an empirical method to estimate this prior variance, and a coherent approach to using SNP-level functional data, to inform the prior probability of causal association. Through simulation we show that when ranking SNPs by our empirical Bayes factor in a fine-mapping study, the causal SNP rank is generally as high or higher than the rank using Bayes factors with other plausible values of the prior variance. Importantly, we also show that assigning SNP-specific prior probabilities of association based on expert prior functional knowledge of the disease mechanism can lead to improved causal SNPs ranks compared to ranking with identical prior probabilities of association. We demonstrate the use of our methods by applying the methods to the fine mapping of the CASP8 region of chromosome 2 using genotype data from the Collaborative Oncological Gene-Environment Study (COGS) Consortium. The data we analysed included approximately 46,000 breast cancer case and 43,000 healthy control samples. © 2016 The Authors. *Genetic Epidemiology published by Wiley Periodicals, Inc.

  13. Bias in diet determination: incorporating traditional methods in Bayesian mixing models.

    PubMed

    Franco-Trecu, Valentina; Drago, Massimiliano; Riet-Sapriza, Federico G; Parnell, Andrew; Frau, Rosina; Inchausti, Pablo

    2013-01-01

    There are not "universal methods" to determine diet composition of predators. Most traditional methods are biased because of their reliance on differential digestibility and the recovery of hard items. By relying on assimilated food, stable isotope and Bayesian mixing models (SIMMs) resolve many biases of traditional methods. SIMMs can incorporate prior information (i.e. proportional diet composition) that may improve the precision in the estimated dietary composition. However few studies have assessed the performance of traditional methods and SIMMs with and without informative priors to study the predators' diets. Here we compare the diet compositions of the South American fur seal and sea lions obtained by scats analysis and by SIMMs-UP (uninformative priors) and assess whether informative priors (SIMMs-IP) from the scat analysis improved the estimated diet composition compared to SIMMs-UP. According to the SIMM-UP, while pelagic species dominated the fur seal's diet the sea lion's did not have a clear dominance of any prey. In contrast, SIMM-IP's diets compositions were dominated by the same preys as in scat analyses. When prior information influenced SIMMs' estimates, incorporating informative priors improved the precision in the estimated diet composition at the risk of inducing biases in the estimates. If preys isotopic data allow discriminating preys' contributions to diets, informative priors should lead to more precise but unbiased estimated diet composition. Just as estimates of diet composition obtained from traditional methods are critically interpreted because of their biases, care must be exercised when interpreting diet composition obtained by SIMMs-IP. The best approach to obtain a near-complete view of predators' diet composition should involve the simultaneous consideration of different sources of partial evidence (traditional methods, SIMM-UP and SIMM-IP) in the light of natural history of the predator species so as to reliably ascertain and weight the information yielded by each method.

  14. Prospective regularization design in prior-image-based reconstruction

    NASA Astrophysics Data System (ADS)

    Dang, Hao; Siewerdsen, Jeffrey H.; Webster Stayman, J.

    2015-12-01

    Prior-image-based reconstruction (PIBR) methods leveraging patient-specific anatomical information from previous imaging studies and/or sequences have demonstrated dramatic improvements in dose utilization and image quality for low-fidelity data. However, a proper balance of information from the prior images and information from the measurements is required (e.g. through careful tuning of regularization parameters). Inappropriate selection of reconstruction parameters can lead to detrimental effects including false structures and failure to improve image quality. Traditional methods based on heuristics are subject to error and sub-optimal solutions, while exhaustive searches require a large number of computationally intensive image reconstructions. In this work, we propose a novel method that prospectively estimates the optimal amount of prior image information for accurate admission of specific anatomical changes in PIBR without performing full image reconstructions. This method leverages an analytical approximation to the implicitly defined PIBR estimator, and introduces a predictive performance metric leveraging this analytical form and knowledge of a particular presumed anatomical change whose accurate reconstruction is sought. Additionally, since model-based PIBR approaches tend to be space-variant, a spatially varying prior image strength map is proposed to optimally admit changes everywhere in the image (eliminating the need to know change locations a priori). Studies were conducted in both an ellipse phantom and a realistic thorax phantom emulating a lung nodule surveillance scenario. The proposed method demonstrated accurate estimation of the optimal prior image strength while achieving a substantial computational speedup (about a factor of 20) compared to traditional exhaustive search. Moreover, the use of the proposed prior strength map in PIBR demonstrated accurate reconstruction of anatomical changes without foreknowledge of change locations in phantoms where the optimal parameters vary spatially by an order of magnitude or more. In a series of studies designed to explore potential unknowns associated with accurate PIBR, optimal prior image strength was found to vary with attenuation differences associated with anatomical change but exhibited only small variations as a function of the shape and size of the change. The results suggest that, given a target change attenuation, prospective patient-, change-, and data-specific customization of the prior image strength can be performed to ensure reliable reconstruction of specific anatomical changes.

  15. Discovering mutated driver genes through a robust and sparse co-regularized matrix factorization framework with prior information from mRNA expression patterns and interaction network.

    PubMed

    Xi, Jianing; Wang, Minghui; Li, Ao

    2018-06-05

    Discovery of mutated driver genes is one of the primary objective for studying tumorigenesis. To discover some relatively low frequently mutated driver genes from somatic mutation data, many existing methods incorporate interaction network as prior information. However, the prior information of mRNA expression patterns are not exploited by these existing network-based methods, which is also proven to be highly informative of cancer progressions. To incorporate prior information from both interaction network and mRNA expressions, we propose a robust and sparse co-regularized nonnegative matrix factorization to discover driver genes from mutation data. Furthermore, our framework also conducts Frobenius norm regularization to overcome overfitting issue. Sparsity-inducing penalty is employed to obtain sparse scores in gene representations, of which the top scored genes are selected as driver candidates. Evaluation experiments by known benchmarking genes indicate that the performance of our method benefits from the two type of prior information. Our method also outperforms the existing network-based methods, and detect some driver genes that are not predicted by the competing methods. In summary, our proposed method can improve the performance of driver gene discovery by effectively incorporating prior information from interaction network and mRNA expression patterns into a robust and sparse co-regularized matrix factorization framework.

  16. WE-FG-207B-05: Iterative Reconstruction Via Prior Image Constrained Total Generalized Variation for Spectral CT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Niu, S; Zhang, Y; Ma, J

    Purpose: To investigate iterative reconstruction via prior image constrained total generalized variation (PICTGV) for spectral computed tomography (CT) using fewer projections while achieving greater image quality. Methods: The proposed PICTGV method is formulated as an optimization problem, which balances the data fidelity and prior image constrained total generalized variation of reconstructed images in one framework. The PICTGV method is based on structure correlations among images in the energy domain and high-quality images to guide the reconstruction of energy-specific images. In PICTGV method, the high-quality image is reconstructed from all detector-collected X-ray signals and is referred as the broad-spectrum image. Distinctmore » from the existing reconstruction methods applied on the images with first order derivative, the higher order derivative of the images is incorporated into the PICTGV method. An alternating optimization algorithm is used to minimize the PICTGV objective function. We evaluate the performance of PICTGV on noise and artifacts suppressing using phantom studies and compare the method with the conventional filtered back-projection method as well as TGV based method without prior image. Results: On the digital phantom, the proposed method outperforms the existing TGV method in terms of the noise reduction, artifacts suppression, and edge detail preservation. Compared to that obtained by the TGV based method without prior image, the relative root mean square error in the images reconstructed by the proposed method is reduced by over 20%. Conclusion: The authors propose an iterative reconstruction via prior image constrained total generalize variation for spectral CT. Also, we have developed an alternating optimization algorithm and numerically demonstrated the merits of our approach. Results show that the proposed PICTGV method outperforms the TGV method for spectral CT.« less

  17. Dissecting effects of complex mixtures: who's afraid of informative priors?

    PubMed

    Thomas, Duncan C; Witte, John S; Greenland, Sander

    2007-03-01

    Epidemiologic studies commonly investigate multiple correlated exposures, which are difficult to analyze appropriately. Hierarchical modeling provides a promising approach for analyzing such data by adding a higher-level structure or prior model for the exposure effects. This prior model can incorporate additional information on similarities among the correlated exposures and can be parametric, semiparametric, or nonparametric. We discuss the implications of applying these models and argue for their expanded use in epidemiology. While a prior model adds assumptions to the conventional (first-stage) model, all statistical methods (including conventional methods) make strong intrinsic assumptions about the processes that generated the data. One should thus balance prior modeling assumptions against assumptions of validity, and use sensitivity analyses to understand their implications. In doing so - and by directly incorporating into our analyses information from other studies or allied fields - we can improve our ability to distinguish true causes of disease from noise and bias.

  18. Self-Explanation in the Domain of Statistics: An Expertise Reversal Effect

    ERIC Educational Resources Information Center

    Leppink, Jimmie; Broers, Nick J.; Imbos, Tjaart; van der Vleuten, Cees P. M.; Berger, Martijn P. F.

    2012-01-01

    This study investigated the effects of four instructional methods on cognitive load, propositional knowledge, and conceptual understanding of statistics, for low prior knowledge students and for high prior knowledge students. The instructional methods were (1) a reading-only control condition, (2) answering open-ended questions, (3) answering…

  19. Bias in Diet Determination: Incorporating Traditional Methods in Bayesian Mixing Models

    PubMed Central

    Franco-Trecu, Valentina; Drago, Massimiliano; Riet-Sapriza, Federico G.; Parnell, Andrew; Frau, Rosina; Inchausti, Pablo

    2013-01-01

    There are not “universal methods” to determine diet composition of predators. Most traditional methods are biased because of their reliance on differential digestibility and the recovery of hard items. By relying on assimilated food, stable isotope and Bayesian mixing models (SIMMs) resolve many biases of traditional methods. SIMMs can incorporate prior information (i.e. proportional diet composition) that may improve the precision in the estimated dietary composition. However few studies have assessed the performance of traditional methods and SIMMs with and without informative priors to study the predators’ diets. Here we compare the diet compositions of the South American fur seal and sea lions obtained by scats analysis and by SIMMs-UP (uninformative priors) and assess whether informative priors (SIMMs-IP) from the scat analysis improved the estimated diet composition compared to SIMMs-UP. According to the SIMM-UP, while pelagic species dominated the fur seal’s diet the sea lion’s did not have a clear dominance of any prey. In contrast, SIMM-IP’s diets compositions were dominated by the same preys as in scat analyses. When prior information influenced SIMMs’ estimates, incorporating informative priors improved the precision in the estimated diet composition at the risk of inducing biases in the estimates. If preys isotopic data allow discriminating preys’ contributions to diets, informative priors should lead to more precise but unbiased estimated diet composition. Just as estimates of diet composition obtained from traditional methods are critically interpreted because of their biases, care must be exercised when interpreting diet composition obtained by SIMMs-IP. The best approach to obtain a near-complete view of predators’ diet composition should involve the simultaneous consideration of different sources of partial evidence (traditional methods, SIMM-UP and SIMM-IP) in the light of natural history of the predator species so as to reliably ascertain and weight the information yielded by each method. PMID:24224031

  20. Obstacles to Using Prior Research and Evaluations.

    ERIC Educational Resources Information Center

    Orwin, Robert G.

    1985-01-01

    The manner in which results and methods are reported influences the ability of the synthesis of prior studies for planning new evaluations. Confidence ratings, coding conventions, and supplemental evidence can partially overcome the difficulties. Planners must acknowledge the influence of their own judgement in using prior research. (Author)

  1. The Role of Prior Experience in Feedback of Beginning Teachers

    ERIC Educational Resources Information Center

    Blount, Tametra Danielle

    2010-01-01

    This causal-comparative, mixed-methods study examined the role of prior experience in the mentoring needs of first-year teachers from alternative certification programs in three Tennessee counties. Teachers examined were: teachers from traditional teacher education programs, teachers with no prior teacher education experience, teachers with prior…

  2. Proportion estimation using prior cluster purities

    NASA Technical Reports Server (NTRS)

    Terrell, G. R. (Principal Investigator)

    1980-01-01

    The prior distribution of CLASSY component purities is studied, and this information incorporated into maximum likelihood crop proportion estimators. The method is tested on Transition Year spring small grain segments.

  3. Meta-analysis of few small studies in orphan diseases.

    PubMed

    Friede, Tim; Röver, Christian; Wandel, Simon; Neuenschwander, Beat

    2017-03-01

    Meta-analyses in orphan diseases and small populations generally face particular problems, including small numbers of studies, small study sizes and heterogeneity of results. However, the heterogeneity is difficult to estimate if only very few studies are included. Motivated by a systematic review in immunosuppression following liver transplantation in children, we investigate the properties of a range of commonly used frequentist and Bayesian procedures in simulation studies. Furthermore, the consequences for interval estimation of the common treatment effect in random-effects meta-analysis are assessed. The Bayesian credibility intervals using weakly informative priors for the between-trial heterogeneity exhibited coverage probabilities in excess of the nominal level for a range of scenarios considered. However, they tended to be shorter than those obtained by the Knapp-Hartung method, which were also conservative. In contrast, methods based on normal quantiles exhibited coverages well below the nominal levels in many scenarios. With very few studies, the performance of the Bayesian credibility intervals is of course sensitive to the specification of the prior for the between-trial heterogeneity. In conclusion, the use of weakly informative priors as exemplified by half-normal priors (with a scale of 0.5 or 1.0) for log odds ratios is recommended for applications in rare diseases. © 2016 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd. © 2016 The Authors. Research Synthesis Methods published by John Wiley & Sons Ltd.

  4. A Study about Placement Support Using Semantic Similarity

    ERIC Educational Resources Information Center

    Katz, Marco; van Bruggen, Jan; Giesbers, Bas; Waterink, Wim; Eshuis, Jannes; Koper, Rob

    2014-01-01

    This paper discusses Latent Semantic Analysis (LSA) as a method for the assessment of prior learning. The Accreditation of Prior Learning (APL) is a procedure to offer learners an individualized curriculum based on their prior experiences and knowledge. The placement decisions in this process are based on the analysis of student material by domain…

  5. A partial differential equation-based general framework adapted to Rayleigh's, Rician's and Gaussian's distributed noise for restoration and enhancement of magnetic resonance image.

    PubMed

    Yadav, Ram Bharos; Srivastava, Subodh; Srivastava, Rajeev

    2016-01-01

    The proposed framework is obtained by casting the noise removal problem into a variational framework. This framework automatically identifies the various types of noise present in the magnetic resonance image and filters them by choosing an appropriate filter. This filter includes two terms: the first term is a data likelihood term and the second term is a prior function. The first term is obtained by minimizing the negative log likelihood of the corresponding probability density functions: Gaussian or Rayleigh or Rician. Further, due to the ill-posedness of the likelihood term, a prior function is needed. This paper examines three partial differential equation based priors which include total variation based prior, anisotropic diffusion based prior, and a complex diffusion (CD) based prior. A regularization parameter is used to balance the trade-off between data fidelity term and prior. The finite difference scheme is used for discretization of the proposed method. The performance analysis and comparative study of the proposed method with other standard methods is presented for brain web dataset at varying noise levels in terms of peak signal-to-noise ratio, mean square error, structure similarity index map, and correlation parameter. From the simulation results, it is observed that the proposed framework with CD based prior is performing better in comparison to other priors in consideration.

  6. GWASinlps: Nonlocal prior based iterative SNP selection tool for genome-wide association studies.

    PubMed

    Sanyal, Nilotpal; Lo, Min-Tzu; Kauppi, Karolina; Djurovic, Srdjan; Andreassen, Ole A; Johnson, Valen E; Chen, Chi-Hua

    2018-06-19

    Multiple marker analysis of the genome-wide association study (GWAS) data has gained ample attention in recent years. However, because of the ultra high-dimensionality of GWAS data, such analysis is challenging. Frequently used penalized regression methods often lead to large number of false positives, whereas Bayesian methods are computationally very expensive. Motivated to ameliorate these issues simultaneously, we consider the novel approach of using nonlocal priors in an iterative variable selection framework. We develop a variable selection method, named, iterative nonlocal prior based selection for GWAS, or GWASinlps, that combines, in an iterative variable selection framework, the computational efficiency of the screen-and-select approach based on some association learning and the parsimonious uncertainty quantification provided by the use of nonlocal priors. The hallmark of our method is the introduction of 'structured screen-and-select' strategy, that considers hierarchical screening, which is not only based on response-predictor associations, but also based on response-response associations, and concatenates variable selection within that hierarchy. Extensive simulation studies with SNPs having realistic linkage disequilibrium structures demonstrate the advantages of our computationally efficient method compared to several frequentist and Bayesian variable selection methods, in terms of true positive rate, false discovery rate, mean squared error, and effect size estimation error. Further, we provide empirical power analysis useful for study design. Finally, a real GWAS data application was considered with human height as phenotype. An R-package for implementing the GWASinlps method is available at https://cran.r-project.org/web/packages/GWASinlps/index.html. Supplementary data are available at Bioinformatics online.

  7. Feasibility of reusing time-matched controls in an overlapping cohort.

    PubMed

    Delcoigne, Bénédicte; Hagenbuch, Niels; Schelin, Maria Ec; Salim, Agus; Lindström, Linda S; Bergh, Jonas; Czene, Kamila; Reilly, Marie

    2018-06-01

    The methods developed for secondary analysis of nested case-control data have been illustrated only in simplified settings in a common cohort and have not found their way into biostatistical practice. This paper demonstrates the feasibility of reusing prior nested case-control data in a realistic setting where a new outcome is available in an overlapping cohort where no new controls were gathered and where all data have been anonymised. Using basic information about the background cohort and sampling criteria, the new cases and prior data are "aligned" to identify the common underlying study base. With this study base, a Kaplan-Meier table of the prior outcome extracts the risk sets required to calculate the weights to assign to the controls to remove the sampling bias. A weighted Cox regression, implemented in standard statistical software, provides unbiased hazard ratios. Using the method to compare cases of contralateral breast cancer to available controls from a prior study of metastases, we identified a multifocal tumor as a risk factor that has not been reported previously. We examine the sensitivity of the method to an imperfect weighting scheme and discuss its merits and pitfalls to provide guidance for its use in medical research studies.

  8. A Bayesian approach to meta-analysis of plant pathology studies.

    PubMed

    Mila, A L; Ngugi, H K

    2011-01-01

    Bayesian statistical methods are used for meta-analysis in many disciplines, including medicine, molecular biology, and engineering, but have not yet been applied for quantitative synthesis of plant pathology studies. In this paper, we illustrate the key concepts of Bayesian statistics and outline the differences between Bayesian and classical (frequentist) methods in the way parameters describing population attributes are considered. We then describe a Bayesian approach to meta-analysis and present a plant pathological example based on studies evaluating the efficacy of plant protection products that induce systemic acquired resistance for the management of fire blight of apple. In a simple random-effects model assuming a normal distribution of effect sizes and no prior information (i.e., a noninformative prior), the results of the Bayesian meta-analysis are similar to those obtained with classical methods. Implementing the same model with a Student's t distribution and a noninformative prior for the effect sizes, instead of a normal distribution, yields similar results for all but acibenzolar-S-methyl (Actigard) which was evaluated only in seven studies in this example. Whereas both the classical (P = 0.28) and the Bayesian analysis with a noninformative prior (95% credibility interval [CRI] for the log response ratio: -0.63 to 0.08) indicate a nonsignificant effect for Actigard, specifying a t distribution resulted in a significant, albeit variable, effect for this product (CRI: -0.73 to -0.10). These results confirm the sensitivity of the analytical outcome (i.e., the posterior distribution) to the choice of prior in Bayesian meta-analyses involving a limited number of studies. We review some pertinent literature on more advanced topics, including modeling of among-study heterogeneity, publication bias, analyses involving a limited number of studies, and methods for dealing with missing data, and show how these issues can be approached in a Bayesian framework. Bayesian meta-analysis can readily include information not easily incorporated in classical methods, and allow for a full evaluation of competing models. Given the power and flexibility of Bayesian methods, we expect them to become widely adopted for meta-analysis of plant pathology studies.

  9. Assessment of Prior Learning in Adult Vocational Education and Training

    ERIC Educational Resources Information Center

    Aarkrog, Vibe; Wahlgren, Bjarne

    2015-01-01

    The article deals about the results of a study of school-based Assessment of Prior Learning of adults who have enrolled as students in a VET college in order to qualify for occupations as skilled workers. Based on examples of VET teachers' methods for assessing the students' prior learning in the programs for gastronomes, respectively child care…

  10. Methods, History, Selected Findings, and Recommendations from the Louisiana School Effectiveness Study, 1980-85

    ERIC Educational Resources Information Center

    Teddlie, Charles; Stringfield, Samuel; Desselle, Stephanie

    2017-01-01

    An overview of the first five years of the Louisiana School Effectiveness Study (LSES) is described. The longitudinal nature of the study has allowed the research team to develop an evolving methodology, one benefiting from prior external studies as well as prior phases of LSES. Practical implications and recommendations for future research are…

  11. Careful with Those Priors: A Note on Bayesian Estimation in Two-Parameter Logistic Item Response Theory Models

    ERIC Educational Resources Information Center

    Marcoulides, Katerina M.

    2018-01-01

    This study examined the use of Bayesian analysis methods for the estimation of item parameters in a two-parameter logistic item response theory model. Using simulated data under various design conditions with both informative and non-informative priors, the parameter recovery of Bayesian analysis methods were examined. Overall results showed that…

  12. Predictive distributions for between-study heterogeneity and simple methods for their application in Bayesian meta-analysis

    PubMed Central

    Turner, Rebecca M; Jackson, Dan; Wei, Yinghui; Thompson, Simon G; Higgins, Julian P T

    2015-01-01

    Numerous meta-analyses in healthcare research combine results from only a small number of studies, for which the variance representing between-study heterogeneity is estimated imprecisely. A Bayesian approach to estimation allows external evidence on the expected magnitude of heterogeneity to be incorporated. The aim of this paper is to provide tools that improve the accessibility of Bayesian meta-analysis. We present two methods for implementing Bayesian meta-analysis, using numerical integration and importance sampling techniques. Based on 14 886 binary outcome meta-analyses in the Cochrane Database of Systematic Reviews, we derive a novel set of predictive distributions for the degree of heterogeneity expected in 80 settings depending on the outcomes assessed and comparisons made. These can be used as prior distributions for heterogeneity in future meta-analyses. The two methods are implemented in R, for which code is provided. Both methods produce equivalent results to standard but more complex Markov chain Monte Carlo approaches. The priors are derived as log-normal distributions for the between-study variance, applicable to meta-analyses of binary outcomes on the log odds-ratio scale. The methods are applied to two example meta-analyses, incorporating the relevant predictive distributions as prior distributions for between-study heterogeneity. We have provided resources to facilitate Bayesian meta-analysis, in a form accessible to applied researchers, which allow relevant prior information on the degree of heterogeneity to be incorporated. © 2014 The Authors. Statistics in Medicine published by John Wiley & Sons Ltd. PMID:25475839

  13. Ultrasound and Cadaveric Prosections as Methods for Teaching Cardiac Anatomy: A Comparative Study

    ERIC Educational Resources Information Center

    Griksaitis, Michael J.; Sawdon, Marina A.; Finn, Gabrielle M.

    2012-01-01

    This study compared the efficacy of two cardiac anatomy teaching modalities, ultrasound imaging and cadaveric prosections, for learning cardiac gross anatomy. One hundred and eight first-year medical students participated. Two weeks prior to the teaching intervention, students completed a pretest to assess their prior knowledge and to ensure that…

  14. Using expert knowledge for test linking.

    PubMed

    Bolsinova, Maria; Hoijtink, Herbert; Vermeulen, Jorine Adinda; Béguin, Anton

    2017-12-01

    Linking and equating procedures are used to make the results of different test forms comparable. In the cases where no assumption of random equivalent groups can be made some form of linking design is used. In practice the amount of data available to link the two tests is often very limited due to logistic and security reasons, which affects the precision of linking procedures. This study proposes to enhance the quality of linking procedures based on sparse data by using Bayesian methods which combine the information in the linking data with background information captured in informative prior distributions. We propose two methods for the elicitation of prior knowledge about the difference in difficulty of two tests from subject-matter experts and explain how these results can be used in the specification of priors. To illustrate the proposed methods and evaluate the quality of linking with and without informative priors, an empirical example of linking primary school mathematics tests is presented. The results suggest that informative priors can increase the precision of linking without decreasing the accuracy. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  15. TU-F-BRF-02: MR-US Prostate Registration Using Patient-Specific Tissue Elasticity Property Prior for MR-Targeted, TRUS-Guided HDR Brachytherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, X; Rossi, P; Ogunleye, T

    2014-06-15

    Purpose: High-dose-rate (HDR) brachytherapy has become a popular treatment modality for prostate cancer. Conventional transrectal ultrasound (TRUS)-guided prostate HDR brachytherapy could benefit significantly from MR-targeted, TRUS-guided procedure where the tumor locations, acquired from the multiparametric MRI, are incorporated into the treatment planning. In order to enable this integration, we have developed a MR-TRUS registration with a patient-specific biomechanical elasticity prior. Methods: The proposed method used a biomechanical elasticity prior to guide the prostate volumetric B-spline deformation in the MRI and TRUS registration. The patient-specific biomechanical elasticity prior was generated using ultrasound elastography, where two 3D TRUS prostate images were acquiredmore » under different probe-induced pressures during the HDR procedure, which takes 2-4 minutes. These two 3D TRUS images were used to calculate the local displacement (elasticity map) of two prostate volumes. The B-spline transformation was calculated by minimizing the Euclidean distance between the normalized attribute vectors of the prostate surface landmarks on the MR and TRUS. This technique was evaluated through two studies: a prostate-phantom study and a pilot study with 5 patients undergoing prostate HDR treatment. The accuracy of our approach was assessed through the locations of several landmarks in the post-registration and TRUS images; our registration results were compared with the surface-based method. Results: For the phantom study, the mean landmark displacement of the proposed method was 1.29±0.11 mm. For the 5 patients, the mean landmark displacement of the surface-based method was 3.25±0.51 mm; our method, 1.71±0.25 mm. Therefore, our proposed method of prostate registration outperformed the surfaced-based registration significantly. Conclusion: We have developed a novel MR-TRUS prostate registration approach based on patient-specific biomechanical elasticity prior. Successful integration of multi-parametric MR and TRUS prostate images provides a prostate-cancer map for treatment planning, enables accurate dose planning and delivery, and potentially enhances prostate HDR treatment outcome.« less

  16. Hippocampus segmentation using locally weighted prior based level set

    NASA Astrophysics Data System (ADS)

    Achuthan, Anusha; Rajeswari, Mandava

    2015-12-01

    Segmentation of hippocampus in the brain is one of a major challenge in medical image segmentation due to its' imaging characteristics, with almost similar intensity between another adjacent gray matter structure, such as amygdala. The intensity similarity has causes the hippocampus to have weak or fuzzy boundaries. With this main challenge being demonstrated by hippocampus, a segmentation method that relies on image information alone may not produce accurate segmentation results. Therefore, it is needed an assimilation of prior information such as shape and spatial information into existing segmentation method to produce the expected segmentation. Previous studies has widely integrated prior information into segmentation methods. However, the prior information has been utilized through a global manner integration, and this does not reflect the real scenario during clinical delineation. Therefore, in this paper, a locally integrated prior information into a level set model is presented. This work utilizes a mean shape model to provide automatic initialization for level set evolution, and has been integrated as prior information into the level set model. The local integration of edge based information and prior information has been implemented through an edge weighting map that decides at voxel level which information need to be observed during a level set evolution. The edge weighting map shows which corresponding voxels having sufficient edge information. Experiments shows that the proposed integration of prior information locally into a conventional edge-based level set model, known as geodesic active contour has shown improvement of 9% in averaged Dice coefficient.

  17. The impact of using informative priors in a Bayesian cost-effectiveness analysis: an application of endovascular versus open surgical repair for abdominal aortic aneurysms in high-risk patients.

    PubMed

    McCarron, C Elizabeth; Pullenayegum, Eleanor M; Thabane, Lehana; Goeree, Ron; Tarride, Jean-Eric

    2013-04-01

    Bayesian methods have been proposed as a way of synthesizing all available evidence to inform decision making. However, few practical applications of the use of Bayesian methods for combining patient-level data (i.e., trial) with additional evidence (e.g., literature) exist in the cost-effectiveness literature. The objective of this study was to compare a Bayesian cost-effectiveness analysis using informative priors to a standard non-Bayesian nonparametric method to assess the impact of incorporating additional information into a cost-effectiveness analysis. Patient-level data from a previously published nonrandomized study were analyzed using traditional nonparametric bootstrap techniques and bivariate normal Bayesian models with vague and informative priors. Two different types of informative priors were considered to reflect different valuations of the additional evidence relative to the patient-level data (i.e., "face value" and "skeptical"). The impact of using different distributions and valuations was assessed in a sensitivity analysis. Models were compared in terms of incremental net monetary benefit (INMB) and cost-effectiveness acceptability frontiers (CEAFs). The bootstrapping and Bayesian analyses using vague priors provided similar results. The most pronounced impact of incorporating the informative priors was the increase in estimated life years in the control arm relative to what was observed in the patient-level data alone. Consequently, the incremental difference in life years originally observed in the patient-level data was reduced, and the INMB and CEAF changed accordingly. The results of this study demonstrate the potential impact and importance of incorporating additional information into an analysis of patient-level data, suggesting this could alter decisions as to whether a treatment should be adopted and whether more information should be acquired.

  18. An Analysis of K-12 Teachers' Conceptions of Agriculture Prior to and during Engagement in an Agricultural Literacy Program

    ERIC Educational Resources Information Center

    Anderson, Shawn M.; Velez, Jonathan J.; Thompson, Gregory W.

    2014-01-01

    This study examined the K-12 teachers' conceptions of the agriculture industry prior to enrolling in an agricultural literacy program and how their conceptions changed throughout the program. The study used qualitative methods to analyze the data collected from entrance questionnaires, interviews, and reflective journals. Trustworthiness was…

  19. Developing Learning Objectives for a Model Course to Prepare Adults for the Assessment of Prior, Non-Sponsored Learning by Portfolio Evaluation.

    ERIC Educational Resources Information Center

    Stevens, Mary A.

    A study was conducted in order to develop a systematic method for the evaluation of students' prior, non-sponsored learning for the award of college credit at Blackhawk College (Illinois). It was determined that a course designed to prepare the student for assessment of prior learning was the best way for the institution to provide assistance to…

  20. External Prior Guided Internal Prior Learning for Real-World Noisy Image Denoising

    NASA Astrophysics Data System (ADS)

    Xu, Jun; Zhang, Lei; Zhang, David

    2018-06-01

    Most of existing image denoising methods learn image priors from either external data or the noisy image itself to remove noise. However, priors learned from external data may not be adaptive to the image to be denoised, while priors learned from the given noisy image may not be accurate due to the interference of corrupted noise. Meanwhile, the noise in real-world noisy images is very complex, which is hard to be described by simple distributions such as Gaussian distribution, making real noisy image denoising a very challenging problem. We propose to exploit the information in both external data and the given noisy image, and develop an external prior guided internal prior learning method for real noisy image denoising. We first learn external priors from an independent set of clean natural images. With the aid of learned external priors, we then learn internal priors from the given noisy image to refine the prior model. The external and internal priors are formulated as a set of orthogonal dictionaries to efficiently reconstruct the desired image. Extensive experiments are performed on several real noisy image datasets. The proposed method demonstrates highly competitive denoising performance, outperforming state-of-the-art denoising methods including those designed for real noisy images.

  1. An empirical Bayes approach to network recovery using external knowledge.

    PubMed

    Kpogbezan, Gino B; van der Vaart, Aad W; van Wieringen, Wessel N; Leday, Gwenaël G R; van de Wiel, Mark A

    2017-09-01

    Reconstruction of a high-dimensional network may benefit substantially from the inclusion of prior knowledge on the network topology. In the case of gene interaction networks such knowledge may come for instance from pathway repositories like KEGG, or be inferred from data of a pilot study. The Bayesian framework provides a natural means of including such prior knowledge. Based on a Bayesian Simultaneous Equation Model, we develop an appealing Empirical Bayes (EB) procedure that automatically assesses the agreement of the used prior knowledge with the data at hand. We use variational Bayes method for posterior densities approximation and compare its accuracy with that of Gibbs sampling strategy. Our method is computationally fast, and can outperform known competitors. In a simulation study, we show that accurate prior data can greatly improve the reconstruction of the network, but need not harm the reconstruction if wrong. We demonstrate the benefits of the method in an analysis of gene expression data from GEO. In particular, the edges of the recovered network have superior reproducibility (compared to that of competitors) over resampled versions of the data. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Abdominal multi-organ segmentation from CT images using conditional shape–location and unsupervised intensity priors

    PubMed Central

    Linguraru, Marius George; Hori, Masatoshi; Summers, Ronald M; Tomiyama, Noriyuki

    2015-01-01

    This paper addresses the automated segmentation of multiple organs in upper abdominal computed tomography (CT) data. The aim of our study is to develop methods to effectively construct the conditional priors and use their prediction power for more accurate segmentation as well as easy adaptation to various imaging conditions in CT images, as observed in clinical practice. We propose a general framework of multi-organ segmentation which effectively incorporates interrelations among multiple organs and easily adapts to various imaging conditions without the need for supervised intensity information. The features of the framework are as follows: (1) A method for modeling conditional shape and location (shape–location) priors, which we call prediction-based priors, is developed to derive accurate priors specific to each subject, which enables the estimation of intensity priors without the need for supervised intensity information. (2) Organ correlation graph is introduced, which defines how the conditional priors are constructed and segmentation processes of multiple organs are executed. In our framework, predictor organs, whose segmentation is sufficiently accurate by using conventional single-organ segmentation methods, are pre-segmented, and the remaining organs are hierarchically segmented using conditional shape–location priors. The proposed framework was evaluated through the segmentation of eight abdominal organs (liver, spleen, left and right kidneys, pancreas, gallbladder, aorta, and inferior vena cava) from 134 CT data from 86 patients obtained under six imaging conditions at two hospitals. The experimental results show the effectiveness of the proposed prediction-based priors and the applicability to various imaging conditions without the need for supervised intensity information. Average Dice coefficients for the liver, spleen, and kidneys were more than 92%, and were around 73% and 67% for the pancreas and gallbladder, respectively. PMID:26277022

  3. Abdominal multi-organ segmentation from CT images using conditional shape-location and unsupervised intensity priors.

    PubMed

    Okada, Toshiyuki; Linguraru, Marius George; Hori, Masatoshi; Summers, Ronald M; Tomiyama, Noriyuki; Sato, Yoshinobu

    2015-12-01

    This paper addresses the automated segmentation of multiple organs in upper abdominal computed tomography (CT) data. The aim of our study is to develop methods to effectively construct the conditional priors and use their prediction power for more accurate segmentation as well as easy adaptation to various imaging conditions in CT images, as observed in clinical practice. We propose a general framework of multi-organ segmentation which effectively incorporates interrelations among multiple organs and easily adapts to various imaging conditions without the need for supervised intensity information. The features of the framework are as follows: (1) A method for modeling conditional shape and location (shape-location) priors, which we call prediction-based priors, is developed to derive accurate priors specific to each subject, which enables the estimation of intensity priors without the need for supervised intensity information. (2) Organ correlation graph is introduced, which defines how the conditional priors are constructed and segmentation processes of multiple organs are executed. In our framework, predictor organs, whose segmentation is sufficiently accurate by using conventional single-organ segmentation methods, are pre-segmented, and the remaining organs are hierarchically segmented using conditional shape-location priors. The proposed framework was evaluated through the segmentation of eight abdominal organs (liver, spleen, left and right kidneys, pancreas, gallbladder, aorta, and inferior vena cava) from 134 CT data from 86 patients obtained under six imaging conditions at two hospitals. The experimental results show the effectiveness of the proposed prediction-based priors and the applicability to various imaging conditions without the need for supervised intensity information. Average Dice coefficients for the liver, spleen, and kidneys were more than 92%, and were around 73% and 67% for the pancreas and gallbladder, respectively. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Low dose CBCT reconstruction via prior contour based total variation (PCTV) regularization: a feasibility study

    NASA Astrophysics Data System (ADS)

    Chen, Yingxuan; Yin, Fang-Fang; Zhang, Yawei; Zhang, You; Ren, Lei

    2018-04-01

    Purpose: compressed sensing reconstruction using total variation (TV) tends to over-smooth the edge information by uniformly penalizing the image gradient. The goal of this study is to develop a novel prior contour based TV (PCTV) method to enhance the edge information in compressed sensing reconstruction for CBCT. Methods: the edge information is extracted from prior planning-CT via edge detection. Prior CT is first registered with on-board CBCT reconstructed with TV method through rigid or deformable registration. The edge contours in prior-CT is then mapped to CBCT and used as the weight map for TV regularization to enhance edge information in CBCT reconstruction. The PCTV method was evaluated using extended-cardiac-torso (XCAT) phantom, physical CatPhan phantom and brain patient data. Results were compared with both TV and edge preserving TV (EPTV) methods which are commonly used for limited projection CBCT reconstruction. Relative error was used to calculate pixel value difference and edge cross correlation was defined as the similarity of edge information between reconstructed images and ground truth in the quantitative evaluation. Results: compared to TV and EPTV, PCTV enhanced the edge information of bone, lung vessels and tumor in XCAT reconstruction and complex bony structures in brain patient CBCT. In XCAT study using 45 half-fan CBCT projections, compared with ground truth, relative errors were 1.5%, 0.7% and 0.3% and edge cross correlations were 0.66, 0.72 and 0.78 for TV, EPTV and PCTV, respectively. PCTV is more robust to the projection number reduction. Edge enhancement was reduced slightly with noisy projections but PCTV was still superior to other methods. PCTV can maintain resolution while reducing the noise in the low mAs CatPhan reconstruction. Low contrast edges were preserved better with PCTV compared with TV and EPTV. Conclusion: PCTV preserved edge information as well as reduced streak artifacts and noise in low dose CBCT reconstruction. PCTV is superior to TV and EPTV methods in edge enhancement, which can potentially improve the localization accuracy in radiation therapy.

  5. Bayesian hierarchical functional data analysis via contaminated informative priors.

    PubMed

    Scarpa, Bruno; Dunson, David B

    2009-09-01

    A variety of flexible approaches have been proposed for functional data analysis, allowing both the mean curve and the distribution about the mean to be unknown. Such methods are most useful when there is limited prior information. Motivated by applications to modeling of temperature curves in the menstrual cycle, this article proposes a flexible approach for incorporating prior information in semiparametric Bayesian analyses of hierarchical functional data. The proposed approach is based on specifying the distribution of functions as a mixture of a parametric hierarchical model and a nonparametric contamination. The parametric component is chosen based on prior knowledge, while the contamination is characterized as a functional Dirichlet process. In the motivating application, the contamination component allows unanticipated curve shapes in unhealthy menstrual cycles. Methods are developed for posterior computation, and the approach is applied to data from a European fecundability study.

  6. Integrating biological knowledge into variable selection: an empirical Bayes approach with an application in cancer biology

    PubMed Central

    2012-01-01

    Background An important question in the analysis of biochemical data is that of identifying subsets of molecular variables that may jointly influence a biological response. Statistical variable selection methods have been widely used for this purpose. In many settings, it may be important to incorporate ancillary biological information concerning the variables of interest. Pathway and network maps are one example of a source of such information. However, although ancillary information is increasingly available, it is not always clear how it should be used nor how it should be weighted in relation to primary data. Results We put forward an approach in which biological knowledge is incorporated using informative prior distributions over variable subsets, with prior information selected and weighted in an automated, objective manner using an empirical Bayes formulation. We employ continuous, linear models with interaction terms and exploit biochemically-motivated sparsity constraints to permit exact inference. We show an example of priors for pathway- and network-based information and illustrate our proposed method on both synthetic response data and by an application to cancer drug response data. Comparisons are also made to alternative Bayesian and frequentist penalised-likelihood methods for incorporating network-based information. Conclusions The empirical Bayes method proposed here can aid prior elicitation for Bayesian variable selection studies and help to guard against mis-specification of priors. Empirical Bayes, together with the proposed pathway-based priors, results in an approach with a competitive variable selection performance. In addition, the overall procedure is fast, deterministic, and has very few user-set parameters, yet is capable of capturing interplay between molecular players. The approach presented is general and readily applicable in any setting with multiple sources of biological prior knowledge. PMID:22578440

  7. The Impact of Nursing Students' Prior Chemistry Experience on Academic Performance and Perception of Relevance in a Health Science Course

    ERIC Educational Resources Information Center

    Boddey, Kerrie; de Berg, Kevin

    2015-01-01

    Nursing students have typically found the study of chemistry to be one of their major challenges in a nursing course. This mixed method study was designed to explore how prior experiences in chemistry might impact chemistry achievement during a health science unit. Nursing students (N = 101) studying chemistry as part of a health science unit were…

  8. Internet Use for Prediagnosis Symptom Appraisal by Colorectal Cancer Patients

    ERIC Educational Resources Information Center

    Thomson, Maria D.; Siminoff, Laura A.; Longo, Daniel R.

    2012-01-01

    Background: This study explored the characteristics of colorectal cancer (CRC) patients who accessed Internet-based health information as part of their symptom appraisal process prior to consulting a health care provider. Method: Newly diagnosed CRC patients who experienced symptoms prior to diagnosis were interviewed. Brief COPE was used to…

  9. Disjunctive Normal Shape and Appearance Priors with Applications to Image Segmentation.

    PubMed

    Mesadi, Fitsum; Cetin, Mujdat; Tasdizen, Tolga

    2015-10-01

    The use of appearance and shape priors in image segmentation is known to improve accuracy; however, existing techniques have several drawbacks. Active shape and appearance models require landmark points and assume unimodal shape and appearance distributions. Level set based shape priors are limited to global shape similarity. In this paper, we present a novel shape and appearance priors for image segmentation based on an implicit parametric shape representation called disjunctive normal shape model (DNSM). DNSM is formed by disjunction of conjunctions of half-spaces defined by discriminants. We learn shape and appearance statistics at varying spatial scales using nonparametric density estimation. Our method can generate a rich set of shape variations by locally combining training shapes. Additionally, by studying the intensity and texture statistics around each discriminant of our shape model, we construct a local appearance probability map. Experiments carried out on both medical and natural image datasets show the potential of the proposed method.

  10. Integrating prior information into microwave tomography part 2: Impact of errors in prior information on microwave tomography image quality.

    PubMed

    Kurrant, Douglas; Fear, Elise; Baran, Anastasia; LoVetri, Joe

    2017-12-01

    The authors have developed a method to combine a patient-specific map of tissue structure and average dielectric properties with microwave tomography. The patient-specific map is acquired with radar-based techniques and serves as prior information for microwave tomography. The impact that the degree of structural detail included in this prior information has on image quality was reported in a previous investigation. The aim of the present study is to extend this previous work by identifying and quantifying the impact that errors in the prior information have on image quality, including the reconstruction of internal structures and lesions embedded in fibroglandular tissue. This study also extends the work of others reported in literature by emulating a clinical setting with a set of experiments that incorporate heterogeneity into both the breast interior and glandular region, as well as prior information related to both fat and glandular structures. Patient-specific structural information is acquired using radar-based methods that form a regional map of the breast. Errors are introduced to create a discrepancy in the geometry and electrical properties between the regional map and the model used to generate the data. This permits the impact that errors in the prior information have on image quality to be evaluated. Image quality is quantitatively assessed by measuring the ability of the algorithm to reconstruct both internal structures and lesions embedded in fibroglandular tissue. The study is conducted using both 2D and 3D numerical breast models constructed from MRI scans. The reconstruction results demonstrate robustness of the method relative to errors in the dielectric properties of the background regional map, and to misalignment errors. These errors do not significantly influence the reconstruction accuracy of the underlying structures, or the ability of the algorithm to reconstruct malignant tissue. Although misalignment errors do not significantly impact the quality of the reconstructed fat and glandular structures for the 3D scenarios, the dielectric properties are reconstructed less accurately within the glandular structure for these cases relative to the 2D cases. However, general agreement between the 2D and 3D results was found. A key contribution of this paper is the detailed analysis of the impact of prior information errors on the reconstruction accuracy and ability to detect tumors. The results support the utility of acquiring patient-specific information with radar-based techniques and incorporating this information into MWT. The method is robust to errors in the dielectric properties of the background regional map, and to misalignment errors. Completion of this analysis is an important step toward developing the method into a practical diagnostic tool. © 2017 American Association of Physicists in Medicine.

  11. A Comparison of the β-Substitution Method and a Bayesian Method for Analyzing Left-Censored Data

    PubMed Central

    Huynh, Tran; Quick, Harrison; Ramachandran, Gurumurthy; Banerjee, Sudipto; Stenzel, Mark; Sandler, Dale P.; Engel, Lawrence S.; Kwok, Richard K.; Blair, Aaron; Stewart, Patricia A.

    2016-01-01

    Classical statistical methods for analyzing exposure data with values below the detection limits are well described in the occupational hygiene literature, but an evaluation of a Bayesian approach for handling such data is currently lacking. Here, we first describe a Bayesian framework for analyzing censored data. We then present the results of a simulation study conducted to compare the β-substitution method with a Bayesian method for exposure datasets drawn from lognormal distributions and mixed lognormal distributions with varying sample sizes, geometric standard deviations (GSDs), and censoring for single and multiple limits of detection. For each set of factors, estimates for the arithmetic mean (AM), geometric mean, GSD, and the 95th percentile (X0.95) of the exposure distribution were obtained. We evaluated the performance of each method using relative bias, the root mean squared error (rMSE), and coverage (the proportion of the computed 95% uncertainty intervals containing the true value). The Bayesian method using non-informative priors and the β-substitution method were generally comparable in bias and rMSE when estimating the AM and GM. For the GSD and the 95th percentile, the Bayesian method with non-informative priors was more biased and had a higher rMSE than the β-substitution method, but use of more informative priors generally improved the Bayesian method’s performance, making both the bias and the rMSE more comparable to the β-substitution method. An advantage of the Bayesian method is that it provided estimates of uncertainty for these parameters of interest and good coverage, whereas the β-substitution method only provided estimates of uncertainty for the AM, and coverage was not as consistent. Selection of one or the other method depends on the needs of the practitioner, the availability of prior information, and the distribution characteristics of the measurement data. We suggest the use of Bayesian methods if the practitioner has the computational resources and prior information, as the method would generally provide accurate estimates and also provides the distributions of all of the parameters, which could be useful for making decisions in some applications. PMID:26209598

  12. Method for Analyzing Students' Utilization of Prior Physics Learning in New Contexts

    ERIC Educational Resources Information Center

    McBride, Dyan L.; Zollman, Dean; Rebello, N. Sanjay

    2010-01-01

    In prior research, the classification of concepts into three types--descriptive, hypothetical and theoretical--has allowed for the association of students' use of different concept types with their level of understanding. Previous studies have also examined the ways in which students link concepts to determine whether students have a meaningful…

  13. The Implications of Learners' Goal Orientation in a Prior Learning Assessment Program

    ERIC Educational Resources Information Center

    McClintock, Patricia

    2013-01-01

    This mixed methods sequential explanatory study was designed to investigate students' persistence in an online Prior Learning Assessment (PLA) Program by researching the implications of goal orientation and other academic, institutional, and student-related factors of non-traditional students enrolled in such a program at the University of St.…

  14. Prevalence and Predictors of Sexual Assault among a College Sample

    ERIC Educational Resources Information Center

    Conley, A. H.; Overstreet, C. M.; Hawn, S. E.; Kendler, K. S.; Dick, D. M.; Amstadter, A. B.

    2017-01-01

    Objective: This study examined the prevalence and correlates of precollege, college-onset, and repeat sexual assault (SA) within a representative student sample. Participants: A representative sample of 7,603 students. Methods: Incoming first-year students completed a survey about their exposure to broad SA prior to college, prior trauma,…

  15. Adaptive Prior Variance Calibration in the Bayesian Continual Reassessment Method

    PubMed Central

    Zhang, Jin; Braun, Thomas M.; Taylor, Jeremy M.G.

    2012-01-01

    Use of the Continual Reassessment Method (CRM) and other model-based approaches to design in Phase I clinical trials has increased due to the ability of the CRM to identify the maximum tolerated dose (MTD) better than the 3+3 method. However, the CRM can be sensitive to the variance selected for the prior distribution of the model parameter, especially when a small number of patients are enrolled. While methods have emerged to adaptively select skeletons and to calibrate the prior variance only at the beginning of a trial, there has not been any approach developed to adaptively calibrate the prior variance throughout a trial. We propose three systematic approaches to adaptively calibrate the prior variance during a trial and compare them via simulation to methods proposed to calibrate the variance at the beginning of a trial. PMID:22987660

  16. Model-based Bayesian inference for ROC data analysis

    NASA Astrophysics Data System (ADS)

    Lei, Tianhu; Bae, K. Ty

    2013-03-01

    This paper presents a study of model-based Bayesian inference to Receiver Operating Characteristics (ROC) data. The model is a simple version of general non-linear regression model. Different from Dorfman model, it uses a probit link function with a covariate variable having zero-one two values to express binormal distributions in a single formula. Model also includes a scale parameter. Bayesian inference is implemented by Markov Chain Monte Carlo (MCMC) method carried out by Bayesian analysis Using Gibbs Sampling (BUGS). Contrast to the classical statistical theory, Bayesian approach considers model parameters as random variables characterized by prior distributions. With substantial amount of simulated samples generated by sampling algorithm, posterior distributions of parameters as well as parameters themselves can be accurately estimated. MCMC-based BUGS adopts Adaptive Rejection Sampling (ARS) protocol which requires the probability density function (pdf) which samples are drawing from be log concave with respect to the targeted parameters. Our study corrects a common misconception and proves that pdf of this regression model is log concave with respect to its scale parameter. Therefore, ARS's requirement is satisfied and a Gaussian prior which is conjugate and possesses many analytic and computational advantages is assigned to the scale parameter. A cohort of 20 simulated data sets and 20 simulations from each data set are used in our study. Output analysis and convergence diagnostics for MCMC method are assessed by CODA package. Models and methods by using continuous Gaussian prior and discrete categorical prior are compared. Intensive simulations and performance measures are given to illustrate our practice in the framework of model-based Bayesian inference using MCMC method.

  17. Prior Military Service, Identity Stigma, and Mental Health Among Transgender Older Adults

    PubMed Central

    Hoy-Ellis, Charles P.; Shiu, Chengshi; Sullivan, Kathleen M.; Kim, Hyun-Jun; Sturges, Allison M.; Fredriksen-Goldsen, Karen I.

    2017-01-01

    Purpose of the Study: Converging evidence from large community-based samples, Internet studies, and Veterans Health Administration data suggest that transgender adults have high rates of U.S. military service. However, little is known about the role of prior military service in their mental health later in life, particularly in relation to identity stigma. In this article, we examine relationships between prior military service, identity stigma, and mental health among transgender older adults. Design and Methods: We used a subsample of transgender older adults (n = 183) from the 2014 survey of Aging with Pride: National Health, Aging, and Sexuality/Gender Study (NHAS). We employed weighted multivariate linear models to evaluate the relationships between psychological health-related quality of life (HRQOL), depressive symptomatology (Center for Epidemiological Studies Depression Scale [CES-D] scores), identity stigma, and prior military service, controlling for background characteristics. Results: Identity stigma was significantly related with higher depressive symptomatology and lower psychological HRQOL. Having a history of prior military service significantly predicted lower depressive symptomatology and higher psychological HRQOL. The relationships between psychological HRQOL, identity stigma, and prior military service were largely explained by depressive symptomatology. Prior military service significantly attenuated the relationship between identity stigma and depressive symptomatology. Implications: By identifying the role of military service in the mental health of transgender older adults, this study provides insights into how prior military service may contribute to resilience and positive mental health outcomes. Directions for future research are discussed. PMID:28087796

  18. A comparison of confidence interval methods for the concordance correlation coefficient and intraclass correlation coefficient with small number of raters.

    PubMed

    Feng, Dai; Svetnik, Vladimir; Coimbra, Alexandre; Baumgartner, Richard

    2014-01-01

    The intraclass correlation coefficient (ICC) with fixed raters or, equivalently, the concordance correlation coefficient (CCC) for continuous outcomes is a widely accepted aggregate index of agreement in settings with small number of raters. Quantifying the precision of the CCC by constructing its confidence interval (CI) is important in early drug development applications, in particular in qualification of biomarker platforms. In recent years, there have been several new methods proposed for construction of CIs for the CCC, but their comprehensive comparison has not been attempted. The methods consisted of the delta method and jackknifing with and without Fisher's Z-transformation, respectively, and Bayesian methods with vague priors. In this study, we carried out a simulation study, with data simulated from multivariate normal as well as heavier tailed distribution (t-distribution with 5 degrees of freedom), to compare the state-of-the-art methods for assigning CI to the CCC. When the data are normally distributed, the jackknifing with Fisher's Z-transformation (JZ) tended to provide superior coverage and the difference between it and the closest competitor, the Bayesian method with the Jeffreys prior was in general minimal. For the nonnormal data, the jackknife methods, especially the JZ method, provided the coverage probabilities closest to the nominal in contrast to the others which yielded overly liberal coverage. Approaches based upon the delta method and Bayesian method with conjugate prior generally provided slightly narrower intervals and larger lower bounds than others, though this was offset by their poor coverage. Finally, we illustrated the utility of the CIs for the CCC in an example of a wake after sleep onset (WASO) biomarker, which is frequently used in clinical sleep studies of drugs for treatment of insomnia.

  19. Integration of prior CT into CBCT reconstruction for improved image quality via reconstruction of difference: first patient studies

    NASA Astrophysics Data System (ADS)

    Zhang, Hao; Gang, Grace J.; Lee, Junghoon; Wong, John; Stayman, J. Webster

    2017-03-01

    Purpose: There are many clinical situations where diagnostic CT is used for an initial diagnosis or treatment planning, followed by one or more CBCT scans that are part of an image-guided intervention. Because the high-quality diagnostic CT scan is a rich source of patient-specific anatomical knowledge, this provides an opportunity to incorporate the prior CT image into subsequent CBCT reconstruction for improved image quality. We propose a penalized-likelihood method called reconstruction of difference (RoD), to directly reconstruct differences between the CBCT scan and the CT prior. In this work, we demonstrate the efficacy of RoD with clinical patient datasets. Methods: We introduce a data processing workflow using the RoD framework to reconstruct anatomical changes between the prior CT and current CBCT. This workflow includes processing steps to account for non-anatomical differences between the two scans including 1) scatter correction for CBCT datasets due to increased scatter fractions in CBCT data; 2) histogram matching for attenuation variations between CT and CBCT; and 3) registration for different patient positioning. CBCT projection data and CT planning volumes for two radiotherapy patients - one abdominal study and one head-and-neck study - were investigated. Results: In comparisons between the proposed RoD framework and more traditional FDK and penalized-likelihood reconstructions, we find a significant improvement in image quality when prior CT information is incorporated into the reconstruction. RoD is able to provide additional low-contrast details while correctly incorporating actual physical changes in patient anatomy. Conclusions: The proposed framework provides an opportunity to either improve image quality or relax data fidelity constraints for CBCT imaging when prior CT studies of the same patient are available. Possible clinical targets include CBCT image-guided radiotherapy and CBCT image-guided surgeries.

  20. Implementing informative priors for heterogeneity in meta-analysis using meta-regression and pseudo data.

    PubMed

    Rhodes, Kirsty M; Turner, Rebecca M; White, Ian R; Jackson, Dan; Spiegelhalter, David J; Higgins, Julian P T

    2016-12-20

    Many meta-analyses combine results from only a small number of studies, a situation in which the between-study variance is imprecisely estimated when standard methods are applied. Bayesian meta-analysis allows incorporation of external evidence on heterogeneity, providing the potential for more robust inference on the effect size of interest. We present a method for performing Bayesian meta-analysis using data augmentation, in which we represent an informative conjugate prior for between-study variance by pseudo data and use meta-regression for estimation. To assist in this, we derive predictive inverse-gamma distributions for the between-study variance expected in future meta-analyses. These may serve as priors for heterogeneity in new meta-analyses. In a simulation study, we compare approximate Bayesian methods using meta-regression and pseudo data against fully Bayesian approaches based on importance sampling techniques and Markov chain Monte Carlo (MCMC). We compare the frequentist properties of these Bayesian methods with those of the commonly used frequentist DerSimonian and Laird procedure. The method is implemented in standard statistical software and provides a less complex alternative to standard MCMC approaches. An importance sampling approach produces almost identical results to standard MCMC approaches, and results obtained through meta-regression and pseudo data are very similar. On average, data augmentation provides closer results to MCMC, if implemented using restricted maximum likelihood estimation rather than DerSimonian and Laird or maximum likelihood estimation. The methods are applied to real datasets, and an extension to network meta-analysis is described. The proposed method facilitates Bayesian meta-analysis in a way that is accessible to applied researchers. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd. © 2016 The Authors. Statistics in Medicine Published by John Wiley & Sons Ltd.

  1. Accommodating Uncertainty in Prior Distributions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Picard, Richard Roy; Vander Wiel, Scott Alan

    2017-01-19

    A fundamental premise of Bayesian methodology is that a priori information is accurately summarized by a single, precisely de ned prior distribution. In many cases, especially involving informative priors, this premise is false, and the (mis)application of Bayes methods produces posterior quantities whose apparent precisions are highly misleading. We examine the implications of uncertainty in prior distributions, and present graphical methods for dealing with them.

  2. Parameter estimation of multivariate multiple regression model using bayesian with non-informative Jeffreys’ prior distribution

    NASA Astrophysics Data System (ADS)

    Saputro, D. R. S.; Amalia, F.; Widyaningsih, P.; Affan, R. C.

    2018-05-01

    Bayesian method is a method that can be used to estimate the parameters of multivariate multiple regression model. Bayesian method has two distributions, there are prior and posterior distributions. Posterior distribution is influenced by the selection of prior distribution. Jeffreys’ prior distribution is a kind of Non-informative prior distribution. This prior is used when the information about parameter not available. Non-informative Jeffreys’ prior distribution is combined with the sample information resulting the posterior distribution. Posterior distribution is used to estimate the parameter. The purposes of this research is to estimate the parameters of multivariate regression model using Bayesian method with Non-informative Jeffreys’ prior distribution. Based on the results and discussion, parameter estimation of β and Σ which were obtained from expected value of random variable of marginal posterior distribution function. The marginal posterior distributions for β and Σ are multivariate normal and inverse Wishart. However, in calculation of the expected value involving integral of a function which difficult to determine the value. Therefore, approach is needed by generating of random samples according to the posterior distribution characteristics of each parameter using Markov chain Monte Carlo (MCMC) Gibbs sampling algorithm.

  3. Performance of statistical process control methods for regional surgical site infection surveillance: a 10-year multicentre pilot study.

    PubMed

    Baker, Arthur W; Haridy, Salah; Salem, Joseph; Ilieş, Iulian; Ergai, Awatef O; Samareh, Aven; Andrianas, Nicholas; Benneyan, James C; Sexton, Daniel J; Anderson, Deverick J

    2017-11-24

    Traditional strategies for surveillance of surgical site infections (SSI) have multiple limitations, including delayed and incomplete outbreak detection. Statistical process control (SPC) methods address these deficiencies by combining longitudinal analysis with graphical presentation of data. We performed a pilot study within a large network of community hospitals to evaluate performance of SPC methods for detecting SSI outbreaks. We applied conventional Shewhart and exponentially weighted moving average (EWMA) SPC charts to 10 previously investigated SSI outbreaks that occurred from 2003 to 2013. We compared the results of SPC surveillance to the results of traditional SSI surveillance methods. Then, we analysed the performance of modified SPC charts constructed with different outbreak detection rules, EWMA smoothing factors and baseline SSI rate calculations. Conventional Shewhart and EWMA SPC charts both detected 8 of the 10 SSI outbreaks analysed, in each case prior to the date of traditional detection. Among detected outbreaks, conventional Shewhart chart detection occurred a median of 12 months prior to outbreak onset and 22 months prior to traditional detection. Conventional EWMA chart detection occurred a median of 7 months prior to outbreak onset and 14 months prior to traditional detection. Modified Shewhart and EWMA charts additionally detected several outbreaks earlier than conventional SPC charts. Shewhart and SPC charts had low false-positive rates when used to analyse separate control hospital SSI data. Our findings illustrate the potential usefulness and feasibility of real-time SPC surveillance of SSI to rapidly identify outbreaks and improve patient safety. Further study is needed to optimise SPC chart selection and calculation, statistical outbreak detection rules and the process for reacting to signals of potential outbreaks. © Article author(s) (or their employer(s) unless otherwise stated in the text of the article) 2017. All rights reserved. No commercial use is permitted unless otherwise expressly granted.

  4. A Method for Constructing Informative Priors for Bayesian Modeling of Occupational Hygiene Data.

    PubMed

    Quick, Harrison; Huynh, Tran; Ramachandran, Gurumurthy

    2017-01-01

    In many occupational hygiene settings, the demand for more accurate, more precise results is at odds with limited resources. To combat this, practitioners have begun using Bayesian methods to incorporate prior information into their statistical models in order to obtain more refined inference from their data. This is not without risk, however, as incorporating prior information that disagrees with the information contained in data can lead to spurious conclusions, particularly if the prior is too informative. In this article, we propose a method for constructing informative prior distributions for normal and lognormal data that are intuitive to specify and robust to bias. To demonstrate the use of these priors, we walk practitioners through a step-by-step implementation of our priors using an illustrative example. We then conclude with recommendations for general use. © The Author 2017. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.

  5. An approach to the analysis of performance of quasi-optimum digital phase-locked loops.

    NASA Technical Reports Server (NTRS)

    Polk, D. R.; Gupta, S. C.

    1973-01-01

    An approach to the analysis of performance of quasi-optimum digital phase-locked loops (DPLL's) is presented. An expression for the characteristic function of the prior error in the state estimate is derived, and from this expression an infinite dimensional equation for the prior error variance is obtained. The prior error-variance equation is a function of the communication system model and the DPLL gain and is independent of the method used to derive the DPLL gain. Two approximations are discussed for reducing the prior error-variance equation to finite dimension. The effectiveness of one approximation in analyzing DPLL performance is studied.

  6. Covariate Measurement Error Correction for Student Growth Percentiles Using the SIMEX Method

    ERIC Educational Resources Information Center

    Shang, Yi; VanIwaarden, Adam; Betebenner, Damian W.

    2015-01-01

    In this study, we examined the impact of covariate measurement error (ME) on the estimation of quantile regression and student growth percentiles (SGPs), and find that SGPs tend to be overestimated among students with higher prior achievement and underestimated among those with lower prior achievement, a problem we describe as ME endogeneity in…

  7. Adult Learner Graduation Rates at Four U.S. Community Colleges by Prior Learning Assessment Status and Method

    ERIC Educational Resources Information Center

    Hayward, Milan S.

    2012-01-01

    The college completion agenda demands improved graduation rates among adult learners and prior learning assessment (PLA) is a promising solution. PLA permits students to earn college credit for knowledge acquired outside of higher education and is associated with improved student outcomes. The current study expanded the literature regarding adult…

  8. Metal Artifact Reduction in X-ray Computed Tomography Using Computer-Aided Design Data of Implants as Prior Information.

    PubMed

    Ruth, Veikko; Kolditz, Daniel; Steiding, Christian; Kalender, Willi A

    2017-06-01

    The performance of metal artifact reduction (MAR) methods in x-ray computed tomography (CT) suffers from incorrect identification of metallic implants in the artifact-affected volumetric images. The aim of this study was to investigate potential improvements of state-of-the-art MAR methods by using prior information on geometry and material of the implant. The influence of a novel prior knowledge-based segmentation (PS) compared with threshold-based segmentation (TS) on 2 MAR methods (linear interpolation [LI] and normalized-MAR [NORMAR]) was investigated. The segmentation is the initial step of both MAR methods. Prior knowledge-based segmentation uses 3-dimensional registered computer-aided design (CAD) data as prior knowledge to estimate the correct position and orientation of the metallic objects. Threshold-based segmentation uses an adaptive threshold to identify metal. Subsequently, for LI and NORMAR, the selected voxels are projected into the raw data domain to mark metal areas. Attenuation values in these areas are replaced by different interpolation schemes followed by a second reconstruction. Finally, the previously selected metal voxels are replaced by the metal voxels determined by PS or TS in the initial reconstruction. First, we investigated in an elaborate phantom study if the knowledge of the exact implant shape extracted from the CAD data provided by the manufacturer of the implant can improve the MAR result. Second, the leg of a human cadaver was scanned using a clinical CT system before and after the implantation of an artificial knee joint. The results were compared regarding segmentation accuracy, CT number accuracy, and the restoration of distorted structures. The use of PS improved the efficacy of LI and NORMAR compared with TS. Artifacts caused by insufficient segmentation were reduced, and additional information was made available within the projection data. The estimation of the implant shape was more exact and not dependent on a threshold value. Consequently, the visibility of structures was improved when comparing the new approach to the standard method. This was further confirmed by improved CT value accuracy and reduced image noise. The PS approach based on prior implant information provides image quality which is superior to TS-based MAR, especially when the shape of the metallic implant is complex. The new approach can be useful for improving MAR methods and dose calculations within radiation therapy based on the MAR corrected CT images.

  9. Penalized maximum likelihood simultaneous longitudinal PET image reconstruction with difference-image priors.

    PubMed

    Ellis, Sam; Reader, Andrew J

    2018-04-26

    Many clinical contexts require the acquisition of multiple positron emission tomography (PET) scans of a single subject, for example, to observe and quantitate changes in functional behaviour in tumors after treatment in oncology. Typically, the datasets from each of these scans are reconstructed individually, without exploiting the similarities between them. We have recently shown that sharing information between longitudinal PET datasets by penalizing voxel-wise differences during image reconstruction can improve reconstructed images by reducing background noise and increasing the contrast-to-noise ratio of high-activity lesions. Here, we present two additional novel longitudinal difference-image priors and evaluate their performance using two-dimesional (2D) simulation studies and a three-dimensional (3D) real dataset case study. We have previously proposed a simultaneous difference-image-based penalized maximum likelihood (PML) longitudinal image reconstruction method that encourages sparse difference images (DS-PML), and in this work we propose two further novel prior terms. The priors are designed to encourage longitudinal images with corresponding differences which have (a) low entropy (DE-PML), and (b) high sparsity in their spatial gradients (DTV-PML). These two new priors and the originally proposed longitudinal prior were applied to 2D-simulated treatment response [ 18 F]fluorodeoxyglucose (FDG) brain tumor datasets and compared to standard maximum likelihood expectation-maximization (MLEM) reconstructions. These 2D simulation studies explored the effects of penalty strengths, tumor behaviour, and interscan coupling on reconstructed images. Finally, a real two-scan longitudinal data series acquired from a head and neck cancer patient was reconstructed with the proposed methods and the results compared to standard reconstruction methods. Using any of the three priors with an appropriate penalty strength produced images with noise levels equivalent to those seen when using standard reconstructions with increased counts levels. In tumor regions, each method produces subtly different results in terms of preservation of tumor quantitation and reconstruction root mean-squared error (RMSE). In particular, in the two-scan simulations, the DE-PML method produced tumor means in close agreement with MLEM reconstructions, while the DTV-PML method produced the lowest errors due to noise reduction within the tumor. Across a range of tumor responses and different numbers of scans, similar results were observed, with DTV-PML producing the lowest errors of the three priors and DE-PML producing the lowest bias. Similar improvements were observed in the reconstructions of the real longitudinal datasets, although imperfect alignment of the two PET images resulted in additional changes in the difference image that affected the performance of the proposed methods. Reconstruction of longitudinal datasets by penalizing difference images between pairs of scans from a data series allows for noise reduction in all reconstructed images. An appropriate choice of penalty term and penalty strength allows for this noise reduction to be achieved while maintaining reconstruction performance in regions of change, either in terms of quantitation of mean intensity via DE-PML, or in terms of tumor RMSE via DTV-PML. Overall, improving the image quality of longitudinal datasets via simultaneous reconstruction has the potential to improve upon currently used methods, allow dose reduction, or reduce scan time while maintaining image quality at current levels. © 2018 The Authors. Medical Physics published by Wiley Periodicals, Inc. on behalf of American Association of Physicists in Medicine.

  10. A Hierarchical Bayesian Model for Calibrating Estimates of Species Divergence Times

    PubMed Central

    Heath, Tracy A.

    2012-01-01

    In Bayesian divergence time estimation methods, incorporating calibrating information from the fossil record is commonly done by assigning prior densities to ancestral nodes in the tree. Calibration prior densities are typically parametric distributions offset by minimum age estimates provided by the fossil record. Specification of the parameters of calibration densities requires the user to quantify his or her prior knowledge of the age of the ancestral node relative to the age of its calibrating fossil. The values of these parameters can, potentially, result in biased estimates of node ages if they lead to overly informative prior distributions. Accordingly, determining parameter values that lead to adequate prior densities is not straightforward. In this study, I present a hierarchical Bayesian model for calibrating divergence time analyses with multiple fossil age constraints. This approach applies a Dirichlet process prior as a hyperprior on the parameters of calibration prior densities. Specifically, this model assumes that the rate parameters of exponential prior distributions on calibrated nodes are distributed according to a Dirichlet process, whereby the rate parameters are clustered into distinct parameter categories. Both simulated and biological data are analyzed to evaluate the performance of the Dirichlet process hyperprior. Compared with fixed exponential prior densities, the hierarchical Bayesian approach results in more accurate and precise estimates of internal node ages. When this hyperprior is applied using Markov chain Monte Carlo methods, the ages of calibrated nodes are sampled from mixtures of exponential distributions and uncertainty in the values of calibration density parameters is taken into account. PMID:22334343

  11. Drift-Free Position Estimation of Periodic or Quasi-Periodic Motion Using Inertial Sensors

    PubMed Central

    Latt, Win Tun; Veluvolu, Kalyana Chakravarthy; Ang, Wei Tech

    2011-01-01

    Position sensing with inertial sensors such as accelerometers and gyroscopes usually requires other aided sensors or prior knowledge of motion characteristics to remove position drift resulting from integration of acceleration or velocity so as to obtain accurate position estimation. A method based on analytical integration has previously been developed to obtain accurate position estimate of periodic or quasi-periodic motion from inertial sensors using prior knowledge of the motion but without using aided sensors. In this paper, a new method is proposed which employs linear filtering stage coupled with adaptive filtering stage to remove drift and attenuation. The prior knowledge of the motion the proposed method requires is only approximate band of frequencies of the motion. Existing adaptive filtering methods based on Fourier series such as weighted-frequency Fourier linear combiner (WFLC), and band-limited multiple Fourier linear combiner (BMFLC) are modified to combine with the proposed method. To validate and compare the performance of the proposed method with the method based on analytical integration, simulation study is performed using periodic signals as well as real physiological tremor data, and real-time experiments are conducted using an ADXL-203 accelerometer. Results demonstrate that the performance of the proposed method outperforms the existing analytical integration method. PMID:22163935

  12. Self-prior strategy for organ reconstruction in fluorescence molecular tomography

    PubMed Central

    Zhou, Yuan; Chen, Maomao; Su, Han; Luo, Jianwen

    2017-01-01

    The purpose of this study is to propose a strategy for organ reconstruction in fluorescence molecular tomography (FMT) without prior information from other imaging modalities, and to overcome the high cost and ionizing radiation caused by the traditional structural prior strategy. The proposed strategy is designed as an iterative architecture to solve the inverse problem of FMT. In each iteration, a short time Fourier transform (STFT) based algorithm is used to extract the self-prior information in the space-frequency energy spectrum with the assumption that the regions with higher fluorescence concentration have larger energy intensity, then the cost function of the inverse problem is modified by the self-prior information, and lastly an iterative Laplacian regularization algorithm is conducted to solve the updated inverse problem and obtains the reconstruction results. Simulations and in vivo experiments on liver reconstruction are carried out to test the performance of the self-prior strategy on organ reconstruction. The organ reconstruction results obtained by the proposed self-prior strategy are closer to the ground truth than those obtained by the iterative Tikhonov regularization (ITKR) method (traditional non-prior strategy). Significant improvements are shown in the evaluation indexes of relative locational error (RLE), relative error (RE) and contrast-to-noise ratio (CNR). The self-prior strategy improves the organ reconstruction results compared with the non-prior strategy and also overcomes the shortcomings of the traditional structural prior strategy. Various applications such as metabolic imaging and pharmacokinetic study can be aided by this strategy. PMID:29082094

  13. Self-prior strategy for organ reconstruction in fluorescence molecular tomography.

    PubMed

    Zhou, Yuan; Chen, Maomao; Su, Han; Luo, Jianwen

    2017-10-01

    The purpose of this study is to propose a strategy for organ reconstruction in fluorescence molecular tomography (FMT) without prior information from other imaging modalities, and to overcome the high cost and ionizing radiation caused by the traditional structural prior strategy. The proposed strategy is designed as an iterative architecture to solve the inverse problem of FMT. In each iteration, a short time Fourier transform (STFT) based algorithm is used to extract the self-prior information in the space-frequency energy spectrum with the assumption that the regions with higher fluorescence concentration have larger energy intensity, then the cost function of the inverse problem is modified by the self-prior information, and lastly an iterative Laplacian regularization algorithm is conducted to solve the updated inverse problem and obtains the reconstruction results. Simulations and in vivo experiments on liver reconstruction are carried out to test the performance of the self-prior strategy on organ reconstruction. The organ reconstruction results obtained by the proposed self-prior strategy are closer to the ground truth than those obtained by the iterative Tikhonov regularization (ITKR) method (traditional non-prior strategy). Significant improvements are shown in the evaluation indexes of relative locational error (RLE), relative error (RE) and contrast-to-noise ratio (CNR). The self-prior strategy improves the organ reconstruction results compared with the non-prior strategy and also overcomes the shortcomings of the traditional structural prior strategy. Various applications such as metabolic imaging and pharmacokinetic study can be aided by this strategy.

  14. Benchmarking wide swath altimetry-based river discharge estimation algorithms for the Ganges river system

    NASA Astrophysics Data System (ADS)

    Bonnema, Matthew G.; Sikder, Safat; Hossain, Faisal; Durand, Michael; Gleason, Colin J.; Bjerklie, David M.

    2016-04-01

    The objective of this study is to compare the effectiveness of three algorithms that estimate discharge from remotely sensed observables (river width, water surface height, and water surface slope) in anticipation of the forthcoming NASA/CNES Surface Water and Ocean Topography (SWOT) mission. SWOT promises to provide these measurements simultaneously, and the river discharge algorithms included here are designed to work with these data. Two algorithms were built around Manning's equation, the Metropolis Manning (MetroMan) method, and the Mean Flow and Geomorphology (MFG) method, and one approach uses hydraulic geometry to estimate discharge, the at-many-stations hydraulic geometry (AMHG) method. A well-calibrated and ground-truthed hydrodynamic model of the Ganges river system (HEC-RAS) was used as reference for three rivers from the Ganges River Delta: the main stem of Ganges, the Arial-Khan, and the Mohananda Rivers. The high seasonal variability of these rivers due to the Monsoon presented a unique opportunity to thoroughly assess the discharge algorithms in light of typical monsoon regime rivers. It was found that the MFG method provides the most accurate discharge estimations in most cases, with an average relative root-mean-squared error (RRMSE) across all three reaches of 35.5%. It is followed closely by the Metropolis Manning algorithm, with an average RRMSE of 51.5%. However, the MFG method's reliance on knowledge of prior river discharge limits its application on ungauged rivers. In terms of input data requirement at ungauged regions with no prior records, the Metropolis Manning algorithm provides a more practical alternative over a region that is lacking in historical observations as the algorithm requires less ancillary data. The AMHG algorithm, while requiring the least prior river data, provided the least accurate discharge measurements with an average wet and dry season RRMSE of 79.8% and 119.1%, respectively, across all rivers studied. This poor performance is directly traced to poor estimation of AMHG via a remotely sensed proxy, and results improve commensurate with MFG and MetroMan when prior AMHG information is given to the method. Therefore, we cannot recommend use of AMHG without inclusion of this prior information, at least for the studied rivers. The dry season discharge (within-bank flow) was captured well by all methods, while the wet season (floodplain flow) appeared more challenging. The picture that emerges from this study is that a multialgorithm approach may be appropriate during flood inundation periods in Ganges Delta.

  15. Variable Selection with Prior Information for Generalized Linear Models via the Prior LASSO Method.

    PubMed

    Jiang, Yuan; He, Yunxiao; Zhang, Heping

    LASSO is a popular statistical tool often used in conjunction with generalized linear models that can simultaneously select variables and estimate parameters. When there are many variables of interest, as in current biological and biomedical studies, the power of LASSO can be limited. Fortunately, so much biological and biomedical data have been collected and they may contain useful information about the importance of certain variables. This paper proposes an extension of LASSO, namely, prior LASSO (pLASSO), to incorporate that prior information into penalized generalized linear models. The goal is achieved by adding in the LASSO criterion function an additional measure of the discrepancy between the prior information and the model. For linear regression, the whole solution path of the pLASSO estimator can be found with a procedure similar to the Least Angle Regression (LARS). Asymptotic theories and simulation results show that pLASSO provides significant improvement over LASSO when the prior information is relatively accurate. When the prior information is less reliable, pLASSO shows great robustness to the misspecification. We illustrate the application of pLASSO using a real data set from a genome-wide association study.

  16. Bayesian methods including nonrandomized study data increased the efficiency of postlaunch RCTs.

    PubMed

    Schmidt, Amand F; Klugkist, Irene; Klungel, Olaf H; Nielen, Mirjam; de Boer, Anthonius; Hoes, Arno W; Groenwold, Rolf H H

    2015-04-01

    Findings from nonrandomized studies on safety or efficacy of treatment in patient subgroups may trigger postlaunch randomized clinical trials (RCTs). In the analysis of such RCTs, results from nonrandomized studies are typically ignored. This study explores the trade-off between bias and power of Bayesian RCT analysis incorporating information from nonrandomized studies. A simulation study was conducted to compare frequentist with Bayesian analyses using noninformative and informative priors in their ability to detect interaction effects. In simulated subgroups, the effect of a hypothetical treatment differed between subgroups (odds ratio 1.00 vs. 2.33). Simulations varied in sample size, proportions of the subgroups, and specification of the priors. As expected, the results for the informative Bayesian analyses were more biased than those from the noninformative Bayesian analysis or frequentist analysis. However, because of a reduction in posterior variance, informative Bayesian analyses were generally more powerful to detect an effect. In scenarios where the informative priors were in the opposite direction of the RCT data, type 1 error rates could be 100% and power 0%. Bayesian methods incorporating data from nonrandomized studies can meaningfully increase power of interaction tests in postlaunch RCTs. Copyright © 2015 Elsevier Inc. All rights reserved.

  17. Imaging performance of a hybrid x-ray computed tomography-fluorescence molecular tomography system using priors.

    PubMed

    Ale, Angelique; Schulz, Ralf B; Sarantopoulos, Athanasios; Ntziachristos, Vasilis

    2010-05-01

    The performance is studied of two newly introduced and previously suggested methods that incorporate priors into inversion schemes associated with data from a recently developed hybrid x-ray computed tomography and fluorescence molecular tomography system, the latter based on CCD camera photon detection. The unique data set studied attains accurately registered data of high spatially sampled photon fields propagating through tissue along 360 degrees projections. Approaches that incorporate structural prior information were included in the inverse problem by adding a penalty term to the minimization function utilized for image reconstructions. Results were compared as to their performance with simulated and experimental data from a lung inflammation animal model and against the inversions achieved when not using priors. The importance of using priors over stand-alone inversions is also showcased with high spatial sampling simulated and experimental data. The approach of optimal performance in resolving fluorescent biodistribution in small animals is also discussed. Inclusion of prior information from x-ray CT data in the reconstruction of the fluorescence biodistribution leads to improved agreement between the reconstruction and validation images for both simulated and experimental data.

  18. A new prior for bayesian anomaly detection: application to biosurveillance.

    PubMed

    Shen, Y; Cooper, G F

    2010-01-01

    Bayesian anomaly detection computes posterior probabilities of anomalous events by combining prior beliefs and evidence from data. However, the specification of prior probabilities can be challenging. This paper describes a Bayesian prior in the context of disease outbreak detection. The goal is to provide a meaningful, easy-to-use prior that yields a posterior probability of an outbreak that performs at least as well as a standard frequentist approach. If this goal is achieved, the resulting posterior could be usefully incorporated into a decision analysis about how to act in light of a possible disease outbreak. This paper describes a Bayesian method for anomaly detection that combines learning from data with a semi-informative prior probability over patterns of anomalous events. A univariate version of the algorithm is presented here for ease of illustration of the essential ideas. The paper describes the algorithm in the context of disease-outbreak detection, but it is general and can be used in other anomaly detection applications. For this application, the semi-informative prior specifies that an increased count over baseline is expected for the variable being monitored, such as the number of respiratory chief complaints per day at a given emergency department. The semi-informative prior is derived based on the baseline prior, which is estimated from using historical data. The evaluation reported here used semi-synthetic data to evaluate the detection performance of the proposed Bayesian method and a control chart method, which is a standard frequentist algorithm that is closest to the Bayesian method in terms of the type of data it uses. The disease-outbreak detection performance of the Bayesian method was statistically significantly better than that of the control chart method when proper baseline periods were used to estimate the baseline behavior to avoid seasonal effects. When using longer baseline periods, the Bayesian method performed as well as the control chart method. The time complexity of the Bayesian algorithm is linear in the number of the observed events being monitored, due to a novel, closed-form derivation that is introduced in the paper. This paper introduces a novel prior probability for Bayesian outbreak detection that is expressive, easy-to-apply, computationally efficient, and performs as well or better than a standard frequentist method.

  19. A Comparison of the β-Substitution Method and a Bayesian Method for Analyzing Left-Censored Data.

    PubMed

    Huynh, Tran; Quick, Harrison; Ramachandran, Gurumurthy; Banerjee, Sudipto; Stenzel, Mark; Sandler, Dale P; Engel, Lawrence S; Kwok, Richard K; Blair, Aaron; Stewart, Patricia A

    2016-01-01

    Classical statistical methods for analyzing exposure data with values below the detection limits are well described in the occupational hygiene literature, but an evaluation of a Bayesian approach for handling such data is currently lacking. Here, we first describe a Bayesian framework for analyzing censored data. We then present the results of a simulation study conducted to compare the β-substitution method with a Bayesian method for exposure datasets drawn from lognormal distributions and mixed lognormal distributions with varying sample sizes, geometric standard deviations (GSDs), and censoring for single and multiple limits of detection. For each set of factors, estimates for the arithmetic mean (AM), geometric mean, GSD, and the 95th percentile (X0.95) of the exposure distribution were obtained. We evaluated the performance of each method using relative bias, the root mean squared error (rMSE), and coverage (the proportion of the computed 95% uncertainty intervals containing the true value). The Bayesian method using non-informative priors and the β-substitution method were generally comparable in bias and rMSE when estimating the AM and GM. For the GSD and the 95th percentile, the Bayesian method with non-informative priors was more biased and had a higher rMSE than the β-substitution method, but use of more informative priors generally improved the Bayesian method's performance, making both the bias and the rMSE more comparable to the β-substitution method. An advantage of the Bayesian method is that it provided estimates of uncertainty for these parameters of interest and good coverage, whereas the β-substitution method only provided estimates of uncertainty for the AM, and coverage was not as consistent. Selection of one or the other method depends on the needs of the practitioner, the availability of prior information, and the distribution characteristics of the measurement data. We suggest the use of Bayesian methods if the practitioner has the computational resources and prior information, as the method would generally provide accurate estimates and also provides the distributions of all of the parameters, which could be useful for making decisions in some applications. © The Author 2015. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.

  20. Active Prior Tactile Knowledge Transfer for Learning Tactual Properties of New Objects

    PubMed Central

    Feng, Di

    2018-01-01

    Reusing the tactile knowledge of some previously-explored objects (prior objects) helps us to easily recognize the tactual properties of new objects. In this paper, we enable a robotic arm equipped with multi-modal artificial skin, like humans, to actively transfer the prior tactile exploratory action experiences when it learns the detailed physical properties of new objects. These experiences, or prior tactile knowledge, are built by the feature observations that the robot perceives from multiple sensory modalities, when it applies the pressing, sliding, and static contact movements on objects with different action parameters. We call our method Active Prior Tactile Knowledge Transfer (APTKT), and systematically evaluated its performance by several experiments. Results show that the robot improved the discrimination accuracy by around 10% when it used only one training sample with the feature observations of prior objects. By further incorporating the predictions from the observation models of prior objects as auxiliary features, our method improved the discrimination accuracy by over 20%. The results also show that the proposed method is robust against transferring irrelevant prior tactile knowledge (negative knowledge transfer). PMID:29466300

  1. Informative priors on fetal fraction increase power of the noninvasive prenatal screen.

    PubMed

    Xu, Hanli; Wang, Shaowei; Ma, Lin-Lin; Huang, Shuai; Liang, Lin; Liu, Qian; Liu, Yang-Yang; Liu, Ke-Di; Tan, Ze-Min; Ban, Hao; Guan, Yongtao; Lu, Zuhong

    2017-11-09

    PurposeNoninvasive prenatal screening (NIPS) sequences a mixture of the maternal and fetal cell-free DNA. Fetal trisomy can be detected by examining chromosomal dosages estimated from sequencing reads. The traditional method uses the Z-test, which compares a subject against a set of euploid controls, where the information of fetal fraction is not fully utilized. Here we present a Bayesian method that leverages informative priors on the fetal fraction.MethodOur Bayesian method combines the Z-test likelihood and informative priors of the fetal fraction, which are learned from the sex chromosomes, to compute Bayes factors. Bayesian framework can account for nongenetic risk factors through the prior odds, and our method can report individual positive/negative predictive values.ResultsOur Bayesian method has more power than the Z-test method. We analyzed 3,405 NIPS samples and spotted at least 9 (of 51) possible Z-test false positives.ConclusionBayesian NIPS is more powerful than the Z-test method, is able to account for nongenetic risk factors through prior odds, and can report individual positive/negative predictive values.Genetics in Medicine advance online publication, 9 November 2017; doi:10.1038/gim.2017.186.

  2. Cutaneous antimicrobial preparation prior to intravenous catheterization in healthy dogs: clinical, microbiological, and histopathological evaluation.

    PubMed

    Coolman, B R; Marretta, S M; Kakoma, I; Wallig, M A; Coolman, S L; Paul, A J

    1998-12-01

    The purpose of this study was to determine the effects of a one-minute chlorhexidine gluconate skin preparation protocol prior to cephalic vein catheterization. Twenty-three healthy beagle dogs had one leg aseptically prepared and the opposite leg served as a control. Twenty-six- and 77-hour time groups were studied. Chlorhexidine-treated legs had significantly lower cutaneous bacterial counts than the control legs prior to catheter insertion and prior to catheter withdrawal for both time groups. Control legs developed significantly more dermatitis than the treated legs after 77 h. A one-minute preparation with 4% chlorhexidine gluconate was an effective method for sustained reduction of cutaneous bacterial counts at peripheral intravenous catheter insertion points in dogs. Increased cutaneous bacterial counts were associated with significantly more microscopic dermatitis in untreated legs after 77 h of catheterization.

  3. Exploiting Genome Structure in Association Analysis

    PubMed Central

    Kim, Seyoung

    2014-01-01

    Abstract A genome-wide association study involves examining a large number of single-nucleotide polymorphisms (SNPs) to identify SNPs that are significantly associated with the given phenotype, while trying to reduce the false positive rate. Although haplotype-based association methods have been proposed to accommodate correlation information across nearby SNPs that are in linkage disequilibrium, none of these methods directly incorporated the structural information such as recombination events along chromosome. In this paper, we propose a new approach called stochastic block lasso for association mapping that exploits prior knowledge on linkage disequilibrium structure in the genome such as recombination rates and distances between adjacent SNPs in order to increase the power of detecting true associations while reducing false positives. Following a typical linear regression framework with the genotypes as inputs and the phenotype as output, our proposed method employs a sparsity-enforcing Laplacian prior for the regression coefficients, augmented by a first-order Markov process along the sequence of SNPs that incorporates the prior information on the linkage disequilibrium structure. The Markov-chain prior models the structural dependencies between a pair of adjacent SNPs, and allows us to look for association SNPs in a coupled manner, combining strength from multiple nearby SNPs. Our results on HapMap-simulated datasets and mouse datasets show that there is a significant advantage in incorporating the prior knowledge on linkage disequilibrium structure for marker identification under whole-genome association. PMID:21548809

  4. Incorporating prior information into differential network analysis using non-paranormal graphical models.

    PubMed

    Zhang, Xiao-Fei; Ou-Yang, Le; Yan, Hong

    2017-08-15

    Understanding how gene regulatory networks change under different cellular states is important for revealing insights into network dynamics. Gaussian graphical models, which assume that the data follow a joint normal distribution, have been used recently to infer differential networks. However, the distributions of the omics data are non-normal in general. Furthermore, although much biological knowledge (or prior information) has been accumulated, most existing methods ignore the valuable prior information. Therefore, new statistical methods are needed to relax the normality assumption and make full use of prior information. We propose a new differential network analysis method to address the above challenges. Instead of using Gaussian graphical models, we employ a non-paranormal graphical model that can relax the normality assumption. We develop a principled model to take into account the following prior information: (i) a differential edge less likely exists between two genes that do not participate together in the same pathway; (ii) changes in the networks are driven by certain regulator genes that are perturbed across different cellular states and (iii) the differential networks estimated from multi-view gene expression data likely share common structures. Simulation studies demonstrate that our method outperforms other graphical model-based algorithms. We apply our method to identify the differential networks between platinum-sensitive and platinum-resistant ovarian tumors, and the differential networks between the proneural and mesenchymal subtypes of glioblastoma. Hub nodes in the estimated differential networks rediscover known cancer-related regulator genes and contain interesting predictions. The source code is at https://github.com/Zhangxf-ccnu/pDNA. szuouyl@gmail.com. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com

  5. The desirable qualities of future doctors--a study of medical student perceptions.

    PubMed

    Hurwitz, Steven; Kelly, Brian; Powis, David; Smyth, Robyn; Lewin, Terry

    2013-07-01

    There is a lack of consensus regarding the qualities possessed by the ideal doctor, and very limited research regarding the views of medical students on these qualities. To investigate the views of commencing medical students regarding the desirable qualities of doctors. A survey containing a set of proposed desirable qualities of doctors identified from the existing literature was completed by 158 first-year medical students. The survey had a 75% response rate. Students rated the individual qualities of empathy, motivation to be a doctor, good verbal communication, ethically sound, integrity and honesty as the most important. A factor analysis identified six categories of qualities: methodical processing, cognitive capacity, people skills, generic work ethic, role certainty and warmth. Significant differences in factor scores were found across subgroups of students (international and domestic students, with and without prior tertiary studies) on the following factors: methodical processing, which was scored highest by domestic students with prior tertiary studies, cognitive capacity, which was scored highest by domestic students without prior tertiary studies and generic work ethic, which was scored highest by international students. Medical students identified a range of desirable personal qualities of a doctor which varied according to student characteristics, including their prior educational experience. Future research aiming to define such desirable qualities should include a broader range of stakeholders, including students at different training levels and institutions.

  6. A fast alignment method for breast MRI follow-up studies using automated breast segmentation and current-prior registration

    NASA Astrophysics Data System (ADS)

    Wang, Lei; Strehlow, Jan; Rühaak, Jan; Weiler, Florian; Diez, Yago; Gubern-Merida, Albert; Diekmann, Susanne; Laue, Hendrik; Hahn, Horst K.

    2015-03-01

    In breast cancer screening for high-risk women, follow-up magnetic resonance images (MRI) are acquired with a time interval ranging from several months up to a few years. Prior MRI studies may provide additional clinical value when examining the current one and thus have the potential to increase sensitivity and specificity of screening. To build a spatial correlation between suspicious findings in both current and prior studies, a reliable alignment method between follow-up studies is desirable. However, long time interval, different scanners and imaging protocols, and varying breast compression can result in a large deformation, which challenges the registration process. In this work, we present a fast and robust spatial alignment framework, which combines automated breast segmentation and current-prior registration techniques in a multi-level fashion. First, fully automatic breast segmentation is applied to extract the breast masks that are used to obtain an initial affine transform. Then, a non-rigid registration algorithm using normalized gradient fields as similarity measure together with curvature regularization is applied. A total of 29 subjects and 58 breast MR images were collected for performance assessment. To evaluate the global registration accuracy, the volume overlap and boundary surface distance metrics are calculated, resulting in an average Dice Similarity Coefficient (DSC) of 0.96 and root mean square distance (RMSD) of 1.64 mm. In addition, to measure local registration accuracy, for each subject a radiologist annotated 10 pairs of markers in the current and prior studies representing corresponding anatomical locations. The average distance error of marker pairs dropped from 67.37 mm to 10.86 mm after applying registration.

  7. Investigating different approaches to develop informative priors in hierarchical Bayesian safety performance functions.

    PubMed

    Yu, Rongjie; Abdel-Aty, Mohamed

    2013-07-01

    The Bayesian inference method has been frequently adopted to develop safety performance functions. One advantage of the Bayesian inference is that prior information for the independent variables can be included in the inference procedures. However, there are few studies that discussed how to formulate informative priors for the independent variables and evaluated the effects of incorporating informative priors in developing safety performance functions. This paper addresses this deficiency by introducing four approaches of developing informative priors for the independent variables based on historical data and expert experience. Merits of these informative priors have been tested along with two types of Bayesian hierarchical models (Poisson-gamma and Poisson-lognormal models). Deviance information criterion (DIC), R-square values, and coefficients of variance for the estimations were utilized as evaluation measures to select the best model(s). Comparison across the models indicated that the Poisson-gamma model is superior with a better model fit and it is much more robust with the informative priors. Moreover, the two-stage Bayesian updating informative priors provided the best goodness-of-fit and coefficient estimation accuracies. Furthermore, informative priors for the inverse dispersion parameter have also been introduced and tested. Different types of informative priors' effects on the model estimations and goodness-of-fit have been compared and concluded. Finally, based on the results, recommendations for future research topics and study applications have been made. Copyright © 2013 Elsevier Ltd. All rights reserved.

  8. Impact of prior cancer history on the overall survival of patients newly diagnosed with cancer: A pan-cancer analysis of the SEER database.

    PubMed

    Zhou, Huaqiang; Huang, Yan; Qiu, Zeting; Zhao, Hongyun; Fang, Wenfeng; Yang, Yunpeng; Zhao, Yuanyuan; Hou, Xue; Ma, Yuxiang; Hong, Shaodong; Zhou, Ting; Zhang, Yaxiong; Zhang, Li

    2018-04-18

    The population of cancer survivors with prior cancer is rapidly growing. Whether a prior cancer diagnosis interferes with outcome is unknown. We conducted a pan-cancer analysis to determine the impact of prior cancer history for patients newly diagnosed with cancer. We identified 20 types of primary solid tumors between 2004 and 2008 in the Surveillance, Epidemiology, and End Results database. Demographic and clinicopathologic variables were compared by χ 2 test and t-test as appropriate. The propensity score-adjusted Kaplan-Meier method and Cox proportional hazards models were used to evaluate the impact of prior cancer on overall survival (OS). Among 1,557,663 eligible patients, 261,474 (16.79%) had a history of prior cancer. More than 65% of prior cancers were diagnosed within 5 years. We classified 20 cancer sites into two groups (PCI and PCS) according to the different impacts of prior cancer on OS. PCI patients with a prior cancer history, which involved the colon and rectum, bone and soft tissues, melanoma, breast, cervix uteri, corpus and uterus, prostate, urinary bladder, kidney and renal pelvis, eye and orbits, thyroid, had inferior OS. The PCS patients (nasopharynx, esophagus, stomach, liver, gallbladder, pancreas, lung, ovary and brain) with a prior cancer history showed similar OS to that of patients without prior cancer. Our pan-cancer study presents the landscape for the survival impact of prior cancer across 20 cancer types. Compared to the patients without prior cancer, the PCI group had inferior OS, while the PCS group had similar OS. Further studies are still needed. © 2018 UICC.

  9. Effect of contrast enhancement prior to iteration procedure on image correction for soft x-ray projection microscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jamsranjav, Erdenetogtokh, E-mail: ja.erdenetogtokh@gmail.com; Shiina, Tatsuo, E-mail: shiina@faculity.chiba-u.jp; Kuge, Kenichi

    2016-01-28

    Soft X-ray microscopy is well recognized as a powerful tool of high-resolution imaging for hydrated biological specimens. Projection type of it has characteristics of easy zooming function, simple optical layout and so on. However the image is blurred by the diffraction of X-rays, leading the spatial resolution to be worse. In this study, the blurred images have been corrected by an iteration procedure, i.e., Fresnel and inverse Fresnel transformations repeated. This method was confirmed by earlier studies to be effective. Nevertheless it was not enough to some images showing too low contrast, especially at high magnification. In the present study,more » we tried a contrast enhancement method to make the diffraction fringes clearer prior to the iteration procedure. The method was effective to improve the images which were not successful by iteration procedure only.« less

  10. Student Characteristics, Prior Experiences, and the Perception of Mixed Methods as an Innovation

    ERIC Educational Resources Information Center

    Brown, Sydney E.

    2014-01-01

    There are persistent challenges to teaching mixed methods and innovative solutions are sought in order to address the needs of an increasingly diverse global audience seeking mixed methods instruction. This mixed methods study was conducted to gain insights to course design by more fully understanding the relationships among graduate student…

  11. A Simple Method for Estimating Informative Node Age Priors for the Fossil Calibration of Molecular Divergence Time Analyses

    PubMed Central

    Nowak, Michael D.; Smith, Andrew B.; Simpson, Carl; Zwickl, Derrick J.

    2013-01-01

    Molecular divergence time analyses often rely on the age of fossil lineages to calibrate node age estimates. Most divergence time analyses are now performed in a Bayesian framework, where fossil calibrations are incorporated as parametric prior probabilities on node ages. It is widely accepted that an ideal parameterization of such node age prior probabilities should be based on a comprehensive analysis of the fossil record of the clade of interest, but there is currently no generally applicable approach for calculating such informative priors. We provide here a simple and easily implemented method that employs fossil data to estimate the likely amount of missing history prior to the oldest fossil occurrence of a clade, which can be used to fit an informative parametric prior probability distribution on a node age. Specifically, our method uses the extant diversity and the stratigraphic distribution of fossil lineages confidently assigned to a clade to fit a branching model of lineage diversification. Conditioning this on a simple model of fossil preservation, we estimate the likely amount of missing history prior to the oldest fossil occurrence of a clade. The likelihood surface of missing history can then be translated into a parametric prior probability distribution on the age of the clade of interest. We show that the method performs well with simulated fossil distribution data, but that the likelihood surface of missing history can at times be too complex for the distribution-fitting algorithm employed by our software tool. An empirical example of the application of our method is performed to estimate echinoid node ages. A simulation-based sensitivity analysis using the echinoid data set shows that node age prior distributions estimated under poor preservation rates are significantly less informative than those estimated under high preservation rates. PMID:23755303

  12. Comparison of Biophysical Characteristics and Predicted Thermophysiological Responses of Three Prototype Body Armor Systems Versus Baseline U.S. Army Body Armor Systems

    DTIC Science & Technology

    2015-06-19

    effective and scientifically valid method of making comparisons of clothing and equipment changes prior to conducting human research. predictive modeling...valid method of making comparisons of clothing and equipment changes prior to conducting human research. 2 INTRODUCTION Modern day...clothing and equipment changes prior to conducting human research. METHODS Ensembles Three different body armor (BA) plus clothing ensembles were

  13. Figure-ground segmentation based on class-independent shape priors

    NASA Astrophysics Data System (ADS)

    Li, Yang; Liu, Yang; Liu, Guojun; Guo, Maozu

    2018-01-01

    We propose a method to generate figure-ground segmentation by incorporating shape priors into the graph-cuts algorithm. Given an image, we first obtain a linear representation of an image and then apply directional chamfer matching to generate class-independent, nonparametric shape priors, which provide shape clues for the graph-cuts algorithm. We then enforce shape priors in a graph-cuts energy function to produce object segmentation. In contrast to previous segmentation methods, the proposed method shares shape knowledge for different semantic classes and does not require class-specific model training. Therefore, the approach obtains high-quality segmentation for objects. We experimentally validate that the proposed method outperforms previous approaches using the challenging PASCAL VOC 2010/2012 and Berkeley (BSD300) segmentation datasets.

  14. A cautionary note on Bayesian estimation of population size by removal sampling with diffuse priors.

    PubMed

    Bord, Séverine; Bioche, Christèle; Druilhet, Pierre

    2018-05-01

    We consider the problem of estimating a population size by removal sampling when the sampling rate is unknown. Bayesian methods are now widespread and allow to include prior knowledge in the analysis. However, we show that Bayes estimates based on default improper priors lead to improper posteriors or infinite estimates. Similarly, weakly informative priors give unstable estimators that are sensitive to the choice of hyperparameters. By examining the likelihood, we show that population size estimates can be stabilized by penalizing small values of the sampling rate or large value of the population size. Based on theoretical results and simulation studies, we propose some recommendations on the choice of the prior. Then, we applied our results to real datasets. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Adaptive power priors with empirical Bayes for clinical trials.

    PubMed

    Gravestock, Isaac; Held, Leonhard

    2017-09-01

    Incorporating historical information into the design and analysis of a new clinical trial has been the subject of much discussion as a way to increase the feasibility of trials in situations where patients are difficult to recruit. The best method to include this data is not yet clear, especially in the case when few historical studies are available. This paper looks at the power prior technique afresh in a binomial setting and examines some previously unexamined properties, such as Box P values, bias, and coverage. Additionally, it proposes an empirical Bayes-type approach to estimating the prior weight parameter by marginal likelihood. This estimate has advantages over previously criticised methods in that it varies commensurably with differences in the historical and current data and can choose weights near 1 when the data are similar enough. Fully Bayesian approaches are also considered. An analysis of the operating characteristics shows that the adaptive methods work well and that the various approaches have different strengths and weaknesses. Copyright © 2017 John Wiley & Sons, Ltd.

  16. Incorporating Formative Assessment and Science Content into Elementary Science Methods--A Case Study

    ERIC Educational Resources Information Center

    Brower, Derek John

    2012-01-01

    Just as elementary students enter the science classroom with prior knowledge and experiences, so do preservice elementary teachers who enter the science methods classroom. Elementary science methods instructors recognize the challenges associated with preparing teachers for the science classroom. Two of these challenges include overcoming limited…

  17. Methodological Note: Analyzing Signs for Recognition & Feature Salience.

    ERIC Educational Resources Information Center

    Shyan, Melissa R.

    1985-01-01

    Presents a method to determine how signs in American Sign Language are recognized by signers. The method uses natural settings and avoids common artificialities found in prior work. A pilot study is described involving language research with Atlantic Bottlenose Dolphins in which the method was successfully used. (SED)

  18. A study of short test and charge retention test methods for nickel-cadmium spacecraft cells

    NASA Technical Reports Server (NTRS)

    Scott, W. R.

    1975-01-01

    Methods for testing nickel-cadmium cells for internal shorts and charge retention were studied. Included were (a) open circuit voltage decay after a brief charge, (b) open circuit voltage recovery after shorting, and (c) open circuit voltage decay and capacity loss after a full charge. The investigation included consideration of the effects of prior history, of conditioning cells prior to testing, and of various test method variables on the results of the tests. Sensitivity of the tests was calibrated in terms of equivalent external resistance. The results were correlated. It was shown that a large number of variables may affect the results of these tests. It is concluded that the voltage decay after a brief charge and the voltage recovery methods are more sensitive than the charged stand method, and can detect an internal short equivalent to a resistance of about (10,000/C)ohms where "C' is the numerical value of the capacity of the cell in ampere hours.

  19. Identifying moderators of the adherence-outcome relation in cognitive therapy for depression.

    PubMed

    Sasso, Katherine E; Strunk, Daniel R; Braun, Justin D; DeRubeis, Robert J; Brotman, Melissa A

    2015-10-01

    Little is known about the influence of patients' pretreatment characteristics on the adherence-outcome relation in cognitive therapy (CT) for depression. In a sample of 57 depressed adults participating in CT, the authors examined interactions between pretreatment patient characteristics and therapist adherence in predicting session-to-session symptom change. Using items from the Collaborative Study Psychotherapy Rating Scale, the authors assessed 3 facets of therapist adherence: cognitive methods, negotiating/structuring, and behavioral methods/homework. Two graduate students rated Sessions 1-4 for adherence. Symptoms were assessed prior to each session with the Beck Depression Inventory-II. Moderators were assessed as part of patients' intake evaluations. After correcting for multiple comparisons, patient gender remained a significant moderator of the relationship between cognitive methods and next-session symptom change; cognitive methods more strongly predicted greater symptom improvement for women as compared to men. Pretreatment anxiety and number of prior depressive episodes were significant moderators of the relationship between behavioral methods/homework and next-session symptom change, with greater behavioral methods/homework predicting symptom improvement more strongly among patients high in pretreatment anxiety and among patients with relatively few prior depressive episodes. This is the first study to provide evidence of how therapist adherence is differentially related to outcome among depressed patients with different characteristics. If replicated, these findings may inform clinical decisions regarding the use of specific facets of adherence in CT for depression with specific patients. (c) 2015 APA, all rights reserved).

  20. Effect of whole-body vibration exercise in a sitting position prior to therapy on muscle tone and upper extremity function in stroke patients.

    PubMed

    Boo, Jung-A; Moon, Sang-Hyun; Lee, Sun-Min; Choi, Jung-Hyun; Park, Si-Eun

    2016-01-01

    [Purpose] The purpose of this study was to determine the effect of whole-body vibration exercise in a sitting position prior to therapy in stroke patients. [Subjects and Methods] Fourteen chronic stroke patients were included in this study. Prior to occupational therapy, whole-body exercise was performed for 10 minutes, 5 times per week, for a total of 8 weeks. Muscle tone and upper extremity function were measured. The Modified Ashworth Scale (MAS) was used to measure muscle tone, and the Manual Function Test (MFT) and Fugl-Meyer Assessment scale (FugM) were used to measure upper extremity function. [Results] MAS score was significantly decreased, and MFT and FugM were significantly increased. [Conclusion] These results indicate that whole-body vibration exercise in a sitting position prior to therapy had a positive effect on muscle tone, and upper extremity function in stroke patients.

  1. Simulation Study of Effects of the Blind Deconvolution on Ultrasound Image

    NASA Astrophysics Data System (ADS)

    He, Xingwu; You, Junchen

    2018-03-01

    Ultrasonic image restoration is an essential subject in Medical Ultrasound Imaging. However, without enough and precise system knowledge, some traditional image restoration methods based on the system prior knowledge often fail to improve the image quality. In this paper, we use the simulated ultrasound image to find the effectiveness of the blind deconvolution method for ultrasound image restoration. Experimental results demonstrate that the blind deconvolution method can be applied to the ultrasound image restoration and achieve the satisfactory restoration results without the precise prior knowledge, compared with the traditional image restoration method. And with the inaccurate small initial PSF, the results shows blind deconvolution could improve the overall image quality of ultrasound images, like much better SNR and image resolution, and also show the time consumption of these methods. it has no significant increasing on GPU platform.

  2. Syndrome diagnosis: human intuition or machine intelligence?

    PubMed

    Braaten, Oivind; Friestad, Johannes

    2008-01-01

    The aim of this study was to investigate whether artificial intelligence methods can represent objective methods that are essential in syndrome diagnosis. Most syndromes have no external criterion standard of diagnosis. The predictive value of a clinical sign used in diagnosis is dependent on the prior probability of the syndrome diagnosis. Clinicians often misjudge the probabilities involved. Syndromology needs objective methods to ensure diagnostic consistency, and take prior probabilities into account. We applied two basic artificial intelligence methods to a database of machine-generated patients - a 'vector method' and a set method. As reference methods we ran an ID3 algorithm, a cluster analysis and a naive Bayes' calculation on the same patient series. The overall diagnostic error rate for the the vector algorithm was 0.93%, and for the ID3 0.97%. For the clinical signs found by the set method, the predictive values varied between 0.71 and 1.0. The artificial intelligence methods that we used, proved simple, robust and powerful, and represent objective diagnostic methods.

  3. Effects of Regularisation Priors and Anatomical Partial Volume Correction on Dynamic PET Data

    NASA Astrophysics Data System (ADS)

    Caldeira, Liliana L.; Silva, Nuno da; Scheins, Jürgen J.; Gaens, Michaela E.; Shah, N. Jon

    2015-08-01

    Dynamic PET provides temporal information about the tracer uptake. However, each PET frame has usually low statistics, resulting in noisy images. Furthermore, PET images suffer from partial volume effects. The goal of this study is to understand the effects of prior regularisation on dynamic PET data and subsequent anatomical partial volume correction. The Median Root Prior (MRP) regularisation method was used in this work during reconstruction. The quantification and noise in image-domain and time-domain (time-activity curves) as well as the impact on parametric images is assessed and compared with Ordinary Poisson Ordered Subset Expectation Maximisation (OP-OSEM) reconstruction with and without Gaussian filter. This study shows the improvement in PET images and time-activity curves (TAC) in terms of noise as well as in the parametric images when using prior regularisation in dynamic PET data. Anatomical partial volume correction improves the TAC and consequently, parametric images. Therefore, the use of MRP with anatomical partial volume correction is of interest for dynamic PET studies.

  4. Jellyfish Bioactive Compounds: Methods for Wet-Lab Work

    PubMed Central

    Frazão, Bárbara; Antunes, Agostinho

    2016-01-01

    The study of bioactive compounds from marine animals has provided, over time, an endless source of interesting molecules. Jellyfish are commonly targets of study due to their toxic proteins. However, there is a gap in reviewing successful wet-lab methods employed in these animals, which compromises the fast progress in the detection of related biomolecules. Here, we provide a compilation of the most effective wet-lab methodologies for jellyfish venom extraction prior to proteomic analysis—separation, identification and toxicity assays. This includes SDS-PAGE, 2DE, gel chromatography, HPLC, DEAE, LC-MS, MALDI, Western blot, hemolytic assay, antimicrobial assay and protease activity assay. For a more comprehensive approach, jellyfish toxicity studies should further consider transcriptome sequencing. We reviewed such methodologies and other genomic techniques used prior to the deep sequencing of transcripts, including RNA extraction, construction of cDNA libraries and RACE. Overall, we provide an overview of the most promising methods and their successful implementation for optimizing time and effort when studying jellyfish. PMID:27077869

  5. Jellyfish Bioactive Compounds: Methods for Wet-Lab Work.

    PubMed

    Frazão, Bárbara; Antunes, Agostinho

    2016-04-12

    The study of bioactive compounds from marine animals has provided, over time, an endless source of interesting molecules. Jellyfish are commonly targets of study due to their toxic proteins. However, there is a gap in reviewing successful wet-lab methods employed in these animals, which compromises the fast progress in the detection of related biomolecules. Here, we provide a compilation of the most effective wet-lab methodologies for jellyfish venom extraction prior to proteomic analysis-separation, identification and toxicity assays. This includes SDS-PAGE, 2DE, gel chromatography, HPLC, DEAE, LC-MS, MALDI, Western blot, hemolytic assay, antimicrobial assay and protease activity assay. For a more comprehensive approach, jellyfish toxicity studies should further consider transcriptome sequencing. We reviewed such methodologies and other genomic techniques used prior to the deep sequencing of transcripts, including RNA extraction, construction of cDNA libraries and RACE. Overall, we provide an overview of the most promising methods and their successful implementation for optimizing time and effort when studying jellyfish.

  6. A pseudo-discrete algebraic reconstruction technique (PDART) prior image-based suppression of high density artifacts in computed tomography

    NASA Astrophysics Data System (ADS)

    Pua, Rizza; Park, Miran; Wi, Sunhee; Cho, Seungryong

    2016-12-01

    We propose a hybrid metal artifact reduction (MAR) approach for computed tomography (CT) that is computationally more efficient than a fully iterative reconstruction method, but at the same time achieves superior image quality to the interpolation-based in-painting techniques. Our proposed MAR method, an image-based artifact subtraction approach, utilizes an intermediate prior image reconstructed via PDART to recover the background information underlying the high density objects. For comparison, prior images generated by total-variation minimization (TVM) algorithm, as a realization of fully iterative approach, were also utilized as intermediate images. From the simulation and real experimental results, it has been shown that PDART drastically accelerates the reconstruction to an acceptable quality of prior images. Incorporating PDART-reconstructed prior images in the proposed MAR scheme achieved higher quality images than those by a conventional in-painting method. Furthermore, the results were comparable to the fully iterative MAR that uses high-quality TVM prior images.

  7. Syndrome Diagnosis: Human Intuition or Machine Intelligence?

    PubMed Central

    Braaten, Øivind; Friestad, Johannes

    2008-01-01

    The aim of this study was to investigate whether artificial intelligence methods can represent objective methods that are essential in syndrome diagnosis. Most syndromes have no external criterion standard of diagnosis. The predictive value of a clinical sign used in diagnosis is dependent on the prior probability of the syndrome diagnosis. Clinicians often misjudge the probabilities involved. Syndromology needs objective methods to ensure diagnostic consistency, and take prior probabilities into account. We applied two basic artificial intelligence methods to a database of machine-generated patients - a ‘vector method’ and a set method. As reference methods we ran an ID3 algorithm, a cluster analysis and a naive Bayes’ calculation on the same patient series. The overall diagnostic error rate for the the vector algorithm was 0.93%, and for the ID3 0.97%. For the clinical signs found by the set method, the predictive values varied between 0.71 and 1.0. The artificial intelligence methods that we used, proved simple, robust and powerful, and represent objective diagnostic methods. PMID:19415142

  8. Life-course blood pressure in relation to brain volumes

    PubMed Central

    Power, Melinda C.; Schneider, Andrea L. C.; Wruck, Lisa; Griswold, Michael; Coker, Laura H.; Alonso, Alvaro; Jack, Clifford R.; Knopman, David; Mosley, Thomas H.; Gottesman, Rebecca F

    2016-01-01

    INTRODUCTION The impact of blood pressure on brain volumes may be time- or pattern-dependent. METHODS In 1678 participants from the Atherosclerosis Risk in Communities Neurocognitive Study, we quantified the association between measures and patterns of blood pressure over three time points (~24 or ~15 years prior and concurrent with neuroimaging) with late life brain volumes. RESULTS Higher diastolic blood pressure ~24 years prior, higher systolic and pulse pressure ~15 years prior, and consistently elevated or rising systolic blood pressure from ~15 years prior to concurrent with neuroimaging, but not blood pressures measured concurrent with neuroimaging, were associated with smaller volumes. The pattern of hypertension ~15 years prior and hypotension concurrent with neuroimaging was associated with smaller volumes in regions preferentially affected by Alzheimer’s disease (e.g., hippocampus: −0.27 standard units, 95%CI:−0.51,−0.03). DISCUSSION Hypertension 15 to 24 years prior is relevant to current brain volumes. Hypertension followed by hypotension appears particularly detrimental. PMID:27139841

  9. Selection of the effect size for sample size determination for a continuous response in a superiority clinical trial using a hybrid classical and Bayesian procedure.

    PubMed

    Ciarleglio, Maria M; Arendt, Christopher D; Peduzzi, Peter N

    2016-06-01

    When designing studies that have a continuous outcome as the primary endpoint, the hypothesized effect size ([Formula: see text]), that is, the hypothesized difference in means ([Formula: see text]) relative to the assumed variability of the endpoint ([Formula: see text]), plays an important role in sample size and power calculations. Point estimates for [Formula: see text] and [Formula: see text] are often calculated using historical data. However, the uncertainty in these estimates is rarely addressed. This article presents a hybrid classical and Bayesian procedure that formally integrates prior information on the distributions of [Formula: see text] and [Formula: see text] into the study's power calculation. Conditional expected power, which averages the traditional power curve using the prior distributions of [Formula: see text] and [Formula: see text] as the averaging weight, is used, and the value of [Formula: see text] is found that equates the prespecified frequentist power ([Formula: see text]) and the conditional expected power of the trial. This hypothesized effect size is then used in traditional sample size calculations when determining sample size for the study. The value of [Formula: see text] found using this method may be expressed as a function of the prior means of [Formula: see text] and [Formula: see text], [Formula: see text], and their prior standard deviations, [Formula: see text]. We show that the "naïve" estimate of the effect size, that is, the ratio of prior means, should be down-weighted to account for the variability in the parameters. An example is presented for designing a placebo-controlled clinical trial testing the antidepressant effect of alprazolam as monotherapy for major depression. Through this method, we are able to formally integrate prior information on the uncertainty and variability of both the treatment effect and the common standard deviation into the design of the study while maintaining a frequentist framework for the final analysis. Solving for the effect size which the study has a high probability of correctly detecting based on the available prior information on the difference [Formula: see text] and the standard deviation [Formula: see text] provides a valuable, substantiated estimate that can form the basis for discussion about the study's feasibility during the design phase. © The Author(s) 2016.

  10. Method of Real-Time Principal-Component Analysis

    NASA Technical Reports Server (NTRS)

    Duong, Tuan; Duong, Vu

    2005-01-01

    Dominant-element-based gradient descent and dynamic initial learning rate (DOGEDYN) is a method of sequential principal-component analysis (PCA) that is well suited for such applications as data compression and extraction of features from sets of data. In comparison with a prior method of gradient-descent-based sequential PCA, this method offers a greater rate of learning convergence. Like the prior method, DOGEDYN can be implemented in software. However, the main advantage of DOGEDYN over the prior method lies in the facts that it requires less computation and can be implemented in simpler hardware. It should be possible to implement DOGEDYN in compact, low-power, very-large-scale integrated (VLSI) circuitry that could process data in real time.

  11. Brain Magnetic Resonance Immediately Prior To Surgery In Single Ventricles and Surgical Postponement

    PubMed Central

    Fogel, Mark A.; Pawlowski, Tom; Schwab, Peter J.; Nicolson, Susan C.; Montenegro, Lisa M.; Berenstein, Laura Diaz; Spray, Thomas L.; Gaynor, J William; Fuller, Stephanie; Keller, Marc S.; Harris, Matthew A.; Whitehead, Kevin K.; Vossough, Arastoo; Licht, Daniel J.

    2014-01-01

    Background Single ventricle patients undergoing surgical reconstruction experience a high rate of brain injury; incidental findings on pre-operative brain scans may result in safety considerations involving hemorrhage extension during cardiopulmonary bypass that result in surgical postponement. Methods Single ventricle patients were studied with brain scans immediately preoperatively as part of a National Institute of Health study and were reviewed by neuroradiology immediately prior to cardiopulmonary bypass. Results One hundred and thirty four consecutive subjects recruited into the project were studied: 33 prior to stage I (3.7±1.8 days), 34 prior to bidirectional Glenn (5.8±3.5 months) and 67 prior to Fontan (3.3±1.1 years). Six (4.5%) surgeries were postponed because of concerning imaging findings on brain MRI; 2 prior to stage I, 3 prior to bidirectional Glenn and 1 prior to Fontan. Five were due to unexpected incidental findings of acute intracranial hemorrhage and one due to diffuse cerebellar cytotoxic edema; none who proceeded to surgery had these lesions. Prematurity as well as genetic syndromes were not present in any with postponed surgery. Four of 4 prior to bidirectional Glenn/Fontan with surgical delays had hypoplastic left heart syndrome compared with 44/97 who did not (P=0.048). After observation and follow up, all eventually had successful surgeries with bypass. Conclusion Preoperative brain MRI performed in children with single ventricles disclosed injuries in 4.5% leading to surgical delay; hemorrhagic lesions were most common and raised concerns for extension during surgery. The true risk of progression and need for delay of surgery due to heparinization associated with these lesions remains uncertain. PMID:25149046

  12. Nonparametric Bayesian models for a spatial covariance.

    PubMed

    Reich, Brian J; Fuentes, Montserrat

    2012-01-01

    A crucial step in the analysis of spatial data is to estimate the spatial correlation function that determines the relationship between a spatial process at two locations. The standard approach to selecting the appropriate correlation function is to use prior knowledge or exploratory analysis, such as a variogram analysis, to select the correct parametric correlation function. Rather that selecting a particular parametric correlation function, we treat the covariance function as an unknown function to be estimated from the data. We propose a flexible prior for the correlation function to provide robustness to the choice of correlation function. We specify the prior for the correlation function using spectral methods and the Dirichlet process prior, which is a common prior for an unknown distribution function. Our model does not require Gaussian data or spatial locations on a regular grid. The approach is demonstrated using a simulation study as well as an analysis of California air pollution data.

  13. Impact of Prior Cancer on Eligibility for Lung Cancer Clinical Trials

    PubMed Central

    Laccetti, Andrew L.; Xuan, Lei; Halm, Ethan A.; Pruitt, Sandi L.

    2014-01-01

    Background In oncology clinical trials, the assumption that a prior cancer diagnosis could interfere with study conduct or outcomes results in frequent exclusion of such patients. We determined the prevalence and characteristics of this practice in lung cancer clinical trials and estimated impact on trial accrual. Methods We reviewed lung cancer clinical trials sponsored or endorsed by the Eastern Oncology Cooperative Group for exclusion criteria related to a prior cancer diagnosis. We estimated prevalence of prior primary cancer diagnoses among lung cancer patients using Surveillance Epidemiology and End Results (SEER)-Medicare linked data. We assessed the association between trial characteristics and prior cancer exclusion using chi-square analysis. All statistical tests were two-sided. Results Fifty-one clinical trials (target enrollment 13072 patients) were included. Forty-one (80%) excluded patients with a prior cancer diagnosis as follows: any prior (14%), within five years (43%), within two or three years (7%), or active cancer (16%). In SEER-Medicare data (n = 210509), 56% of prior cancers were diagnosed within five years before the lung cancer diagnosis. Across trials, the estimated number and proportion of patients excluded because of prior cancer ranged from 0–207 and 0%-18%. Prior cancer was excluded in 94% of trials with survival primary endpoints and 73% of trials with nonsurvival primary endpoints (P = .06). Conclusions A substantial proportion of patients are reflexively excluded from lung cancer clinical trials because of prior cancer. This inclusion criterion is applied widely across studies, including more than two-thirds of trials with nonsurvival endpoints. More research is needed to understand the basis and ramifications of this exclusion policy. PMID:25253615

  14. Joint Prior Learning for Visual Sensor Network Noisy Image Super-Resolution

    PubMed Central

    Yue, Bo; Wang, Shuang; Liang, Xuefeng; Jiao, Licheng; Xu, Caijin

    2016-01-01

    The visual sensor network (VSN), a new type of wireless sensor network composed of low-cost wireless camera nodes, is being applied for numerous complex visual analyses in wild environments, such as visual surveillance, object recognition, etc. However, the captured images/videos are often low resolution with noise. Such visual data cannot be directly delivered to the advanced visual analysis. In this paper, we propose a joint-prior image super-resolution (JPISR) method using expectation maximization (EM) algorithm to improve VSN image quality. Unlike conventional methods that only focus on upscaling images, JPISR alternatively solves upscaling mapping and denoising in the E-step and M-step. To meet the requirement of the M-step, we introduce a novel non-local group-sparsity image filtering method to learn the explicit prior and induce the geometric duality between images to learn the implicit prior. The EM algorithm inherently combines the explicit prior and implicit prior by joint learning. Moreover, JPISR does not rely on large external datasets for training, which is much more practical in a VSN. Extensive experiments show that JPISR outperforms five state-of-the-art methods in terms of both PSNR, SSIM and visual perception. PMID:26927114

  15. Unified approach for extrapolation and bridging of adult information in early-phase dose-finding paediatric studies.

    PubMed

    Petit, Caroline; Samson, Adeline; Morita, Satoshi; Ursino, Moreno; Guedj, Jérémie; Jullien, Vincent; Comets, Emmanuelle; Zohar, Sarah

    2018-06-01

    The number of trials conducted and the number of patients per trial are typically small in paediatric clinical studies. This is due to ethical constraints and the complexity of the medical process for treating children. While incorporating prior knowledge from adults may be extremely valuable, this must be done carefully. In this paper, we propose a unified method for designing and analysing dose-finding trials in paediatrics, while bridging information from adults. The dose-range is calculated under three extrapolation options, linear, allometry and maturation adjustment, using adult pharmacokinetic data. To do this, it is assumed that target exposures are the same in both populations. The working model and prior distribution parameters of the dose-toxicity and dose-efficacy relationships are obtained using early-phase adult toxicity and efficacy data at several dose levels. Priors are integrated into the dose-finding process through Bayesian model selection or adaptive priors. This calibrates the model to adjust for misspecification, if the adult and pediatric data are very different. We performed a simulation study which indicates that incorporating prior adult information in this way may improve dose selection in children.

  16. Prior knowledge driven Granger causality analysis on gene regulatory network discovery

    DOE PAGES

    Yao, Shun; Yoo, Shinjae; Yu, Dantong

    2015-08-28

    Our study focuses on discovering gene regulatory networks from time series gene expression data using the Granger causality (GC) model. However, the number of available time points (T) usually is much smaller than the number of target genes (n) in biological datasets. The widely applied pairwise GC model (PGC) and other regularization strategies can lead to a significant number of false identifications when n>>T. In this study, we proposed a new method, viz., CGC-2SPR (CGC using two-step prior Ridge regularization) to resolve the problem by incorporating prior biological knowledge about a target gene data set. In our simulation experiments, themore » propose new methodology CGC-2SPR showed significant performance improvement in terms of accuracy over other widely used GC modeling (PGC, Ridge and Lasso) and MI-based (MRNET and ARACNE) methods. In addition, we applied CGC-2SPR to a real biological dataset, i.e., the yeast metabolic cycle, and discovered more true positive edges with CGC-2SPR than with the other existing methods. In our research, we noticed a “ 1+1>2” effect when we combined prior knowledge and gene expression data to discover regulatory networks. Based on causality networks, we made a functional prediction that the Abm1 gene (its functions previously were unknown) might be related to the yeast’s responses to different levels of glucose. In conclusion, our research improves causality modeling by combining heterogeneous knowledge, which is well aligned with the future direction in system biology. Furthermore, we proposed a method of Monte Carlo significance estimation (MCSE) to calculate the edge significances which provide statistical meanings to the discovered causality networks. All of our data and source codes will be available under the link https://bitbucket.org/dtyu/granger-causality/wiki/Home.« less

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dogan, N; Padgett, K; Evans, J

    Purpose: Adaptive Radiotherapy (ART) with frequent CT imaging has been used to improve dosimetric accuracy by accounting for anatomical variations, such as primary tumor shrinkage and/or body weight loss, in Head and Neck (H&N) patients. In most ART strategies, the difference between the planned and the delivered dose is estimated by generating new plans on repeated CT scans using dose-volume constraints used with the initial planning CT without considering already delivered dose. The aim of this study was to assess the dosimetric gains achieved by re-planning based on prior dose by comparing them to re-planning not based-on prior dose formore » H&N patients. Methods: Ten locally-advanced H&N cancer patients were selected for this study. For each patient, six weekly CT imaging were acquired during the course of radiotherapy. PTVs, parotids, cord, brainstem, and esophagus were contoured on both planning and six weekly CT images. ART with weekly re-plans were done by two strategies: 1) Generating a new optimized IMRT plan without including prior dose from previous fractions (NoPriorDose) and 2) Generating a new optimized IMRT plan based on the prior dose given from previous fractions (PriorDose). Deformable image registration was used to accumulate the dose distributions between planning and six weekly CT scans. The differences in accumulated doses for both strategies were evaluated using the DVH constraints for all structures. Results: On average, the differences in accumulated doses for PTV1, PTV2 and PTV3 for NoPriorDose and PriorDose strategies were <2%. The differences in Dmean to the cord and brainstem were within 3%. The esophagus Dmean was reduced by 2% using PriorDose. PriorDose strategy, however, reduced the left parotid D50 and Dmean by 15% and 14% respectively. Conclusion: This study demonstrated significant parotid sparing, potentially reducing xerostomia, by using ART with IMRT optimization based on prior dose for weekly re-planning of H&N cancer patients.« less

  18. Prior robust empirical Bayes inference for large-scale data by conditioning on rank with application to microarray data

    PubMed Central

    Liao, J. G.; Mcmurry, Timothy; Berg, Arthur

    2014-01-01

    Empirical Bayes methods have been extensively used for microarray data analysis by modeling the large number of unknown parameters as random effects. Empirical Bayes allows borrowing information across genes and can automatically adjust for multiple testing and selection bias. However, the standard empirical Bayes model can perform poorly if the assumed working prior deviates from the true prior. This paper proposes a new rank-conditioned inference in which the shrinkage and confidence intervals are based on the distribution of the error conditioned on rank of the data. Our approach is in contrast to a Bayesian posterior, which conditions on the data themselves. The new method is almost as efficient as standard Bayesian methods when the working prior is close to the true prior, and it is much more robust when the working prior is not close. In addition, it allows a more accurate (but also more complex) non-parametric estimate of the prior to be easily incorporated, resulting in improved inference. The new method’s prior robustness is demonstrated via simulation experiments. Application to a breast cancer gene expression microarray dataset is presented. Our R package rank.Shrinkage provides a ready-to-use implementation of the proposed methodology. PMID:23934072

  19. Separation and reconstruction of BCG and EEG signals during continuous EEG and fMRI recordings

    PubMed Central

    Xia, Hongjing; Ruan, Dan; Cohen, Mark S.

    2014-01-01

    Despite considerable effort to remove it, the ballistocardiogram (BCG) remains a major artifact in electroencephalographic data (EEG) acquired inside magnetic resonance imaging (MRI) scanners, particularly in continuous (as opposed to event-related) recordings. In this study, we have developed a new Direct Recording Prior Encoding (DRPE) method to extract and separate the BCG and EEG components from contaminated signals, and have demonstrated its performance by comparing it quantitatively to the popular Optimal Basis Set (OBS) method. Our modified recording configuration allows us to obtain representative bases of the BCG- and EEG-only signals. Further, we have developed an optimization-based reconstruction approach to maximally incorporate prior knowledge of the BCG/EEG subspaces, and of the signal characteristics within them. Both OBS and DRPE methods were tested with experimental data, and compared quantitatively using cross-validation. In the challenging continuous EEG studies, DRPE outperforms the OBS method by nearly sevenfold in separating the continuous BCG and EEG signals. PMID:25002836

  20. Age estimation by assessment of pulp chamber volume: a Bayesian network for the evaluation of dental evidence.

    PubMed

    Sironi, Emanuele; Taroni, Franco; Baldinotti, Claudio; Nardi, Cosimo; Norelli, Gian-Aristide; Gallidabino, Matteo; Pinchi, Vilma

    2017-11-14

    The present study aimed to investigate the performance of a Bayesian method in the evaluation of dental age-related evidence collected by means of a geometrical approximation procedure of the pulp chamber volume. Measurement of this volume was based on three-dimensional cone beam computed tomography images. The Bayesian method was applied by means of a probabilistic graphical model, namely a Bayesian network. Performance of that method was investigated in terms of accuracy and bias of the decisional outcomes. Influence of an informed elicitation of the prior belief of chronological age was also studied by means of a sensitivity analysis. Outcomes in terms of accuracy were adequate with standard requirements for forensic adult age estimation. Findings also indicated that the Bayesian method does not show a particular tendency towards under- or overestimation of the age variable. Outcomes of the sensitivity analysis showed that results on estimation are improved with a ration elicitation of the prior probabilities of age.

  1. l0 regularization based on a prior image incorporated non-local means for limited-angle X-ray CT reconstruction.

    PubMed

    Zhang, Lingli; Zeng, Li; Guo, Yumeng

    2018-01-01

    Restricted by the scanning environment in some CT imaging modalities, the acquired projection data are usually incomplete, which may lead to a limited-angle reconstruction problem. Thus, image quality usually suffers from the slope artifacts. The objective of this study is to first investigate the distorted domains of the reconstructed images which encounter the slope artifacts and then present a new iterative reconstruction method to address the limited-angle X-ray CT reconstruction problem. The presented framework of new method exploits the structural similarity between the prior image and the reconstructed image aiming to compensate the distorted edges. Specifically, the new method utilizes l0 regularization and wavelet tight framelets to suppress the slope artifacts and pursue the sparsity. New method includes following 4 steps to (1) address the data fidelity using SART; (2) compensate for the slope artifacts due to the missed projection data using the prior image and modified nonlocal means (PNLM); (3) utilize l0 regularization to suppress the slope artifacts and pursue the sparsity of wavelet coefficients of the transformed image by using iterative hard thresholding (l0W); and (4) apply an inverse wavelet transform to reconstruct image. In summary, this method is referred to as "l0W-PNLM". Numerical implementations showed that the presented l0W-PNLM was superior to suppress the slope artifacts while preserving the edges of some features as compared to the commercial and other popular investigative algorithms. When the image to be reconstructed is inconsistent with the prior image, the new method can avoid or minimize the distorted edges in the reconstructed images. Quantitative assessments also showed that applying the new method obtained the highest image quality comparing to the existing algorithms. This study demonstrated that the presented l0W-PNLM yielded higher image quality due to a number of unique characteristics, which include that (1) it utilizes the structural similarity between the reconstructed image and prior image to modify the distorted edges by slope artifacts; (2) it adopts wavelet tight frames to obtain the first and high derivative in several directions and levels; and (3) it takes advantage of l0 regularization to promote the sparsity of wavelet coefficients, which is effective for the inhibition of the slope artifacts. Therefore, the new method can address the limited-angle CT reconstruction problem effectively and have practical significance.

  2. A new approach for reducing beam hardening artifacts in polychromatic X-ray computed tomography using more accurate prior image.

    PubMed

    Wang, Hui; Xu, Yanan; Shi, Hongli

    2018-03-15

    Metal artifacts severely degrade CT image quality in clinical diagnosis, which are difficult to removed, especially for the beam hardening artifacts. The metal artifact reduction (MAR) based on prior images are the most frequently-used methods. However, there exists a lot misclassification in most prior images caused by absence of prior information such as spectrum distribution of X-ray beam source, especially when multiple or big metal are included. This work aims is to identify a more accurate prior image to improve image quality. The proposed method includes four steps. First, the metal image is segmented by thresholding an initial image, where the metal traces are identified in the initial projection data using the forward projection of the metal image. Second, the accurate absorbent model of certain metal image is calculated according to the spectrum distribution of certain X-ray beam source and energy-dependent attenuation coefficients of metal. Third, a new metal image is reconstructed by the general analytical reconstruction algorithm such as filtered back projection (FPB). The prior image is obtained by segmenting the difference image between the initial image and the new metal image into air, tissue and bone. Fourth, the initial projection data are normalized by dividing the projection data of prior image pixel to pixel. The final corrected image is obtained by interpolation, denormalization and reconstruction. Several clinical images with dental fillings and knee prostheses were used to evaluate the proposed algorithm and normalized metal artifact reduction (NMAR) and linear interpolation (LI) method. The results demonstrate the artifacts were reduced efficiently by the proposed method. The proposed method could obtain an exact prior image using the prior information about X-ray beam source and energy-dependent attenuation coefficients of metal. As a result, better performance of reducing beam hardening artifacts can be achieved. Moreover, the process of the proposed method is rather simple and little extra calculation burden is necessary. It has superiorities over other algorithms when include multiple and/or big implants.

  3. Shape-driven 3D segmentation using spherical wavelets.

    PubMed

    Nain, Delphine; Haker, Steven; Bobick, Aaron; Tannenbaum, Allen

    2006-01-01

    This paper presents a novel active surface segmentation algorithm using a multiscale shape representation and prior. We define a parametric model of a surface using spherical wavelet functions and learn a prior probability distribution over the wavelet coefficients to model shape variations at different scales and spatial locations in a training set. Based on this representation, we derive a parametric active surface evolution using the multiscale prior coefficients as parameters for our optimization procedure to naturally include the prior in the segmentation framework. Additionally, the optimization method can be applied in a coarse-to-fine manner. We apply our algorithm to the segmentation of brain caudate nucleus, of interest in the study of schizophrenia. Our validation shows our algorithm is computationally efficient and outperforms the Active Shape Model algorithm by capturing finer shape details.

  4. A saltwater flotation technique to identify unincubated eggs

    USGS Publications Warehouse

    Devney, C.A.; Kondrad, S.L.; Stebbins, K.R.; Brittingham, K.D.; Hoffman, D.J.; Heinz, G.H.

    2009-01-01

    Field studies on nesting birds sometimes involve questions related to nest initiation dates, length of the incubation period, or changes in parental incubation behavior during various stages of incubation. Some of this information can be best assessed when a nest is discovered before the eggs have undergone any incubation, and this has traditionally been assessed by floating eggs in freshwater. However, because the freshwater method is not particularly accurate in identifying unincubated eggs, we developed a more reliable saltwater flotation method. The saltwater method involves diluting a saturated saltwater solution with freshwater until a salt concentration is reached where unincubated eggs sink to the bottom and incubated eggs float to the surface. For Laughing Gulls (Leucophaeus atricilla), floating eggs in freshwater failed to identify 39.0% (N = 251) of eggs that were subsequently found by candling to have undergone incubation prior to collection. By contrast, in a separate collection of gull eggs, no eggs that passed the saltwater test (N = 225) were found by a later candling to have been incubated prior to collection. For Double-crested Cormorants (Phalacrocorax auritus), floating eggs in freshwater failed to identify 15.6% (N = 250) of eggs that had undergone incubation prior to collection, whereas in a separate collection, none of the eggs that passed the saltwater test (N = 85) were found by a later candling to have been incubated prior to collection. Immersion of eggs in saltwater did not affect embryo survival. Although use of the saltwater method is likely limited to colonial species and requires calibrating a saltwater solution, it is a faster and more accurate method of identifying unincubated eggs than the traditional method of floating eggs in freshwater.

  5. Intrarectal ice application prior to transrectal prostate biopsy: a prospective randomised trial accessing pain and collateral effects

    PubMed Central

    Çaliskan, Baris; Mutlu, Nazim

    2015-01-01

    Objectives To analyze the efficacy of intrarectal ice application as an anesthetic method prior to transrectal ultrasound (TRUS) guided prostate biopsy. Materials and Methods A total of 120 consecutive men were included into the study prospectively. Patients were equally randomized as group 1 and 2 with 60 patients each. Ice was applied as an anesthetic method 5 minutes before procedure to the patients in group 1. Patients in group 2 were applied 10 ml of 2% lidocaine gel 10 minutes before procedure. Twelve core biopsy procedure was performed for all patients. The pain level was evaluated using a visual analogue scale (VAS). Results Median pain score was 3.5 (1-8) in group 1 and 5 (1-8) in group 2. There is significantly difference between groups regarding the mean sense of pain level during the procedure. (p=0.007) There was also no difference in complications between two groups about presence and duration of macroscopic hematuria and rectal bleeding. Conclusions Intrarectal ice application prior to TRUS prostate biopsy has an effect on reducing pain. Development of new techniques about cold effect or ice can make this method more useful and decrease complication rates. PMID:25928515

  6. Prior Learning Assessment: How Institutions Use Portfolio Assessments

    ERIC Educational Resources Information Center

    Klein-Collins, Becky; Hain, Patrick

    2009-01-01

    The term Prior Learning Assessment (PLA) refers not to a single kind of assessment but rather an entire family of assessment methods that can be used by institutions. Some of these methods are exam-based. In addition, there are other methods of PLA. One of the more innovative methods of offering PLA, however, is through the development of student…

  7. Limited-angle multi-energy CT using joint clustering prior and sparsity regularization

    NASA Astrophysics Data System (ADS)

    Zhang, Huayu; Xing, Yuxiang

    2016-03-01

    In this article, we present an easy-to-implement Multi-energy CT scanning strategy and a corresponding reconstruction method, which facilitate spectral CT imaging by improving the data efficiency the number-of-energy- channel fold without introducing visible limited-angle artifacts caused by reducing projection views. Leveraging the structure coherence at different energies, we first pre-reconstruct a prior structure information image using projection data from all energy channels. Then, we perform a k-means clustering on the prior image to generate a sparse dictionary representation for the image, which severs as a structure information constraint. We com- bine this constraint with conventional compressed sensing method and proposed a new model which we referred as Joint Clustering Prior and Sparsity Regularization (CPSR). CPSR is a convex problem and we solve it by Alternating Direction Method of Multipliers (ADMM). We verify our CPSR reconstruction method with a numerical simulation experiment. A dental phantom with complicate structures of teeth and soft tissues is used. X-ray beams from three spectra of different peak energies (120kVp, 90kVp, 60kVp) irradiate the phantom to form tri-energy projections. Projection data covering only 75◦ from each energy spectrum are collected for reconstruction. Independent reconstruction for each energy will cause severe limited-angle artifacts even with the help of compressed sensing approaches. Our CPSR provides us with images free of the limited-angle artifact. All edge details are well preserved in our experimental study.

  8. How patient educators help students to learn: An exploratory study.

    PubMed

    Cheng, Phoebe T M; Towle, Angela

    2017-03-01

    Benefits of the active involvement of patients in educating health professionals are well-recognized but little is known about how patient educators facilitate student learning. This exploratory qualitative study investigated the teaching practices and experiences that prepared patient educators for their roles in a longitudinal interprofessional Health Mentors program. Semi-structured interviews were conducted with eleven experienced health mentors. Responses were coded and analyzed for themes related to teaching goals, methods, and prior experiences. Mentors used a rich variety of teaching methods to teach patient-centeredness and interprofessionalism, categorized as: telling my story, stimulating reflection, sharing perspectives, and problem-solving. As educators they drew on a variety of prior experiences with teaching, facilitation or public speaking and long-term interactions with the health-care system. Patient educators use diverse teaching methods, drawing on both individualistic and social perspectives on learning. A peer-support model of training and support would help maintain the authenticity of patients as educators. The study highlights inadequacies of current learning theories to explain how patients help students learn.

  9. Quantitative assessments of arousal by analyzing microsaccade rates and pupil fluctuations prior to slow eye movements.

    PubMed

    Honda, Shogo; Kohama, Takeshi; Tanaka, Tatsuro; Yoshida, Hisashi

    2014-01-01

    It is well known that a decline of arousal level causes of poor performance of movements or judgments. Our previous study indicates that microsaccade (MS) rates and pupil fluctuations change before slow eye movements (SEMs) (Honda et al. 2013). However, SEM detection of this study was obscure and insufficient. In this study, we propose a new SEM detection method and analyze MS rates and pupil fluctuations while subjects maintain their gaze on a target. We modified Shin et al.'s method, which is optimized for EOG (electrooculography) signals, to extract the period of sustaining SEMs using a general eye tracker. After SEM detection, we analyzed MS rates and pupil fluctuations prior to the initiation of SEMs. As a result, we were able to detect SEMs more precisely than in our previous study. Moreover, the results of eye movements and pupil fluctuations analyses show that gradual rise of MS rate and longitudinal miosis are observed prior to the initiation of SEMs, which is consistent with our previous study. These findings suggest that monitoring eye movements and pupil fluctuations may evaluate the arousal level more precisely. Further, we found that these tendencies become more significant when they are restricted to the initial SEMs.

  10. The Impact of the Tree Prior on Molecular Dating of Data Sets Containing a Mixture of Inter- and Intraspecies Sampling.

    PubMed

    Ritchie, Andrew M; Lo, Nathan; Ho, Simon Y W

    2017-05-01

    In Bayesian phylogenetic analyses of genetic data, prior probability distributions need to be specified for the model parameters, including the tree. When Bayesian methods are used for molecular dating, available tree priors include those designed for species-level data, such as the pure-birth and birth-death priors, and coalescent-based priors designed for population-level data. However, molecular dating methods are frequently applied to data sets that include multiple individuals across multiple species. Such data sets violate the assumptions of both the speciation and coalescent-based tree priors, making it unclear which should be chosen and whether this choice can affect the estimation of node times. To investigate this problem, we used a simulation approach to produce data sets with different proportions of within- and between-species sampling under the multispecies coalescent model. These data sets were then analyzed under pure-birth, birth-death, constant-size coalescent, and skyline coalescent tree priors. We also explored the ability of Bayesian model testing to select the best-performing priors. We confirmed the applicability of our results to empirical data sets from cetaceans, phocids, and coregonid whitefish. Estimates of node times were generally robust to the choice of tree prior, but some combinations of tree priors and sampling schemes led to large differences in the age estimates. In particular, the pure-birth tree prior frequently led to inaccurate estimates for data sets containing a mixture of inter- and intraspecific sampling, whereas the birth-death and skyline coalescent priors produced stable results across all scenarios. Model testing provided an adequate means of rejecting inappropriate tree priors. Our results suggest that tree priors do not strongly affect Bayesian molecular dating results in most cases, even when severely misspecified. However, the choice of tree prior can be significant for the accuracy of dating results in the case of data sets with mixed inter- and intraspecies sampling. [Bayesian phylogenetic methods; model testing; molecular dating; node time; tree prior.]. © The authors 2016. Published by Oxford University Press, on behalf of the Society of Systematic Biologists. All rights reserved. For permissions, please e-mail: journals.permission@oup.com.

  11. Low-Resolution Raman-Spectroscopy Combustion Thermometry

    NASA Technical Reports Server (NTRS)

    Nguyen, Quang-Viet; Kojima, Jun

    2008-01-01

    A method of optical thermometry, now undergoing development, involves low-resolution measurement of the spectrum of spontaneous Raman scattering (SRS) from N2 and O2 molecules. The method is especially suitable for measuring temperatures in high pressure combustion environments that contain N2, O2, or N2/O2 mixtures (including air). Methods based on SRS (in which scattered light is shifted in wavelength by amounts that depend on vibrational and rotational energy levels of laser-illuminated molecules) have been popular means of probing flames because they are almost the only methods that provide spatially and temporally resolved concentrations and temperatures of multiple molecular species in turbulent combustion. The present SRS-based method differs from prior SRS-based methods that have various drawbacks, a description of which would exceed the scope of this article. Two main differences between this and prior SRS-based methods are that it involves analysis in the frequency (equivalently, wavelength) domain, in contradistinction to analysis in the intensity domain in prior methods; and it involves low-resolution measurement of what amounts to predominantly the rotational Raman spectra of N2 and O2, in contradistinction to higher-resolution measurement of the vibrational Raman spectrum of N2 only in prior methods.

  12. Trunk muscle activation during golf swing: Baseline and threshold.

    PubMed

    Silva, Luís; Marta, Sérgio; Vaz, João; Fernandes, Orlando; Castro, Maria António; Pezarat-Correia, Pedro

    2013-10-01

    There is a lack of studies regarding EMG temporal analysis during dynamic and complex motor tasks, such as golf swing. The aim of this study is to analyze the EMG onset during the golf swing, by comparing two different threshold methods. Method A threshold was determined using the baseline activity recorded between two maximum voluntary contraction (MVC). Method B threshold was calculated using the mean EMG activity for 1000ms before the 500ms prior to the start of the Backswing. Two different clubs were also studied. Three-way repeated measures ANOVA was used to compare methods, muscles and clubs. Two-way mixed Intraclass Correlation Coefficient (ICC) with absolute agreement was used to determine the methods reliability. Club type usage showed no influence in onset detection. Rectus abdominis (RA) showed the higher agreement between methods. Erector spinae (ES), on the other hand, showed a very low agreement, that might be related to postural activity before the swing. External oblique (EO) is the first being activated, at 1295ms prior impact. There is a similar activation time between right and left muscles sides, although the right EO showed better agreement between methods than left side. Therefore, the algorithms usage is task- and muscle-dependent. Copyright © 2013 Elsevier Ltd. All rights reserved.

  13. Development and validation of an algorithm to complete colonoscopy using standard endoscopes in patients with prior incomplete colonoscopy

    PubMed Central

    Rogers, Melinda C.; Gawron, Andrew; Grande, David; Keswani, Rajesh N.

    2017-01-01

    Background and study aims  Incomplete colonoscopy may occur as a result of colon angulation (adhesions or diverticulosis), endoscope looping, or both. Specialty endoscopes/devices have been shown to successfully complete prior incomplete colonoscopies, but may not be widely available. Radiographic or other image-based evaluations have been shown to be effective but may miss small or flat lesions, and colonoscopy is often still indicated if a large lesion is identified. The purpose of this study was to develop and validate an algorithm to determine the optimum endoscope to ensure completion of the examination in patients with prior incomplete colonoscopy. Patients and methods  This was a prospective cohort study of 175 patients with prior incomplete colonoscopy who were referred to a single endoscopist at a single academic medical center over a 3-year period from 2012 through 2015. Colonoscopy outcomes from the initial 50 patients were used to develop an algorithm to determine the optimal standard endoscope and technique to achieve cecal intubation. The algorithm was validated on the subsequent 125 patients. Results  The overall repeat colonoscopy success rate using a standard endoscope was 94 %. The initial standard endoscope specified by the algorithm was used and completed the colonoscopy in 90 % of patients. Conclusions  This study identifies an effective strategy for completing colonoscopy in patients with prior incomplete examination, using widely available standard endoscopes and an algorithm based on patient characteristics and reasons for prior incomplete colonoscopy. PMID:28924595

  14. Evaluation of beef trim sampling methods for detection of Shiga toxin-producing Escherichia coli (STEC)

    USDA-ARS?s Scientific Manuscript database

    Presence of Shiga toxin-producing Escherichia coli (STEC) is a major concern in ground beef. Several methods for sampling beef trim prior to grinding are currently used in the beef industry. The purpose of this study was to determine the efficacy of the sampling methods for detecting STEC in beef ...

  15. Comparison of the Efficiency of Two Flashcard Drill Methods on Children's Reading Performance

    ERIC Educational Resources Information Center

    Joseph, Laurice; Eveleigh, Elisha; Konrad, Moira; Neef, Nancy; Volpe, Robert

    2012-01-01

    The purpose of this study was to extend prior flashcard drill and practice research by holding instructional time constant and allowing learning trials to vary. Specifically, the authors aimed to determine whether an incremental rehearsal method or a traditional drill and practice method was most efficient in helping 5 first-grade children read,…

  16. Automated Probabilistic Reconstruction of White-Matter Pathways in Health and Disease Using an Atlas of the Underlying Anatomy

    PubMed Central

    Yendiki, Anastasia; Panneck, Patricia; Srinivasan, Priti; Stevens, Allison; Zöllei, Lilla; Augustinack, Jean; Wang, Ruopeng; Salat, David; Ehrlich, Stefan; Behrens, Tim; Jbabdi, Saad; Gollub, Randy; Fischl, Bruce

    2011-01-01

    We have developed a method for automated probabilistic reconstruction of a set of major white-matter pathways from diffusion-weighted MR images. Our method is called TRACULA (TRActs Constrained by UnderLying Anatomy) and utilizes prior information on the anatomy of the pathways from a set of training subjects. By incorporating this prior knowledge in the reconstruction procedure, our method obviates the need for manual interaction with the tract solutions at a later stage and thus facilitates the application of tractography to large studies. In this paper we illustrate the application of the method on data from a schizophrenia study and investigate whether the inclusion of both patients and healthy subjects in the training set affects our ability to reconstruct the pathways reliably. We show that, since our method does not constrain the exact spatial location or shape of the pathways but only their trajectory relative to the surrounding anatomical structures, a set a of healthy training subjects can be used to reconstruct the pathways accurately in patients as well as in controls. PMID:22016733

  17. Stochastic reconstructions of spectral functions: Application to lattice QCD

    NASA Astrophysics Data System (ADS)

    Ding, H.-T.; Kaczmarek, O.; Mukherjee, Swagato; Ohno, H.; Shu, H.-T.

    2018-05-01

    We present a detailed study of the applications of two stochastic approaches, stochastic optimization method (SOM) and stochastic analytical inference (SAI), to extract spectral functions from Euclidean correlation functions. SOM has the advantage that it does not require prior information. On the other hand, SAI is a more generalized method based on Bayesian inference. Under mean field approximation SAI reduces to the often-used maximum entropy method (MEM) and for a specific choice of the prior SAI becomes equivalent to SOM. To test the applicability of these two stochastic methods to lattice QCD, firstly, we apply these methods to various reasonably chosen model correlation functions and present detailed comparisons of the reconstructed spectral functions obtained from SOM, SAI and MEM. Next, we present similar studies for charmonia correlation functions obtained from lattice QCD computations using clover-improved Wilson fermions on large, fine, isotropic lattices at 0.75 and 1.5 Tc, Tc being the deconfinement transition temperature of a pure gluon plasma. We find that SAI and SOM give consistent results to MEM at these two temperatures.

  18. Conditional maximum-entropy method for selecting prior distributions in Bayesian statistics

    NASA Astrophysics Data System (ADS)

    Abe, Sumiyoshi

    2014-11-01

    The conditional maximum-entropy method (abbreviated here as C-MaxEnt) is formulated for selecting prior probability distributions in Bayesian statistics for parameter estimation. This method is inspired by a statistical-mechanical approach to systems governed by dynamics with largely separated time scales and is based on three key concepts: conjugate pairs of variables, dimensionless integration measures with coarse-graining factors and partial maximization of the joint entropy. The method enables one to calculate a prior purely from a likelihood in a simple way. It is shown, in particular, how it not only yields Jeffreys's rules but also reveals new structures hidden behind them.

  19. Mendelian randomization with Egger pleiotropy correction and weakly informative Bayesian priors.

    PubMed

    Schmidt, A F; Dudbridge, F

    2017-12-15

    The MR-Egger (MRE) estimator has been proposed to correct for directional pleiotropic effects of genetic instruments in an instrumental variable (IV) analysis. The power of this method is considerably lower than that of conventional estimators, limiting its applicability. Here we propose a novel Bayesian implementation of the MR-Egger estimator (BMRE) and explore the utility of applying weakly informative priors on the intercept term (the pleiotropy estimate) to increase power of the IV (slope) estimate. This was a simulation study to compare the performance of different IV estimators. Scenarios differed in the presence of a causal effect, the presence of pleiotropy, the proportion of pleiotropic instruments and degree of 'Instrument Strength Independent of Direct Effect' (InSIDE) assumption violation. Based on empirical plasma urate data, we present an approach to elucidate a prior distribution for the amount of pleiotropy. A weakly informative prior on the intercept term increased power of the slope estimate while maintaining type 1 error rates close to the nominal value of 0.05. Under the InSIDE assumption, performance was unaffected by the presence or absence of pleiotropy. Violation of the InSIDE assumption biased all estimators, affecting the BMRE more than the MRE method. Depending on the prior distribution, the BMRE estimator has more power at the cost of an increased susceptibility to InSIDE assumption violations. As such the BMRE method is a compromise between the MRE and conventional IV estimators, and may be an especially useful approach to account for observed pleiotropy. © The Author 2017. Published by Oxford University Press on behalf of the International Epidemiological Association.

  20. Soybean Resistance to White Mold: Evaluation of Soybean Germplasm Under Different Conditions and Validation of QTL

    PubMed Central

    Kandel, Ramkrishna; Chen, Charles Y.; Grau, Craig R.; Dorrance, Ann E.; Liu, Jean Q.; Wang, Yang; Wang, Dechun

    2018-01-01

    Soybean (Glycine max L. Merr.) white mold (SWM), caused by Sclerotinia sclerotiorum (Lib) de Barry), is a devastating fungal disease in the Upper Midwest of the United States and southern Canada. Various methods exist to evaluate for SWM resistance and many quantitative trait loci (QTL) with minor effect governing SWM resistance have been identified in prior studies. This study aimed to predict field resistance to SWM using low-cost and efficient greenhouse inoculation methods and to confirm the QTL reported in previous studies. Three related but independent studies were conducted in the field, greenhouse, and laboratory to evaluate for SWM resistance. The first study evaluated 66 soybean plant introductions (PIs) with known field resistance to SWM using the greenhouse drop-mycelium inoculation method. These 66 PIs were significantly (P < 0.043) different for resistance to SWM. However, year was highly significant (P < 0.00001), while PI x year interaction was not significant (P < 0.623). The second study compared plant mortality (PM) of 35 soybean breeding lines or varieties in greenhouse inoculation methods with disease severity index (DSI) in field evaluations. Moderate correlation (r) between PM under drop-mycelium method and DSI in field trials (r = 0.65, p < 0.0001) was obtained. The PM under spray-mycelium was also correlated significantly with DSI from field trials (r = 0.51, p < 0.0018). Likewise, significant correlation (r = 0.62, p < 0.0001) was obtained between PM across greenhouse inoculation methods and DSI across field trials. These findings suggest that greenhouse inoculation methods could predict the field resistance to SWM. The third study attempted to validate 33 QTL reported in prior studies using seven populations that comprised a total of 392 F4 : 6 lines derived from crosses involving a partially resistant cultivar “Skylla,” five partially resistant PIs, and a known susceptible cultivar “E00290.” The estimates of broad-sense heritability (h2) ranged from 0.39 to 0.66 in the populations. Of the seven populations, four had h2 estimates that were significantly different from zero (p < 0.05). Single marker analysis across populations and inoculation methods identified 11 significant SSRs (p < 0.05) corresponding to 10 QTL identified by prior studies. Thus, these five new PIs could be used as new sources of resistant alleles to develop SWM resistant commercial cultivars. PMID:29731761

  1. Power in Bayesian Mediation Analysis for Small Sample Research

    PubMed Central

    Miočević, Milica; MacKinnon, David P.; Levy, Roy

    2018-01-01

    It was suggested that Bayesian methods have potential for increasing power in mediation analysis (Koopman, Howe, Hollenbeck, & Sin, 2015; Yuan & MacKinnon, 2009). This paper compares the power of Bayesian credibility intervals for the mediated effect to the power of normal theory, distribution of the product, percentile, and bias-corrected bootstrap confidence intervals at N≤ 200. Bayesian methods with diffuse priors have power comparable to the distribution of the product and bootstrap methods, and Bayesian methods with informative priors had the most power. Varying degrees of precision of prior distributions were also examined. Increased precision led to greater power only when N≥ 100 and the effects were small, N < 60 and the effects were large, and N < 200 and the effects were medium. An empirical example from psychology illustrated a Bayesian analysis of the single mediator model from prior selection to interpreting results. PMID:29662296

  2. Power in Bayesian Mediation Analysis for Small Sample Research.

    PubMed

    Miočević, Milica; MacKinnon, David P; Levy, Roy

    2017-01-01

    It was suggested that Bayesian methods have potential for increasing power in mediation analysis (Koopman, Howe, Hollenbeck, & Sin, 2015; Yuan & MacKinnon, 2009). This paper compares the power of Bayesian credibility intervals for the mediated effect to the power of normal theory, distribution of the product, percentile, and bias-corrected bootstrap confidence intervals at N≤ 200. Bayesian methods with diffuse priors have power comparable to the distribution of the product and bootstrap methods, and Bayesian methods with informative priors had the most power. Varying degrees of precision of prior distributions were also examined. Increased precision led to greater power only when N≥ 100 and the effects were small, N < 60 and the effects were large, and N < 200 and the effects were medium. An empirical example from psychology illustrated a Bayesian analysis of the single mediator model from prior selection to interpreting results.

  3. The Efficacy of Three Learning Methods Collaborative, Context-Based Learning and Traditional, on Learning, Attitude and Behaviour of Undergraduate Nursing Students: Integrating Theory and Practice

    PubMed Central

    Hasanpour-Dehkordi, Ali

    2016-01-01

    Introduction Communication skills training, responsibility, respect, and self-awareness are important indexes of changing learning behaviours in modern approaches. Aim The aim of this study was to investigate the efficacy of three learning approaches, collaborative, context-based learning (CBL), and traditional, on learning, attitude, and behaviour of undergraduate nursing students. Materials and Methods This study was a clinical trial with pretest and post-test of control group. The participants were senior nursing students. The samples were randomly assigned to three groups; CBL, collaborative, and traditional. To gather data a standard questionnaire of students’ behaviour and attitude was administered prior to and after the intervention. Also, the rate of learning was investigated by a researcher-developed questionnaire prior to and after the intervention in the three groups. Results In CBL and collaborative training groups, the mean score of behaviour and attitude increased after the intervention. But no significant association was obtained between the mean scores of behaviour and attitude prior to and after the intervention in the traditional group. However, the mean learning score increased significantly in the CBL, collaborative, and traditional groups after the study in comparison to before the study. Conclusion Both CBL and collaborative approaches were useful in terms of increased respect, self-awareness, self-evaluation, communication skills and responsibility as well as increased motivation and learning score in comparison to traditional method. PMID:27190926

  4. A Small-Scale Study on Student Teachers' Perceptions of Classroom Management and Methods for Dealing with Misbehaviour

    ERIC Educational Resources Information Center

    Atici, Meral

    2007-01-01

    The purpose of this study is to identify student teachers' perceptions of classroom management and methods for dealing with misbehaviour. In-depth interviews with nine student teachers at Cukurova University (CU) in Turkey have been conducted twice, prior to and at the end of their teaching practice. Instructional management, behaviour management,…

  5. Teenagers' Explanations of Bullying

    ERIC Educational Resources Information Center

    Thornberg, Robert; Knutsen, Sven

    2011-01-01

    The aim of the present study was to explore how teenagers explain why bullying takes place at school, and whether there were any differences in explaining bullying due to gender and prior bullying experiences. One hundred and seventy-six Swedish students in Grade 9 responded to a questionnaire. Mixed methods (qualitative and quantitative methods)…

  6. Confidence of compliance: a Bayesian approach for percentile standards.

    PubMed

    McBride, G B; Ellis, J C

    2001-04-01

    Rules for assessing compliance with percentile standards commonly limit the number of exceedances permitted in a batch of samples taken over a defined assessment period. Such rules are commonly developed using classical statistical methods. Results from alternative Bayesian methods are presented (using beta-distributed prior information and a binomial likelihood), resulting in "confidence of compliance" graphs. These allow simple reading of the consumer's risk and the supplier's risks for any proposed rule. The influence of the prior assumptions required by the Bayesian technique on the confidence results is demonstrated, using two reference priors (uniform and Jeffreys') and also using optimistic and pessimistic user-defined priors. All four give less pessimistic results than does the classical technique, because interpreting classical results as "confidence of compliance" actually invokes a Bayesian approach with an extreme prior distribution. Jeffreys' prior is shown to be the most generally appropriate choice of prior distribution. Cost savings can be expected using rules based on this approach.

  7. Optimal Multiple Surface Segmentation With Shape and Context Priors

    PubMed Central

    Bai, Junjie; Garvin, Mona K.; Sonka, Milan; Buatti, John M.; Wu, Xiaodong

    2014-01-01

    Segmentation of multiple surfaces in medical images is a challenging problem, further complicated by the frequent presence of weak boundary evidence, large object deformations, and mutual influence between adjacent objects. This paper reports a novel approach to multi-object segmentation that incorporates both shape and context prior knowledge in a 3-D graph-theoretic framework to help overcome the stated challenges. We employ an arc-based graph representation to incorporate a wide spectrum of prior information through pair-wise energy terms. In particular, a shape-prior term is used to penalize local shape changes and a context-prior term is used to penalize local surface-distance changes from a model of the expected shape and surface distances, respectively. The globally optimal solution for multiple surfaces is obtained by computing a maximum flow in a low-order polynomial time. The proposed method was validated on intraretinal layer segmentation of optical coherence tomography images and demonstrated statistically significant improvement of segmentation accuracy compared to our earlier graph-search method that was not utilizing shape and context priors. The mean unsigned surface positioning errors obtained by the conventional graph-search approach (6.30 ± 1.58 μm) was improved to 5.14 ± 0.99 μm when employing our new method with shape and context priors. PMID:23193309

  8. A hierarchical Bayesian approach to adaptive vision testing: A case study with the contrast sensitivity function.

    PubMed

    Gu, Hairong; Kim, Woojae; Hou, Fang; Lesmes, Luis Andres; Pitt, Mark A; Lu, Zhong-Lin; Myung, Jay I

    2016-01-01

    Measurement efficiency is of concern when a large number of observations are required to obtain reliable estimates for parametric models of vision. The standard entropy-based Bayesian adaptive testing procedures addressed the issue by selecting the most informative stimulus in sequential experimental trials. Noninformative, diffuse priors were commonly used in those tests. Hierarchical adaptive design optimization (HADO; Kim, Pitt, Lu, Steyvers, & Myung, 2014) further improves the efficiency of the standard Bayesian adaptive testing procedures by constructing an informative prior using data from observers who have already participated in the experiment. The present study represents an empirical validation of HADO in estimating the human contrast sensitivity function. The results show that HADO significantly improves the accuracy and precision of parameter estimates, and therefore requires many fewer observations to obtain reliable inference about contrast sensitivity, compared to the method of quick contrast sensitivity function (Lesmes, Lu, Baek, & Albright, 2010), which uses the standard Bayesian procedure. The improvement with HADO was maintained even when the prior was constructed from heterogeneous populations or a relatively small number of observers. These results of this case study support the conclusion that HADO can be used in Bayesian adaptive testing by replacing noninformative, diffuse priors with statistically justified informative priors without introducing unwanted bias.

  9. A hierarchical Bayesian approach to adaptive vision testing: A case study with the contrast sensitivity function

    PubMed Central

    Gu, Hairong; Kim, Woojae; Hou, Fang; Lesmes, Luis Andres; Pitt, Mark A.; Lu, Zhong-Lin; Myung, Jay I.

    2016-01-01

    Measurement efficiency is of concern when a large number of observations are required to obtain reliable estimates for parametric models of vision. The standard entropy-based Bayesian adaptive testing procedures addressed the issue by selecting the most informative stimulus in sequential experimental trials. Noninformative, diffuse priors were commonly used in those tests. Hierarchical adaptive design optimization (HADO; Kim, Pitt, Lu, Steyvers, & Myung, 2014) further improves the efficiency of the standard Bayesian adaptive testing procedures by constructing an informative prior using data from observers who have already participated in the experiment. The present study represents an empirical validation of HADO in estimating the human contrast sensitivity function. The results show that HADO significantly improves the accuracy and precision of parameter estimates, and therefore requires many fewer observations to obtain reliable inference about contrast sensitivity, compared to the method of quick contrast sensitivity function (Lesmes, Lu, Baek, & Albright, 2010), which uses the standard Bayesian procedure. The improvement with HADO was maintained even when the prior was constructed from heterogeneous populations or a relatively small number of observers. These results of this case study support the conclusion that HADO can be used in Bayesian adaptive testing by replacing noninformative, diffuse priors with statistically justified informative priors without introducing unwanted bias. PMID:27105061

  10. Extraction of microseismic waveforms characteristics prior to rock burst using Hilbert-Huang transform

    NASA Astrophysics Data System (ADS)

    Li, Xuelong; Li, Zhonghui; Wang, Enyuan; Feng, Junjun; Chen, Liang; Li, Nan; Kong, Xiangguo

    2016-09-01

    This study provides a new research idea concerning rock burst prediction. The characteristics of microseismic (MS) waveforms prior to and during the rock burst were studied through the Hilbert-Huang transform (HHT). In order to demonstrate the advantage of the MS features extraction based on HHT, the conventional analysis method (Fourier transform) was also used to make a comparison. The results show that HHT is simple and reliable, and could extract in-depth information about the characteristics of MS waveforms. About 10 days prior to the rock burst, the main frequency of MS waveforms transforms from the high-frequency to low-frequency. What's more, the waveforms energy also presents accumulation characteristic. Based on our study results, it can be concluded that the MS signals analysis through HHT could provide valuable information about the coal or rock deformation and fracture.

  11. Shape-Driven 3D Segmentation Using Spherical Wavelets

    PubMed Central

    Nain, Delphine; Haker, Steven; Bobick, Aaron; Tannenbaum, Allen

    2013-01-01

    This paper presents a novel active surface segmentation algorithm using a multiscale shape representation and prior. We define a parametric model of a surface using spherical wavelet functions and learn a prior probability distribution over the wavelet coefficients to model shape variations at different scales and spatial locations in a training set. Based on this representation, we derive a parametric active surface evolution using the multiscale prior coefficients as parameters for our optimization procedure to naturally include the prior in the segmentation framework. Additionally, the optimization method can be applied in a coarse-to-fine manner. We apply our algorithm to the segmentation of brain caudate nucleus, of interest in the study of schizophrenia. Our validation shows our algorithm is computationally efficient and outperforms the Active Shape Model algorithm by capturing finer shape details. PMID:17354875

  12. Scalable Learning for Geostatistics and Speaker Recognition

    DTIC Science & Technology

    2011-01-01

    of prior knowledge of the model or due to improved robustness requirements). Both these methods have their own advantages and disadvantages. The use...application. If the data is well-correlated and low-dimensional, any prior knowledge available on the data can be used to build a parametric model. In the...absence of prior knowledge , non-parametric methods can be used. If the data is high-dimensional, PCA based dimensionality reduction is often the first

  13. Prior-based artifact correction (PBAC) in computed tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heußer, Thorsten, E-mail: thorsten.heusser@dkfz-heidelberg.de; Brehm, Marcus; Ritschl, Ludwig

    2014-02-15

    Purpose: Image quality in computed tomography (CT) often suffers from artifacts which may reduce the diagnostic value of the image. In many cases, these artifacts result from missing or corrupt regions in the projection data, e.g., in the case of metal, truncation, and limited angle artifacts. The authors propose a generalized correction method for different kinds of artifacts resulting from missing or corrupt data by making use of available prior knowledge to perform data completion. Methods: The proposed prior-based artifact correction (PBAC) method requires prior knowledge in form of a planning CT of the same patient or in form ofmore » a CT scan of a different patient showing the same body region. In both cases, the prior image is registered to the patient image using a deformable transformation. The registered prior is forward projected and data completion of the patient projections is performed using smooth sinogram inpainting. The obtained projection data are used to reconstruct the corrected image. Results: The authors investigate metal and truncation artifacts in patient data sets acquired with a clinical CT and limited angle artifacts in an anthropomorphic head phantom data set acquired with a gantry-based flat detector CT device. In all cases, the corrected images obtained by PBAC are nearly artifact-free. Compared to conventional correction methods, PBAC achieves better artifact suppression while preserving the patient-specific anatomy at the same time. Further, the authors show that prominent anatomical details in the prior image seem to have only minor impact on the correction result. Conclusions: The results show that PBAC has the potential to effectively correct for metal, truncation, and limited angle artifacts if adequate prior data are available. Since the proposed method makes use of a generalized algorithm, PBAC may also be applicable to other artifacts resulting from missing or corrupt data.« less

  14. Relationship between suicidality and impulsivity in bipolar I disorder: a diffusion tensor imaging study

    PubMed Central

    Mahon, Katie; Burdick, Katherine E; Wu, Jinghui; Ardekani, Babak A; Szeszko, Philip R

    2012-01-01

    Background Impulsivity is characteristic of individuals with bipolar disorder and may be a contributing factor to the high rate of suicide in patients with this disorder. Although white matter abnormalities have been implicated in the pathophysiology of bipolar disorder, their relationship to impulsivity and suicidality in this disorder has not been well-investigated. Methods Diffusion tensor imaging scans were acquired in 14 bipolar disorder patients with a prior suicide attempt, 15 bipolar disorder patients with no prior suicide attempt, and 15 healthy volunteers. Bipolar disorder patients received clinical assessments including measures of impulsivity, depression, mania, and anxiety. Images were processed using the Tract-Based Spatial Statistics method in the FSL software package. Results Bipolar disorder patients with a prior suicide attempt had lower fractional anisotropy (FA) within the left orbital frontal white matter (p < 0.05, corrected) and higher overall impulsivity compared to patients without a previous suicide attempt. Among patients with a prior suicide attempt, FA in the orbital frontal white matter region correlated inversely with motor impulsivity. Conclusions Abnormal orbital frontal white matter may play a role in impulsive and suicidal behavior among patients with bipolar disorder. PMID:22329475

  15. Order priors for Bayesian network discovery with an application to malware phylogeny

    DOE PAGES

    Oyen, Diane; Anderson, Blake; Sentz, Kari; ...

    2017-09-15

    Here, Bayesian networks have been used extensively to model and discover dependency relationships among sets of random variables. We learn Bayesian network structure with a combination of human knowledge about the partial ordering of variables and statistical inference of conditional dependencies from observed data. Our approach leverages complementary information from human knowledge and inference from observed data to produce networks that reflect human beliefs about the system as well as to fit the observed data. Applying prior beliefs about partial orderings of variables is an approach distinctly different from existing methods that incorporate prior beliefs about direct dependencies (or edges)more » in a Bayesian network. We provide an efficient implementation of the partial-order prior in a Bayesian structure discovery learning algorithm, as well as an edge prior, showing that both priors meet the local modularity requirement necessary for an efficient Bayesian discovery algorithm. In benchmark studies, the partial-order prior improves the accuracy of Bayesian network structure learning as well as the edge prior, even though order priors are more general. Our primary motivation is in characterizing the evolution of families of malware to aid cyber security analysts. For the problem of malware phylogeny discovery, we find that our algorithm, compared to existing malware phylogeny algorithms, more accurately discovers true dependencies that are missed by other algorithms.« less

  16. Order priors for Bayesian network discovery with an application to malware phylogeny

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oyen, Diane; Anderson, Blake; Sentz, Kari

    Here, Bayesian networks have been used extensively to model and discover dependency relationships among sets of random variables. We learn Bayesian network structure with a combination of human knowledge about the partial ordering of variables and statistical inference of conditional dependencies from observed data. Our approach leverages complementary information from human knowledge and inference from observed data to produce networks that reflect human beliefs about the system as well as to fit the observed data. Applying prior beliefs about partial orderings of variables is an approach distinctly different from existing methods that incorporate prior beliefs about direct dependencies (or edges)more » in a Bayesian network. We provide an efficient implementation of the partial-order prior in a Bayesian structure discovery learning algorithm, as well as an edge prior, showing that both priors meet the local modularity requirement necessary for an efficient Bayesian discovery algorithm. In benchmark studies, the partial-order prior improves the accuracy of Bayesian network structure learning as well as the edge prior, even though order priors are more general. Our primary motivation is in characterizing the evolution of families of malware to aid cyber security analysts. For the problem of malware phylogeny discovery, we find that our algorithm, compared to existing malware phylogeny algorithms, more accurately discovers true dependencies that are missed by other algorithms.« less

  17. Improving Thermal Dose Accuracy in Magnetic Resonance-Guided Focused Ultrasound Surgery: Long-Term Thermometry Using a Prior Baseline as a Reference

    PubMed Central

    Bitton, Rachel R.; Webb, Taylor D.; Pauly, Kim Butts; Ghanouni, Pejman

    2015-01-01

    Purpose To investigate thermal dose volume (TDV) and non-perfused volume (NPV) of magnetic resonance-guided focused ultrasound (MRgFUS) treatments in patients with soft tissue tumors, and describe a method for MR thermal dosimetry using a baseline reference. Materials and Methods Agreement between TDV and immediate post treatment NPV was evaluated from MRgFUS treatments of five patients with biopsy-proven desmoid tumors. Thermometry data (gradient echo, 3T) were analyzed over the entire course of the treatments to discern temperature errors in the standard approach. The technique searches previously acquired baseline images for a match using 2D normalized cross-correlation and a weighted mean of phase difference images. Thermal dose maps and TDVs were recalculated using the matched baseline and compared to NPV. Results TDV and NPV showed between 47%–91% disagreement, using the standard immediate baseline method for calculating TDV. Long-term thermometry showed a nonlinear local temperature accrual, where peak additional temperature varied between 4–13°C (mean = 7.8°C) across patients. The prior baseline method could be implemented by finding a previously acquired matching baseline 61% ± 8% (mean ± SD) of the time. We found 7%–42% of the disagreement between TDV and NPV was due to errors in thermometry caused by heat accrual. For all patients, the prior baseline method increased the estimated treatment volume and reduced the discrepancies between TDV and NPV (P = 0.023). Conclusion This study presents a mismatch between in-treatment and post treatment efficacy measures. The prior baseline approach accounts for local heating and improves the accuracy of thermal dose-predicted volume. PMID:26119129

  18. Clustering and Bayesian hierarchical modeling for the definition of informative prior distributions in hydrogeology

    NASA Astrophysics Data System (ADS)

    Cucchi, K.; Kawa, N.; Hesse, F.; Rubin, Y.

    2017-12-01

    In order to reduce uncertainty in the prediction of subsurface flow and transport processes, practitioners should use all data available. However, classic inverse modeling frameworks typically only make use of information contained in in-situ field measurements to provide estimates of hydrogeological parameters. Such hydrogeological information about an aquifer is difficult and costly to acquire. In this data-scarce context, the transfer of ex-situ information coming from previously investigated sites can be critical for improving predictions by better constraining the estimation procedure. Bayesian inverse modeling provides a coherent framework to represent such ex-situ information by virtue of the prior distribution and combine them with in-situ information from the target site. In this study, we present an innovative data-driven approach for defining such informative priors for hydrogeological parameters at the target site. Our approach consists in two steps, both relying on statistical and machine learning methods. The first step is data selection; it consists in selecting sites similar to the target site. We use clustering methods for selecting similar sites based on observable hydrogeological features. The second step is data assimilation; it consists in assimilating data from the selected similar sites into the informative prior. We use a Bayesian hierarchical model to account for inter-site variability and to allow for the assimilation of multiple types of site-specific data. We present the application and validation of the presented methods on an established database of hydrogeological parameters. Data and methods are implemented in the form of an open-source R-package and therefore facilitate easy use by other practitioners.

  19. Fast approximation for joint optimization of segmentation, shape, and location priors, and its application in gallbladder segmentation.

    PubMed

    Saito, Atsushi; Nawano, Shigeru; Shimizu, Akinobu

    2017-05-01

    This paper addresses joint optimization for segmentation and shape priors, including translation, to overcome inter-subject variability in the location of an organ. Because a simple extension of the previous exact optimization method is too computationally complex, we propose a fast approximation for optimization. The effectiveness of the proposed approximation is validated in the context of gallbladder segmentation from a non-contrast computed tomography (CT) volume. After spatial standardization and estimation of the posterior probability of the target organ, simultaneous optimization of the segmentation, shape, and location priors is performed using a branch-and-bound method. Fast approximation is achieved by combining sampling in the eigenshape space to reduce the number of shape priors and an efficient computational technique for evaluating the lower bound. Performance was evaluated using threefold cross-validation of 27 CT volumes. Optimization in terms of translation of the shape prior significantly improved segmentation performance. The proposed method achieved a result of 0.623 on the Jaccard index in gallbladder segmentation, which is comparable to that of state-of-the-art methods. The computational efficiency of the algorithm is confirmed to be good enough to allow execution on a personal computer. Joint optimization of the segmentation, shape, and location priors was proposed, and it proved to be effective in gallbladder segmentation with high computational efficiency.

  20. Uncertainty in training image-based inversion of hydraulic head data constrained to ERT data: Workflow and case study

    NASA Astrophysics Data System (ADS)

    Hermans, Thomas; Nguyen, Frédéric; Caers, Jef

    2015-07-01

    In inverse problems, investigating uncertainty in the posterior distribution of model parameters is as important as matching data. In recent years, most efforts have focused on techniques to sample the posterior distribution with reasonable computational costs. Within a Bayesian context, this posterior depends on the prior distribution. However, most of the studies ignore modeling the prior with realistic geological uncertainty. In this paper, we propose a workflow inspired by a Popper-Bayes philosophy that data should first be used to falsify models, then only be considered for matching. We propose a workflow consisting of three steps: (1) in defining the prior, we interpret multiple alternative geological scenarios from literature (architecture of facies) and site-specific data (proportions of facies). Prior spatial uncertainty is modeled using multiple-point geostatistics, where each scenario is defined using a training image. (2) We validate these prior geological scenarios by simulating electrical resistivity tomography (ERT) data on realizations of each scenario and comparing them to field ERT in a lower dimensional space. In this second step, the idea is to probabilistically falsify scenarios with ERT, meaning that scenarios which are incompatible receive an updated probability of zero while compatible scenarios receive a nonzero updated belief. (3) We constrain the hydrogeological model with hydraulic head and ERT using a stochastic search method. The workflow is applied to a synthetic and a field case studies in an alluvial aquifer. This study highlights the importance of considering and estimating prior uncertainty (without data) through a process of probabilistic falsification.

  1. Assessing the Relationship between Family Mealtime Communication and Adolescent Emotional Well-Being Using the Experience Sampling Method

    ERIC Educational Resources Information Center

    Offer, Shira

    2013-01-01

    While most prior research has focused on the frequency of family meals the issue of which elements of family mealtime are most salient for adolescents' well-being has remained overlooked. The current study used the experience sampling method, a unique form of time diary, and survey data drawn from the 500 Family Study (N = 237 adolescents with…

  2. Fast Low-Rank Bayesian Matrix Completion With Hierarchical Gaussian Prior Models

    NASA Astrophysics Data System (ADS)

    Yang, Linxiao; Fang, Jun; Duan, Huiping; Li, Hongbin; Zeng, Bing

    2018-06-01

    The problem of low rank matrix completion is considered in this paper. To exploit the underlying low-rank structure of the data matrix, we propose a hierarchical Gaussian prior model, where columns of the low-rank matrix are assumed to follow a Gaussian distribution with zero mean and a common precision matrix, and a Wishart distribution is specified as a hyperprior over the precision matrix. We show that such a hierarchical Gaussian prior has the potential to encourage a low-rank solution. Based on the proposed hierarchical prior model, a variational Bayesian method is developed for matrix completion, where the generalized approximate massage passing (GAMP) technique is embedded into the variational Bayesian inference in order to circumvent cumbersome matrix inverse operations. Simulation results show that our proposed method demonstrates superiority over existing state-of-the-art matrix completion methods.

  3. Preliminary Results of Cleaning Process for Lubricant Contamination

    NASA Astrophysics Data System (ADS)

    Eisenmann, D.; Brasche, L.; Lopez, R.

    2006-03-01

    Fluorescent penetrant inspection (FPI) is widely used for aviation and other components for surface-breaking crack detection. As with all inspection methods, adherence to the process parameters is critical to the successful detection of defects. Prior to FPI, components are cleaned using a variety of cleaning methods which are selected based on the alloy and the soil types which must be removed. It is also important that the cleaning process not adversely affect the FPI process. There are a variety of lubricants and surface coatings used in the aviation industry which must be removed prior to FPI. To assess the effectiveness of typical cleaning processes on removal of these contaminants, a study was initiated at an airline overhaul facility. Initial results of the cleaning study for lubricant contamination in nickel, titanium and aluminum alloys will be presented.

  4. Balancing the Role of Priors in Multi-Observer Segmentation Evaluation

    PubMed Central

    Huang, Xiaolei; Wang, Wei; Lopresti, Daniel; Long, Rodney; Antani, Sameer; Xue, Zhiyun; Thoma, George

    2009-01-01

    Comparison of a group of multiple observer segmentations is known to be a challenging problem. A good segmentation evaluation method would allow different segmentations not only to be compared, but to be combined to generate a “true” segmentation with higher consensus. Numerous multi-observer segmentation evaluation approaches have been proposed in the literature, and STAPLE in particular probabilistically estimates the true segmentation by optimal combination of observed segmentations and a prior model of the truth. An Expectation–Maximization (EM) algorithm, STAPLE’S convergence to the desired local minima depends on good initializations for the truth prior and the observer-performance prior. However, accurate modeling of the initial truth prior is nontrivial. Moreover, among the two priors, the truth prior always dominates so that in certain scenarios when meaningful observer-performance priors are available, STAPLE can not take advantage of that information. In this paper, we propose a Bayesian decision formulation of the problem that permits the two types of prior knowledge to be integrated in a complementary manner in four cases with differing application purposes: (1) with known truth prior; (2) with observer prior; (3) with neither truth prior nor observer prior; and (4) with both truth prior and observer prior. The third and fourth cases are not discussed (or effectively ignored) by STAPLE, and in our research we propose a new method to combine multiple-observer segmentations based on the maximum a posterior (MAP) principle, which respects the observer prior regardless of the availability of the truth prior. Based on the four scenarios, we have developed a web-based software application that implements the flexible segmentation evaluation framework for digitized uterine cervix images. Experiment results show that our framework has flexibility in effectively integrating different priors for multi-observer segmentation evaluation and it also generates results comparing favorably to those by the STAPLE algorithm and the Majority Vote Rule. PMID:20523759

  5. Pre-Service Teachers' Beliefs about Knowledge, Mathematics, and Science

    ERIC Educational Resources Information Center

    Cady, Jo Ann; Rearden, Kristin

    2007-01-01

    This study examines the beliefs of K-8 preservice teachers during a content methods course. The goals of this course included exposing the preservice teachers to student-centered instructional methods for math and science and encouraging the development of lessons that would integrate mathematics and science. Prior research suggested that one must…

  6. Integrative Bayesian variable selection with gene-based informative priors for genome-wide association studies.

    PubMed

    Zhang, Xiaoshuai; Xue, Fuzhong; Liu, Hong; Zhu, Dianwen; Peng, Bin; Wiemels, Joseph L; Yang, Xiaowei

    2014-12-10

    Genome-wide Association Studies (GWAS) are typically designed to identify phenotype-associated single nucleotide polymorphisms (SNPs) individually using univariate analysis methods. Though providing valuable insights into genetic risks of common diseases, the genetic variants identified by GWAS generally account for only a small proportion of the total heritability for complex diseases. To solve this "missing heritability" problem, we implemented a strategy called integrative Bayesian Variable Selection (iBVS), which is based on a hierarchical model that incorporates an informative prior by considering the gene interrelationship as a network. It was applied here to both simulated and real data sets. Simulation studies indicated that the iBVS method was advantageous in its performance with highest AUC in both variable selection and outcome prediction, when compared to Stepwise and LASSO based strategies. In an analysis of a leprosy case-control study, iBVS selected 94 SNPs as predictors, while LASSO selected 100 SNPs. The Stepwise regression yielded a more parsimonious model with only 3 SNPs. The prediction results demonstrated that the iBVS method had comparable performance with that of LASSO, but better than Stepwise strategies. The proposed iBVS strategy is a novel and valid method for Genome-wide Association Studies, with the additional advantage in that it produces more interpretable posterior probabilities for each variable unlike LASSO and other penalized regression methods.

  7. DERMAL AND MOUTHING TRANSFERS OF SURFACE RESIDUES MEASURED USING FLUORESCENCE IMAGING

    EPA Science Inventory

    To reduce the uncertainty associated with current estimates of children's exposure to pesticides by dermal contact and non-dietary ingestion, residue transfer data are required. Prior to conducting exhaustive studies, a screening study to develop and test methods for measuring...

  8. Detecting and Estimating Contamination of Human DNA Samples in Sequencing and Array-Based Genotype Data

    PubMed Central

    Jun, Goo; Flickinger, Matthew; Hetrick, Kurt N.; Romm, Jane M.; Doheny, Kimberly F.; Abecasis, Gonçalo R.; Boehnke, Michael; Kang, Hyun Min

    2012-01-01

    DNA sample contamination is a serious problem in DNA sequencing studies and may result in systematic genotype misclassification and false positive associations. Although methods exist to detect and filter out cross-species contamination, few methods to detect within-species sample contamination are available. In this paper, we describe methods to identify within-species DNA sample contamination based on (1) a combination of sequencing reads and array-based genotype data, (2) sequence reads alone, and (3) array-based genotype data alone. Analysis of sequencing reads allows contamination detection after sequence data is generated but prior to variant calling; analysis of array-based genotype data allows contamination detection prior to generation of costly sequence data. Through a combination of analysis of in silico and experimentally contaminated samples, we show that our methods can reliably detect and estimate levels of contamination as low as 1%. We evaluate the impact of DNA contamination on genotype accuracy and propose effective strategies to screen for and prevent DNA contamination in sequencing studies. PMID:23103226

  9. Inverse modeling of Asian (222)Rn flux using surface air (222)Rn concentration.

    PubMed

    Hirao, Shigekazu; Yamazawa, Hiromi; Moriizumi, Jun

    2010-11-01

    When used with an atmospheric transport model, the (222)Rn flux distribution estimated in our previous study using soil transport theory caused underestimation of atmospheric (222)Rn concentrations as compared with measurements in East Asia. In this study, we applied a Bayesian synthesis inverse method to produce revised estimates of the annual (222)Rn flux density in Asia by using atmospheric (222)Rn concentrations measured at seven sites in East Asia. The Bayesian synthesis inverse method requires a prior estimate of the flux distribution and its uncertainties. The atmospheric transport model MM5/HIRAT and our previous estimate of the (222)Rn flux distribution as the prior value were used to generate new flux estimates for the eastern half of the Eurasian continent dividing into 10 regions. The (222)Rn flux densities estimated using the Bayesian inversion technique were generally higher than the prior flux densities. The area-weighted average (222)Rn flux density for Asia was estimated to be 33.0 mBq m(-2) s(-1), which is substantially higher than the prior value (16.7 mBq m(-2) s(-1)). The estimated (222)Rn flux densities decrease with increasing latitude as follows: Southeast Asia (36.7 mBq m(-2) s(-1)); East Asia (28.6 mBq m(-2) s(-1)) including China, Korean Peninsula and Japan; and Siberia (14.1 mBq m(-2) s(-1)). Increase of the newly estimated fluxes in Southeast Asia, China, Japan, and the southern part of Eastern Siberia from the prior ones contributed most significantly to improved agreement of the model-calculated concentrations with the atmospheric measurements. The sensitivity analysis of prior flux errors and effects of locally exhaled (222)Rn showed that the estimated fluxes in Northern and Central China, Korea, Japan, and the southern part of Eastern Siberia were robust, but that in Central Asia had a large uncertainty.

  10. A novel method for improving cerussite sulfidization

    NASA Astrophysics Data System (ADS)

    Feng, Qi-cheng; Wen, Shu-ming; Zhao, Wen-juan; Cao, Qin-bo; Lü, Chao

    2016-06-01

    Evaluation of flotation behavior, solution measurements, and surface analyses were performed to investigate the effects of chloride ion addition on the sulfidization of cerussite in this study. Micro-flotation tests indicate that the addition of chloride ions prior to sulfidization can significantly increase the flotation recovery of cerussite, which is attributed to the formation of more lead sulfide species on the mineral surface. Solution measurement results suggest that the addition of chloride ions prior to sulfidization induces the transformation of more sulfide ions from pulp solution onto the mineral surface by the formation of more lead sulfide species. X-ray diffraction and energy-dispersive spectroscopy indicate that more lead sulfide species form on the mineral surface when chloride ions are added prior to sulfidization. These results demonstrate that the addition of chloride ions prior to sulfidization can significantly improve the sulfidization of cerussite, thereby enhancing the flotation performance.

  11. Calibrated tree priors for relaxed phylogenetics and divergence time estimation.

    PubMed

    Heled, Joseph; Drummond, Alexei J

    2012-01-01

    The use of fossil evidence to calibrate divergence time estimation has a long history. More recently, Bayesian Markov chain Monte Carlo has become the dominant method of divergence time estimation, and fossil evidence has been reinterpreted as the specification of prior distributions on the divergence times of calibration nodes. These so-called "soft calibrations" have become widely used but the statistical properties of calibrated tree priors in a Bayesian setting hashave not been carefully investigated. Here, we clarify that calibration densities, such as those defined in BEAST 1.5, do not represent the marginal prior distribution of the calibration node. We illustrate this with a number of analytical results on small trees. We also describe an alternative construction for a calibrated Yule prior on trees that allows direct specification of the marginal prior distribution of the calibrated divergence time, with or without the restriction of monophyly. This method requires the computation of the Yule prior conditional on the height of the divergence being calibrated. Unfortunately, a practical solution for multiple calibrations remains elusive. Our results suggest that direct estimation of the prior induced by specifying multiple calibration densities should be a prerequisite of any divergence time dating analysis.

  12. Estimating parameter of Rayleigh distribution by using Maximum Likelihood method and Bayes method

    NASA Astrophysics Data System (ADS)

    Ardianti, Fitri; Sutarman

    2018-01-01

    In this paper, we use Maximum Likelihood estimation and Bayes method under some risk function to estimate parameter of Rayleigh distribution to know the best method. The prior knowledge which used in Bayes method is Jeffrey’s non-informative prior. Maximum likelihood estimation and Bayes method under precautionary loss function, entropy loss function, loss function-L 1 will be compared. We compare these methods by bias and MSE value using R program. After that, the result will be displayed in tables to facilitate the comparisons.

  13. Numerical optimization using flow equations.

    PubMed

    Punk, Matthias

    2014-12-01

    We develop a method for multidimensional optimization using flow equations. This method is based on homotopy continuation in combination with a maximum entropy approach. Extrema of the optimizing functional correspond to fixed points of the flow equation. While ideas based on Bayesian inference such as the maximum entropy method always depend on a prior probability, the additional step in our approach is to perform a continuous update of the prior during the homotopy flow. The prior probability thus enters the flow equation only as an initial condition. We demonstrate the applicability of this optimization method for two paradigmatic problems in theoretical condensed matter physics: numerical analytic continuation from imaginary to real frequencies and finding (variational) ground states of frustrated (quantum) Ising models with random or long-range antiferromagnetic interactions.

  14. Numerical optimization using flow equations

    NASA Astrophysics Data System (ADS)

    Punk, Matthias

    2014-12-01

    We develop a method for multidimensional optimization using flow equations. This method is based on homotopy continuation in combination with a maximum entropy approach. Extrema of the optimizing functional correspond to fixed points of the flow equation. While ideas based on Bayesian inference such as the maximum entropy method always depend on a prior probability, the additional step in our approach is to perform a continuous update of the prior during the homotopy flow. The prior probability thus enters the flow equation only as an initial condition. We demonstrate the applicability of this optimization method for two paradigmatic problems in theoretical condensed matter physics: numerical analytic continuation from imaginary to real frequencies and finding (variational) ground states of frustrated (quantum) Ising models with random or long-range antiferromagnetic interactions.

  15. Accuracy of Recalled Body Weight – A Study with 20-years of Follow-up

    PubMed Central

    Dahl, Anna K; Reynolds, Chandra A

    2013-01-01

    Objective Weight changes may be an important indicator of an ongoing pathological process. Retrospective self-report might be the only possibility to capture prior weight. The objective of the study was to evaluate the accuracy of retrospective recall of body weight in old age and factors that might predict accuracy. Design and Methods In 2007, 646 participants (mean age, 71.6 years) of the Swedish Adoption/Twin Study of Aging (SATSA) answered questions about their present weight and how much they weighed 20-years ago. Of these, 436 had self-reported their weight twenty years earlier and among these 134 had also had their weight assessed at this time point. Results Twenty year retrospectively recalled weight underestimated the prior assessed weight by −1.89 ± 5.9 kg and underestimated prior self-reported weight by −0.55 ± 5.2 kg. Moreover, 82.4% of the sample were accurate within 10%, and 45.8% were accurate within 5% of their prior assessed weights; similarly, 84.2% and 58.0 % were accurate within 10% and 5% respectively, for prior self-reported weight. Current higher body mass index and preferences of reporting weights ending with zero or five was associated with an underestimation of prior weight, while greater weight change over 20 year, and low Mini-Mental State Scores (MMSE) (<25) led to an overestimation of prior weight. Conclusions Recalled weight comes close to the assessed population mean, but at the individual level there is a large variation. The accuracy is affected by current BMI, changes in weight, end-digit preferences, and current cognitive ability. Recalled weight should be used with caution. PMID:23913738

  16. Methods for calculating confidence and credible intervals for the residual between-study variance in random effects meta-regression models

    PubMed Central

    2014-01-01

    Background Meta-regression is becoming increasingly used to model study level covariate effects. However this type of statistical analysis presents many difficulties and challenges. Here two methods for calculating confidence intervals for the magnitude of the residual between-study variance in random effects meta-regression models are developed. A further suggestion for calculating credible intervals using informative prior distributions for the residual between-study variance is presented. Methods Two recently proposed and, under the assumptions of the random effects model, exact methods for constructing confidence intervals for the between-study variance in random effects meta-analyses are extended to the meta-regression setting. The use of Generalised Cochran heterogeneity statistics is extended to the meta-regression setting and a Newton-Raphson procedure is developed to implement the Q profile method for meta-analysis and meta-regression. WinBUGS is used to implement informative priors for the residual between-study variance in the context of Bayesian meta-regressions. Results Results are obtained for two contrasting examples, where the first example involves a binary covariate and the second involves a continuous covariate. Intervals for the residual between-study variance are wide for both examples. Conclusions Statistical methods, and R computer software, are available to compute exact confidence intervals for the residual between-study variance under the random effects model for meta-regression. These frequentist methods are almost as easily implemented as their established counterparts for meta-analysis. Bayesian meta-regressions are also easily performed by analysts who are comfortable using WinBUGS. Estimates of the residual between-study variance in random effects meta-regressions should be routinely reported and accompanied by some measure of their uncertainty. Confidence and/or credible intervals are well-suited to this purpose. PMID:25196829

  17. ℓ1-Regularized full-waveform inversion with prior model information based on orthant-wise limited memory quasi-Newton method

    NASA Astrophysics Data System (ADS)

    Dai, Meng-Xue; Chen, Jing-Bo; Cao, Jian

    2017-07-01

    Full-waveform inversion (FWI) is an ill-posed optimization problem which is sensitive to noise and initial model. To alleviate the ill-posedness of the problem, regularization techniques are usually adopted. The ℓ1-norm penalty is a robust regularization method that preserves contrasts and edges. The Orthant-Wise Limited-Memory Quasi-Newton (OWL-QN) method extends the widely-used limited-memory Broyden-Fletcher-Goldfarb-Shanno (L-BFGS) method to the ℓ1-regularized optimization problems and inherits the efficiency of L-BFGS. To take advantage of the ℓ1-regularized method and the prior model information obtained from sonic logs and geological information, we implement OWL-QN algorithm in ℓ1-regularized FWI with prior model information in this paper. Numerical experiments show that this method not only improve the inversion results but also has a strong anti-noise ability.

  18. Use of prior knowledge for the analysis of high-throughput transcriptomics and metabolomics data

    PubMed Central

    2014-01-01

    Background High-throughput omics technologies have enabled the measurement of many genes or metabolites simultaneously. The resulting high dimensional experimental data poses significant challenges to transcriptomics and metabolomics data analysis methods, which may lead to spurious instead of biologically relevant results. One strategy to improve the results is the incorporation of prior biological knowledge in the analysis. This strategy is used to reduce the solution space and/or to focus the analysis on biological meaningful regions. In this article, we review a selection of these methods used in transcriptomics and metabolomics. We combine the reviewed methods in three groups based on the underlying mathematical model: exploratory methods, supervised methods and estimation of the covariance matrix. We discuss which prior knowledge has been used, how it is incorporated and how it modifies the mathematical properties of the underlying methods. PMID:25033193

  19. dPIRPLE: a joint estimation framework for deformable registration and penalized-likelihood CT image reconstruction using prior images

    NASA Astrophysics Data System (ADS)

    Dang, H.; Wang, A. S.; Sussman, Marc S.; Siewerdsen, J. H.; Stayman, J. W.

    2014-09-01

    Sequential imaging studies are conducted in many clinical scenarios. Prior images from previous studies contain a great deal of patient-specific anatomical information and can be used in conjunction with subsequent imaging acquisitions to maintain image quality while enabling radiation dose reduction (e.g., through sparse angular sampling, reduction in fluence, etc). However, patient motion between images in such sequences results in misregistration between the prior image and current anatomy. Existing prior-image-based approaches often include only a simple rigid registration step that can be insufficient for capturing complex anatomical motion, introducing detrimental effects in subsequent image reconstruction. In this work, we propose a joint framework that estimates the 3D deformation between an unregistered prior image and the current anatomy (based on a subsequent data acquisition) and reconstructs the current anatomical image using a model-based reconstruction approach that includes regularization based on the deformed prior image. This framework is referred to as deformable prior image registration, penalized-likelihood estimation (dPIRPLE). Central to this framework is the inclusion of a 3D B-spline-based free-form-deformation model into the joint registration-reconstruction objective function. The proposed framework is solved using a maximization strategy whereby alternating updates to the registration parameters and image estimates are applied allowing for improvements in both the registration and reconstruction throughout the optimization process. Cadaver experiments were conducted on a cone-beam CT testbench emulating a lung nodule surveillance scenario. Superior reconstruction accuracy and image quality were demonstrated using the dPIRPLE algorithm as compared to more traditional reconstruction methods including filtered backprojection, penalized-likelihood estimation (PLE), prior image penalized-likelihood estimation (PIPLE) without registration, and prior image penalized-likelihood estimation with rigid registration of a prior image (PIRPLE) over a wide range of sampling sparsity and exposure levels.

  20. Mind wandering during film comprehension: The role of prior knowledge and situational interest.

    PubMed

    Kopp, Kristopher; Mills, Caitlin; D'Mello, Sidney

    2016-06-01

    This study assessed the occurrence and factors that influence mind wandering (MW) in the domain of film comprehension. The cascading model of inattention assumes that a stronger mental representation (i.e., a situation model) during comprehension results in less MW. Accordingly, a suppression hypothesis suggests that MW would decrease as a function of having the knowledge of the plot of a film prior to viewing, because the prior-knowledge would help to strengthen the situation model during comprehension. Furthermore, an interest-moderation hypothesis would predict that the suppression effect of prior-knowledge would only emerge when there was interest in viewing the film. In the current experiment, 108 participants either read a short story that depicted the plot (i.e., prior-knowledge condition) or read an unrelated story of equal length (control condition) prior to viewing the short film (32.5 minutes) entitled The Red Balloon. Participants self-reported their interest in viewing the film immediately before the film was presented. MW was tracked using a self-report method targeting instances of MW with metacognitive awareness. Participants in the prior-knowledge condition reported less MW compared with the control condition, thereby supporting the suppression hypothesis. MW also decreased over the duration of the film, but only for those with prior-knowledge of the film. Finally, prior-knowledge effects on MW were only observed when interest was average or high, but not when interest was low.

  1. The Efficacy of Three Learning Methods Collaborative, Context-Based Learning and Traditional, on Learning, Attitude and Behaviour of Undergraduate Nursing Students: Integrating Theory and Practice.

    PubMed

    Hasanpour-Dehkordi, Ali; Solati, Kamal

    2016-04-01

    Communication skills training, responsibility, respect, and self-awareness are important indexes of changing learning behaviours in modern approaches. The aim of this study was to investigate the efficacy of three learning approaches, collaborative, context-based learning (CBL), and traditional, on learning, attitude, and behaviour of undergraduate nursing students. This study was a clinical trial with pretest and post-test of control group. The participants were senior nursing students. The samples were randomly assigned to three groups; CBL, collaborative, and traditional. To gather data a standard questionnaire of students' behaviour and attitude was administered prior to and after the intervention. Also, the rate of learning was investigated by a researcher-developed questionnaire prior to and after the intervention in the three groups. In CBL and collaborative training groups, the mean score of behaviour and attitude increased after the intervention. But no significant association was obtained between the mean scores of behaviour and attitude prior to and after the intervention in the traditional group. However, the mean learning score increased significantly in the CBL, collaborative, and traditional groups after the study in comparison to before the study. Both CBL and collaborative approaches were useful in terms of increased respect, self-awareness, self-evaluation, communication skills and responsibility as well as increased motivation and learning score in comparison to traditional method.

  2. A METHOD TO REMOVE ENVIRONMENTAL INHIBITORS PRIOR TO THE DETECTION OF WATERBORNE ENTERIC VIRUSES BY REVERSE TRANSCRIPTION-POLYMERASE CHAIN REACTION

    EPA Science Inventory

    A method was developed to remove environmental inhibitors from sample concentrates prior to detection of human enteric viruses using the reverse transcription-polymerase chain reaction (RT-PCR).Environmental inhibitors, concentrated along with viruses during water sample processi...

  3. Prior adversities predict posttraumatic stress reactions in adolescents following the Oslo Terror events 2011

    PubMed Central

    Nordanger, Dag Ø.; Breivik, Kyrre; Haugland, Bente Storm; Lehmann, Stine; Mæhle, Magne; Braarud, Hanne Cecilie; Hysing, Mari

    2014-01-01

    Background Former studies suggest that prior exposure to adverse experiences such as violence or sexual abuse increases vulnerability to posttraumatic stress reactions in victims of subsequent trauma. However, little is known about how such a history affects responses to terror in the general adolescent population. Objective To explore the role of prior exposure to adverse experiences as risk factors for posttraumatic stress reactions to the Oslo Terror events. Method We used data from 10,220 high school students in a large cross-sectional survey of adolescents in Norway that took place seven months after the Oslo Terror events. Prior exposure assessed was: direct exposure to violence, witnessing of violence, and unwanted sexual acts. We explored how these prior adversities interact with well-established risk factors such as proximity to the events, perceived life threat during the terror events, and gender. Results All types of prior exposure as well as the other risk factors were associated with terror-related posttraumatic stress reactions. The effects of prior adversities were, although small, independent of adolescents’ proximity to the terror events. Among prior adversities, only the effect of direct exposure to violence was moderated by perceived life threat. Exposure to prior adversities increased the risk of posttraumatic stress reactions equally for both genders, but proximity to the terror events and perceived life threat increased the risk more in females. Conclusions Terror events can have a more destabilizing impact on victims of prior adversities, independent of their level of exposure. The findings may be relevant to mental health workers and others providing post-trauma health care. PMID:24872862

  4. Approaching Multidimensional Forms of Knowledge through Personal Meaning Mapping in Science Integrating Teaching outside the Classroom

    ERIC Educational Resources Information Center

    Hartmeyer, Rikke; Bølling, Mads; Bentsen, Peter

    2017-01-01

    Current research points to Personal Meaning Mapping (PMM) as a method useful in investigating students' prior and current science knowledge. However, studies investigating PMM as a method for exploring specific knowledge dimensions are lacking. Ensuring that students are able to access specific knowledge dimensions is important, especially in…

  5. A Mixed-Method Exploration of School Organizational and Social Relationship Factors That Influence Dropout-Decision Making in a Rural High School

    ERIC Educational Resources Information Center

    Farina, Andrea J.

    2013-01-01

    This explanatory mixed-method study explored the dropout phenomenon from an ecological perspective identifying the school organizational (academics, activities, structure) and social relationship (teachers, peers) factors that most significantly influence students' decisions to leave school prior to graduation at a rural high school in south…

  6. Illness Progression, Recent Stress and Morphometry of Hippocampal Subfields and Medial Prefrontal Cortex in Major Depression

    PubMed Central

    Treadway, Michael T.; Waskom, Michael L.; Dillon, Daniel G.; Holmes, Avram J.; Park, Min Tae M.; Chakravarty, M. Mallar; Dutra, Sunny J.; Polli, Frida E.; Iosifescu, Dan V.; Fava, Maurizio; Gabrieli, John D.E.; Pizzagalli, Diego A.

    2014-01-01

    Background Longitudinal studies of illness progression in Major Depressive Disorder (MDD) indicate that the onset of subsequent depressive episodes becomes increasingly decoupled from external stressors. A possible mechanism underlying this phenomenon is that multiple episodes induce long-lasting neurobiological changes that confer increased risk for recurrence. Prior morphometric studies have frequently reported volumetric reductions in MDD—especially in medial prefrontal cortex (mPFC) and the hippocampus— but few studies have investigated whether these changes are exacerbated by prior episodes. Methods We used structural magnetic resonance imaging (sMRI) to examine relationships between number of prior episodes, current stress, and brain volume and cortical thickness in a sample of 103 medication-free depressed patients and never-depressed controls. Volumetric analyses of the hippocampus were performed using a recently-validated subfield segmentation approach, while cortical thickness estimates were obtained using Vertex-Based Cortical Thickness (VBCT). Participants were grouped on the basis of the number of prior depressive episodes as well as current depressive state. Results Number of prior episodes was associated with both lower reported stress levels as well as reduced volume in the dentate gyrus. Cortical thinning of the left medial prefrontal cortex (mPFC) was associated with a greater number of prior depressive episodes, but not current depressive state. Conclusions Collectively, these findings are consistent with preclinical models suggesting that the dentate gyrus and mPFC are especially vulnerable to stress exposure, and provide evidence for morphometric changes that are consistent with stress-sensitization models of recurrence in MDD. PMID:25109665

  7. Illness progression, recent stress, and morphometry of hippocampal subfields and medial prefrontal cortex in major depression.

    PubMed

    Treadway, Michael T; Waskom, Michael L; Dillon, Daniel G; Holmes, Avram J; Park, Min Tae M; Chakravarty, M Mallar; Dutra, Sunny J; Polli, Frida E; Iosifescu, Dan V; Fava, Maurizio; Gabrieli, John D E; Pizzagalli, Diego A

    2015-02-01

    Longitudinal studies of illness progression in patients with major depressive disorder (MDD) indicate that the onset of subsequent depressive episodes becomes increasingly decoupled from external stressors. A possible mechanism underlying this phenomenon is that multiple episodes induce long-lasting neurobiological changes that confer increased risk for recurrence. Prior morphometric studies have frequently reported volumetric reductions in patients with MDD--especially in medial prefrontal cortex (mPFC) and the hippocampus--but few studies have investigated whether these changes are exacerbated by prior episodes. In a sample of 103 medication-free patients with depression and control subjects with no history of depression, structural magnetic resonance imaging was performed to examine relationships between number of prior episodes, current stress, hippocampal subfield volume and cortical thickness. Volumetric analyses of the hippocampus were performed using a recently validated subfield segmentation approach, and cortical thickness estimates were obtained using vertex-based methods. Participants were grouped on the basis of the number of prior depressive episodes and current depressive diagnosis. Number of prior episodes was associated with both lower reported stress levels and reduced volume in the dentate gyrus. Cortical thinning of the left mPFC was associated with a greater number of prior depressive episodes but not current depressive diagnosis. Collectively, these findings are consistent with preclinical models suggesting that the dentate gyrus and mPFC are especially vulnerable to stress exposure and provide evidence for morphometric changes that are consistent with stress-sensitization models of recurrence in MDD. Copyright © 2015 Society of Biological Psychiatry. Published by Elsevier Inc. All rights reserved.

  8. Image segmentation with a novel regularized composite shape prior based on surrogate study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Tingting, E-mail: tingtingzhao@mednet.ucla.edu; Ruan, Dan, E-mail: druan@mednet.ucla.edu

    Purpose: Incorporating training into image segmentation is a good approach to achieve additional robustness. This work aims to develop an effective strategy to utilize shape prior knowledge, so that the segmentation label evolution can be driven toward the desired global optimum. Methods: In the variational image segmentation framework, a regularization for the composite shape prior is designed to incorporate the geometric relevance of individual training data to the target, which is inferred by an image-based surrogate relevance metric. Specifically, this regularization is imposed on the linear weights of composite shapes and serves as a hyperprior. The overall problem is formulatedmore » in a unified optimization setting and a variational block-descent algorithm is derived. Results: The performance of the proposed scheme is assessed in both corpus callosum segmentation from an MR image set and clavicle segmentation based on CT images. The resulted shape composition provides a proper preference for the geometrically relevant training data. A paired Wilcoxon signed rank test demonstrates statistically significant improvement of image segmentation accuracy, when compared to multiatlas label fusion method and three other benchmark active contour schemes. Conclusions: This work has developed a novel composite shape prior regularization, which achieves superior segmentation performance than typical benchmark schemes.« less

  9. Analyzing small data sets using Bayesian estimation: the case of posttraumatic stress symptoms following mechanical ventilation in burn survivors

    PubMed Central

    van de Schoot, Rens; Broere, Joris J.; Perryck, Koen H.; Zondervan-Zwijnenburg, Mariëlle; van Loey, Nancy E.

    2015-01-01

    Background The analysis of small data sets in longitudinal studies can lead to power issues and often suffers from biased parameter values. These issues can be solved by using Bayesian estimation in conjunction with informative prior distributions. By means of a simulation study and an empirical example concerning posttraumatic stress symptoms (PTSS) following mechanical ventilation in burn survivors, we demonstrate the advantages and potential pitfalls of using Bayesian estimation. Methods First, we show how to specify prior distributions and by means of a sensitivity analysis we demonstrate how to check the exact influence of the prior (mis-) specification. Thereafter, we show by means of a simulation the situations in which the Bayesian approach outperforms the default, maximum likelihood and approach. Finally, we re-analyze empirical data on burn survivors which provided preliminary evidence of an aversive influence of a period of mechanical ventilation on the course of PTSS following burns. Results Not suprisingly, maximum likelihood estimation showed insufficient coverage as well as power with very small samples. Only when Bayesian analysis, in conjunction with informative priors, was used power increased to acceptable levels. As expected, we showed that the smaller the sample size the more the results rely on the prior specification. Conclusion We show that two issues often encountered during analysis of small samples, power and biased parameters, can be solved by including prior information into Bayesian analysis. We argue that the use of informative priors should always be reported together with a sensitivity analysis. PMID:25765534

  10. Analyzing small data sets using Bayesian estimation: the case of posttraumatic stress symptoms following mechanical ventilation in burn survivors.

    PubMed

    van de Schoot, Rens; Broere, Joris J; Perryck, Koen H; Zondervan-Zwijnenburg, Mariëlle; van Loey, Nancy E

    2015-01-01

    Background : The analysis of small data sets in longitudinal studies can lead to power issues and often suffers from biased parameter values. These issues can be solved by using Bayesian estimation in conjunction with informative prior distributions. By means of a simulation study and an empirical example concerning posttraumatic stress symptoms (PTSS) following mechanical ventilation in burn survivors, we demonstrate the advantages and potential pitfalls of using Bayesian estimation. Methods : First, we show how to specify prior distributions and by means of a sensitivity analysis we demonstrate how to check the exact influence of the prior (mis-) specification. Thereafter, we show by means of a simulation the situations in which the Bayesian approach outperforms the default, maximum likelihood and approach. Finally, we re-analyze empirical data on burn survivors which provided preliminary evidence of an aversive influence of a period of mechanical ventilation on the course of PTSS following burns. Results : Not suprisingly, maximum likelihood estimation showed insufficient coverage as well as power with very small samples. Only when Bayesian analysis, in conjunction with informative priors, was used power increased to acceptable levels. As expected, we showed that the smaller the sample size the more the results rely on the prior specification. Conclusion : We show that two issues often encountered during analysis of small samples, power and biased parameters, can be solved by including prior information into Bayesian analysis. We argue that the use of informative priors should always be reported together with a sensitivity analysis.

  11. Improving Bayesian credibility intervals for classifier error rates using maximum entropy empirical priors.

    PubMed

    Gustafsson, Mats G; Wallman, Mikael; Wickenberg Bolin, Ulrika; Göransson, Hanna; Fryknäs, M; Andersson, Claes R; Isaksson, Anders

    2010-06-01

    Successful use of classifiers that learn to make decisions from a set of patient examples require robust methods for performance estimation. Recently many promising approaches for determination of an upper bound for the error rate of a single classifier have been reported but the Bayesian credibility interval (CI) obtained from a conventional holdout test still delivers one of the tightest bounds. The conventional Bayesian CI becomes unacceptably large in real world applications where the test set sizes are less than a few hundred. The source of this problem is that fact that the CI is determined exclusively by the result on the test examples. In other words, there is no information at all provided by the uniform prior density distribution employed which reflects complete lack of prior knowledge about the unknown error rate. Therefore, the aim of the study reported here was to study a maximum entropy (ME) based approach to improved prior knowledge and Bayesian CIs, demonstrating its relevance for biomedical research and clinical practice. It is demonstrated how a refined non-uniform prior density distribution can be obtained by means of the ME principle using empirical results from a few designs and tests using non-overlapping sets of examples. Experimental results show that ME based priors improve the CIs when employed to four quite different simulated and two real world data sets. An empirically derived ME prior seems promising for improving the Bayesian CI for the unknown error rate of a designed classifier. Copyright 2010 Elsevier B.V. All rights reserved.

  12. Improving thermal dose accuracy in magnetic resonance-guided focused ultrasound surgery: Long-term thermometry using a prior baseline as a reference.

    PubMed

    Bitton, Rachel R; Webb, Taylor D; Pauly, Kim Butts; Ghanouni, Pejman

    2016-01-01

    To investigate thermal dose volume (TDV) and non-perfused volume (NPV) of magnetic resonance-guided focused ultrasound (MRgFUS) treatments in patients with soft tissue tumors, and describe a method for MR thermal dosimetry using a baseline reference. Agreement between TDV and immediate post treatment NPV was evaluated from MRgFUS treatments of five patients with biopsy-proven desmoid tumors. Thermometry data (gradient echo, 3T) were analyzed over the entire course of the treatments to discern temperature errors in the standard approach. The technique searches previously acquired baseline images for a match using 2D normalized cross-correlation and a weighted mean of phase difference images. Thermal dose maps and TDVs were recalculated using the matched baseline and compared to NPV. TDV and NPV showed between 47%-91% disagreement, using the standard immediate baseline method for calculating TDV. Long-term thermometry showed a nonlinear local temperature accrual, where peak additional temperature varied between 4-13°C (mean = 7.8°C) across patients. The prior baseline method could be implemented by finding a previously acquired matching baseline 61% ± 8% (mean ± SD) of the time. We found 7%-42% of the disagreement between TDV and NPV was due to errors in thermometry caused by heat accrual. For all patients, the prior baseline method increased the estimated treatment volume and reduced the discrepancies between TDV and NPV (P = 0.023). This study presents a mismatch between in-treatment and post treatment efficacy measures. The prior baseline approach accounts for local heating and improves the accuracy of thermal dose-predicted volume. © 2015 Wiley Periodicals, Inc.

  13. Genetic Classification of Populations Using Supervised Learning

    PubMed Central

    Bridges, Michael; Heron, Elizabeth A.; O'Dushlaine, Colm; Segurado, Ricardo; Morris, Derek; Corvin, Aiden; Gill, Michael; Pinto, Carlos

    2011-01-01

    There are many instances in genetics in which we wish to determine whether two candidate populations are distinguishable on the basis of their genetic structure. Examples include populations which are geographically separated, case–control studies and quality control (when participants in a study have been genotyped at different laboratories). This latter application is of particular importance in the era of large scale genome wide association studies, when collections of individuals genotyped at different locations are being merged to provide increased power. The traditional method for detecting structure within a population is some form of exploratory technique such as principal components analysis. Such methods, which do not utilise our prior knowledge of the membership of the candidate populations. are termed unsupervised. Supervised methods, on the other hand are able to utilise this prior knowledge when it is available. In this paper we demonstrate that in such cases modern supervised approaches are a more appropriate tool for detecting genetic differences between populations. We apply two such methods, (neural networks and support vector machines) to the classification of three populations (two from Scotland and one from Bulgaria). The sensitivity exhibited by both these methods is considerably higher than that attained by principal components analysis and in fact comfortably exceeds a recently conjectured theoretical limit on the sensitivity of unsupervised methods. In particular, our methods can distinguish between the two Scottish populations, where principal components analysis cannot. We suggest, on the basis of our results that a supervised learning approach should be the method of choice when classifying individuals into pre-defined populations, particularly in quality control for large scale genome wide association studies. PMID:21589856

  14. Mixed linear-non-linear inversion of crustal deformation data: Bayesian inference of model, weighting and regularization parameters

    NASA Astrophysics Data System (ADS)

    Fukuda, Jun'ichi; Johnson, Kaj M.

    2010-06-01

    We present a unified theoretical framework and solution method for probabilistic, Bayesian inversions of crustal deformation data. The inversions involve multiple data sets with unknown relative weights, model parameters that are related linearly or non-linearly through theoretic models to observations, prior information on model parameters and regularization priors to stabilize underdetermined problems. To efficiently handle non-linear inversions in which some of the model parameters are linearly related to the observations, this method combines both analytical least-squares solutions and a Monte Carlo sampling technique. In this method, model parameters that are linearly and non-linearly related to observations, relative weights of multiple data sets and relative weights of prior information and regularization priors are determined in a unified Bayesian framework. In this paper, we define the mixed linear-non-linear inverse problem, outline the theoretical basis for the method, provide a step-by-step algorithm for the inversion, validate the inversion method using synthetic data and apply the method to two real data sets. We apply the method to inversions of multiple geodetic data sets with unknown relative data weights for interseismic fault slip and locking depth. We also apply the method to the problem of estimating the spatial distribution of coseismic slip on faults with unknown fault geometry, relative data weights and smoothing regularization weight.

  15. Application of the Markov Chain Monte Carlo method for snow water equivalent retrieval based on passive microwave measurements

    NASA Astrophysics Data System (ADS)

    Pan, J.; Durand, M. T.; Vanderjagt, B. J.

    2015-12-01

    Markov Chain Monte Carlo (MCMC) method is a retrieval algorithm based on Bayes' rule, which starts from an initial state of snow/soil parameters, and updates it to a series of new states by comparing the posterior probability of simulated snow microwave signals before and after each time of random walk. It is a realization of the Bayes' rule, which gives an approximation to the probability of the snow/soil parameters in condition of the measured microwave TB signals at different bands. Although this method could solve all snow parameters including depth, density, snow grain size and temperature at the same time, it still needs prior information of these parameters for posterior probability calculation. How the priors will influence the SWE retrieval is a big concern. Therefore, in this paper at first, a sensitivity test will be carried out to study how accurate the snow emission models and how explicit the snow priors need to be to maintain the SWE error within certain amount. The synthetic TB simulated from the measured snow properties plus a 2-K observation error will be used for this purpose. It aims to provide a guidance on the MCMC application under different circumstances. Later, the method will be used for the snowpits at different sites, including Sodankyla, Finland, Churchill, Canada and Colorado, USA, using the measured TB from ground-based radiometers at different bands. Based on the previous work, the error in these practical cases will be studied, and the error sources will be separated and quantified.

  16. Adaptation of red blood cell lysis represents a fundamental breakthrough that improves the sensitivity of Salmonella detection in blood

    PubMed Central

    Boyd, MA; Tennant, SM; Melendez, JH; Toema, D; Galen, JE; Geddes, CD; Levine, MM

    2015-01-01

    Aims Isolation of Salmonella Typhi from blood culture is the standard diagnostic for confirming typhoid fever but it is unavailable in many developing countries. We previously described a Microwave Accelerated Metal Enhanced Fluorescence (MAMEF)-based assay to detect Salmonella in medium. Attempts to detect Salmonella in blood were unsuccessful, presumably due to the interference of erythrocytes. The objective of this study was to evaluate various blood treatment methods that could be used prior to PCR, real-time PCR or MAMEF to increase sensitivity of detection of Salmonella. Methods and Results We tested ammonium chloride and erythrocyte lysis buffer, water, Lymphocyte Separation Medium, BD Vacutainer® CPT™ Tubes and dextran. Erythrocyte lysis buffer was the best isolation method as it is fast, inexpensive and works with either fresh or stored blood. The sensitivity of PCR- and real-time PCR detection of Salmonella in spiked blood was improved when whole blood was first lysed using erythrocyte lysis buffer prior to DNA extraction. Removal of erythrocytes and clotting factors also enabled reproducible lysis of Salmonella and fragmentation of DNA, which are necessary for MAMEF sensing. Conclusions Use of the erythrocyte lysis procedure prior to DNA extraction has enabled improved sensitivity of Salmonella detection by PCR and real-time PCR and has allowed lysis and fragmentation of Salmonella using microwave radiation (for future detection by MAMEF). Significance and Impact of the Study Adaptation of the blood lysis method represents a fundamental breakthrough that improves the sensitivity of DNA-based detection of Salmonella in blood. PMID:25630831

  17. Heat transfer to two-phase air/water mixtures flowing in small tubes with inlet disequilibrium

    NASA Technical Reports Server (NTRS)

    Janssen, J. M.; Florschuetz, L. W.; Fiszdon, J. P.

    1986-01-01

    The cooling of gas turbine components was the subject of considerable research. The problem is difficult because the available coolant, compressor bleed air, is itself quite hot and has relatively poor thermophysical properties for a coolant. Injecting liquid water to evaporatively cool the air prior to its contact with the hot components was proposed and studied, particularly as a method of cooling for contingency power applications. Injection of a small quantity of cold liquid water into a relatively hot coolant air stream such that evaporation of the liquid is still in process when the coolant contacts the hot component was studied. No approach was found whereby heat transfer characteristics could be confidently predicted for such a case based solely on prior studies. It was not clear whether disequilibrium between phases at the inlet to the hot component section would improve cooling relative to that obtained where equilibrium was established prior to contact with the hot surface.

  18. Sequential Probability Ratio Test for Collision Avoidance Maneuver Decisions

    NASA Technical Reports Server (NTRS)

    Carpenter, J. Russell; Markley, F. Landis

    2010-01-01

    When facing a conjunction between space objects, decision makers must chose whether to maneuver for collision avoidance or not. We apply a well-known decision procedure, the sequential probability ratio test, to this problem. We propose two approaches to the problem solution, one based on a frequentist method, and the other on a Bayesian method. The frequentist method does not require any prior knowledge concerning the conjunction, while the Bayesian method assumes knowledge of prior probability densities. Our results show that both methods achieve desired missed detection rates, but the frequentist method's false alarm performance is inferior to the Bayesian method's

  19. Concentration of organic compounds in natural waters with solid-phase dispersion based on advesicle modified silica prior to liquid chromatography.

    PubMed

    Parisis, Nikolaos A; Giokas, Dimosthenis L; Vlessidis, Athanasios G; Evmiridis, Nicholaos P

    2005-12-02

    The ability of vesicle-coated silica to aid the extraction of organic compounds from water prior to liquid chromatographic analysis is presented for the first time. The method is based on the formation of silica supported cationic multi-lamellar vesicles of gemini surfactants inherently ensuring the presence of hydrophilic and hydrophobic sites for the partitioning of analytes bearing different properties. Method development is illustrated by studying the adsolubilization of UV absorbing chemicals from swimming pool water. Due to the requirement for external energy input (intense shearing) a method based on solid-phase dispersion (SPD) was applied producing better results than off-line solid-phase extraction (SPE). Meticulous investigation of the experimental parameters was conducted in order to elucidate the mechanisms behind the proposed extraction pattern. Analyte recoveries were quantitative under the optimum experimental conditions offering recoveries higher than 96% with RSD values below 5%.

  20. Efficient harvesting methods for early-stage snake and turtle embryos.

    PubMed

    Matsubara, Yoshiyuki; Kuroiwa, Atsushi; Suzuki, Takayuki

    2016-04-01

    Reptile development is an intriguing research target for understating the unique morphogenesis of reptiles as well as the evolution of vertebrates. However, there are numerous difficulties associated with studying development in reptiles. The number of available reptile eggs is usually quite limited. In addition, the reptile embryo is tightly adhered to the eggshell, making it a challenge to isolate reptile embryos intact. Furthermore, there have been few reports describing efficient procedures for isolating intact embryos especially prior to pharyngula stage. Thus, the aim of this review is to present efficient procedures for obtaining early-stage reptilian embryos intact. We first describe the method for isolating early-stage embryos of the Japanese striped snake. This is the first detailed method for obtaining embryos prior to oviposition in oviparous snake species. Second, we describe an efficient strategy for isolating early-stage embryos of the soft-shelled turtle. © 2016 Japanese Society of Developmental Biologists.

  1. Preliminary assestment of lint cotton water content in gin-drying temperature studies

    USDA-ARS?s Scientific Manuscript database

    Prior studies to measure total water (free and bound) in lint cotton by Karl Fischer Titration showed the method is more accurate and precise than moisture content by standard oven drying. The objective of the current study was to compare the moisture and total water contents from five cultivars de...

  2. Implicit Priors in Galaxy Cluster Mass and Scaling Relation Determinations

    NASA Technical Reports Server (NTRS)

    Mantz, A.; Allen, S. W.

    2011-01-01

    Deriving the total masses of galaxy clusters from observations of the intracluster medium (ICM) generally requires some prior information, in addition to the assumptions of hydrostatic equilibrium and spherical symmetry. Often, this information takes the form of particular parametrized functions used to describe the cluster gas density and temperature profiles. In this paper, we investigate the implicit priors on hydrostatic masses that result from this fully parametric approach, and the implications of such priors for scaling relations formed from those masses. We show that the application of such fully parametric models of the ICM naturally imposes a prior on the slopes of the derived scaling relations, favoring the self-similar model, and argue that this prior may be influential in practice. In contrast, this bias does not exist for techniques which adopt an explicit prior on the form of the mass profile but describe the ICM non-parametrically. Constraints on the slope of the cluster mass-temperature relation in the literature show a separation based the approach employed, with the results from fully parametric ICM modeling clustering nearer the self-similar value. Given that a primary goal of scaling relation analyses is to test the self-similar model, the application of methods subject to strong, implicit priors should be avoided. Alternative methods and best practices are discussed.

  3. Maximum entropy, fluctuations and priors

    NASA Astrophysics Data System (ADS)

    Caticha, A.

    2001-05-01

    The method of maximum entropy (ME) is extended to address the following problem: Once one accepts that the ME distribution is to be preferred over all others, the question is to what extent are distributions with lower entropy supposed to be ruled out. Two applications are given. The first is to the theory of thermodynamic fluctuations. The formulation is exact, covariant under changes of coordinates, and allows fluctuations of both the extensive and the conjugate intensive variables. The second application is to the construction of an objective prior for Bayesian inference. The prior obtained by following the ME method to its inevitable conclusion turns out to be a special case (α=1) of what are currently known under the name of entropic priors. .

  4. Evaluation of two methods for using MR information in PET reconstruction

    NASA Astrophysics Data System (ADS)

    Caldeira, L.; Scheins, J.; Almeida, P.; Herzog, H.

    2013-02-01

    Using magnetic resonance (MR) information in maximum a posteriori (MAP) algorithms for positron emission tomography (PET) image reconstruction has been investigated in the last years. Recently, three methods to introduce this information have been evaluated and the Bowsher prior was considered the best. Its main advantage is that it does not require image segmentation. Another method that has been widely used for incorporating MR information is using boundaries obtained by segmentation. This method has also shown improvements in image quality. In this paper, two methods for incorporating MR information in PET reconstruction are compared. After a Bayes parameter optimization, the reconstructed images were compared using the mean squared error (MSE) and the coefficient of variation (CV). MSE values are 3% lower in Bowsher than using boundaries. CV values are 10% lower in Bowsher than using boundaries. Both methods performed better than using no prior, that is, maximum likelihood expectation maximization (MLEM) or MAP without anatomic information in terms of MSE and CV. Concluding, incorporating MR information using the Bowsher prior gives better results in terms of MSE and CV than boundaries. MAP algorithms showed again to be effective in noise reduction and convergence, specially when MR information is incorporated. The robustness of the priors in respect to noise and inhomogeneities in the MR image has however still to be performed.

  5. Hepa filter dissolution process

    DOEpatents

    Brewer, Ken N.; Murphy, James A.

    1994-01-01

    A process for dissolution of spent high efficiency particulate air (HEPA) filters and then combining the complexed filter solution with other radioactive wastes prior to calcining the mixed and blended waste feed. The process is an alternate to a prior method of acid leaching the spent filters which is an inefficient method of treating spent HEPA filters for disposal.

  6. Multi-object segmentation using coupled nonparametric shape and relative pose priors

    NASA Astrophysics Data System (ADS)

    Uzunbas, Mustafa Gökhan; Soldea, Octavian; Çetin, Müjdat; Ünal, Gözde; Erçil, Aytül; Unay, Devrim; Ekin, Ahmet; Firat, Zeynep

    2009-02-01

    We present a new method for multi-object segmentation in a maximum a posteriori estimation framework. Our method is motivated by the observation that neighboring or coupling objects in images generate configurations and co-dependencies which could potentially aid in segmentation if properly exploited. Our approach employs coupled shape and inter-shape pose priors that are computed using training images in a nonparametric multi-variate kernel density estimation framework. The coupled shape prior is obtained by estimating the joint shape distribution of multiple objects and the inter-shape pose priors are modeled via standard moments. Based on such statistical models, we formulate an optimization problem for segmentation, which we solve by an algorithm based on active contours. Our technique provides significant improvements in the segmentation of weakly contrasted objects in a number of applications. In particular for medical image analysis, we use our method to extract brain Basal Ganglia structures, which are members of a complex multi-object system posing a challenging segmentation problem. We also apply our technique to the problem of handwritten character segmentation. Finally, we use our method to segment cars in urban scenes.

  7. BEaST: brain extraction based on nonlocal segmentation technique.

    PubMed

    Eskildsen, Simon F; Coupé, Pierrick; Fonov, Vladimir; Manjón, José V; Leung, Kelvin K; Guizard, Nicolas; Wassef, Shafik N; Østergaard, Lasse Riis; Collins, D Louis

    2012-02-01

    Brain extraction is an important step in the analysis of brain images. The variability in brain morphology and the difference in intensity characteristics due to imaging sequences make the development of a general purpose brain extraction algorithm challenging. To address this issue, we propose a new robust method (BEaST) dedicated to produce consistent and accurate brain extraction. This method is based on nonlocal segmentation embedded in a multi-resolution framework. A library of 80 priors is semi-automatically constructed from the NIH-sponsored MRI study of normal brain development, the International Consortium for Brain Mapping, and the Alzheimer's Disease Neuroimaging Initiative databases. In testing, a mean Dice similarity coefficient of 0.9834±0.0053 was obtained when performing leave-one-out cross validation selecting only 20 priors from the library. Validation using the online Segmentation Validation Engine resulted in a top ranking position with a mean Dice coefficient of 0.9781±0.0047. Robustness of BEaST is demonstrated on all baseline ADNI data, resulting in a very low failure rate. The segmentation accuracy of the method is better than two widely used publicly available methods and recent state-of-the-art hybrid approaches. BEaST provides results comparable to a recent label fusion approach, while being 40 times faster and requiring a much smaller library of priors. Copyright © 2011 Elsevier Inc. All rights reserved.

  8. Alcohol, drug and other prior crimes and risk of arrest in handgun purchasers: protocol for a controlled observational study

    PubMed Central

    Wintemute, Garen J; Kass, Philip H; Stewart, Susan L; Cerdá, Magdalena; Gruenewald, Paul J

    2016-01-01

    Background and objective Alcohol abuse is common in the USA and is a well-established risk factor for violence. Other drug use and criminal activity are risk factors as well and frequently occur together with alcohol abuse. Firearm ownership is also common; there are >50 million firearm owners in the USA. This study assesses the relationships between alcohol and drug abuse and future violence among firearm owners, which no prior research has done. Design and study population This records-based retrospective cohort study will involve all persons who legally purchased handguns in California in 2001—approximately 116 000 individuals—with follow-up through the end of 2013. Methods The principal exposures include prior convictions for alcohol-related and drug-related offenses. The primary outcome measure is an arrest following handgun purchase for a violent Crime Index offense: homicide, rape, robbery or aggravated assault. Subjects will be considered at risk for outcome events for only as long as their residence in California can be established independently of outcome events. Covariates include individual characteristics (eg, age, sex, criminal history, firearm purchase history) and community characteristics (eg, demographics, socioeconomic measures, firearm ownership and alcohol outlet density). We will employ survival analytic methods, expressing effects as HRs. Discussion The results of this large-scale study are likely to be generalisable and to have important implications for violence prevention policies and programmes. PMID:26498316

  9. The Development of Word Frequency Lists Prior to the 1944 Thorndike-Lorge List.

    ERIC Educational Resources Information Center

    Bontrager, Terry

    1991-01-01

    Examines the word frequency studies that preceded the 1944 Thorndike-Lorge count and places those investigations in their broad, cultural perspective. Draws attention to the impact of the studies on knowledge about language and its development, educational curriculum and assessment, and methods of research. (MG)

  10. Dimensions of Spirituality Fostered through the PULSE Program for Service Learning

    ERIC Educational Resources Information Center

    Barrett, Michelle C. Sterk

    2016-01-01

    Cultivating spiritual development is central to the mission of Catholic higher education institutions. Studies demonstrate that service learning is a pedagogical method through which spiritual development can be fostered among undergraduates. This study builds upon prior research to analyze whether spiritual growth occurred and which dimensions of…

  11. The Retention of Meaningful Understanding of Meiosis and Genetics.

    ERIC Educational Resources Information Center

    Cavallo, Ann Liberatore

    This study investigated the retention of meaningful understanding of the biological topics of meiosis, the Punnett square method and the relations between these two topics. This study also explored the predictive influence of students' general tendency to learn meaningfully or by rote (meaningful learning orientation), prior knowledge of meiosis,…

  12. Unemployment Benefit Exhaustion: Incentive Effects on Job-Finding Rates

    ERIC Educational Resources Information Center

    Filges, Trine; Geerdsen, Lars Pico; Knudsen, Anne-Sofie Due; Jørgensen, Anne-Marie Klint

    2015-01-01

    Purpose: This systematic review studied the impact of exhaustion of unemployment benefits on the exit rate out of unemployment and into employment prior to benefit exhaustion or shortly thereafter. Method: We followed Campbell Collaboration guidelines to prepare this review, and ultimately located 12 studies for final analysis and interpretation.…

  13. The measurement of patient attitudes regarding prenatal and preconception genetic carrier screening and translational behavioral medicine: an integrative review.

    PubMed

    Shiroff, Jennifer J; Gregoski, Mathew J

    2017-06-01

    Measurement of recessive carrier screening attitudes related to conception and pregnancy is necessary to determine current acceptance, and whether behavioral intervention strategies are needed in clinical practice. To evaluate quantitative survey instruments to measure patient attitudes regarding genetic carrier testing prior to conception and pregnancy databases examining patient attitudes regarding genetic screening prior to conception and pregnancy from 2003-2013 were searched yielding 344 articles; eight studies with eight instruments met criteria for inclusion. Data abstraction on theoretical framework, subjects, instrument description, scoring, method of measurement, reliability, validity, feasibility, level of evidence, and outcomes was completed. Reliability information was provided in five studies with an internal consistency of Cronbach's α >0.70. Information pertaining to validity was presented in three studies and included construct validity via factor analysis. Despite limited psychometric information, these questionnaires are self-administered and can be briefly completed, making them a feasible method of evaluation.

  14. Sleep Deprivation and Recovery Sleep Prior to a Noxious Inflammatory Insult Influence Characteristics and Duration of Pain

    PubMed Central

    Vanini, Giancarlo

    2016-01-01

    Study Objectives: Insufficient sleep and chronic pain are public health epidemics. Sleep loss worsens pain and predicts the development of chronic pain. Whether previous, acute sleep loss and recovery sleep determine pain levels and duration remains poorly understood. This study tested whether acute sleep deprivation and recovery sleep prior to formalin injection alter post-injection pain levels and duration. Methods: Male Sprague-Dawley rats (n = 48) underwent sleep deprivation or ad libitum sleep for 9 hours. Thereafter, rats received a subcutaneous injection of formalin or saline into a hind paw. In the recovery sleep group, rats were allowed 24 h between sleep deprivation and the injection of formalin. Mechanical and thermal nociception were assessed using the von Frey test and Hargreaves' method. Nociceptive measures were performed at 1, 3, 7, 10, 14, 17 and 21 days post-injection. Results: Formalin caused bilateral mechanical hypersensitivity (allodynia) that persisted for up to 21 days post-injection. Sleep deprivation significantly enhanced bilateral allodynia. There was a synergistic interaction when sleep deprivation preceded a formalin injection. Rats allowed a recovery sleep period prior to formalin injection developed allodynia only in the injected limb, with higher mechanical thresholds (less allodynia) and a shorter recovery period. There were no persistent changes in thermal nociception. Conclusion: The data suggest that acute sleep loss preceding an inflammatory insult enhances pain and can contribute to chronic pain. The results encourage studies in a model of surgical pain to test whether enhancing sleep reduces pain levels and duration. Citation: Vanini G. Sleep deprivation and recovery sleep prior to a noxious inflammatory insult influence characteristics and duration of pain. SLEEP 2016;39(1):133–142. PMID:26237772

  15. Insufficient evidence for the use of a physical examination to detect maltreatment in children without prior suspicion: a systematic review

    PubMed Central

    2013-01-01

    Background Although it is often performed in clinical practice, the diagnostic value of a screening physical examination to detect maltreatment in children without prior suspicion has not been reviewed. This article aims to evaluate the diagnostic value of a complete physical examination as a screening instrument to detect maltreatment in children without prior suspicion. Methods We systematically searched the databases of MEDLINE, EMBASE, PsychINFO, CINAHL, and ERIC, using a sensitive search strategy. Studies that i) presented medical findings of a complete physical examination for screening purposes in children 0–18 years, ii) specifically recorded the presence or absence of signs of child maltreatment, and iii) recorded child maltreatment confirmed by a reference standard, were included. Two reviewers independently performed study selection, data extraction, and quality appraisal using the QUADAS-2 tool. Results The search yielded 4,499 titles, of which three studies met the eligibility criteria. The prevalence of confirmed signs of maltreatment during screening physical examination varied between 0.8% and 13.5%. The designs of the studies were inadequate to assess the diagnostic accuracy of a screening physical examination for child maltreatment. Conclusions Because of the lack of informative studies, we could not draw conclusions about the diagnostic value of a screening physical examination in children without prior suspicion of child maltreatment. PMID:24313949

  16. Preservice Teachers' Images of Scientists: Do Prior Science Experiences Make a Difference?

    ERIC Educational Resources Information Center

    Milford, Todd M.; Tippett, Christine D.

    2013-01-01

    This article presents the results of a mixed methods study that used the Draw-a-Scientist Test as a visual tool for exploring preservice teachers' beliefs about scientists. A questionnaire was also administered to 165 students who were enrolled in elementary (K-8) and secondary (8-12) science methods courses. Taken as a whole, the images drawn by…

  17. Benthic meiofauna responses to five forest harvest methods

    Treesearch

    Freese Smith; Arthur V. Brown; Misty Pope; Jerry L. Michael

    2001-01-01

    Benthic meiofauna were collected from the pools of minute (0 order) streams in the Ouachita National Forest, Arkansas during March 21-23, 1996 to see if benthic communities responded to forest harvest methods in a similar manner as plankton communities collected two years prior. The study streams and their watersheds (2-6 ha) were located in 14-16 ha forest stands that...

  18. The Effects of Prior Authorization Policies on Medicaid-Enrolled Children's Use of Antipsychotic Medications: Evidence from Two Mid-Atlantic States

    PubMed Central

    Leckman-Westin, Emily; Okeke, Edward; Scharf, Deborah M.; Sorbero, Mark; Chen, Qingxian; Chor, Ka Ho Brian; Finnerty, Molly; Wisdom, Jennifer P.

    2014-01-01

    Abstract Objective: The purpose of this study was to examine the impact of prior authorization policies on the receipt of antipsychotic medication for Medicaid-enrolled children. Methods: Using de-identified administrative Medicaid data from two large, neighboring, mid-Atlantic states from November 2007 through June 2011, we identified subjects <18 years of age using antipsychotics, from the broader group of children and adolescents receiving behavioral health services or any psychotropic medication. Prior authorization for antipsychotics was required for children in State A <6 years of age from September 2008, and for children <13 years of age from August 2009. No such prior authorizations existed in State B during that period. Filled prescriptions were identified in the data using national drug codes. Using a triple-difference strategy (using differences among the states, time periods, and differences in antidepressant prescribing rates among states over the same time periods), we examined the effect of the prior authorization policy on the rate at which antipsychotic prescriptions were filled for Medicaid-enrolled children and adolescents. Results: The impact of prior authorization policies on antipsychotic medication use varied by age: Among 6–12 year old children, the impact of the prior authorization policy on antipsychotic medication prescribing was a modest but statistically significant decrease of 0.47% after adjusting for other factors; there was no effect of the prior authorization among children 0–5 years. Conclusions: Prior authorization policies had a modest but statistically significant effect on antipsychotic use in 6–12 year old children, but had no impact in younger children. Future research is needed to understand the utilization and clinical effects of prior authorization and other policies and interventions designed to influence antipsychotic use in children. PMID:25144909

  19. Bayesian statistical ionospheric tomography improved by incorporating ionosonde measurements

    NASA Astrophysics Data System (ADS)

    Norberg, Johannes; Virtanen, Ilkka I.; Roininen, Lassi; Vierinen, Juha; Orispää, Mikko; Kauristie, Kirsti; Lehtinen, Markku S.

    2016-04-01

    We validate two-dimensional ionospheric tomography reconstructions against EISCAT incoherent scatter radar measurements. Our tomography method is based on Bayesian statistical inversion with prior distribution given by its mean and covariance. We employ ionosonde measurements for the choice of the prior mean and covariance parameters and use the Gaussian Markov random fields as a sparse matrix approximation for the numerical computations. This results in a computationally efficient tomographic inversion algorithm with clear probabilistic interpretation. We demonstrate how this method works with simultaneous beacon satellite and ionosonde measurements obtained in northern Scandinavia. The performance is compared with results obtained with a zero-mean prior and with the prior mean taken from the International Reference Ionosphere 2007 model. In validating the results, we use EISCAT ultra-high-frequency incoherent scatter radar measurements as the ground truth for the ionization profile shape. We find that in comparison to the alternative prior information sources, ionosonde measurements improve the reconstruction by adding accurate information about the absolute value and the altitude distribution of electron density. With an ionosonde at continuous disposal, the presented method enhances stand-alone near-real-time ionospheric tomography for the given conditions significantly.

  20. Method for loading lipid like vesicles with drugs of other chemicals

    DOEpatents

    Mehlhorn, R.J.

    1998-06-09

    A method for accumulating drugs or other chemicals within synthetic, lipid-like vesicles by means of a pH gradient imposed on the vesicles just prior to use is described. The method is suited for accumulating molecules with basic or acid moieties which are permeable to the vesicles membranes in their uncharged form and for molecules that contain charge moieties that are hydrophobic ions and can therefore cross the vesicle membranes in their charged form. The method is advantageous over prior art methods for encapsulating biologically active materials within vesicles in that is achieves very high degrees of loading with simple procedures that are economical and require little technical expertise, furthermore kits which can be stored for prolonged periods prior to use without impairment of the capacity to achieve drug accumulation are described. A related application of the method consists of using this technology to detoxify animals that have been exposed to poisons with basic, weak acid or hydrophobic charge groups within their molecular structures. 2 figs.

  1. Method for loading lipid like vesicles with drugs of other chemicals

    DOEpatents

    Mehlhorn, Rolf Joachim

    1998-01-01

    A method for accumulating drugs or other chemicals within synthetic, lipid-like vesicles by means of a pH gradient imposed on the vesicles just prior to use is described. The method is suited for accumulating molecules with basic or acid moieties which are permeable to the vesicles membranes in their uncharged form and for molecules that contain charge moieties that are hydrophobic ions and can therefore cross the vesicle membranes in their charged form. The method is advantageous over prior art methods for encapsulating biologically active materials within vesicles in that is achieves very high degrees of loading with simple procedures that are economical and require little technical expertise, furthermore kits which can be stored for prolonged periods prior to use without impairment of the capacity to achieve drug accumulation are described. A related application of the method consists of using this technology to detoxify animals that have been exposed to poisons with basic, weak acid or hydrophobic charge groups within their molecular structures.

  2. Method of detoxifying animal suffering from overdose

    DOEpatents

    Mehlhorn, Rolf J.

    1997-01-01

    A method for accumulating drugs or other chemicals within synthetic, lipid-like vesicles by means of a pH gradient imposed on the vesicles just prior to use is described. The method is suited for accumulating molecules with basic or acid moieties which are permeable to the vesicles membranes in their uncharged form and for molecules that contain charge moieties that are hydrophobic ions and can therefore cross the vesicle membranes in their charged form. The method is advantageous over prior art methods for encapsulating biologically active materials within vesicles in that it achieves very high degrees of loading with simple procedures that are economical and require little technical expertise, furthermore kits which can be stored for prolonged periods prior to use without impairment of the capacity to achieve drug accumulation are described. A related application of the method consists of using this technology to detoxify animals that have been exposed to poisons with basic, weak acid or hydrophobic charge groups within their molecular structure.

  3. Dynamic PET Image reconstruction for parametric imaging using the HYPR kernel method

    NASA Astrophysics Data System (ADS)

    Spencer, Benjamin; Qi, Jinyi; Badawi, Ramsey D.; Wang, Guobao

    2017-03-01

    Dynamic PET image reconstruction is a challenging problem because of the ill-conditioned nature of PET and the lowcounting statistics resulted from short time-frames in dynamic imaging. The kernel method for image reconstruction has been developed to improve image reconstruction of low-count PET data by incorporating prior information derived from high-count composite data. In contrast to most of the existing regularization-based methods, the kernel method embeds image prior information in the forward projection model and does not require an explicit regularization term in the reconstruction formula. Inspired by the existing highly constrained back-projection (HYPR) algorithm for dynamic PET image denoising, we propose in this work a new type of kernel that is simpler to implement and further improves the kernel-based dynamic PET image reconstruction. Our evaluation study using a physical phantom scan with synthetic FDG tracer kinetics has demonstrated that the new HYPR kernel-based reconstruction can achieve a better region-of-interest (ROI) bias versus standard deviation trade-off for dynamic PET parametric imaging than the post-reconstruction HYPR denoising method and the previously used nonlocal-means kernel.

  4. Induced Abortions and the Risk of Preeclampsia Among Nulliparous Women

    PubMed Central

    Parker, Samantha E.; Gissler, Mika; Ananth, Cande V.; Werler, Martha M.

    2015-01-01

    Induced abortion (IA) has been associated with a lower risk of preeclampsia among nulliparous women, but it remains unclear whether this association differs by method (either surgical or medical) or timing of IA. We performed a nested case-control study of 12,650 preeclampsia cases and 50,600 matched control deliveries identified in the Medical Birth Register of Finland from 1996 to 2010. Data on number, method, and timing of IAs were obtained through a linkage with the Registry of Induced Abortions. Odds ratios and 95% confidence intervals were calculated. Overall, prior IA was associated with a lower risk of preeclampsia, with odds ratios of 0.9 (95% confidence interval (CI): 0.9, 1.0) for 1 prior IA and 0.7 (95% CI: 0.5, 1.0) for 3 or more IAs. Differences in the associations between IA and preeclampsia by timing and method of IA were small, with odds ratios of 0.8 (95% CI: 0.6, 1.1) for late (≥12 gestation weeks) surgical abortion and 0.9 (95% CI: 0.7, 1.2) for late medical abortion. There was no association between IA in combination with a history of spontaneous abortion and risk of preeclampsia. In conclusion, prior IA only was associated with a slight reduction in the risk of preeclampsia. PMID:26377957

  5. Wavelet based detection of manatee vocalizations

    NASA Astrophysics Data System (ADS)

    Gur, Berke M.; Niezrecki, Christopher

    2005-04-01

    The West Indian manatee (Trichechus manatus latirostris) has become endangered partly because of watercraft collisions in Florida's coastal waterways. Several boater warning systems, based upon manatee vocalizations, have been proposed to reduce the number of collisions. Three detection methods based on the Fourier transform (threshold, harmonic content and autocorrelation methods) were previously suggested and tested. In the last decade, the wavelet transform has emerged as an alternative to the Fourier transform and has been successfully applied in various fields of science and engineering including the acoustic detection of dolphin vocalizations. As of yet, no prior research has been conducted in analyzing manatee vocalizations using the wavelet transform. Within this study, the wavelet transform is used as an alternative to the Fourier transform in detecting manatee vocalizations. The wavelet coefficients are analyzed and tested against a specified criterion to determine the existence of a manatee call. The performance of the method presented is tested on the same data previously used in the prior studies, and the results are compared. Preliminary results indicate that using the wavelet transform as a signal processing technique to detect manatee vocalizations shows great promise.

  6. Covariance specification and estimation to improve top-down Green House Gas emission estimates

    NASA Astrophysics Data System (ADS)

    Ghosh, S.; Lopez-Coto, I.; Prasad, K.; Whetstone, J. R.

    2015-12-01

    The National Institute of Standards and Technology (NIST) operates the North-East Corridor (NEC) project and the Indianapolis Flux Experiment (INFLUX) in order to develop measurement methods to quantify sources of Greenhouse Gas (GHG) emissions as well as their uncertainties in urban domains using a top down inversion method. Top down inversion updates prior knowledge using observations in a Bayesian way. One primary consideration in a Bayesian inversion framework is the covariance structure of (1) the emission prior residuals and (2) the observation residuals (i.e. the difference between observations and model predicted observations). These covariance matrices are respectively referred to as the prior covariance matrix and the model-data mismatch covariance matrix. It is known that the choice of these covariances can have large effect on estimates. The main objective of this work is to determine the impact of different covariance models on inversion estimates and their associated uncertainties in urban domains. We use a pseudo-data Bayesian inversion framework using footprints (i.e. sensitivities of tower measurements of GHGs to surface emissions) and emission priors (based on Hestia project to quantify fossil-fuel emissions) to estimate posterior emissions using different covariance schemes. The posterior emission estimates and uncertainties are compared to the hypothetical truth. We find that, if we correctly specify spatial variability and spatio-temporal variability in prior and model-data mismatch covariances respectively, then we can compute more accurate posterior estimates. We discuss few covariance models to introduce space-time interacting mismatches along with estimation of the involved parameters. We then compare several candidate prior spatial covariance models from the Matern covariance class and estimate their parameters with specified mismatches. We find that best-fitted prior covariances are not always best in recovering the truth. To achieve accuracy, we perform a sensitivity study to further tune covariance parameters. Finally, we introduce a shrinkage based sample covariance estimation technique for both prior and mismatch covariances. This technique allows us to achieve similar accuracy nonparametrically in a more efficient and automated way.

  7. Qualitative Investigation of the Earthquake Precuesors Prior to the March 14,2012 Earthquake in Japan

    NASA Astrophysics Data System (ADS)

    Raghuwanshi, Shailesh Kumar; Gwal, Ashok Kumar

    Abstract: In this study we have used the Empirical Mode Decomposition (EMD) method in conjunction with the Cross Correlation analysis to analyze ionospheric foF2 parameter Japan earthquake with magnitude M = 6.9. The data are collected from Kokubunji (35.70N, 139.50E) and Yamakawa (31.20N, 130.60E) ionospheric stations. The EMD method was used for removing the geophysical noise from the foF2 data and then to calculate the correlation coefficient between them. It was found that the ionospheric foF2 parameter shows anomalous change few days before the earthquake. The results are in agreement with the theoretical model evidencing ionospheric modification prior to Japan earthquake in a certain area around the epicenter.

  8. Parametrically Guided Generalized Additive Models with Application to Mergers and Acquisitions Data

    PubMed Central

    Fan, Jianqing; Maity, Arnab; Wang, Yihui; Wu, Yichao

    2012-01-01

    Generalized nonparametric additive models present a flexible way to evaluate the effects of several covariates on a general outcome of interest via a link function. In this modeling framework, one assumes that the effect of each of the covariates is nonparametric and additive. However, in practice, often there is prior information available about the shape of the regression functions, possibly from pilot studies or exploratory analysis. In this paper, we consider such situations and propose an estimation procedure where the prior information is used as a parametric guide to fit the additive model. Specifically, we first posit a parametric family for each of the regression functions using the prior information (parametric guides). After removing these parametric trends, we then estimate the remainder of the nonparametric functions using a nonparametric generalized additive model, and form the final estimates by adding back the parametric trend. We investigate the asymptotic properties of the estimates and show that when a good guide is chosen, the asymptotic variance of the estimates can be reduced significantly while keeping the asymptotic variance same as the unguided estimator. We observe the performance of our method via a simulation study and demonstrate our method by applying to a real data set on mergers and acquisitions. PMID:23645976

  9. Parametrically Guided Generalized Additive Models with Application to Mergers and Acquisitions Data.

    PubMed

    Fan, Jianqing; Maity, Arnab; Wang, Yihui; Wu, Yichao

    2013-01-01

    Generalized nonparametric additive models present a flexible way to evaluate the effects of several covariates on a general outcome of interest via a link function. In this modeling framework, one assumes that the effect of each of the covariates is nonparametric and additive. However, in practice, often there is prior information available about the shape of the regression functions, possibly from pilot studies or exploratory analysis. In this paper, we consider such situations and propose an estimation procedure where the prior information is used as a parametric guide to fit the additive model. Specifically, we first posit a parametric family for each of the regression functions using the prior information (parametric guides). After removing these parametric trends, we then estimate the remainder of the nonparametric functions using a nonparametric generalized additive model, and form the final estimates by adding back the parametric trend. We investigate the asymptotic properties of the estimates and show that when a good guide is chosen, the asymptotic variance of the estimates can be reduced significantly while keeping the asymptotic variance same as the unguided estimator. We observe the performance of our method via a simulation study and demonstrate our method by applying to a real data set on mergers and acquisitions.

  10. HEPA filter dissolution process

    DOEpatents

    Brewer, K.N.; Murphy, J.A.

    1994-02-22

    A process is described for dissolution of spent high efficiency particulate air (HEPA) filters and then combining the complexed filter solution with other radioactive wastes prior to calcining the mixed and blended waste feed. The process is an alternate to a prior method of acid leaching the spent filters which is an inefficient method of treating spent HEPA filters for disposal. 4 figures.

  11. Development of an impact- and solvent-resistant thermoplastic composite matrix, phase 3

    NASA Technical Reports Server (NTRS)

    Delano, C. B.; Kiskiras, C. J.

    1985-01-01

    The polyimide from BTDA 1,6-hexanediamine and m-phenylenediamine was selected from a prior study for the present study. Methods to prepare prepreg which would provide low void composites at low molding pressures from the thermoplastic polyimide were studied. Cresol solutions of the polyimide were applied to a balanced weave carbon fabric and the cresol removed prior to composite molding. Low void composites were prepared from smoothed prepregs at high pressures (34.5 MPa) and temperatures as low as 260 C. Lower molding pressures lead to higher void composites. Need for a lower melt viscosity in the neat resin is suggested as a requirement to achieve low void composites at low pressures. Some mechanical properties are included.

  12. A novel method to scale up fungal endophyte isolations

    USDA-ARS?s Scientific Manuscript database

    Estimations of species diversity are influenced by sampling intensity which in turn is influenced by methodology. For fungal endophyte diversity studies, the methodology includes surface-sterilization prior to isolation of endophytes. Surface-sterilization is an essential component of fungal endophy...

  13. Biomarkers for early detection of pancreatic cancer — EDRN Public Portal

    Cancer.gov

    Background: The clinical management of pancreatic cancer is severely hampered by the absence of effective screening tools. Methods: Sixty-seven biomarkers were evaluated in prediagnostic sera obtained from cases of pancreatic cancer enrolled in the Prostate, Lung, Colorectal, and Ovarian Cancer Screening Trial (PLCO). Results: The panel of CA 19-9, OPN, and OPG, identified in a prior retrospective study, was not effective. CA 19-9, CEA, NSE, bHCG, CEACAM1 and PRL were significantly altered in sera obtained from cases greater than 1 year prior to diagnosis. Levels of CA 19-9, CA 125, CEA, PRL, and IL-8 were negatively correlated with time to diagnosis. A training/validation study using alternate halves of the PLCO set failed to identify a biomarker panel with significantly improved performance over CA 19-9 alone. When the entire PLCO set was used for training at a specificity (SP) of 95%, a panel of CA 19-9, CEA, and Cyfra 21-1 provided significantly elevated sensitivity (SN) levels of 32.4% and 29.7% in samples collected 1 year prior to diagnosis, respectively, compared to SN levels of 25.7% and 17.2% for CA 19-9 alone. Conclusions: Most biomarkers identified in previously conducted case/control studies are ineffective in prediagnostic samples, however several biomarkers were identified as significantly altered up to 35 months prior to diagnosis. Two newly derived biomarker combination offered some advantage of CA 19-9 alone in terms of SN, particularly in samples collected >1 year prior to diagnosis, however further study will be needed to fully define the implications of these findings.

  14. Safety of fentanyl initiation according to past opioid exposure among patients newly prescribed fentanyl patches

    PubMed Central

    Friesen, Kevin J.; Woelk, Cornelius; Bugden, Shawn

    2016-01-01

    Background: Although a convenient opioid delivery system, transdermal fentanyl patches have caused several deaths and resulted in safety warnings reminding prescribers that fentanyl patches should be prescribed only for patients who have adequate prior exposure to opioids. We conducted a longitudinal analysis of the safety of fentanyl initiation by examining past opioid exposure among patients newly prescribed fentanyl patches. Methods: We identified all patients in the province of Manitoba who were newly prescribed fentanyl patches between Apr. 1, 2001, and Mar. 31, 2013. We converted all prior opioid use to oral morphine equivalents and determined the average daily dose in the 7–30 days before initial fentanyl patch use. Fentanyl initiation was considered unsafe if the patient’s pre-fentanyl opioid exposure was below the recommended level. Results: We identified 11 063 patients who began using fentanyl patches during the study period. Overall, fentanyl initiation was deemed unsafe in 74.1% of cases because the patient’s prior opioid exposure was inadequate. Women and patients 65 years of age and older were more likely than men and younger patients, respectively, to have inadequate prior opioid exposure (p < 0.001 for each comparison). The proportion of patients who had unsafe prescriptions for fentanyl patches decreased significantly over the study period, from 87.0% in 2001 to 50.0% in 2012 (p < 0.001). Interpretation: The safety of fentanyl initiation improved over the study period, but still half of fentanyl patch prescriptions were written for patients with inadequate prior opioid exposure. Review of prior opioid exposure may be a simple but important way to improve the safe use of fentanyl patches. PMID:27044480

  15. Blind image deconvolution using the Fields of Experts prior

    NASA Astrophysics Data System (ADS)

    Dong, Wende; Feng, Huajun; Xu, Zhihai; Li, Qi

    2012-11-01

    In this paper, we present a method for single image blind deconvolution. To improve its ill-posedness, we formulate the problem under Bayesian probabilistic framework and use a prior named Fields of Experts (FoE) which is learnt from natural images to regularize the latent image. Furthermore, due to the sparse distribution of the point spread function (PSF), we adopt a Student-t prior to regularize it. An improved alternating minimization (AM) approach is proposed to solve the resulted optimization problem. Experiments on both synthetic and real world blurred images show that the proposed method can achieve results of high quality.

  16. Selected aspects of prior and likelihood information for a Bayesian classifier in a road safety analysis.

    PubMed

    Nowakowska, Marzena

    2017-04-01

    The development of the Bayesian logistic regression model classifying the road accident severity is discussed. The already exploited informative priors (method of moments, maximum likelihood estimation, and two-stage Bayesian updating), along with the original idea of a Boot prior proposal, are investigated when no expert opinion has been available. In addition, two possible approaches to updating the priors, in the form of unbalanced and balanced training data sets, are presented. The obtained logistic Bayesian models are assessed on the basis of a deviance information criterion (DIC), highest probability density (HPD) intervals, and coefficients of variation estimated for the model parameters. The verification of the model accuracy has been based on sensitivity, specificity and the harmonic mean of sensitivity and specificity, all calculated from a test data set. The models obtained from the balanced training data set have a better classification quality than the ones obtained from the unbalanced training data set. The two-stage Bayesian updating prior model and the Boot prior model, both identified with the use of the balanced training data set, outperform the non-informative, method of moments, and maximum likelihood estimation prior models. It is important to note that one should be careful when interpreting the parameters since different priors can lead to different models. Copyright © 2017 Elsevier Ltd. All rights reserved.

  17. Bayesian Analysis of Silica Exposure and Lung Cancer Using Human and Animal Studies.

    PubMed

    Bartell, Scott M; Hamra, Ghassan Badri; Steenland, Kyle

    2017-03-01

    Bayesian methods can be used to incorporate external information into epidemiologic exposure-response analyses of silica and lung cancer. We used data from a pooled mortality analysis of silica and lung cancer (n = 65,980), using untransformed and log-transformed cumulative exposure. Animal data came from chronic silica inhalation studies using rats. We conducted Bayesian analyses with informative priors based on the animal data and different cross-species extrapolation factors. We also conducted analyses with exposure measurement error corrections in the absence of a gold standard, assuming Berkson-type error that increased with increasing exposure. The pooled animal data exposure-response coefficient was markedly higher (log exposure) or lower (untransformed exposure) than the coefficient for the pooled human data. With 10-fold uncertainty, the animal prior had little effect on results for pooled analyses and only modest effects in some individual studies. One-fold uncertainty produced markedly different results for both pooled and individual studies. Measurement error correction had little effect in pooled analyses using log exposure. Using untransformed exposure, measurement error correction caused a 5% decrease in the exposure-response coefficient for the pooled analysis and marked changes in some individual studies. The animal prior had more impact for smaller human studies and for one-fold versus three- or 10-fold uncertainty. Adjustment for Berkson error using Bayesian methods had little effect on the exposure-response coefficient when exposure was log transformed or when the sample size was large. See video abstract at, http://links.lww.com/EDE/B160.

  18. Incorporation of prior information on parameters into nonlinear regression groundwater flow models: 1. Theory

    USGS Publications Warehouse

    Cooley, Richard L.

    1982-01-01

    Prior information on the parameters of a groundwater flow model can be used to improve parameter estimates obtained from nonlinear regression solution of a modeling problem. Two scales of prior information can be available: (1) prior information having known reliability (that is, bias and random error structure) and (2) prior information consisting of best available estimates of unknown reliability. A regression method that incorporates the second scale of prior information assumes the prior information to be fixed for any particular analysis to produce improved, although biased, parameter estimates. Approximate optimization of two auxiliary parameters of the formulation is used to help minimize the bias, which is almost always much smaller than that resulting from standard ridge regression. It is shown that if both scales of prior information are available, then a combined regression analysis may be made.

  19. The impact of intimate partner violence on women's contraceptive use: Evidence from the Rakai Community Cohort Study in Rakai, Uganda.

    PubMed

    Maxwell, Lauren; Brahmbhatt, Heena; Ndyanabo, Anthony; Wagman, Jennifer; Nakigozi, Gertrude; Kaufman, Jay S; Nalugoda, Fred; Serwadda, David; Nandi, Arijit

    2018-05-05

    A systematic review of longitudinal studies suggests that intimate partner violence (IPV) is associated with reduced contraceptive use, but most included studies were limited to two time points. We used seven waves of data from the Rakai Community Cohort Study in Rakai, Uganda to estimate the effect of prior year IPV at one visit on women's current contraceptive use at the following visit. We used inverse probability of treatment-weighted marginal structural models (MSMs) to estimate the relative risk of current contraceptive use comparing women who were exposed to emotional, physical, and/or sexual IPV during the year prior to interview to those who were not. We accounted for time-fixed and time-varying confounders and prior IPV and adjusted standard errors for repeated measures within individuals. The analysis included 7923 women interviewed between 2001 and 2013. In the weighted MSMs, women who experienced any form of prior year IPV were 20% less likely to use condoms at last sex than women who had not (95% CI: 0.12, 0.26). We did not find evidence that IPV affects current use of modern contraception (RR: 0.99; 95% CI: 0.95, 1.03); however, current use of a partner-dependent method was 27% lower among women who reported any form of prior-year IPV compared to women who had not (95% CI: 0.20, 0.33). Women who experienced prior-year IPV were less likely to use condoms and other forms of contraception that required negotiation with their male partners and more likely to use contraception that they could hide from their male partners. Longitudinal studies in Rakai and elsewhere have found that women who experience IPV have a higher rate of HIV than women who do not. Our finding that women who experience IPV are less likely to use condoms may help explain the relation between IPV and HIV. Copyright © 2018 Elsevier Ltd. All rights reserved.

  20. Dissociation During Intense Military Stress is Related to Subsequent Somatic Symptoms in Women

    PubMed Central

    Steffian, Lisa; Steffian, George; Doran, Anthony P.; Rasmusson, Ann M.; Morgan, CA

    2007-01-01

    Background: Research studies of the female response to intense stress are under-represented in the scientific literature; indeed, publications in female humans and animals number half those in male subjects. In addition, women have only recently entered more dangerous professions that were historically limited to men. The US Navy's survival course, therefore, offers a unique opportunity to examine, in a controlled manner, individual differences in the human female response to acute and realistic military stress. Method: The current study assessed the nature and prevalence of dissociative symptoms and other aspects of adaptive function in healthy female subjects experiencing acute, intense stress during US Navy survival training. Cognitive dissociation and previous exposure to traumatic events were assessed at baseline in 32 female service members prior to Navy survival training. At the conclusion of training, retrospectively rated levels of dissociation during peak training stress and current health symptoms were assessed. Results: Female subjects reported previous trauma (35%) and at least one symptom of dissociation at baseline prior to training (47%). Eighty-eight percent of subjects reported experiencing multiple symptoms of dissociation during peak training stress. Post-stress dissociation scores and stress-induced increases in dissociation, as well as prior cumulative exposure to potentially traumatic events, were significant predictors of post-stress health symptoms. Discussion: In this study, increases in dissociative symptoms during intense training stress, post-stress dissociation symptom levels, and prior cumulative exposure to stressful, potentially traumatic events predicted post-stress health symptoms in women. Prior studies in men have demonstrated correlations between neurobiological responses to stress and stress-associated levels of dissociation. Thus future studies in larger samples of women are needed to investigate the relationship between prior stress exposure, alterations in neurobiological responses to stress and potentially related alterations in neuropsychological and physical reactions to stress. PMID:20805901

  1. Clarifying the Role of Neuroticism in Suicidal Ideation and Suicide Attempt among Women with Major Depressive Disorder

    PubMed Central

    Rappaport, Lance M; Flint, Jonathan; Kendler, Kenneth S

    2017-01-01

    Background Prior research consistently demonstrates that neuroticism increases risk for suicidal ideation, but the association between neuroticism and suicidal behavior has been inconsistent. Whereas neuroticism is recommended as an endophenotype for suicidality, the association of neuroticism with attempted suicide warrants clarification. In particular, prior research has not distinguished between correlates of attempted suicide, correlates of suicidal ideation, and correlates of comorbid psychopathology. Methods The present study used the CONVERGE study, a sample of 5,864 women with major depressive disorder and 5,783 women without major depressive disorder throughout China. Diagnoses, suicidal ideation, and attempted suicide were assessed with the Composite International Diagnostic Interview (CIDI). Neuroticism was assessed with the neuroticism portion of the Eysenck Personality Questionnaire. Results Results replicate prior findings on the correlates of suicidal ideation, particularly elevated neuroticism among individuals who report prior suicidal ideation. Moreover, as compared to individuals who reported having experienced only suicidal ideation, neuroticism was associated with decreased likelihood of having attempted suicide. Conclusions The association of neuroticism with suicidality is more complicated than has been previously described. Whereas neuroticism increases risk for suicidal ideation, neuroticism may decrease risk for a suicide attempt among individuals with suicidal ideation. These results have implications for the assessment of risk for a suicide attempt among individuals who report suicidal ideation and addresses prior discordant findings by clarifying the association between neuroticism and attempted suicide. PMID:28397619

  2. An improved approximate-Bayesian model-choice method for estimating shared evolutionary history

    PubMed Central

    2014-01-01

    Background To understand biological diversification, it is important to account for large-scale processes that affect the evolutionary history of groups of co-distributed populations of organisms. Such events predict temporally clustered divergences times, a pattern that can be estimated using genetic data from co-distributed species. I introduce a new approximate-Bayesian method for comparative phylogeographical model-choice that estimates the temporal distribution of divergences across taxa from multi-locus DNA sequence data. The model is an extension of that implemented in msBayes. Results By reparameterizing the model, introducing more flexible priors on demographic and divergence-time parameters, and implementing a non-parametric Dirichlet-process prior over divergence models, I improved the robustness, accuracy, and power of the method for estimating shared evolutionary history across taxa. Conclusions The results demonstrate the improved performance of the new method is due to (1) more appropriate priors on divergence-time and demographic parameters that avoid prohibitively small marginal likelihoods for models with more divergence events, and (2) the Dirichlet-process providing a flexible prior on divergence histories that does not strongly disfavor models with intermediate numbers of divergence events. The new method yields more robust estimates of posterior uncertainty, and thus greatly reduces the tendency to incorrectly estimate models of shared evolutionary history with strong support. PMID:24992937

  3. Non-causal spike filtering improves decoding of movement intention for intracortical BCIs

    PubMed Central

    Masse, Nicolas Y.; Jarosiewicz, Beata; Simeral, John D.; Bacher, Daniel; Stavisky, Sergey D.; Cash, Sydney S.; Oakley, Erin M.; Berhanu, Etsub; Eskandar, Emad; Friehs, Gerhard; Hochberg, Leigh R.; Donoghue, John P.

    2014-01-01

    Background Multiple types of neural signals are available for controlling assistive devices through brain-computer interfaces (BCIs). Intracortically-recorded spiking neural signals are attractive for BCIs because they can in principle provide greater fidelity of encoded information compared to electrocorticographic (ECoG) signals and electroencephalograms (EEGs). Recent reports show that the information content of these spiking neural signals can be reliably extracted simply by causally band-pass filtering the recorded extracellular voltage signals and then applying a spike detection threshold, without relying on “sorting” action potentials. New method We show that replacing the causal filter with an equivalent non-causal filter increases the information content extracted from the extracellular spiking signal and improves decoding of intended movement direction. This method can be used for real-time BCI applications by using a 4 ms lag between recording and filtering neural signals. Results Across 18 sessions from two people with tetraplegia enrolled in the BrainGate2 pilot clinical trial, we found that threshold crossing events extracted using this non-causal filtering method were significantly more informative of each participant’s intended cursor kinematics compared to threshold crossing events derived from causally filtered signals. This new method decreased the mean angular error between the intended and decoded cursor direction by 9.7° for participant S3, who was implanted 5.4 years prior to this study, and by 3.5° for participant T2, who was implanted 3 months prior to this study. Conclusions Non-causally filtering neural signals prior to extracting threshold crossing events may be a simple yet effective way to condition intracortically recorded neural activity for direct control of external devices through BCIs. PMID:25128256

  4. Modern fertility awareness methods: Wrist wearables capture the changes of temperature associated with the menstrual cycle.

    PubMed

    Shilaih, Mohaned; Goodale, Brianna M; Falco, Lisa; Kübler, Florian; De Clerck, Valerie; Leeners, Brigitte

    2017-11-24

    Core and peripheral body temperatures are affected by changes in reproductive hormones during the menstrual cycle. Women worldwide use the basal body temperature (BBT) method to aid and prevent conception. However, prior research suggests taking one's daily temperature can prove inconvenient and subject to environmental factors. We investigate whether a more automatic, non-invasive temperature measurement system can detect changes in temperature across the menstrual cycle. We examined how wrist-skin temperature (WST), measured with wearable sensors, correlates with urinary tests of ovulation and may serve as a new method of fertility tracking. One hundred and thirty-six eumenorrheic, non-pregnant women participated in an observational study. Participants wore WST biosensors during sleep and reported their daily activities. An at-home luteinizing hormone test was used to confirm ovulation. WST was recorded across 437 cycles (mean cycles/participant=3.21, S.D.=2.25). We tested the relationship between the fertile window and WST temperature shifts, using the BBT three-over-six rule. A sustained three-day temperature shift was observed in 357/437 cycles (82%), with the lowest cycle temperature occurring in the fertile window 41% of the time. Most temporal shifts (307/357, 86%) occurred on ovulation day or later. The average early-luteal phase temperature was 0.33°C higher than in the fertile window. Menstrual cycle changes in WST were impervious to lifestyle factors, like having sex, alcohol or eating prior to bed, that, in prior work, have been shown to obfuscate BBT readings. Although currently costlier than BBT, this study suggests that WST could be a promising, convenient parameter for future multi-parameter fertility-awareness methods. ©2017 The Author(s).

  5. The Design of Time-Series Comparisons under Resource Constraints.

    ERIC Educational Resources Information Center

    Willemain, Thomas R.; Hartunian, Nelson S.

    1982-01-01

    Two methods for dividing an interrupted time-series study between baseline and experimental phases when study resources are limited are compared. In fixed designs, the baseline duration is predetermined. In flexible designs the baseline duration is contingent on remaining resources and the match of results to prior expectations of the evaluator.…

  6. On the Measurement and Properties of Ambiguity in Probabilistic Expectations

    ERIC Educational Resources Information Center

    Pickett, Justin T.; Loughran, Thomas A.; Bushway, Shawn

    2015-01-01

    Survey respondents' probabilistic expectations are now widely used in many fields to study risk perceptions, decision-making processes, and behavior. Researchers have developed several methods to account for the fact that the probability of an event may be more ambiguous for some respondents than others, but few prior studies have empirically…

  7. Territorial Behavior in Public Settings

    ERIC Educational Resources Information Center

    Costa, Marco

    2012-01-01

    This study provides a novel observational method to observe repetitive seating patterns chosen by students in a classroom. Although prior work that relied on self-reports suggests that students claim the same seats repeatedly, the main hypothesis of the study was that in a repeated use of a public space, people tend to occupy the same position,…

  8. Childhood Abuse and Neglect and Adult Intimate Relationships: A Prospective Study

    ERIC Educational Resources Information Center

    Colman, R.A.; Widom, C.S.

    2004-01-01

    Objective:: The present study extends prior research on childhood maltreatment and social functioning by examining the impact of early childhood physical abuse, sexual abuse, and neglect on rates of involvement in adult intimate relationships and relationship functioning. Method:: Substantiated cases of child abuse and neglect from 1967 to 1971…

  9. "Social Work Abstracts" Fails Again: A Replication and Extension

    ERIC Educational Resources Information Center

    Holden, Gary; Barker, Kathleen; Covert-Vail, Lucinda; Rosenberg, Gary; Cohen, Stephanie A.

    2009-01-01

    Objective: According to a prior study, there are substantial lapses in journal coverage in the "Social Work Abstracts" (SWA) database. The current study provides a replication and extension. Method: The longitudinal pattern of coverage of thirty-three journals categorized in SWA as core journals (published in the 1989-1996 period) is examined.…

  10. Self-Report Measure of Psychological Abuse of Older Adults

    ERIC Educational Resources Information Center

    Conrad, Kendon J.; Iris, Madelyn; Ridings, John W.; Langley, Kate; Anetzberger, Georgia J.

    2011-01-01

    Purpose: This study tested key psychometric properties of the Older Adult Psychological Abuse Measure (OAPAM), one self-report scale of the Older Adult Mistreatment Assessment (OAMA). Design and Methods: Items and theory were developed in a prior concept mapping study. Subsequently, the measures were administered to 226 substantiated clients by 22…

  11. Single-Trial Normalization for Event-Related Spectral Decomposition Reduces Sensitivity to Noisy Trials

    PubMed Central

    Grandchamp, Romain; Delorme, Arnaud

    2011-01-01

    In electroencephalography, the classical event-related potential model often proves to be a limited method to study complex brain dynamics. For this reason, spectral techniques adapted from signal processing such as event-related spectral perturbation (ERSP) – and its variant event-related synchronization and event-related desynchronization – have been used over the past 20 years. They represent average spectral changes in response to a stimulus. These spectral methods do not have strong consensus for comparing pre- and post-stimulus activity. When computing ERSP, pre-stimulus baseline removal is usually performed after averaging the spectral estimate of multiple trials. Correcting the baseline of each single-trial prior to averaging spectral estimates is an alternative baseline correction method. However, we show that this method leads to positively skewed post-stimulus ERSP values. We eventually present new single-trial-based ERSP baseline correction methods that perform trial normalization or centering prior to applying classical baseline correction methods. We show that single-trial correction methods minimize the contribution of artifactual data trials with high-amplitude spectral estimates and are robust to outliers when performing statistical inference testing. We then characterize these methods in terms of their time–frequency responses and behavior compared to classical ERSP methods. PMID:21994498

  12. Boosting probabilistic graphical model inference by incorporating prior knowledge from multiple sources.

    PubMed

    Praveen, Paurush; Fröhlich, Holger

    2013-01-01

    Inferring regulatory networks from experimental data via probabilistic graphical models is a popular framework to gain insights into biological systems. However, the inherent noise in experimental data coupled with a limited sample size reduces the performance of network reverse engineering. Prior knowledge from existing sources of biological information can address this low signal to noise problem by biasing the network inference towards biologically plausible network structures. Although integrating various sources of information is desirable, their heterogeneous nature makes this task challenging. We propose two computational methods to incorporate various information sources into a probabilistic consensus structure prior to be used in graphical model inference. Our first model, called Latent Factor Model (LFM), assumes a high degree of correlation among external information sources and reconstructs a hidden variable as a common source in a Bayesian manner. The second model, a Noisy-OR, picks up the strongest support for an interaction among information sources in a probabilistic fashion. Our extensive computational studies on KEGG signaling pathways as well as on gene expression data from breast cancer and yeast heat shock response reveal that both approaches can significantly enhance the reconstruction accuracy of Bayesian Networks compared to other competing methods as well as to the situation without any prior. Our framework allows for using diverse information sources, like pathway databases, GO terms and protein domain data, etc. and is flexible enough to integrate new sources, if available.

  13. Inference of Gene Regulatory Networks Using Bayesian Nonparametric Regression and Topology Information.

    PubMed

    Fan, Yue; Wang, Xiao; Peng, Qinke

    2017-01-01

    Gene regulatory networks (GRNs) play an important role in cellular systems and are important for understanding biological processes. Many algorithms have been developed to infer the GRNs. However, most algorithms only pay attention to the gene expression data but do not consider the topology information in their inference process, while incorporating this information can partially compensate for the lack of reliable expression data. Here we develop a Bayesian group lasso with spike and slab priors to perform gene selection and estimation for nonparametric models. B-spline basis functions are used to capture the nonlinear relationships flexibly and penalties are used to avoid overfitting. Further, we incorporate the topology information into the Bayesian method as a prior. We present the application of our method on DREAM3 and DREAM4 datasets and two real biological datasets. The results show that our method performs better than existing methods and the topology information prior can improve the result.

  14. Investigation of error sources in regional inverse estimates of greenhouse gas emissions in Canada

    NASA Astrophysics Data System (ADS)

    Chan, E.; Chan, D.; Ishizawa, M.; Vogel, F.; Brioude, J.; Delcloo, A.; Wu, Y.; Jin, B.

    2015-08-01

    Inversion models can use atmospheric concentration measurements to estimate surface fluxes. This study is an evaluation of the errors in a regional flux inversion model for different provinces of Canada, Alberta (AB), Saskatchewan (SK) and Ontario (ON). Using CarbonTracker model results as the target, the synthetic data experiment analyses examined the impacts of the errors from the Bayesian optimisation method, prior flux distribution and the atmospheric transport model, as well as their interactions. The scaling factors for different sub-regions were estimated by the Markov chain Monte Carlo (MCMC) simulation and cost function minimization (CFM) methods. The CFM method results are sensitive to the relative size of the assumed model-observation mismatch and prior flux error variances. Experiment results show that the estimation error increases with the number of sub-regions using the CFM method. For the region definitions that lead to realistic flux estimates, the numbers of sub-regions for the western region of AB/SK combined and the eastern region of ON are 11 and 4 respectively. The corresponding annual flux estimation errors for the western and eastern regions using the MCMC (CFM) method are -7 and -3 % (0 and 8 %) respectively, when there is only prior flux error. The estimation errors increase to 36 and 94 % (40 and 232 %) resulting from transport model error alone. When prior and transport model errors co-exist in the inversions, the estimation errors become 5 and 85 % (29 and 201 %). This result indicates that estimation errors are dominated by the transport model error and can in fact cancel each other and propagate to the flux estimates non-linearly. In addition, it is possible for the posterior flux estimates having larger differences than the prior compared to the target fluxes, and the posterior uncertainty estimates could be unrealistically small that do not cover the target. The systematic evaluation of the different components of the inversion model can help in the understanding of the posterior estimates and percentage errors. Stable and realistic sub-regional and monthly flux estimates for western region of AB/SK can be obtained, but not for the eastern region of ON. This indicates that it is likely a real observation-based inversion for the annual provincial emissions will work for the western region whereas; improvements are needed with the current inversion setup before real inversion is performed for the eastern region.

  15. Hierarchical Commensurate and Power Prior Models for Adaptive Incorporation of Historical Information in Clinical Trials

    PubMed Central

    Hobbs, Brian P.; Carlin, Bradley P.; Mandrekar, Sumithra J.; Sargent, Daniel J.

    2011-01-01

    Summary Bayesian clinical trial designs offer the possibility of a substantially reduced sample size, increased statistical power, and reductions in cost and ethical hazard. However when prior and current information conflict, Bayesian methods can lead to higher than expected Type I error, as well as the possibility of a costlier and lengthier trial. This motivates an investigation of the feasibility of hierarchical Bayesian methods for incorporating historical data that are adaptively robust to prior information that reveals itself to be inconsistent with the accumulating experimental data. In this paper, we present several models that allow for the commensurability of the information in the historical and current data to determine how much historical information is used. A primary tool is elaborating the traditional power prior approach based upon a measure of commensurability for Gaussian data. We compare the frequentist performance of several methods using simulations, and close with an example of a colon cancer trial that illustrates a linear models extension of our adaptive borrowing approach. Our proposed methods produce more precise estimates of the model parameters, in particular conferring statistical significance to the observed reduction in tumor size for the experimental regimen as compared to the control regimen. PMID:21361892

  16. Releasing-addition method for the flame-photometric determination of calcium in thermal waters

    USGS Publications Warehouse

    Rowe, J.J.

    1963-01-01

    Study of the interferences of silica and sulfate in the flame-photometric determination of calcium in thermal waters has led to the development of a method requiring no prior chemical separations. The interference effects of silica, sulfate, potassium, sodium, aluminum, and phosphate are overcome by an addition technique coupled with the use of magnesium as a releasing agent. ?? 1963.

  17. Expansion of the scope of AOAC first action method 2012.25 - single-laboratory validation of triphenylmethane dye and leuco metabolite analysis in shrimp, tilapia, catfish, and salmon by LC-MS/MS

    USDA-ARS?s Scientific Manuscript database

    Prior to conducting a collaborative study of AOAC First Action 2012.25 LC-MS/MS analytical method for the determination of residues of three triphenylmethane dyes (malachite green, crystal violet, and brilliant green) and their metabolites (leucomalachite green and leucocrystal violet) in seafood, a...

  18. 76 FR 26085 - Endangered and Threatened Wildlife and Plants; Proposed Rule To Revise the List of Endangered and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-05-05

    ...: You may submit comments by one of the following methods: Electronically: Go to the Federal eRulemaking... viability of the species. You may submit your comments and materials by one of the methods listed in... until the molecular genetics studies of the last few years, the range of the gray wolf prior to European...

  19. Bias correction for estimated QTL effects using the penalized maximum likelihood method.

    PubMed

    Zhang, J; Yue, C; Zhang, Y-M

    2012-04-01

    A penalized maximum likelihood method has been proposed as an important approach to the detection of epistatic quantitative trait loci (QTL). However, this approach is not optimal in two special situations: (1) closely linked QTL with effects in opposite directions and (2) small-effect QTL, because the method produces downwardly biased estimates of QTL effects. The present study aims to correct the bias by using correction coefficients and shifting from the use of a uniform prior on the variance parameter of a QTL effect to that of a scaled inverse chi-square prior. The results of Monte Carlo simulation experiments show that the improved method increases the power from 25 to 88% in the detection of two closely linked QTL of equal size in opposite directions and from 60 to 80% in the identification of QTL with small effects (0.5% of the total phenotypic variance). We used the improved method to detect QTL responsible for the barley kernel weight trait using 145 doubled haploid lines developed in the North American Barley Genome Mapping Project. Application of the proposed method to other shrinkage estimation of QTL effects is discussed.

  20. Intravenous magnesium for pediatric sickle cell vaso-occlusive crisis: methodological issues of a randomized controlled trial.

    PubMed

    Badaki-Makun, Oluwakemi; Scott, J Paul; Panepinto, Julie A; Casper, T Charles; Hillery, Cheryl A; Dean, J Michael; Brousseau, David C

    2014-06-01

    Multiple recent Sickle Cell Disease studies have been terminated due to poor enrollment. We developed methods to overcome past barriers and utilized these to study the efficacy and safety of intravenous magnesium for vaso-occlusive crisis (VOC). We describe the methods of the Intravenous Magnesium in Sickle Vaso-occlusive Crisis (MAGiC) trial and discuss methods used to overcome past barriers. MAGiC was a multi-center randomized double-blind placebo-controlled trial of intravenous magnesium versus normal saline for treatment of VOC. The study was a collaboration between Pediatric Hematologists and Emergency Physicians in the Pediatric Emergency Care Applied Research Network (PECARN). Eligible patients were randomized within 12 hours of receiving intravenous opioids in the Emergency Department (ED) and administered study medication every 8 hours. The primary outcome was hospital length of stay. Associated plasma studies elucidated magnesium's mechanism of action and the pathophysiology of VOC. Health-related quality of life was measured. Site-, protocol-, and patient-related barriers from prior studies were identified and addressed. Limited study staff availability, lack of collaboration with the ED, and difficulty obtaining consent were previously identified barriers. Leveraging PECARN resources, forging close collaborations between Sickle Cell Centers and EDs of participating sites, and approaching eligible patients for prior consent helped overcome these barriers. Participation in the PECARN network and establishment of collaborative arrangements between Sickle Cell Centers and their affiliated EDs are major innovative features of the MAGiC study that allowed improved subject capture. These methods could serve as a model for future studies of VOCs. © 2014 Wiley Periodicals, Inc.

  1. Shoulder instability in professional football players.

    PubMed

    Leclere, Lance E; Asnis, Peter D; Griffith, Matthew H; Granito, David; Berkson, Eric M; Gill, Thomas J

    2013-09-01

    Shoulder instability is a common problem in American football players entering the National Football League (NFL). Treatment options include nonoperative and surgical stabilization. This study evaluated how the method of treatment of pre-NFL shoulder instability affects the rate of recurrence and the time elapsed until recurrence in players on 1 NFL team. Retrospective cohort. Medical records from 1980 to 2008 for 1 NFL team were reviewed. There were 328 players included in the study who started their career on the team and remained on the team for at least 2 years (mean, 3.9 years; range, 2-14 years). The history of instability prior to entering the NFL and the method of treatment were collected. Data on the occurrence of instability while in the NFL were recorded to determine the rate and timing of recurrence. Thirty-one players (9.5%) had a history of instability prior to entering the NFL. Of the 297 players with no history of instability, 39 (13.1%) had a primary event at a mean of 18.4 ± 22.2 months (range, 0-102 months) after joining the team. In the group of players with prior instability treated with surgical stabilization, there was no statistical difference in the rate of recurrence (10.5%) or the timing to the instability episode (mean, 26 months) compared with players with no history of instability. Twelve players had shoulder instability treated nonoperatively prior to the NFL. Five of these players (41.7%) had recurrent instability at a mean of 4.4 ± 7.0 months (range, 0-16 months). The patients treated nonoperatively had a significantly higher rate of recurrence (P = 0.02) and an earlier time of recurrence (P = 0.04). The rate of contralateral instability was 25.8%, occurring at a mean of 8.6 months. Recurrent shoulder instability is more common in NFL players with a history of nonoperative treatment. Surgical stabilization appears to restore the rate and timing of instability to that of players with no prior history of instability.

  2. Incidence and Predictive Factors of Pain Flare After Spine Stereotactic Body Radiation Therapy: Secondary Analysis of Phase 1/2 Trials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pan, Hubert Y.; Allen, Pamela K.; Wang, Xin S.

    Purpose/Objective(s): To perform a secondary analysis of institutional prospective spine stereotactic body radiation therapy (SBRT) trials to investigate posttreatment acute pain flare. Methods and Materials: Medical records for enrolled patients were reviewed. Study protocol included baseline and follow-up surveys with pain assessment by Brief Pain Inventory and documentation of pain medications. Patients were considered evaluable for pain flare if clinical note or follow-up survey was completed within 2 weeks of SBRT. Pain flare was defined as a clinical note indicating increased pain at the treated site or survey showing a 2-point increase in worst pain score, a 25% increase in analgesicmore » intake, or the initiation of steroids. Binary logistic regression was used to determine predictive factors for pain flare occurrence. Results: Of the 210 enrolled patients, 195 (93%) were evaluable for pain flare, including 172 (88%) clinically, 135 (69%) by survey, and 112 (57%) by both methods. Of evaluable patients, 61 (31%) had undergone prior surgery, 57 (29%) had received prior radiation, and 34 (17%) took steroids during treatment, mostly for prior conditions. Pain flare was observed in 44 patients (23%). Median time to pain flare was 5 days (range, 0-20 days) after the start of treatment. On multivariate analysis, the only independent factor associated with pain flare was the number of treatment fractions (odds ratio = 0.66, P=.004). Age, sex, performance status, spine location, number of treated vertebrae, prior radiation, prior surgery, primary tumor histology, baseline pain score, and steroid use were not significant. Conclusions: Acute pain flare after spine SBRT is a relatively common event, for which patients should be counseled. Additional study is needed to determine whether prophylactic or symptomatic intervention is preferred.« less

  3. Applying Standard Interfaces to a Process-Control Language

    NASA Technical Reports Server (NTRS)

    Berthold, Richard T.

    2005-01-01

    A method of applying open-operating-system standard interfaces to the NASA User Interface Language (UIL) has been devised. UIL is a computing language that can be used in monitoring and controlling automated processes: for example, the Timeliner computer program, written in UIL, is a general-purpose software system for monitoring and controlling sequences of automated tasks in a target system. In providing the major elements of connectivity between UIL and the target system, the present method offers advantages over the prior method. Most notably, unlike in the prior method, the software description of the target system can be made independent of the applicable compiler software and need not be linked to the applicable executable compiler image. Also unlike in the prior method, it is not necessary to recompile the source code and relink the source code to a new executable compiler image. Abstraction of the description of the target system to a data file can be defined easily, with intuitive syntax, and knowledge of the source-code language is not needed for the definition.

  4. Bayes factors for testing inequality constrained hypotheses: Issues with prior specification.

    PubMed

    Mulder, Joris

    2014-02-01

    Several issues are discussed when testing inequality constrained hypotheses using a Bayesian approach. First, the complexity (or size) of the inequality constrained parameter spaces can be ignored. This is the case when using the posterior probability that the inequality constraints of a hypothesis hold, Bayes factors based on non-informative improper priors, and partial Bayes factors based on posterior priors. Second, the Bayes factor may not be invariant for linear one-to-one transformations of the data. This can be observed when using balanced priors which are centred on the boundary of the constrained parameter space with a diagonal covariance structure. Third, the information paradox can be observed. When testing inequality constrained hypotheses, the information paradox occurs when the Bayes factor of an inequality constrained hypothesis against its complement converges to a constant as the evidence for the first hypothesis accumulates while keeping the sample size fixed. This paradox occurs when using Zellner's g prior as a result of too much prior shrinkage. Therefore, two new methods are proposed that avoid these issues. First, partial Bayes factors are proposed based on transformed minimal training samples. These training samples result in posterior priors that are centred on the boundary of the constrained parameter space with the same covariance structure as in the sample. Second, a g prior approach is proposed by letting g go to infinity. This is possible because the Jeffreys-Lindley paradox is not an issue when testing inequality constrained hypotheses. A simulation study indicated that the Bayes factor based on this g prior approach converges fastest to the true inequality constrained hypothesis. © 2013 The British Psychological Society.

  5. A comparison between Bayes discriminant analysis and logistic regression for prediction of debris flow in southwest Sichuan, China

    NASA Astrophysics Data System (ADS)

    Xu, Wenbo; Jing, Shaocai; Yu, Wenjuan; Wang, Zhaoxian; Zhang, Guoping; Huang, Jianxi

    2013-11-01

    In this study, the high risk areas of Sichuan Province with debris flow, Panzhihua and Liangshan Yi Autonomous Prefecture, were taken as the studied areas. By using rainfall and environmental factors as the predictors and based on the different prior probability combinations of debris flows, the prediction of debris flows was compared in the areas with statistical methods: logistic regression (LR) and Bayes discriminant analysis (BDA). The results through the comprehensive analysis show that (a) with the mid-range scale prior probability, the overall predicting accuracy of BDA is higher than those of LR; (b) with equal and extreme prior probabilities, the overall predicting accuracy of LR is higher than those of BDA; (c) the regional predicting models of debris flows with rainfall factors only have worse performance than those introduced environmental factors, and the predicting accuracies of occurrence and nonoccurrence of debris flows have been changed in the opposite direction as the supplemented information.

  6. Shape priors for segmentation of the cervix region within uterine cervix images

    NASA Astrophysics Data System (ADS)

    Lotenberg, Shelly; Gordon, Shiri; Greenspan, Hayit

    2008-03-01

    The work focuses on a unique medical repository of digital Uterine Cervix images ("Cervigrams") collected by the National Cancer Institute (NCI), National Institute of Health, in longitudinal multi-year studies. NCI together with the National Library of Medicine is developing a unique web-based database of the digitized cervix images to study the evolution of lesions related to cervical cancer. Tools are needed for the automated analysis of the cervigram content to support the cancer research. In recent works, a multi-stage automated system for segmenting and labeling regions of medical and anatomical interest within the cervigrams was developed. The current paper concentrates on incorporating prior-shape information in the cervix region segmentation task. In accordance with the fact that human experts mark the cervix region as circular or elliptical, two shape models (and corresponding methods) are suggested. The shape models are embedded within an active contour framework that relies on image features. Experiments indicate that incorporation of the prior shape information augments previous results.

  7. Comfort and experience with online learning: trends over nine years and associations with knowledge

    PubMed Central

    2014-01-01

    Background Some evidence suggests that attitude toward computer-based instruction is an important determinant of success in online learning. We sought to determine how comfort using computers and perceptions of prior online learning experiences have changed over the past decade, and how these associate with learning outcomes. Methods Each year from 2003–2011 we conducted a prospective trial of online learning. As part of each year’s study, we asked medicine residents about their comfort using computers and if their previous experiences with online learning were favorable. We assessed knowledge using a multiple-choice test. We used regression to analyze associations and changes over time. Results 371 internal medicine and family medicine residents participated. Neither comfort with computers nor perceptions of prior online learning experiences showed a significant change across years (p > 0.61), with mean comfort rating 3.96 (maximum 5 = very comfortable) and mean experience rating 4.42 (maximum 6 = strongly agree [favorable]). Comfort showed no significant association with knowledge scores (p = 0.39) but perceptions of prior experiences did, with a 1.56% rise in knowledge score for a 1-point rise in experience score (p = 0.02). Correlations among comfort, perceptions of prior experiences, and number of prior experiences were all small and not statistically significant. Conclusions Comfort with computers and perceptions of prior experience with online learning remained stable over nine years. Prior good experiences (but not comfort with computers) demonstrated a modest association with knowledge outcomes, suggesting that prior course satisfaction may influence subsequent learning. PMID:24985690

  8. Dose specification and quality assurance of RTOG protocol 95-17; a cooperative group study of 192Ir breast implants as sole therapy

    PubMed Central

    Ibbott, Geoffrey S.; Hanson, W.F.; Martin, Elizabeth; Kuske, Robert R.; Arthur, Douglas; Rabinovitch, Rachel; White, Julia; Wilenzick, Raymond M.; Harris, Irene; Tailor, Ramesh C.

    2007-01-01

    Purpose RTOG protocol 95-17 was a phase I/II trial to evaluate multi-catheter brachytherapy as the sole method of adjuvant breast radiotherapy for stage I/II breast carcinoma following breast conserving surgery. Low or high dose rate sources were allowed. Dose prescription and treatment evaluation were based on recommendations in ICRU Report 58, and included the parameters mean central dose (MCD), average peripheral dose, dose homogeneity index (DHI), and the dimensions of the low and high dose regions. Methods and Materials Three levels of quality assurance were implemented: (1) Credentialing of institutions was required prior to entering patients onto the study. (2) Rapid review of each treatment plan was conducted prior to treatment, and (3) Retrospective review was performed by the Radiological Physics Center in conjunction with the study chairman and RTOG dosimetry staff. Results Credentialing focused on the accuracy of dose calculation algorithm and compliance with protocol guidelines. Rapid review was designed to identify and correct deviations from the protocol prior to treatment. The retrospective review involved recalculation of dosimetry parameters and review of dose distributions to evaluate the treatment. Specifying both central and peripheral doses resulted in uniform dose distributions, with a mean dose homogeneity index of 0.83 ±0.06. Conclusions Vigorous quality assurance resulted in a high-quality study with few deviations; only 4 of 100 patients were judged as minor variations from protocol and no patient was judged a major deviation. This study should be considered a model for quality assurance of future trials. PMID:18035213

  9. The effect of two fixation methods (TAF and DESS) on morphometric parameters of Aphelenchoides ritzemabosi.

    PubMed

    Chałańska, Aneta; Bogumił, Aleksandra; Malewski, Tadeusz; Kowalewska, Katarzyna

    2016-02-19

    Identification of nematode species by using conventional methods requires fixation of the isolated material and a suitable preparation for further analyses. Tentative identification using microscopic methods should also be performed prior to initiating molecular studies. In the literature, various methods are described for the preparation of nematodes from the genus Aphelenchoides for identification and microscopic studies. The most commonly used fixatives are formalin (Timm 1969; Szczygieł & Cid del Prado Vera 1981, Crozzoli et al. 2008, Khan et al. 2008), FAA (Wasilewska 1969; Vovlas et al. 2005, Khan et al. 2007) and TAF (Hooper 1958, Chizhov et al. 2006, Jagdale & Grewal 2006).

  10. Method for integrating microelectromechanical devices with electronic circuitry

    DOEpatents

    Montague, Stephen; Smith, James H.; Sniegowski, Jeffry J.; McWhorter, Paul J.

    1998-01-01

    A method for integrating one or more microelectromechanical (MEM) devices with electronic circuitry. The method comprises the steps of forming each MEM device within a cavity below a device surface of the substrate; encapsulating the MEM device prior to forming electronic circuitry on the substrate; and releasing the MEM device for operation after fabrication of the electronic circuitry. Planarization of the encapsulated MEM device prior to formation of the electronic circuitry allows the use of standard processing steps for fabrication of the electronic circuitry.

  11. Level of Skill Argued Students on Physics Material

    NASA Astrophysics Data System (ADS)

    Viyanti, V.; Cari, C.; Sunarno, W.; Prasetyo, Z. K.

    2017-09-01

    This study aims to analyze the prior knowledge of students to map the level of skills to argue floating and sinking material. Prior knowledge is the process of concept formation in cognitive processes spontaneously or based on student experience. The study population is high school students of class XI. The sample selection using cluster random sampling, obtained the number of sampel as many as 50 student. The research used descriptive survey method. The data were obtained through a multiple choice test both grounded and interviewed. The data analyzed refers to: alignment the concept and the activity of developing the skill of the argument. The result obtained by the average level of skill argue in terms of the prior knowladge of on “Level 2”. The data show that students have difficulty expressing simple arguments consisting of only one statement. This indicates a lack of student experience in cultivating argumentative skills in their learning. The skill level mapping argued in this study to be a reference for researchers to provide feedback measures to obtain positive change in cognitive conflict argued.

  12. A predictive approach to selecting the size of a clinical trial, based on subjective clinical opinion.

    PubMed

    Spiegelhalter, D J; Freedman, L S

    1986-01-01

    The 'textbook' approach to determining sample size in a clinical trial has some fundamental weaknesses which we discuss. We describe a new predictive method which takes account of prior clinical opinion about the treatment difference. The method adopts the point of clinical equivalence (determined by interviewing the clinical participants) as the null hypothesis. Decision rules at the end of the study are based on whether the interval estimate of the treatment difference (classical or Bayesian) includes the null hypothesis. The prior distribution is used to predict the probabilities of making the decisions to use one or other treatment or to reserve final judgement. It is recommended that sample size be chosen to control the predicted probability of the last of these decisions. An example is given from a multi-centre trial of superficial bladder cancer.

  13. Evaluation of an alternative method for hiring air traffic control specialists with prior military experience.

    DOT National Transportation Integrated Search

    1992-01-01

    This study was conducted to assess an FAA program to hire former military air traffic control specialists to enter ATC field training directly without first attending the Academy screening program. Selection of military controllers was based on meeti...

  14. Panitumumab Use in Metastatic Colorectal Cancer and Patterns of KRAS Testing: Results from a Europe-Wide Physician Survey and Medical Records Review

    PubMed Central

    Trojan, Jörg; Mineur, Laurent; Tomášek, Jiří; Rouleau, Etienne; Fabian, Pavel; de Maglio, Giovanna; García-Alfonso, Pilar; Aprile, Giuseppe; Taylor, Aliki; Kafatos, George; Downey, Gerald; Terwey, Jan-Henrik; van Krieken, J. Han

    2015-01-01

    Background From 2008–2013, the European indication for panitumumab required that patients’ tumor KRAS exon 2 mutation status was known prior to starting treatment. To evaluate physician awareness of panitumumab prescribing information and how physicians prescribe panitumumab in patients with metastatic colorectal cancer (mCRC), two European multi-country, cross-sectional, observational studies were initiated in 2012: a physician survey and a medical records review. The first two out of three planned rounds for each study are reported. Methods The primary objective in the physician survey was to estimate the prevalence of KRAS testing, and in the medical records review, it was to evaluate the effect of test results on patterns of panitumumab use. The medical records review study also included a pathologists’ survey. Results In the physician survey, nearly all oncologists (299/301) were aware of the correct panitumumab indication and the need to test patients’ tumor KRAS status before treatment with panitumumab. Nearly all oncologists (283/301) had in the past 6 months of clinical practice administered panitumumab correctly to mCRC patients with wild-type KRAS status. In the medical records review, 97.5% of participating oncologists (77/79) conducted a KRAS test for all of their patients prior to prescribing panitumumab. Four patients (1.3%) did not have tumor KRAS mutation status tested prior to starting panitumumab treatment. Approximately one-quarter of patients (85/306) were treated with panitumumab and concurrent oxaliplatin-containing chemotherapy; of these, 83/85 had confirmed wild-type KRAS status prior to starting panitumumab treatment. All 56 referred laboratories that participated used a Conformité Européenne-marked or otherwise validated KRAS detection method, and nearly all (55/56) participated in a quality assurance scheme. Conclusions There was a high level of knowledge amongst oncologists around panitumumab prescribing information and the need to test and confirm patients’ tumors as being wild-type KRAS prior to treatment with panitumumab, with or without concurrent oxaliplatin-containing therapy. PMID:26491871

  15. Bayesian analysis of multimethod ego-depletion studies favours the null hypothesis.

    PubMed

    Etherton, Joseph L; Osborne, Randall; Stephenson, Katelyn; Grace, Morgan; Jones, Chas; De Nadai, Alessandro S

    2018-04-01

    Ego-depletion refers to the purported decrease in performance on a task requiring self-control after engaging in a previous task involving self-control, with self-control proposed to be a limited resource. Despite many published studies consistent with this hypothesis, recurrent null findings within our laboratory and indications of publication bias have called into question the validity of the depletion effect. This project used three depletion protocols involved three different depleting initial tasks followed by three different self-control tasks as dependent measures (total n = 840). For each method, effect sizes were not significantly different from zero When data were aggregated across the three different methods and examined meta-analytically, the pooled effect size was not significantly different from zero (for all priors evaluated, Hedges' g = 0.10 with 95% credibility interval of [-0.05, 0.24]) and Bayes factors reflected strong support for the null hypothesis (Bayes factor > 25 for all priors evaluated). © 2018 The British Psychological Society.

  16. Chicago Classification Criteria of Esophageal Motility Disorders Defined in High Resolution Esophageal Pressure Topography (EPT)†

    PubMed Central

    Bredenoord, Albert J; Fox, Mark; Kahrilas, Peter J; Pandolfino, John E; Schwizer, Werner; Smout, AJPM; Conklin, Jeffrey L; Cook, Ian J; Gyawali, Prakash; Hebbard, Geoffrey; Holloway, Richard H; Ke, Meiyun; Keller, Jutta; Mittal, Ravinder K; Peters, Jeff; Richter, Joel; Roman, Sabine; Rommel, Nathalie; Sifrim, Daniel; Tutuian, Radu; Valdovinos, Miguel; Vela, Marcelo F; Zerbib, Frank

    2011-01-01

    Background The Chicago Classification of esophageal motility was developed to facilitate the interpretation of clinical high resolution esophageal pressure topography (EPT) studies, concurrent with the widespread adoption of this technology into clinical practice. The Chicago Classification has been, and will continue to be, an evolutionary process, molded first by published evidence pertinent to the clinical interpretation of high resolution manometry (HRM) studies and secondarily by group experience when suitable evidence is lacking. Methods This publication summarizes the state of our knowledge as of the most recent meeting of the International High Resolution Manometry Working Group in Ascona, Switzerland in April 2011. The prior iteration of the Chicago Classification was updated through a process of literature analysis and discussion. Key Results The major changes in this document from the prior iteration are largely attributable to research studies published since the prior iteration, in many cases research conducted in response to prior deliberations of the International High Resolution Manometry Working Group. The classification now includes criteria for subtyping achalasia, EGJ outflow obstruction, motility disorders not observed in normal subjects (Distal esophageal spasm, Hypercontractile esophagus, and Absent peristalsis), and statistically defined peristaltic abnormalities (Weak peristalsis, Frequent failed peristalsis, Rapid contractions with normal latency, and Hypertensive peristalsis). Conclusions & Inferences The Chicago Classification is an algorithmic scheme for diagnosis of esophageal motility disorders from clinical EPT studies. Moving forward, we anticipate continuing this process with increased emphasis placed on natural history studies and outcome data based on the classification. PMID:22248109

  17. Mixture model based joint-MAP reconstruction of attenuation and activity maps in TOF-PET

    NASA Astrophysics Data System (ADS)

    Hemmati, H.; Kamali-Asl, A.; Ghafarian, P.; Ay, M. R.

    2018-06-01

    A challenge to have quantitative positron emission tomography (PET) images is to provide an accurate and patient-specific photon attenuation correction. In PET/MR scanners, the nature of MR signals and hardware limitations have led to a real challenge on the attenuation map extraction. Except for a constant factor, the activity and attenuation maps from emission data on TOF-PET system can be determined by the maximum likelihood reconstruction of attenuation and activity approach (MLAA) from emission data. The aim of the present study is to constrain the joint estimations of activity and attenuation approach for PET system using a mixture model prior based on the attenuation map histogram. This novel prior enforces non-negativity and its hyperparameters can be estimated using a mixture decomposition step from the current estimation of the attenuation map. The proposed method can also be helpful on the solving of scaling problem and is capable to assign the predefined regional attenuation coefficients with some degree of confidence to the attenuation map similar to segmentation-based attenuation correction approaches. The performance of the algorithm is studied with numerical and Monte Carlo simulations and a phantom experiment and was compared with MLAA algorithm with and without the smoothing prior. The results demonstrate that the proposed algorithm is capable of producing the cross-talk free activity and attenuation images from emission data. The proposed approach has potential to be a practical and competitive method for joint reconstruction of activity and attenuation maps from emission data on PET/MR and can be integrated on the other methods.

  18. Acidified pressurized hot water for the continuous extraction of cadmium and lead from plant materials prior to ETAAS

    NASA Astrophysics Data System (ADS)

    Morales-Muñoz, S.; Luque-García, J. L.; Luque de Castro, M. D.

    2003-01-01

    Acidified and pressurized hot water is proposed for the continuous leaching of Cd and Pb from plants prior to determination by electrothermal atomic absorption spectrometry. Beech leaves (a certified reference material—CRM 100—where the analytes were not certified) were used for optimizing the method by a multivariate approach. The samples (0.5 g) were subjected to dynamic extraction with water modified with 1% v/v HNO 3 at 250 °C as leachant. A kinetics study was performed in order to know the pattern of the extraction process. The method was validated with a CRM (olive leaves, 062 from the BCR) where the analytes had been certified. The agreement between the certified values and those found using the proposed method demonstrates its usefulness. The repeatability and within-laboratory reproducibility were 3.7 and 2.3% for Cd and 1.04% and 6.3% for Pb, respectively. The precision of the method, together with its efficiency, rapidity, and environmental acceptability, makes it a good alternative for the determination of trace metals in plant material.

  19. Multiaxis Rainflow Fatigue Methods for Nonstationary Vibration

    NASA Technical Reports Server (NTRS)

    Irvine, T.

    2016-01-01

    Mechanical structures and components may be subjected to cyclical loading conditions, including sine and random vibration. Such systems must be designed and tested accordingly. Rainflow cycle counting is the standard method for reducing a stress time history to a table of amplitude-cycle pairings prior to the Palmgren-Miner cumulative damage calculation. The damage calculation is straightforward for sinusoidal stress but very complicated for random stress, particularly for nonstationary vibration. This paper evaluates candidate methods and makes a recommendation for further study of a hybrid technique.

  20. A novel tracing method for the segmentation of cell wall networks.

    PubMed

    De Vylder, Jonas; Rooms, Filip; Dhondt, Stijn; Inze, Dirk; Philips, Wilfried

    2013-01-01

    Cell wall networks are a common subject of research in biology, which are important for plant growth analysis, organ studies, etc. In order to automate the detection of individual cells in such cell wall networks, we propose a new segmentation algorithm. The proposed method is a network tracing algorithm, exploiting the prior knowledge of the network structure. The method is applicable on multiple microscopy modalities such as fluorescence, but also for images captured using non invasive microscopes such as differential interference contrast (DIC) microscopes.

  1. Is the Recall of Verbal-Spatial Information from Working Memory Affected by Symptoms of ADHD?

    ERIC Educational Resources Information Center

    Caterino, Linda C.; Verdi, Michael P.

    2012-01-01

    Objective: The Kulhavy model for text learning using organized spatial displays proposes that learning will be increased when participants view visual images prior to related text. In contrast to previous studies, this study also included students who exhibited symptoms of ADHD. Method: Participants were presented with either a map-text or…

  2. Can Pictures Promote the Acquisition of Sight-Word Reading? An Evaluation of Two Potential Instructional Strategies

    ERIC Educational Resources Information Center

    Richardson, Amy R.; Lerman, Dorothea C.; Nissen, Melissa A.; Luck, Kally M.; Neal, Ashley E.; Bao, Shimin; Tsami, Loukia

    2017-01-01

    Sight-word instruction can be a useful supplement to phonics-based methods under some circumstances. Nonetheless, few studies have evaluated the conditions under which pictures may be used successfully to teach sight-word reading. In this study, we extended prior research by examining two potential strategies for reducing the effects of…

  3. Correlates of Intellectual Ability with Morphology of the Hippocampus and Amygdala in Healthy Adults

    ERIC Educational Resources Information Center

    Amat, Jose A.; Bansal, Ravi; Whiteman, Ronald; Haggerty, Rita; Royal, Jason; Peterson, Bradley S.

    2008-01-01

    Several prior imaging studies of healthy adults have correlated volumes of the hippocampus and amygdala with measures of general intelligence (IQ), with variable results. In this study, we assessed correlations between volumes of the hippocampus and amygdala and full-scale IQ scores (FSIQ) using a method of image analysis that permits detailed…

  4. Preconditions for Post-Employment Learning: Preliminary Results from Ongoing Research

    ERIC Educational Resources Information Center

    Salter, Linda

    2011-01-01

    This article describes the first phase of a two-phase, mixed-method study. The study, now in progress, explores how and to what extent willingness to engage in learning in mature adulthood is influenced by prior experiences and specific individual personality variables, such as perceived locus of control and degree of self-efficacy. Study…

  5. Prediagnostic Serum Biomarkers as Early Detection Tools for Pancreatic Cancer in a Large Prospective Cohort Study

    PubMed Central

    Nolen, Brian M.; Brand, Randall E.; Prosser, Denise; Velikokhatnaya, Liudmila; Allen, Peter J.; Zeh, Herbert J.; Grizzle, William E.; Lomakin, Aleksey; Lokshin, Anna E.

    2014-01-01

    Background The clinical management of pancreatic cancer is severely hampered by the absence of effective screening tools. Methods Sixty-seven biomarkers were evaluated in prediagnostic sera obtained from cases of pancreatic cancer enrolled in the Prostate, Lung, Colorectal, and Ovarian Cancer Screening Trial (PLCO). Results The panel of CA 19-9, OPN, and OPG, identified in a prior retrospective study, was not effective. CA 19-9, CEA, NSE, bHCG, CEACAM1 and PRL were significantly altered in sera obtained from cases greater than 1 year prior to diagnosis. Levels of CA 19-9, CA 125, CEA, PRL, and IL-8 were negatively associated with time to diagnosis. A training/validation study using alternate halves of the PLCO set failed to identify a biomarker panel with significantly improved performance over CA 19-9 alone. When the entire PLCO set was used for training at a specificity (SP) of 95%, a panel of CA 19-9, CEA, and Cyfra 21-1 provided significantly elevated sensitivity (SN) levels of 32.4% and 29.7% in samples collected <1 and >1 year prior to diagnosis, respectively, compared to SN levels of 25.7% and 17.2% for CA 19-9 alone. Conclusions Most biomarkers identified in previously conducted case/control studies are ineffective in prediagnostic samples, however several biomarkers were identified as significantly altered up to 35 months prior to diagnosis. Two newly derived biomarker combinations offered advantage over CA 19-9 alone in terms of SN, particularly in samples collected >1 year prior to diagnosis. However, the efficacy of biomarker-based tools remains limited at present. Several biomarkers demonstrated significant velocity related to time to diagnosis, an observation which may offer considerable potential for enhancements in early detection. PMID:24747429

  6. Effect of post-encoding emotion on recollection and familiarity for pictures.

    PubMed

    Wang, Bo; Ren, Yanju

    2017-07-01

    Although prior studies have examined the effect of post-encoding emotional arousal on recognition memory for words, it is unknown whether the enhancement effect observed on words generalizes to pictures. Furthermore, prior studies using words have showed that the effect of emotional arousal can be modulated by stimuli valence and delay in emotion induction, but it is unclear whether such modulation can extend to pictures and whether other factors such as encoding method (incidental vs. intentional encoding) can be modulatory. Five experiments were conducted to answer these questions. In Experiment 1, participants encoded a list of neutral and negative pictures and then watched a 3-min neutral or negative video. The delayed test showed that negative arousal impaired recollection regardless of picture valence but had no effect on familiarity. Experiment 2 replicated the above findings. Experiment 3 was similar to Experiment 1 except that participants watched a 3-min neutral, negative, or positive video and conducted free recall before the recognition test. Unlike the prior two experiments, the impairment effect of negative arousal disappeared. Experiment 4, where the free recall task was eliminated, replicated the results from Experiment 3. Experiment 5 replicated Experiments 1 and 2 and further showed that the impairment effects of negative arousal could be modulated by delay in emotion induction but not by encoding method or stimuli valence. Taken together, the current study suggests that the enhancement effect observed on words may not generalize to pictures.

  7. Effect of endoscopic transpapillary biliary drainage with/without endoscopic sphincterotomy on post-endoscopic retrograde cholangiopancreatography pancreatitis in patients with biliary stricture (E-BEST): a protocol for a multicentre randomised controlled trial

    PubMed Central

    Kato, Shin; Kuwatani, Masaki; Sugiura, Ryo; Sano, Itsuki; Kawakubo, Kazumichi; Ono, Kota; Sakamoto, Naoya

    2017-01-01

    Introduction The effect of endoscopic sphincterotomy prior to endoscopic biliary stenting to prevent post-endoscopic retrograde cholangiopancreatography pancreatitis remains to be fully elucidated. The aim of this study is to prospectively evaluate the non-inferiority of non-endoscopic sphincterotomy prior to stenting for naïve major duodenal papilla compared with endoscopic sphincterotomy prior to stenting in patients with biliary stricture. Methods and analysis We designed a multicentre randomised controlled trial, for which we will recruit 370 patients with biliary stricture requiring endoscopic biliary stenting from 26 high-volume institutions in Japan. Patients will be randomly allocated to the endoscopic sphincterotomy group or the non-endoscopic sphincterotomy group. The main outcome measure is the incidence of pancreatitis within 2 days of initial transpapillary biliary drainage. Data will be analysed on completion of the study. We will calculate the 95% confidence intervals (CIs) of the incidence of pancreatitis in each group and analyse weather the difference in both groups with 95% CIs is within the non-inferiority margin (6%) using the Wald method. Ethics and dissemination This study has been approved by the institutional review board of Hokkaido University Hospital (IRB: 016–0181). Results will be submitted for presentation at an international medical conference and published in a peer-reviewed journal. Trial registration number The University Hospital Medical Information Network ID: UMIN000025727 Pre-results. PMID:28801436

  8. Linear Regression with a Randomly Censored Covariate: Application to an Alzheimer's Study.

    PubMed

    Atem, Folefac D; Qian, Jing; Maye, Jacqueline E; Johnson, Keith A; Betensky, Rebecca A

    2017-01-01

    The association between maternal age of onset of dementia and amyloid deposition (measured by in vivo positron emission tomography (PET) imaging) in cognitively normal older offspring is of interest. In a regression model for amyloid, special methods are required due to the random right censoring of the covariate of maternal age of onset of dementia. Prior literature has proposed methods to address the problem of censoring due to assay limit of detection, but not random censoring. We propose imputation methods and a survival regression method that do not require parametric assumptions about the distribution of the censored covariate. Existing imputation methods address missing covariates, but not right censored covariates. In simulation studies, we compare these methods to the simple, but inefficient complete case analysis, and to thresholding approaches. We apply the methods to the Alzheimer's study.

  9. Total knee arthroplasty fibrosis following arthroscopic intervention

    PubMed Central

    Churchill, Jessica L.; Sodhi, Nipun; Khlopas, Anton; Piuzzi, Nicolas S.; Dalton, Sarah E.; Chughtai, Morad; Sultan, Assem A.; Jones, Steven; Williams, Nick; Mont, Michael A.

    2017-01-01

    Background Although arthroscopy is generally considered to be a relatively benign procedure with limited trauma to periarticular soft tissues, post-arthroscopic bleeding as well as osmolality differences between the normal saline used to irrigate and the native synovial fluid (282 vs. 420 mOs) can lead to capsular reactions. Therefore, the purpose of this study was to evaluate whether capsular reaction occurred after knee arthroscopy, by comparing a matched cohort of patients who either did or did not undergo prior arthroscopic surgery. Specifically, we compared histological features such as: (I) synovial thickness; (II) cellularity; and (III) the amount of fibrous tissue for each cohort. Methods Prior to their total knee arthroplasty (TKA), 40 consecutive patients who had previously undergone arthroscopy were matched to 40 consecutive patients who had not. During each patient’s TKA, a biopsy of the capsule and fat pad was taken and formalin sections were sent to pathology to assess for synovial thickness, cellularity, and the amount of fibrous tissue. The pathologist was blinded to the groupings. Findings for all histologic features were classified as equivocal, slight to moderate, and moderate to severe. Results There were a significantly higher proportion of patients who had increased synovial thickness in the prior arthroscopy group as compared to the no-prior arthroscopy group (97.5% vs. 0%, P<0.001). Additionally, there were a significantly higher proportion of patients who had increased cellularity in the prior arthroscopy group as compared to the no-prior arthroscopy group (60.0% vs. 0%, P<0.001). There were also a significantly higher proportion of patients who had increased fibrous tissue in the prior arthroscopy group as compared to the no-prior arthroscopy group (95% vs. 62.5%, P<0.001). Conclusions Arthroscopic surgery may have long-term effects on capsular tissue as surgical observations of patients with prior arthroscopic surgery from this study found that the capsule is thicker and denser. Histologic assessment confirms there may be increased synovial thickness, increased cellularity, as well as thickening of fibrous tissue. This preliminary study and further evaluation are required. This suggests that arthroscopic surgery may have long-lasting effects on periarticular tissue especially the capsular tissue which may have implications for pain and functional recovery. PMID:29299475

  10. An investigation of multitasking information behavior and the influence of working memory and flow

    NASA Astrophysics Data System (ADS)

    Alexopoulou, Peggy; Hepworth, Mark; Morris, Anne

    2015-02-01

    This study explored the multitasking information behaviour of Web users and how this is influenced by working memory, flow and Personal, Artefact and Task characteristics, as described in the PAT model. The research was exploratory using a pragmatic, mixed method approach. Thirty University students participated; 10 psychologists, 10 accountants and 10 mechanical engineers. The data collection tools used were: pre and post questionnaires, a working memory test, a flow state scale test, audio-visual data, web search logs, think aloud data, observation, and the critical decision method. All participants searched information on the Web for four topics: two for which they had prior knowledge and two more without prior knowledge. Perception of task complexity was found to be related to working memory. People with low working memory reported a significant increase in task complexity after they had completed information searching tasks for which they had no prior knowledge, this was not the case for tasks with prior knowledge. Regarding flow and task complexity, the results confirmed the suggestion of the PAT model (Finneran and Zhang, 2003), which proposed that a complex task can lead to anxiety and low flow levels as well as to perceived challenge and high flow levels. However, the results did not confirm the suggestion of the PAT model regarding the characteristics of web search systems and especially perceived vividness. All participants experienced high vividness. According to the PAT model, however, only people with high flow should experience high levels of vividness. Flow affected the degree of change of knowledge of the participants. People with high flow gained more knowledge for tasks without prior knowledge rather than people with low flow. Furthermore, accountants felt that tasks without prior knowledge were less complex at the end of the web seeking procedure than psychologists and mechanical engineers. Finally, the three disciplines appeared to differ regarding the multitasking information behaviour characteristics such as queries, web search sessions and opened tabs/windows.

  11. [Reconstituting evaluation methods based on both qualitative and quantitative paradigms].

    PubMed

    Miyata, Hiroaki; Okubo, Suguru; Yoshie, Satoru; Kai, Ichiro

    2011-01-01

    Debate about the relationship between quantitative and qualitative paradigms is often muddled and confusing and the clutter of terms and arguments has resulted in the concepts becoming obscure and unrecognizable. In this study we conducted content analysis regarding evaluation methods of qualitative healthcare research. We extracted descriptions on four types of evaluation paradigm (validity/credibility, reliability/credibility, objectivity/confirmability, and generalizability/transferability), and classified them into subcategories. In quantitative research, there has been many evaluation methods based on qualitative paradigms, and vice versa. Thus, it might not be useful to consider evaluation methods of qualitative paradigm are isolated from those of quantitative methods. Choosing practical evaluation methods based on the situation and prior conditions of each study is an important approach for researchers.

  12. Epigenetic priors for identifying active transcription factor binding sites.

    PubMed

    Cuellar-Partida, Gabriel; Buske, Fabian A; McLeay, Robert C; Whitington, Tom; Noble, William Stafford; Bailey, Timothy L

    2012-01-01

    Accurate knowledge of the genome-wide binding of transcription factors in a particular cell type or under a particular condition is necessary for understanding transcriptional regulation. Using epigenetic data such as histone modification and DNase I, accessibility data has been shown to improve motif-based in silico methods for predicting such binding, but this approach has not yet been fully explored. We describe a probabilistic method for combining one or more tracks of epigenetic data with a standard DNA sequence motif model to improve our ability to identify active transcription factor binding sites (TFBSs). We convert each data type into a position-specific probabilistic prior and combine these priors with a traditional probabilistic motif model to compute a log-posterior odds score. Our experiments, using histone modifications H3K4me1, H3K4me3, H3K9ac and H3K27ac, as well as DNase I sensitivity, show conclusively that the log-posterior odds score consistently outperforms a simple binary filter based on the same data. We also show that our approach performs competitively with a more complex method, CENTIPEDE, and suggest that the relative simplicity of the log-posterior odds scoring method makes it an appealing and very general method for identifying functional TFBSs on the basis of DNA and epigenetic evidence. FIMO, part of the MEME Suite software toolkit, now supports log-posterior odds scoring using position-specific priors for motif search. A web server and source code are available at http://meme.nbcr.net. Utilities for creating priors are at http://research.imb.uq.edu.au/t.bailey/SD/Cuellar2011. t.bailey@uq.edu.au Supplementary data are available at Bioinformatics online.

  13. Partial articular-sided rotator cuff tears: in situ repair versus tear completion prior to repair.

    PubMed

    Sethi, Paul M; Rajaram, Arun; Obopilwe, Elifho; Mazzocca, Augustus D

    2013-06-01

    Uncertainty exists over the ideal surgical treatment method for partial articular-sided rotator cuff tears, with options ranging from debridement to in situ repair to tear completion prior to repair. The purpose of this study was to determine whether in situ repair was a viable biomechanical treatment option compared with tear completion prior to repair of partial articular-sided rotator cuff tears. Fourteen fresh-frozen cadaveric shoulders were dissected. Partial articular-sided tears were created and repaired using in situ repair or tear completion prior to the repair. Strain and displacement were measured at 45°, 60°, and 90° of glenohumeral abduction. Testing was performed with a load of 100 N applied for 30 cycles. Data from the biomechanical testing displayed 4 conditions that showed improved characteristics of in situ repair over completion and repair: bursal-sided strain anteriorly at 45°, bursal-sided strain anteriorly at 90°, bursal-sided displacement anteriorly at 45°, and bursal-sided displacement anteriorly at 90°. The data indicate that in situ repair is a viable biomechanical treatment option compared with tear completion prior to repair of partial articular-sided rotator cuff tears. When clinically appropriate, the in situ repair may offer some biomechanical advantages, with lower strain and displacement observed on the bursal side compared with tear completion prior to repair. Copyright 2013, SLACK Incorporated.

  14. Bayesian survival analysis in clinical trials: What methods are used in practice?

    PubMed

    Brard, Caroline; Le Teuff, Gwénaël; Le Deley, Marie-Cécile; Hampson, Lisa V

    2017-02-01

    Background Bayesian statistics are an appealing alternative to the traditional frequentist approach to designing, analysing, and reporting of clinical trials, especially in rare diseases. Time-to-event endpoints are widely used in many medical fields. There are additional complexities to designing Bayesian survival trials which arise from the need to specify a model for the survival distribution. The objective of this article was to critically review the use and reporting of Bayesian methods in survival trials. Methods A systematic review of clinical trials using Bayesian survival analyses was performed through PubMed and Web of Science databases. This was complemented by a full text search of the online repositories of pre-selected journals. Cost-effectiveness, dose-finding studies, meta-analyses, and methodological papers using clinical trials were excluded. Results In total, 28 articles met the inclusion criteria, 25 were original reports of clinical trials and 3 were re-analyses of a clinical trial. Most trials were in oncology (n = 25), were randomised controlled (n = 21) phase III trials (n = 13), and half considered a rare disease (n = 13). Bayesian approaches were used for monitoring in 14 trials and for the final analysis only in 14 trials. In the latter case, Bayesian survival analyses were used for the primary analysis in four cases, for the secondary analysis in seven cases, and for the trial re-analysis in three cases. Overall, 12 articles reported fitting Bayesian regression models (semi-parametric, n = 3; parametric, n = 9). Prior distributions were often incompletely reported: 20 articles did not define the prior distribution used for the parameter of interest. Over half of the trials used only non-informative priors for monitoring and the final analysis (n = 12) when it was specified. Indeed, no articles fitting Bayesian regression models placed informative priors on the parameter of interest. The prior for the treatment effect was based on historical data in only four trials. Decision rules were pre-defined in eight cases when trials used Bayesian monitoring, and in only one case when trials adopted a Bayesian approach to the final analysis. Conclusion Few trials implemented a Bayesian survival analysis and few incorporated external data into priors. There is scope to improve the quality of reporting of Bayesian methods in survival trials. Extension of the Consolidated Standards of Reporting Trials statement for reporting Bayesian clinical trials is recommended.

  15. Incorporation of stochastic engineering models as prior information in Bayesian medical device trials.

    PubMed

    Haddad, Tarek; Himes, Adam; Thompson, Laura; Irony, Telba; Nair, Rajesh

    2017-01-01

    Evaluation of medical devices via clinical trial is often a necessary step in the process of bringing a new product to market. In recent years, device manufacturers are increasingly using stochastic engineering models during the product development process. These models have the capability to simulate virtual patient outcomes. This article presents a novel method based on the power prior for augmenting a clinical trial using virtual patient data. To properly inform clinical evaluation, the virtual patient model must simulate the clinical outcome of interest, incorporating patient variability, as well as the uncertainty in the engineering model and in its input parameters. The number of virtual patients is controlled by a discount function which uses the similarity between modeled and observed data. This method is illustrated by a case study of cardiac lead fracture. Different discount functions are used to cover a wide range of scenarios in which the type I error rates and power vary for the same number of enrolled patients. Incorporation of engineering models as prior knowledge in a Bayesian clinical trial design can provide benefits of decreased sample size and trial length while still controlling type I error rate and power.

  16. Iterative reconstruction for x-ray computed tomography using prior-image induced nonlocal regularization.

    PubMed

    Zhang, Hua; Huang, Jing; Ma, Jianhua; Bian, Zhaoying; Feng, Qianjin; Lu, Hongbing; Liang, Zhengrong; Chen, Wufan

    2014-09-01

    Repeated X-ray computed tomography (CT) scans are often required in several specific applications such as perfusion imaging, image-guided biopsy needle, image-guided intervention, and radiotherapy with noticeable benefits. However, the associated cumulative radiation dose significantly increases as comparison with that used in the conventional CT scan, which has raised major concerns in patients. In this study, to realize radiation dose reduction by reducing the X-ray tube current and exposure time (mAs) in repeated CT scans, we propose a prior-image induced nonlocal (PINL) regularization for statistical iterative reconstruction via the penalized weighted least-squares (PWLS) criteria, which we refer to as "PWLS-PINL". Specifically, the PINL regularization utilizes the redundant information in the prior image and the weighted least-squares term considers a data-dependent variance estimation, aiming to improve current low-dose image quality. Subsequently, a modified iterative successive overrelaxation algorithm is adopted to optimize the associative objective function. Experimental results on both phantom and patient data show that the present PWLS-PINL method can achieve promising gains over the other existing methods in terms of the noise reduction, low-contrast object detection, and edge detail preservation.

  17. Iterative Reconstruction for X-Ray Computed Tomography using Prior-Image Induced Nonlocal Regularization

    PubMed Central

    Ma, Jianhua; Bian, Zhaoying; Feng, Qianjin; Lu, Hongbing; Liang, Zhengrong; Chen, Wufan

    2014-01-01

    Repeated x-ray computed tomography (CT) scans are often required in several specific applications such as perfusion imaging, image-guided biopsy needle, image-guided intervention, and radiotherapy with noticeable benefits. However, the associated cumulative radiation dose significantly increases as comparison with that used in the conventional CT scan, which has raised major concerns in patients. In this study, to realize radiation dose reduction by reducing the x-ray tube current and exposure time (mAs) in repeated CT scans, we propose a prior-image induced nonlocal (PINL) regularization for statistical iterative reconstruction via the penalized weighted least-squares (PWLS) criteria, which we refer to as “PWLS-PINL”. Specifically, the PINL regularization utilizes the redundant information in the prior image and the weighted least-squares term considers a data-dependent variance estimation, aiming to improve current low-dose image quality. Subsequently, a modified iterative successive over-relaxation algorithm is adopted to optimize the associative objective function. Experimental results on both phantom and patient data show that the present PWLS-PINL method can achieve promising gains over the other existing methods in terms of the noise reduction, low-contrast object detection and edge detail preservation. PMID:24235272

  18. Recognition of Prior Learning at the Centre of a National Strategy: Tensions between Professional Gains and Personal Development

    ERIC Educational Resources Information Center

    Lima, Licínio C.; Guimarães, Paula

    2016-01-01

    This paper focuses on recognition of prior learning as part of a national policy based on European Union guidelines for lifelong learning, and it explains how recognition of prior learning has been perceived since it was implemented in Portugal in 2000. Data discussed are the result of a mixed method research project that surveyed adult learners,…

  19. Low dose tomographic fluoroscopy: 4D intervention guidance with running prior

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flach, Barbara; Kuntz, Jan; Brehm, Marcus

    Purpose: Today's standard imaging technique in interventional radiology is the single- or biplane x-ray fluoroscopy which delivers 2D projection images as a function of time (2D+T). This state-of-the-art technology, however, suffers from its projective nature and is limited by the superposition of the patient's anatomy. Temporally resolved tomographic volumes (3D+T) would significantly improve the visualization of complex structures. A continuous tomographic data acquisition, if carried out with today's technology, would yield an excessive patient dose. Recently the authors proposed a method that enables tomographic fluoroscopy at the same dose level as projective fluoroscopy which means that if scanning time ofmore » an intervention guided by projective fluoroscopy is the same as that of an intervention guided by tomographic fluoroscopy, almost the same dose is administered to the patient. The purpose of this work is to extend authors' previous work and allow for patient motion during the intervention.Methods: The authors propose the running prior technique for adaptation of a prior image. This adaptation is realized by a combination of registration and projection replacement. In a first step the prior is deformed to the current position via affine and deformable registration. Then the information from outdated projections is replaced by newly acquired projections using forward and backprojection steps. The thus adapted volume is the running prior. The proposed method is validated by simulated as well as measured data. To investigate motion during intervention a moving head phantom was simulated. Real in vivo data of a pig are acquired by a prototype CT system consisting of a flat detector and a continuously rotating clinical gantry.Results: With the running prior technique it is possible to correct for motion without additional dose. For an application in intervention guidance both steps of the running prior technique, registration and replacement, are necessary. Reconstructed volumes based on the running prior show high image quality without introducing new artifacts and the interventional materials are displayed at the correct position.Conclusions: The running prior improves the robustness of low dose 3D+T intervention guidance toward intended or unintended patient motion.« less

  20. Bayesian Nonparametric Ordination for the Analysis of Microbial Communities.

    PubMed

    Ren, Boyu; Bacallado, Sergio; Favaro, Stefano; Holmes, Susan; Trippa, Lorenzo

    2017-01-01

    Human microbiome studies use sequencing technologies to measure the abundance of bacterial species or Operational Taxonomic Units (OTUs) in samples of biological material. Typically the data are organized in contingency tables with OTU counts across heterogeneous biological samples. In the microbial ecology community, ordination methods are frequently used to investigate latent factors or clusters that capture and describe variations of OTU counts across biological samples. It remains important to evaluate how uncertainty in estimates of each biological sample's microbial distribution propagates to ordination analyses, including visualization of clusters and projections of biological samples on low dimensional spaces. We propose a Bayesian analysis for dependent distributions to endow frequently used ordinations with estimates of uncertainty. A Bayesian nonparametric prior for dependent normalized random measures is constructed, which is marginally equivalent to the normalized generalized Gamma process, a well-known prior for nonparametric analyses. In our prior, the dependence and similarity between microbial distributions is represented by latent factors that concentrate in a low dimensional space. We use a shrinkage prior to tune the dimensionality of the latent factors. The resulting posterior samples of model parameters can be used to evaluate uncertainty in analyses routinely applied in microbiome studies. Specifically, by combining them with multivariate data analysis techniques we can visualize credible regions in ecological ordination plots. The characteristics of the proposed model are illustrated through a simulation study and applications in two microbiome datasets.

  1. Connect the Dots and Pinhole Constellations.

    ERIC Educational Resources Information Center

    Kominski, John

    1991-01-01

    Identifies a variety of methods to introduce constellations and asterisms to students in the classroom and planetarium prior to their study of the night sky. Materials used include transparencies, oatmeal boxes, photographic slides, and tracing paper. Exercises incorporate storytelling and prediction of location, movement, and seasonal patterns of…

  2. New prior sampling methods for nested sampling - Development and testing

    NASA Astrophysics Data System (ADS)

    Stokes, Barrie; Tuyl, Frank; Hudson, Irene

    2017-06-01

    Nested Sampling is a powerful algorithm for fitting models to data in the Bayesian setting, introduced by Skilling [1]. The nested sampling algorithm proceeds by carrying out a series of compressive steps, involving successively nested iso-likelihood boundaries, starting with the full prior distribution of the problem parameters. The "central problem" of nested sampling is to draw at each step a sample from the prior distribution whose likelihood is greater than the current likelihood threshold, i.e., a sample falling inside the current likelihood-restricted region. For both flat and informative priors this ultimately requires uniform sampling restricted to the likelihood-restricted region. We present two new methods of carrying out this sampling step, and illustrate their use with the lighthouse problem [2], a bivariate likelihood used by Gregory [3] and a trivariate Gaussian mixture likelihood. All the algorithm development and testing reported here has been done with Mathematica® [4].

  3. Optical aberration correction for simple lenses via sparse representation

    NASA Astrophysics Data System (ADS)

    Cui, Jinlin; Huang, Wei

    2018-04-01

    Simple lenses with spherical surfaces are lightweight, inexpensive, highly flexible, and can be easily processed. However, they suffer from optical aberrations that lead to limitations in high-quality photography. In this study, we propose a set of computational photography techniques based on sparse signal representation to remove optical aberrations, thereby allowing the recovery of images captured through a single-lens camera. The primary advantage of the proposed method is that many prior point spread functions calibrated at different depths are successfully used for restoring visual images in a short time, which can be generally applied to nonblind deconvolution methods for solving the problem of the excessive processing time caused by the number of point spread functions. The optical software CODE V is applied for examining the reliability of the proposed method by simulation. The simulation results reveal that the suggested method outperforms the traditional methods. Moreover, the performance of a single-lens camera is significantly enhanced both qualitatively and perceptually. Particularly, the prior information obtained by CODE V can be used for processing the real images of a single-lens camera, which provides an alternative approach to conveniently and accurately obtain point spread functions of single-lens cameras.

  4. Gunshot residue testing in suicides: Part II: Analysis by inductive coupled plasma-atomic emission spectrometry.

    PubMed

    Molina, D Kimberley; Castorena, Joe L; Martinez, Michael; Garcia, James; DiMaio, Vincent J M

    2007-09-01

    Several different methods can be employed to test for gunshot residue (GSR) on a decedent's hands, including scanning electron microscopy with energy dispersive x-ray (SEM/EDX) and inductive coupled plasma-atomic emission spectrometry (ICP-AES). In part I of this 2-part series, GSR results performed by SEM/EDX in undisputed cases of suicidal handgun wounds were studied. In part II, the same population was studied, deceased persons with undisputed suicidal handgun wounds, but GSR testing was performed using ICP-AES. A total of 102 cases were studied and analyzed for caliber of weapon, proximity of wound, and the results of the GSR testing. This study found that 50% of cases where the deceased was known to have fired a handgun immediately prior to death had positive GSR results by ICP/AES, which did not differ from the results of GSR testing by SEM/EDX. Since only 50% of cases where the person is known to have fired a weapon were positive for GSR by either method, this test should not be relied upon to determine whether someone has discharged a firearm and is not useful as a determining factor of whether or not a wound is self-inflicted or non-self-inflicted. While a positive GSR result may be of use, a negative result is not helpful in the medical examiner setting as a negative result indicates that either a person fired a weapon prior to death or a person did not fire a weapon prior to death.

  5. The value of X-ray digital tomosynthesis in the diagnosis of urinary calculi

    PubMed Central

    Liu, Shifeng; Wang, Hong; Feng, Weihua; Hu, Xiaokun; Guo, Jian; Shang, Qingjun; Li, Zixiang; Yu, Hongsheng

    2018-01-01

    Urinary calculus is a common and recurrent condition that affects kidney function. The present study evaluated the use of digital tomosynthesis (DTS) and Kidneys-Ureters-Bladder (KUB) radiography as methods of diagnosing urinary calculi. Unenhanced multidetector computed tomography (UMDCT) was used in the diagnosis of calculi. KUB radiography and DTS procedures were conducted on patients prior to and following bowel preparation to detect kidney, ureteral and bladder calculi. Differences in diagnostic performance of KUB radiography and DTS imaging on prepared and unprepared bowel were evaluated using the χ2 test. The consistency of diagnostic results between two examining physicians was analyzed using the κ test. A total of 138 calculi from 80 patients were detected via UMDCT. The calculi detection rates of KUB prior to and following bowel preparation were 47.8 and 66.7% respectively, and the calculi detection rate of DTS prior to and following bowel preparation were 94.2 and 96.4%, respectively. The detection rates of calculi >5 mm via KUB prior to and following bowel preparation were 56.6 and 73.5% respectively, and in DTS they were 100% prior to and following bowel preparation. Economically, DTS performed on the unprepared bowel was the most cost effective, followed by DTS on the prepared bowel, KUB on the unprepared bowel and KUB on the prepared bowel. Therefore, the current study concluded that DTS may be an appropriate first-line imaging technique in patients with urinary calculi. PMID:29434761

  6. The value of X-ray digital tomosynthesis in the diagnosis of urinary calculi.

    PubMed

    Liu, Shifeng; Wang, Hong; Feng, Weihua; Hu, Xiaokun; Guo, Jian; Shang, Qingjun; Li, Zixiang; Yu, Hongsheng

    2018-02-01

    Urinary calculus is a common and recurrent condition that affects kidney function. The present study evaluated the use of digital tomosynthesis (DTS) and Kidneys-Ureters-Bladder (KUB) radiography as methods of diagnosing urinary calculi. Unenhanced multidetector computed tomography (UMDCT) was used in the diagnosis of calculi. KUB radiography and DTS procedures were conducted on patients prior to and following bowel preparation to detect kidney, ureteral and bladder calculi. Differences in diagnostic performance of KUB radiography and DTS imaging on prepared and unprepared bowel were evaluated using the χ 2 test. The consistency of diagnostic results between two examining physicians was analyzed using the κ test. A total of 138 calculi from 80 patients were detected via UMDCT. The calculi detection rates of KUB prior to and following bowel preparation were 47.8 and 66.7% respectively, and the calculi detection rate of DTS prior to and following bowel preparation were 94.2 and 96.4%, respectively. The detection rates of calculi >5 mm via KUB prior to and following bowel preparation were 56.6 and 73.5% respectively, and in DTS they were 100% prior to and following bowel preparation. Economically, DTS performed on the unprepared bowel was the most cost effective, followed by DTS on the prepared bowel, KUB on the unprepared bowel and KUB on the prepared bowel. Therefore, the current study concluded that DTS may be an appropriate first-line imaging technique in patients with urinary calculi.

  7. The weighted priors approach for combining expert opinions in logistic regression experiments

    DOE PAGES

    Quinlan, Kevin R.; Anderson-Cook, Christine M.; Myers, Kary L.

    2017-04-24

    When modeling the reliability of a system or component, it is not uncommon for more than one expert to provide very different prior estimates of the expected reliability as a function of an explanatory variable such as age or temperature. Our goal in this paper is to incorporate all information from the experts when choosing a design about which units to test. Bayesian design of experiments has been shown to be very successful for generalized linear models, including logistic regression models. We use this approach to develop methodology for the case where there are several potentially non-overlapping priors under consideration.more » While multiple priors have been used for analysis in the past, they have never been used in a design context. The Weighted Priors method performs well for a broad range of true underlying model parameter choices and is more robust when compared to other reasonable design choices. Finally, we illustrate the method through multiple scenarios and a motivating example. Additional figures for this article are available in the online supplementary information.« less

  8. The weighted priors approach for combining expert opinions in logistic regression experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Quinlan, Kevin R.; Anderson-Cook, Christine M.; Myers, Kary L.

    When modeling the reliability of a system or component, it is not uncommon for more than one expert to provide very different prior estimates of the expected reliability as a function of an explanatory variable such as age or temperature. Our goal in this paper is to incorporate all information from the experts when choosing a design about which units to test. Bayesian design of experiments has been shown to be very successful for generalized linear models, including logistic regression models. We use this approach to develop methodology for the case where there are several potentially non-overlapping priors under consideration.more » While multiple priors have been used for analysis in the past, they have never been used in a design context. The Weighted Priors method performs well for a broad range of true underlying model parameter choices and is more robust when compared to other reasonable design choices. Finally, we illustrate the method through multiple scenarios and a motivating example. Additional figures for this article are available in the online supplementary information.« less

  9. Patient-Specific Early Seizure Detection from Scalp EEG

    PubMed Central

    Minasyan, Georgiy R.; Chatten, John B.; Chatten, Martha Jane; Harner, Richard N.

    2010-01-01

    Objective Develop a method for automatic detection of seizures prior to or immediately after clinical onset using features derived from scalp EEG. Methods This detection method is patient-specific. It uses recurrent neural networks and a variety of input features. For each patient we trained and optimized the detection algorithm for two cases: 1) during the period immediately preceding seizure onset, and 2) during the period immediately following seizure onset. Continuous scalp EEG recordings (duration 15 – 62 h, median 25 h) from 25 patients, including a total of 86 seizures, were used in this study. Results Pre-onset detection was successful in 14 of the 25 patients. For these 14 patients, all of the testing seizures were detected prior to seizure onset with a median pre-onset time of 51 sec and false positive rate was 0.06/h. Post-onset detection had 100% sensitivity, 0.023/hr false positive rate and median delay of 4 sec after onset. Conclusions The unique results of this study relate to pre-onset detection. Significance Our results suggest that reliable pre-onset seizure detection may be achievable for a significant subset of epilepsy patients without use of invasive electrodes. PMID:20461014

  10. Minimization for conditional simulation: Relationship to optimal transport

    NASA Astrophysics Data System (ADS)

    Oliver, Dean S.

    2014-05-01

    In this paper, we consider the problem of generating independent samples from a conditional distribution when independent samples from the prior distribution are available. Although there are exact methods for sampling from the posterior (e.g. Markov chain Monte Carlo or acceptance/rejection), these methods tend to be computationally demanding when evaluation of the likelihood function is expensive, as it is for most geoscience applications. As an alternative, in this paper we discuss deterministic mappings of variables distributed according to the prior to variables distributed according to the posterior. Although any deterministic mappings might be equally useful, we will focus our discussion on a class of algorithms that obtain implicit mappings by minimization of a cost function that includes measures of data mismatch and model variable mismatch. Algorithms of this type include quasi-linear estimation, randomized maximum likelihood, perturbed observation ensemble Kalman filter, and ensemble of perturbed analyses (4D-Var). When the prior pdf is Gaussian and the observation operators are linear, we show that these minimization-based simulation methods solve an optimal transport problem with a nonstandard cost function. When the observation operators are nonlinear, however, the mapping of variables from the prior to the posterior obtained from those methods is only approximate. Errors arise from neglect of the Jacobian determinant of the transformation and from the possibility of discontinuous mappings.

  11. What is the perception of biological risk by undergraduate nursing students?

    PubMed Central

    Moreno-Arroyo, Mª Carmen; Puig-Llobet, Montserrat; Falco-Pegueroles, Anna; Lluch-Canut, Maria Teresa; García, Irma Casas; Roldán-Merino, Juan

    2016-01-01

    Abstract Objective: to analyze undergraduate nursing students' perception of biological risk and its relationship with their prior practical training. Method: a descriptive cross-sectional study was conducted among undergraduate nursing students enrolled in clinical practice courses in the academic year 2013-2014 at the School of Nursing at the University of Barcelona. Variables: sociodemographic variables, employment, training, clinical experience and other variables related to the assessment of perceived biological risk were collected. Both a newly developed tool and the Dimensional Assessment of Risk Perception at the worker level scale (Escala de Evaluación Dimensional del Riesgo Percibido por el Trabajador, EDRP-T) were used. Statistical analysis: descriptive and univariate analysis were used to identify differences between the perception of biological risk of the EDRP-T scale items and sociodemographic variables. Results: students without prior practical training had weaker perceptions of biological risk compared to students with prior practical training (p=0.05 and p=0.04, respectively). Weaker perceptions of biological risk were found among students with prior work experience. Conclusion: practical training and work experience influence the perception of biological risk among nursing students. PMID:27384468

  12. Students' inductive reasoning skills and the relevance of prior knowledge: an exploratory study with a computer-based training course on the topic of acne vulgaris.

    PubMed

    Horn-Ritzinger, Sabine; Bernhardt, Johannes; Horn, Michael; Smolle, Josef

    2011-04-01

    The importance of inductive instruction in medical education is increasingly growing. Little is known about the relevance of prior knowledge regarding students' inductive reasoning abilities. The purpose is to evaluate this inductive teaching method as a means of fostering higher levels of learning and to explore how individual differences in prior knowledge (high [HPK] vs. low [LPK]) contribute to students' inductive reasoning skills. Twenty-six LPK and 18 HPK students could train twice with an interactive computer-based training object to discover the underlying concept before doing the final comprehension check. Students had a median of 76.9% of correct answers in the first, 90.9% in the second training, and answered 92% of the final assessment questions correctly. More important, 86% of all students succeeded with inductive learning, among them 83% of the HPK students and 89% of the LPK students. Prior knowledge did not predict performance on overall comprehension. This inductive instructional strategy fostered students' deep approaches to learning in a time-effective way.

  13. Use of Prophylactic Antibiotics to Prevent Abscess Formation Following Hepatic Ablation in Patients with Prior Enterobiliary Manipulation

    PubMed Central

    Richter, Michael; Aloia, Thomas A.; Conrad, Claudius; Ahrar, Kamran; Gupta, Sanjay; Vauthey, Jean-Nicolas; Huang, Steven Y.

    2016-01-01

    Introduction Prior enterobiliary manipulation confers a high risk for liver abscess formation after hepatic ablation. We aimed to determine if prophylactic antibiotics could prevent post-ablation abscess in patients with a history of hepaticojejunostomy. Materials and Methods This single-institution retrospective study identified 262 patients who underwent 307 percutaneous liver ablation sessions between January 2010 and August 2014. Twelve (4.6%) patients with prior hepaticojejunostomy were included in this analysis. Ten (83>%) had received an aggressive prophylactic antibiotic regimen consisting of levofloxacin, metronidazole, neomycin, and erythromycin base. Two (16.6%) had received other antibiotic regimens. Clinical, laboratory, and imaging findings were used to identify abscess formation and antibiotic-related side effects. Results Twelve ablation sessions were performed during the period studied. During a mean follow-up period of 440 days (range, 77–1784 days), post-ablation abscesses had developed in 2 (16.6 %) patients, who both received the alternative antibiotic regimens. None of the 10 patients who received the aggressive prophylactic antibiotic regimen developed liver abscess. One of the 10 patients who received the aggressive prophylactic antibiotic regimen developed grade 2 antibiotic-related diarrhea and arthralgia. Conclusion An aggressive regimen of prophylactic antibiotics may be effective in preventing liver abscess formation after liver ablation in patients with prior hepaticojejunostomy. PMID:26984694

  14. Evaluating a Collaborative Approach to Improve Prior Authorization Efficiency in the Treatment of Hepatitis C Virus

    PubMed Central

    Dunn, Emily E.; Vranek, Kathryn; Hynicka, Lauren M.; Gripshover, Janet; Potosky, Darryn

    2017-01-01

    Objective: A team-based approach to obtaining prior authorization approval was implemented utilizing a specialty pharmacy, a clinic-based pharmacy technician specialist, and a registered nurse to work with providers to obtain approval for medications for hepatitis C virus (HCV) infection. The objective of this study was to evaluate the time to approval for prescribed treatment of HCV infection. Methods: A retrospective observational study was conducted including patients treated for HCV infection by clinic providers who received at least 1 oral direct-acting antiviral HCV medication. Patients were divided into 2 groups, based on whether they were treated before or after the implementation of the team-based approach. Student t tests were used to compare average wait times before and after the intervention. Results: The sample included 180 patients, 68 treated before the intervention and 112 patients who initiated therapy after. All patients sampled required prior authorization approval by a third-party payer to begin therapy. There was a statistically significant reduction (P = .02) in average wait time in the postintervention group (15.6 ± 12.1 days) once adjusted using dates of approval. Conclusions: Pharmacy collaboration may provide increases in efficiency in provider prior authorization practices and reduced wait time for patients to begin treatment. PMID:28665904

  15. A model to systematically employ professional judgment in the Bayesian Decision Analysis for a semiconductor industry exposure assessment.

    PubMed

    Torres, Craig; Jones, Rachael; Boelter, Fred; Poole, James; Dell, Linda; Harper, Paul

    2014-01-01

    Bayesian Decision Analysis (BDA) uses Bayesian statistics to integrate multiple types of exposure information and classify exposures within the exposure rating categorization scheme promoted in American Industrial Hygiene Association (AIHA) publications. Prior distributions for BDA may be developed from existing monitoring data, mathematical models, or professional judgment. Professional judgments may misclassify exposures. We suggest that a structured qualitative risk assessment (QLRA) method can provide consistency and transparency in professional judgments. In this analysis, we use a structured QLRA method to define prior distributions (priors) for BDA. We applied this approach at three semiconductor facilities in South Korea, and present an evaluation of the performance of structured QLRA for determination of priors, and an evaluation of occupational exposures using BDA. Specifically, the structured QLRA was applied to chemical agents in similar exposure groups to identify provisional risk ratings. Standard priors were developed for each risk rating before review of historical monitoring data. Newly collected monitoring data were used to update priors informed by QLRA or historical monitoring data, and determine the posterior distribution. Exposure ratings were defined by the rating category with the highest probability--i.e., the most likely. We found the most likely exposure rating in the QLRA-informed priors to be consistent with historical and newly collected monitoring data, and the posterior exposure ratings developed with QLRA-informed priors to be equal to or greater than those developed with data-informed priors in 94% of comparisons. Overall, exposures at these facilities are consistent with well-controlled work environments. That is, the 95th percentile of exposure distributions are ≤50% of the occupational exposure limit (OEL) for all chemical-SEG combinations evaluated; and are ≤10% of the limit for 94% of chemical-SEG combinations evaluated.

  16. Curing Composite Materials Using Lower-Energy Electron Beams

    NASA Technical Reports Server (NTRS)

    Byrne, Catherine A.; Bykanov, Alexander

    2004-01-01

    In an improved method of fabricating composite-material structures by laying up prepreg tapes (tapes of fiber reinforcement impregnated by uncured matrix materials) and then curing them, one cures the layups by use of beams of electrons having kinetic energies in the range of 200 to 300 keV. In contrast, in a prior method, one used electron beams characterized by kinetic energies up to 20 MeV. The improved method was first suggested by an Italian group in 1993, but had not been demonstrated until recently. With respect to both the prior method and the present improved method, the impetus for the use of electron- beam curing is a desire to avoid the high costs of autoclaves large enough to effect thermal curing of large composite-material structures. Unfortunately, in the prior method, the advantages of electron-beam curing are offset by the need for special walls and ceilings on curing chambers to shield personnel from x rays generated by impacts of energetic electrons. These shields must be thick [typically 2 to 3 ft (about 0.6 to 0.9 m) if made of concrete] and are therefore expensive. They also make it difficult to bring large structures into and out of the curing chambers. Currently, all major companies that fabricate composite-material spacecraft and aircraft structures form their layups by use of automated tape placement (ATP) machines. In the present improved method, an electron-beam gun is attached to an ATP head and used to irradiate the tape as it is pressed onto the workpiece. The electron kinetic energy between 200 and 300 keV is sufficient for penetration of the ply being laid plus one or two of the plies underneath it. Provided that the electron-beam gun is properly positioned, it is possible to administer the required electron dose and, at the same time, to protect personnel with less shielding than is needed in the prior method. Adequate shielding can be provided by concrete walls 6 ft (approximately equal to 1.8 m) high and 16 in. (approximately equal to 41 cm) thick, without a ceiling. The success of the present method depends on the use of a cationic epoxy as the matrix material in the prepreg tape, heating the prepreg tape to a temperature of 50 C immediately prior to layup, and exposing the workpiece to an electron-beam dose of approximately 2 Mrad. Experiments have shown that structures fabricated by the present method have the same mechanical properties as those of nominally identical structures fabricated by the prior method with electron beams of 3 to 4 MeV.

  17. Boosting Probabilistic Graphical Model Inference by Incorporating Prior Knowledge from Multiple Sources

    PubMed Central

    Praveen, Paurush; Fröhlich, Holger

    2013-01-01

    Inferring regulatory networks from experimental data via probabilistic graphical models is a popular framework to gain insights into biological systems. However, the inherent noise in experimental data coupled with a limited sample size reduces the performance of network reverse engineering. Prior knowledge from existing sources of biological information can address this low signal to noise problem by biasing the network inference towards biologically plausible network structures. Although integrating various sources of information is desirable, their heterogeneous nature makes this task challenging. We propose two computational methods to incorporate various information sources into a probabilistic consensus structure prior to be used in graphical model inference. Our first model, called Latent Factor Model (LFM), assumes a high degree of correlation among external information sources and reconstructs a hidden variable as a common source in a Bayesian manner. The second model, a Noisy-OR, picks up the strongest support for an interaction among information sources in a probabilistic fashion. Our extensive computational studies on KEGG signaling pathways as well as on gene expression data from breast cancer and yeast heat shock response reveal that both approaches can significantly enhance the reconstruction accuracy of Bayesian Networks compared to other competing methods as well as to the situation without any prior. Our framework allows for using diverse information sources, like pathway databases, GO terms and protein domain data, etc. and is flexible enough to integrate new sources, if available. PMID:23826291

  18. MRAC Control with Prior Model Knowledge for Asymmetric Damaged Aircraft

    PubMed Central

    Zhang, Jing

    2015-01-01

    This paper develops a novel state-tracking multivariable model reference adaptive control (MRAC) technique utilizing prior knowledge of plant models to recover control performance of an asymmetric structural damaged aircraft. A modification of linear model representation is given. With prior knowledge on structural damage, a polytope linear parameter varying (LPV) model is derived to cover all concerned damage conditions. An MRAC method is developed for the polytope model, of which the stability and asymptotic error convergence are theoretically proved. The proposed technique reduces the number of parameters to be adapted and thus decreases computational cost and requires less input information. The method is validated by simulations on NASA generic transport model (GTM) with damage. PMID:26180839

  19. Should we pretreat solid waste prior to anaerobic digestion? An assessment of its environmental cost.

    PubMed

    Carballa, Marta; Duran, Cecilia; Hospido, Almudena

    2011-12-15

    Many studies have shown the effectiveness of pretreatments prior to anaerobic digestion of solid wastes, but to our knowledge, none analyzes their environmental consequences/costs. In this work, seven different pretreatments applied to two types of waste (kitchen waste and sewage sludge) have been environmentally evaluated by using life cycle assessment (LCA) methodology. The results show that the environmental burdens associated to the application of pretreatments prior to anaerobic digestion cannot be excluded. Among the options tested, the pressurize-depressurize and chemical (acid or alkaline) pretreatments could be recommended on the basis of their beneficial net environmental performance, while thermal and ozonation alternatives require energy efficiency optimization to reduce their environmental burdens. Reconciling operational, economic and environmental aspects in a holistic approach for the selection of the most sustainable option, mechanical (e.g., pressurize-depressurize) and chemical methods appear to be the most appropriate alternatives at this stage.

  20. Participation in Counseling Programs: High-Risk Participants Are Reluctant to Accept HIV-Prevention Counseling

    PubMed Central

    Earl, Allison; Albarracín, Dolores; Durantini, Marta R.; Gunnoe, Joann B.; Leeper, Josh; Levitt, Justin H.

    2013-01-01

    HIV-prevention intervention effectiveness depends on understanding whether clients with highest need for HIV-prevention counseling accept it. With this objective, a field study with a high-risk community sample from the southeastern United States (N = 350) investigated whether initial knowledge about HIV, motivation to use condoms, condom-use-relevant behavioral skills, and prior condom use correlate with subsequent acceptance of an HIV-prevention counseling session. Ironically, participants with high (vs. low) motivation to use condoms, high (vs. low) condom-use-relevant behavioral skills, and high (vs. low) prior condom use were more likely to accept the HIV-prevention counseling. Moreover, the influence of motivation to use condoms, condom-use-relevant behavioral skills, and prior condom use on acceptance of the counseling was mediated by expectations that the counseling session would be useful. Methods to reduce barriers to recruitment of clients for counseling programs are discussed. PMID:19634960

  1. Protective effect of sensory denervation in inflammatory arthritis (evidence of regulatory neuroimmune pathways in the arthritic joint)

    PubMed Central

    Kane, D; Lockhart, J; Balint, P; Mann, C; Ferrell, W; McInnes, I

    2005-01-01

    Case report: The patient developed arthritis mutilans in all digits of both hands with the exception of the left 4th finger, which had prior sensory denervation following traumatic nerve dissection. Plain radiography, ultrasonography and nerve conduction studies of the hands confirmed the absence of articular disease and sensory innervation in the left 4th digit. Methods: This relationship between joint innervation and joint inflammation was investigated experimentally by prior surgical sensory denervation of the medial aspect of the knee in six Wistar rats in which carrageenan induced arthritis was subsequently induced. Prior sensory denervation—with preservation of muscle function—prevented the development of inflammatory arthritis in the denervated knee. Discussion: Observations in human and animal inflammatory arthritis suggest that regulatory neuroimmune pathways in the joint are an important mechanism that modulates the clinical expression of inflammatory arthritis. PMID:15155371

  2. A priori motion models for four-dimensional reconstruction in gated cardiac SPECT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lalush, D.S.; Tsui, B.M.W.; Cui, Lin

    1996-12-31

    We investigate the benefit of incorporating a priori assumptions about cardiac motion in a fully four-dimensional (4D) reconstruction algorithm for gated cardiac SPECT. Previous work has shown that non-motion-specific 4D Gibbs priors enforcing smoothing in time and space can control noise while preserving resolution. In this paper, we evaluate methods for incorporating known heart motion in the Gibbs prior model. The new model is derived by assigning motion vectors to each 4D voxel, defining the movement of that volume of activity into the neighboring time frames. Weights for the Gibbs cliques are computed based on these {open_quotes}most likely{close_quotes} motion vectors.more » To evaluate, we employ the mathematical cardiac-torso (MCAT) phantom with a new dynamic heart model that simulates the beating and twisting motion of the heart. Sixteen realistically-simulated gated datasets were generated, with noise simulated to emulate a real Tl-201 gated SPECT study. Reconstructions were performed using several different reconstruction algorithms, all modeling nonuniform attenuation and three-dimensional detector response. These include ML-EM with 4D filtering, 4D MAP-EM without prior motion assumption, and 4D MAP-EM with prior motion assumptions. The prior motion assumptions included both the correct motion model and incorrect models. Results show that reconstructions using the 4D prior model can smooth noise and preserve time-domain resolution more effectively than 4D linear filters. We conclude that modeling of motion in 4D reconstruction algorithms can be a powerful tool for smoothing noise and preserving temporal resolution in gated cardiac studies.« less

  3. Discrete Sparse Coding.

    PubMed

    Exarchakis, Georgios; Lücke, Jörg

    2017-11-01

    Sparse coding algorithms with continuous latent variables have been the subject of a large number of studies. However, discrete latent spaces for sparse coding have been largely ignored. In this work, we study sparse coding with latents described by discrete instead of continuous prior distributions. We consider the general case in which the latents (while being sparse) can take on any value of a finite set of possible values and in which we learn the prior probability of any value from data. This approach can be applied to any data generated by discrete causes, and it can be applied as an approximation of continuous causes. As the prior probabilities are learned, the approach then allows for estimating the prior shape without assuming specific functional forms. To efficiently train the parameters of our probabilistic generative model, we apply a truncated expectation-maximization approach (expectation truncation) that we modify to work with a general discrete prior. We evaluate the performance of the algorithm by applying it to a variety of tasks: (1) we use artificial data to verify that the algorithm can recover the generating parameters from a random initialization, (2) use image patches of natural images and discuss the role of the prior for the extraction of image components, (3) use extracellular recordings of neurons to present a novel method of analysis for spiking neurons that includes an intuitive discretization strategy, and (4) apply the algorithm on the task of encoding audio waveforms of human speech. The diverse set of numerical experiments presented in this letter suggests that discrete sparse coding algorithms can scale efficiently to work with realistic data sets and provide novel statistical quantities to describe the structure of the data.

  4. Research and Teaching: Computational Methods in General Chemistry--Perceptions of Programming, Prior Experience, and Student Outcomes

    ERIC Educational Resources Information Center

    Wheeler, Lindsay B.; Chiu, Jennie L.; Grisham, Charles M.

    2016-01-01

    This article explores how integrating computational tools into a general chemistry laboratory course can influence student perceptions of programming and investigates relationships among student perceptions, prior experience, and student outcomes.

  5. Explosion yield estimation from pressure wave template matching

    PubMed Central

    Arrowsmith, Stephen; Bowman, Daniel

    2017-01-01

    A method for estimating the yield of explosions from shock-wave and acoustic-wave measurements is presented. The method exploits full waveforms by comparing pressure measurements against an empirical stack of prior observations using scaling laws. The approach can be applied to measurements across a wide-range of source-to-receiver distances. The method is applied to data from two explosion experiments in different regions, leading to mean relative errors in yield estimates of 0.13 using prior data from the same region, and 0.2 when applied to a new region. PMID:28618805

  6. Method for integrating microelectromechanical devices with electronic circuitry

    DOEpatents

    Montague, S.; Smith, J.H.; Sniegowski, J.J.; McWhorter, P.J.

    1998-08-25

    A method is disclosed for integrating one or more microelectromechanical (MEM) devices with electronic circuitry. The method comprises the steps of forming each MEM device within a cavity below a device surface of the substrate; encapsulating the MEM device prior to forming electronic circuitry on the substrate; and releasing the MEM device for operation after fabrication of the electronic circuitry. Planarization of the encapsulated MEM device prior to formation of the electronic circuitry allows the use of standard processing steps for fabrication of the electronic circuitry. 13 figs.

  7. A Particle Batch Smoother Approach to Snow Water Equivalent Estimation

    NASA Technical Reports Server (NTRS)

    Margulis, Steven A.; Girotto, Manuela; Cortes, Gonzalo; Durand, Michael

    2015-01-01

    This paper presents a newly proposed data assimilation method for historical snow water equivalent SWE estimation using remotely sensed fractional snow-covered area fSCA. The newly proposed approach consists of a particle batch smoother (PBS), which is compared to a previously applied Kalman-based ensemble batch smoother (EnBS) approach. The methods were applied over the 27-yr Landsat 5 record at snow pillow and snow course in situ verification sites in the American River basin in the Sierra Nevada (United States). This basin is more densely vegetated and thus more challenging for SWE estimation than the previous applications of the EnBS. Both data assimilation methods provided significant improvement over the prior (modeling only) estimates, with both able to significantly reduce prior SWE biases. The prior RMSE values at the snow pillow and snow course sites were reduced by 68%-82% and 60%-68%, respectively, when applying the data assimilation methods. This result is encouraging for a basin like the American where the moderate to high forest cover will necessarily obscure more of the snow-covered ground surface than in previously examined, less-vegetated basins. The PBS generally outperformed the EnBS: for snow pillows the PBSRMSE was approx.54%of that seen in the EnBS, while for snow courses the PBSRMSE was approx.79%of the EnBS. Sensitivity tests show relative insensitivity for both the PBS and EnBS results to ensemble size and fSCA measurement error, but a higher sensitivity for the EnBS to the mean prior precipitation input, especially in the case where significant prior biases exist.

  8. Application of pretreatment methods on agricultural products prior to frying: a review.

    PubMed

    Oladejo, Ayobami Olayemi; Ma, Haile; Qu, Wenjuan; Zhou, Cunshan; Wu, Bengang; Uzoejinwa, Benjamin Bernard; Onwude, Daniel I; Yang, Xue

    2018-01-01

    Frying is one of the methods of processing foods, which imparts flavour, taste, colour and crispness in the fried foods. In spite of an increase in the demand for fried foods by consumers all over the world, the danger posed by consuming too much fat is still a challenge. Many researchers have put forward many ideas on how to reduce the oil uptake and improve the nutritional and organoleptic qualities of foods during frying. Several pretreatment techniques applied to food materials prior to frying have been investigated by researchers in a bid to reduce the oil uptake and improve the quality parameters of fried foods. Therefore, this review focuses on the various pretreatment methods and the recent novel methods like ultrasound, infrared, superheated steam drying, microwave technique and pulsed electric field applied to foods prior to frying and its effects on the qualities of fried foods. © 2017 Society of Chemical Industry. © 2017 Society of Chemical Industry.

  9. Rational Clinical Experiment: Assessing Prior Probability and Its Impact on the Success of Phase II Clinical Trials

    PubMed Central

    Halperin, Daniel M.; Lee, J. Jack; Dagohoy, Cecile Gonzales; Yao, James C.

    2015-01-01

    Purpose Despite a robust clinical trial enterprise and encouraging phase II results, the vast minority of oncologic drugs in development receive regulatory approval. In addition, clinicians occasionally make therapeutic decisions based on phase II data. Therefore, clinicians, investigators, and regulatory agencies require improved understanding of the implications of positive phase II studies. We hypothesized that prior probability of eventual drug approval was significantly different across GI cancers, with substantial ramifications for the predictive value of phase II studies. Methods We conducted a systematic search of phase II studies conducted between 1999 and 2004 and compared studies against US Food and Drug Administration and National Cancer Institute databases of approved indications for drugs tested in those studies. Results In all, 317 phase II trials were identified and followed for a median of 12.5 years. Following completion of phase III studies, eventual new drug application approval rates varied from 0% (zero of 45) in pancreatic adenocarcinoma to 34.8% (24 of 69) for colon adenocarcinoma. The proportion of drugs eventually approved was correlated with the disease under study (P < .001). The median type I error for all published trials was 0.05, and the median type II error was 0.1, with minimal variation. By using the observed median type I error for each disease, phase II studies have positive predictive values ranging from less than 1% to 90%, depending on primary site of the cancer. Conclusion Phase II trials in different GI malignancies have distinct prior probabilities of drug approval, yielding quantitatively and qualitatively different predictive values with similar statistical designs. Incorporation of prior probability into trial design may allow for more effective design and interpretation of phase II studies. PMID:26261263

  10. Assessing Organic Contaminant Fluxes from Contaminated Sediments Following Dam Removal in an Urbanized River

    EPA Science Inventory

    In this study, methods and approaches were developed and tested to assess changes in contaminant fluxes resulting from dam removal in a riverine system. Sediment traps and passive samplers were deployed to measure particulate and dissolved PAHs and PCBs in the water column prior...

  11. It Takes a Village: Protecting Rural African American Youth in the Context of Racism

    ERIC Educational Resources Information Center

    Berkel, Cady; Murry, Velma McBride; Hurt, Tera R.; Chen, Yi-fu; Brody, Gene H.; Simons, Ronald L.; Cutrona, Carolyn; Gibbons, Frederick X.

    2009-01-01

    Prior research demonstrates negative consequences of racism, however, little is known about community, parenting, and intrapersonal mechanisms that protect youth. Using a mixed-methods approach, this study illuminated linkages between positive and negative contextual influences on rural African American adolescent outcomes. Quantitative results…

  12. The Sophistical Attitude and the Invention of Rhetoric

    ERIC Educational Resources Information Center

    Crick, Nathan

    2010-01-01

    Traditionally, the Older Sophists were conceived as philosophical skeptics who rejected speculative inquiry to focus on rhetorical methods of being successful in practical life. More recently, this view has been complicated by studies revealing the Sophists to be a diverse group of intellectuals who practiced their art prior to the categorization…

  13. Effects of Early Literacy Environments on the Reading Attitudes, Behaviours and Values of Veteran Teachers

    ERIC Educational Resources Information Center

    Levitt, Roberta; Red Owl, R. H.

    2013-01-01

    Research has linked early literacy environments to the attitudes, behaviours and instructional values of reading teachers, but most prior research has addressed preservice or early inservice teachers. This mixed-methods, hypothesis-generating, "Q" methodology-based study explored the relationship between early literacy environments and…

  14. Breeding bird communities

    Treesearch

    Vanessa L. Artman; Randy Dettmers

    2003-01-01

    Prescribed burning is being applied on an experimental basis to restore and maintain mixed-oak communities in southern Ohio. This chapter describes baseline conditions for the breeding bird community prior to prescribed burning. We surveyed breeding bird populations at four study areas using the territory-mapping method. We observed 35 bird species during the surveys....

  15. Trends in Children's Video Game Play: Practical but Not Creative Thinking

    ERIC Educational Resources Information Center

    Hamlen, Karla R.

    2013-01-01

    Prior research has found common trends among children's video game play as related to gender, age, interests, creativity, and other descriptors. This study re-examined the previously reported trends by utilizing principal components analysis with variables such as creativity, general characteristics, and problem-solving methods to determine…

  16. Stabilizing Oils from Smoked Pink Salmon (Oncorhynchus gorbuscha)

    USDA-ARS?s Scientific Manuscript database

    Smoking of meats and fish is one of the earliest preservation technologies developed by humans. In this study, the smoking process was evaluated as a method for reducing oxidation of Pink Salmon (Oncorhynchus gorbuscha) oils and also maintaining the quality of oil in aged fish prior to oil extractio...

  17. Sex and Vaccination

    ERIC Educational Resources Information Center

    Zavrel, Erik; Herreid, Clyde Freeman

    2008-01-01

    This case study is centered upon the recent debate concerning the decision by Texas Governor Rick Perry to mandate the compulsory vaccination of girls in the Texas public school system against the human papillomavirus (HPV) prior to entering the sixth grade. The interrupted case method is particularly appropriate for this subject with the case…

  18. Experimental Evaluation of the Training Structure of the Picture Exchange Communication System (PECS)

    ERIC Educational Resources Information Center

    Cummings, Anne R.; Carr, James E.; LeBlanc, Linda A.

    2012-01-01

    The Picture Exchange Communication System (PECS) is a picture-based alternative communication method that is widely accepted and utilized with individuals with disabilities. Although prior studies have examined the clinical efficacy of PECS, none have experimentally evaluated its manualized training structure. We experimentally evaluated the…

  19. ASCORBIC ACID REDUCTION OF RESIDUAL ACTIVE CHLORINE IN POTABLE WATER PRIOR TO HALOCARBOXYLATE DETERMINATION

    EPA Science Inventory

    In studies on the formation of disinfection byproducts (DBPs), it is necessary to scavenge residual active (oxidizing) chlorine in order to fix the chlorination byproducts (such as haloethanoates) at a point in time. Thus, methods designed for compliance monitoring are not alway...

  20. Methods That Matter in Addressing Cultural Diversity with Teacher Candidates

    ERIC Educational Resources Information Center

    Acquah, Emmanuel O.; Commins, Nancy L.

    2017-01-01

    Drawing on a combination of prior experience, theoretical stance, and intuition, along with pedagogical practices identified to be effective in addressing diversity with teacher candidates, a model for teaching multicultural education to teacher candidates was designed. This study examined how particular elements of this model were effective in…

  1. Perceived Personality Traits of Individuals with Anorexia Nervosa

    ERIC Educational Resources Information Center

    Watters, Jessica E.; Malouff, John M.

    2012-01-01

    Background: Prior research has found evidence of a general negative personality stereotype for individuals who have anorexia nervosa (AN). Methods: This study examined the expected personality characteristics of individuals with AN using the Five-Factor Model of personality to allow identification of specific personality traits that are part of…

  2. Teaching Evolution: From SMART Objectives to Threshold Experience

    ERIC Educational Resources Information Center

    Wolf, Alexander; Akkaraju, Shylaja

    2014-01-01

    Despite the centrality of evolution to the study of biology, the pedagogical methods employed to teach the subject are often instructor-centered and rarely embedded in every topic throughout the curriculum. In addition, students' prior beliefs about evolution are often dismissed rather than incorporated into the classroom. In this article we…

  3. Enhancement in Informational Masking

    ERIC Educational Resources Information Center

    Cao, Xiang; Richards, Virginia M.

    2012-01-01

    Purpose: The ability to detect a tone added to a random masker improves when a preview of the masker is provided. In 2 experiments, the authors explored the role that perceptual organization plays in this release from masking. Method: Detection thresholds were measured in informational masking studies. The maskers were drawn at random prior to…

  4. Breast Cancer after Augmentation: Oncologic and Reconstructive Considerations among Women Undergoing Mastectomy.

    PubMed

    Cho, Eugenia H; Shammas, Ronnie L; Phillips, Brett T; Greenup, Rachel A; Hwang, E Shelley; Hollenbeck, Scott T

    2017-06-01

    Breast augmentation with subglandular versus subpectoral implants may differentially impact the early detection of breast cancer and treatment recommendations. The authors assessed the impact of prior augmentation on the diagnosis and management of breast cancer in women undergoing mastectomy. Breast cancer diagnosis and management were retrospectively analyzed in all women with prior augmentation undergoing therapeutic mastectomy at the authors' institution from 1993 to 2014. Comparison was made to all women with no prior augmentation undergoing mastectomy in 2010. Subanalyses were performed according to prior implant placement. A total of 260 women with (n = 89) and without (n = 171) prior augmentation underwent mastectomy for 95 and 179 breast cancers, respectively. Prior implant placement was subglandular (n = 27) or subpectoral (n = 63) (For five breasts, the placement was unknown). Breast cancer stage at diagnosis (p = 0.19) and detection method (p = 0.48) did not differ for women with and without prior augmentation. Compared to subpectoral augmentation, subglandular augmentation was associated with the diagnosis of invasive breast cancer rather than ductal carcinoma in situ (p = 0.01) and detection by self-palpation rather than screening mammography (p = 0.03). Immediate two-stage implant reconstruction was the preferred reconstructive method in women with augmentation (p < 0.01). Breast cancer stage at diagnosis was similar for women with and without prior augmentation. Among women with augmentation, however, subglandular implants were associated with more advanced breast tumors commonly detected on palpation rather than mammography. Increased vigilance in breast cancer screening is recommended among women with subglandular augmentation. Therapeutic, III.

  5. Evolutionary History of the Asian Horned Frogs (Megophryinae): Integrative Approaches to Timetree Dating in the Absence of a Fossil Record.

    PubMed

    Mahony, Stephen; Foley, Nicole M; Biju, S D; Teeling, Emma C

    2017-03-01

    Molecular dating studies typically need fossils to calibrate the analyses. Unfortunately, the fossil record is extremely poor or presently nonexistent for many species groups, rendering such dating analysis difficult. One such group is the Asian horned frogs (Megophryinae). Sampling all generic nomina, we combined a novel ∼5 kb dataset composed of four nuclear and three mitochondrial gene fragments to produce a robust phylogeny, with an extensive external morphological study to produce a working taxonomy for the group. Expanding the molecular dataset to include out-groups of fossil-represented ancestral anuran families, we compared the priorless RelTime dating method with the widely used prior-based Bayesian timetree method, MCMCtree, utilizing a novel combination of fossil priors for anuran phylogenetic dating. The phylogeny was then subjected to ancestral phylogeographic analyses, and dating estimates were compared with likely biogeographic vicariant events. Phylogenetic analyses demonstrated that previously proposed systematic hypotheses were incorrect due to the paraphyly of genera. Molecular phylogenetic, morphological, and timetree results support the recognition of Megophryinae as a single genus, Megophrys, with a subgenus level classification. Timetree results using RelTime better corresponded with the known fossil record for the out-group anuran tree. For the priorless in-group, it also outperformed MCMCtree when node date estimates were compared with likely influential historical biogeographic events, providing novel insights into the evolutionary history of this pan-Asian anuran group. Given a relatively small molecular dataset, and limited prior knowledge, this study demonstrates that the computationally rapid RelTime dating tool may outperform more popular and complex prior reliant timetree methodologies. © The Author 2017. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  6. The interplay between experiential and traditional learning for competency development.

    PubMed

    Bonesso, Sara; Gerli, Fabrizio; Pizzi, Claudio

    2015-01-01

    Extensive research demonstrated that firms may pursue several advantages in hiring individuals with the set of emotional, social, and cognitive (ESC) competencies that are most critical for business success. Therefore, the role of education for competency development is becoming paramount. Prior studies have questioned the traditional methods, grounded in the lecture format, as a way to effectively develop ESC competencies. Alternatively, they propose experiential learning techniques that involve participants in dedicated courses or activities. Despite the insights provided by these studies, they do not take into account a comprehensive set of learning methods and their combined effect on the individual's competency portfolio within educational programs that aim to transfer primarily professional skills. Our study aims to fill these gaps by investigating the impact of the interplay between different learning methods on ESC competencies through a sample of students enrolled in the first year of a master's degree program. After providing a classification of three learning methods [traditional learning (TL), individual experiential learning (IEL), and social experiential learning (SEL)], the study delves into their combined influence on ESC competencies, adopting the Artificial Neural Network. Contrary to prior studies, our results provide counterintuitive evidence, suggesting that TL needs to be implemented together, on the one hand, with IEL to achieve a significant effect on emotional competencies and, on the other hand, with SEL to have an impact on social competencies. Moreover, IEL plays a prominent role in stimulating cognitive competencies. Our research contributes to educational literature by providing new insights on the effective combination of learning methods that can be adopted into programs that transfer technical knowledge and skills to promote behavioral competencies.

  7. The interplay between experiential and traditional learning for competency development

    PubMed Central

    Bonesso, Sara; Gerli, Fabrizio; Pizzi, Claudio

    2015-01-01

    Extensive research demonstrated that firms may pursue several advantages in hiring individuals with the set of emotional, social, and cognitive (ESC) competencies that are most critical for business success. Therefore, the role of education for competency development is becoming paramount. Prior studies have questioned the traditional methods, grounded in the lecture format, as a way to effectively develop ESC competencies. Alternatively, they propose experiential learning techniques that involve participants in dedicated courses or activities. Despite the insights provided by these studies, they do not take into account a comprehensive set of learning methods and their combined effect on the individual's competency portfolio within educational programs that aim to transfer primarily professional skills. Our study aims to fill these gaps by investigating the impact of the interplay between different learning methods on ESC competencies through a sample of students enrolled in the first year of a master's degree program. After providing a classification of three learning methods [traditional learning (TL), individual experiential learning (IEL), and social experiential learning (SEL)], the study delves into their combined influence on ESC competencies, adopting the Artificial Neural Network. Contrary to prior studies, our results provide counterintuitive evidence, suggesting that TL needs to be implemented together, on the one hand, with IEL to achieve a significant effect on emotional competencies and, on the other hand, with SEL to have an impact on social competencies. Moreover, IEL plays a prominent role in stimulating cognitive competencies. Our research contributes to educational literature by providing new insights on the effective combination of learning methods that can be adopted into programs that transfer technical knowledge and skills to promote behavioral competencies. PMID:26388810

  8. Advanced Oxidation Processes: Process Mechanisms, Affecting Parameters and Landfill Leachate Treatment.

    PubMed

    Su-Huan, Kow; Fahmi, Muhammad Ridwan; Abidin, Che Zulzikrami Azner; Soon-An, Ong

    2016-11-01

      Advanced oxidation processes (AOPs) are of special interest in treating landfill leachate as they are the most promising procedures to degrade recalcitrant compounds and improve the biodegradability of wastewater. This paper aims to refresh the information base of AOPs and to discover the research gaps of AOPs in landfill leachate treatment. A brief overview of mechanisms involving in AOPs including ozone-based AOPs, hydrogen peroxide-based AOPs and persulfate-based AOPs are presented, and the parameters affecting AOPs are elaborated. Particularly, the advancement of AOPs in landfill leachate treatment is compared and discussed. Landfill leachate characterization prior to method selection and method optimization prior to treatment are necessary, as the performance and practicability of AOPs are influenced by leachate matrixes and treatment cost. More studies concerning the scavenging effects of leachate matrixes towards AOPs, as well as the persulfate-based AOPs in landfill leachate treatment, are necessary in the future.

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nemani, Deepika; Vapiwala, Neha; Hwang, W.-T.

    Purpose: Little information has been reported regarding outcomes after treatment for patients with early-stage invasive breast cancer and a prior nonbreast malignancy. This report analyzes the outcomes in patients with Stage I and II breast cancer after breast conservation treatment (BCT) with a prior nonbreast malignancy. Methods and Materials: The study cohort comprised 66 women with invasive breast cancer and a prior nonbreast malignancy. All patients were treated with breast conservation surgery followed by definitive breast irradiation between 1978 and 2003. Median ages at diagnosis of invasive breast cancer and prior malignancy were 57 and 50 years, respectively. The medianmore » interval between the prior malignancy and breast cancer was 7.0 years. Median and mean follow-up times after BCT were 5.3 and 7.0 years. Results: The 5-year and 10-year overall survival rates were 94% (95% confidence interval [CI], 82-98%) and 78% (95% CI, 59-89%), respectively. There were 4 patients (6%) with local failure and 10 patients (15%) with distant metastases. The 10-year rate of local failure rate was 5% (95% CI, 2-16%) and freedom from distant metastases was 78% (95% CI, 61-88%). No obvious differences in survival or local control were noted compared with the reported results in the literature for patients with invasive breast cancer alone. Conclusions: Both overall survival and local control at 5 and 10 years were comparable to rates observed in early-stage breast cancer patients without a prior malignancy. Prior nonbreast malignancy is not a contraindication to BCT, if the primary cancer is effectively controlled.« less

  10. An Investigation of the Neurological and Neuropsychiatric Disturbances in Adults with Undiagnosed and/or Untreated Phenylketonuria in Poland

    ERIC Educational Resources Information Center

    Mazur, Artur; Jarochowicz, Sabina; Oltarzewski, Mariusz; Sykut-Cegielska, Jolanta; Gradowska, Wanda; Januszek-Trzciakowska, Aleksandra; O'Malley, Grace; Kwolek, Andrzej

    2011-01-01

    Background: The aim of the study was to determine neurological and neuropsychiatric manifestations in a group of patients with previously undiagnosed or untreated phenylketonuria (PKU) in the south-eastern part of Poland. Methods: The study was conducted among 400 adults with severe intellectual disability who were born prior to neonatal screening…

  11. Effect of competition on height growth and survival of planted Japanese larch

    Treesearch

    E. F. McNamara; Irvin C. Reigner

    1960-01-01

    On the Dilldown Unit of the Delaware-Lehigh Experimental Forest in Pennsylvania, several planting studies have been made with the aim of finding the most economical and practical methods of converting scrub oak areas to productive high-forest types. These studies have already shown the need for site preparation prior to planting. Seedlings planted on prepared areas...

  12. Detained Adolescent Females' Multiple Mental Health and Adjustment Problem Outcomes in Young Adulthood

    ERIC Educational Resources Information Center

    van der Molen, E.; Vermeiren, R. R. J. M.; Krabbendam, A. A.; Beekman, A. T. F.; Doreleijers, T. A. H.; Jansen, L. M. C.

    2013-01-01

    Background: Although prior studies have shown that detained females are marked by significant adverse circumstances, little is known about their adult outcomes. Method: Prospective follow-up study of 184 (80.4% of original sample of 229) detained adolescent females who were reassessed 4.5 SD = 0.6) years later in young adulthood (mean age = 20.0,…

  13. Last Year Your Answer Was… : The Impact of Dependent Interviewing Wording and Survey Factors on Reporting of Change

    ERIC Educational Resources Information Center

    Al Baghal, Tarek

    2017-01-01

    Prior studies suggest memories are potentially error prone. Proactive dependent interviewing (PDI) is a possible method to reduce errors in reports of change in longitudinal studies, reminding respondents of previous answers while asking if there has been any change since the last survey. However, little research has been conducted on the impact…

  14. Literature review on use of nonwood plant fibers for building materials and panels

    Treesearch

    John A. Youngquist; Brent E. English; Roger C. Scharmer; Poo Chow; Steven R. Shook

    1994-01-01

    The research studies included in this review focus on the use of nonwood plant fibers for building materials and panels. Studies address (1) methods for efficiently producing building materials and panels from nonwood plant fibers; (2) treatment of fibers prior to board production; (3) process variables, such as press time and temperature, press pressure, and type of...

  15. Confidence Intervals for the Between-Study Variance in Random Effects Meta-Analysis Using Generalised Cochran Heterogeneity Statistics

    ERIC Educational Resources Information Center

    Jackson, Dan

    2013-01-01

    Statistical inference is problematic in the common situation in meta-analysis where the random effects model is fitted to just a handful of studies. In particular, the asymptotic theory of maximum likelihood provides a poor approximation, and Bayesian methods are sensitive to the prior specification. Hence, less efficient, but easily computed and…

  16. Training Needs Assessment in the Botswana Public Service: A Case Study of Five State Sector Ministries

    ERIC Educational Resources Information Center

    Balisi, Shadreck

    2014-01-01

    Using qualitative methods, this study analysed the process of training needs assessment in the Botswana public service, with special focus on five state sector ministries. It is evident from the research findings that there is little and an unsystematic approach to the needs assessment prior to training. The research further revealed that the…

  17. Executive Functioning Skills in Preschool-Age Children with Cochlear Implants

    ERIC Educational Resources Information Center

    Beer, Jessica; Kronenberger, William G.; Castellanos, Irina; Colson, Bethany G.; Henning, Shirley C.; Pisoni, David B.

    2014-01-01

    Purpose: The purpose of this study was to determine whether deficits in executive functioning (EF) in children with cochlear implants (CIs) emerge as early as the preschool years. Method: Two groups of children ages 3 to 6 years participated in this cross-sectional study: 24 preschoolers who had CIs prior to 36 months of age and 21 preschoolers…

  18. The neglected tool in the Bayesian ecologist's shed: a case study testing informative priors' effect on model accuracy

    PubMed Central

    Morris, William K; Vesk, Peter A; McCarthy, Michael A; Bunyavejchewin, Sarayudh; Baker, Patrick J

    2015-01-01

    Despite benefits for precision, ecologists rarely use informative priors. One reason that ecologists may prefer vague priors is the perception that informative priors reduce accuracy. To date, no ecological study has empirically evaluated data-derived informative priors' effects on precision and accuracy. To determine the impacts of priors, we evaluated mortality models for tree species using data from a forest dynamics plot in Thailand. Half the models used vague priors, and the remaining half had informative priors. We found precision was greater when using informative priors, but effects on accuracy were more variable. In some cases, prior information improved accuracy, while in others, it was reduced. On average, models with informative priors were no more or less accurate than models without. Our analyses provide a detailed case study on the simultaneous effect of prior information on precision and accuracy and demonstrate that when priors are specified appropriately, they lead to greater precision without systematically reducing model accuracy. PMID:25628867

  19. The neglected tool in the Bayesian ecologist's shed: a case study testing informative priors' effect on model accuracy.

    PubMed

    Morris, William K; Vesk, Peter A; McCarthy, Michael A; Bunyavejchewin, Sarayudh; Baker, Patrick J

    2015-01-01

    Despite benefits for precision, ecologists rarely use informative priors. One reason that ecologists may prefer vague priors is the perception that informative priors reduce accuracy. To date, no ecological study has empirically evaluated data-derived informative priors' effects on precision and accuracy. To determine the impacts of priors, we evaluated mortality models for tree species using data from a forest dynamics plot in Thailand. Half the models used vague priors, and the remaining half had informative priors. We found precision was greater when using informative priors, but effects on accuracy were more variable. In some cases, prior information improved accuracy, while in others, it was reduced. On average, models with informative priors were no more or less accurate than models without. Our analyses provide a detailed case study on the simultaneous effect of prior information on precision and accuracy and demonstrate that when priors are specified appropriately, they lead to greater precision without systematically reducing model accuracy.

  20. Advanced Steel Microstructural Classification by Deep Learning Methods.

    PubMed

    Azimi, Seyed Majid; Britz, Dominik; Engstler, Michael; Fritz, Mario; Mücklich, Frank

    2018-02-01

    The inner structure of a material is called microstructure. It stores the genesis of a material and determines all its physical and chemical properties. While microstructural characterization is widely spread and well known, the microstructural classification is mostly done manually by human experts, which gives rise to uncertainties due to subjectivity. Since the microstructure could be a combination of different phases or constituents with complex substructures its automatic classification is very challenging and only a few prior studies exist. Prior works focused on designed and engineered features by experts and classified microstructures separately from the feature extraction step. Recently, Deep Learning methods have shown strong performance in vision applications by learning the features from data together with the classification step. In this work, we propose a Deep Learning method for microstructural classification in the examples of certain microstructural constituents of low carbon steel. This novel method employs pixel-wise segmentation via Fully Convolutional Neural Network (FCNN) accompanied by a max-voting scheme. Our system achieves 93.94% classification accuracy, drastically outperforming the state-of-the-art method of 48.89% accuracy. Beyond the strong performance of our method, this line of research offers a more robust and first of all objective way for the difficult task of steel quality appreciation.

  1. A Bayes linear Bayes method for estimation of correlated event rates.

    PubMed

    Quigley, John; Wilson, Kevin J; Walls, Lesley; Bedford, Tim

    2013-12-01

    Typically, full Bayesian estimation of correlated event rates can be computationally challenging since estimators are intractable. When estimation of event rates represents one activity within a larger modeling process, there is an incentive to develop more efficient inference than provided by a full Bayesian model. We develop a new subjective inference method for correlated event rates based on a Bayes linear Bayes model under the assumption that events are generated from a homogeneous Poisson process. To reduce the elicitation burden we introduce homogenization factors to the model and, as an alternative to a subjective prior, an empirical method using the method of moments is developed. Inference under the new method is compared against estimates obtained under a full Bayesian model, which takes a multivariate gamma prior, where the predictive and posterior distributions are derived in terms of well-known functions. The mathematical properties of both models are presented. A simulation study shows that the Bayes linear Bayes inference method and the full Bayesian model provide equally reliable estimates. An illustrative example, motivated by a problem of estimating correlated event rates across different users in a simple supply chain, shows how ignoring the correlation leads to biased estimation of event rates. © 2013 Society for Risk Analysis.

  2. Determination of gold in geologic materials by solvent extraction and atomic-absorption spectrometry

    USGS Publications Warehouse

    Huffman, Claude; Mensik, J.D.; Riley, L.B.

    1967-01-01

    The two methods presented for the determination of traces of gold in geologic materials are the cyanide atomic-absorption method and the fire-assay atomic-absorption method. In the cyanide method gold is leached with a sodium-cyanide solution. The monovalent gold is then oxidized to the trivalent state and concentrated by extracting into methyl isobutyl ketone prior to estimation by atomic absorption. In the fire-assay atomic-absorption method, the gold-silver bead obtained from fire assay is dissolved in nitric and hydrochloric acids. Gold is then concentrated by extracting into methyl isobutyl ketone prior to determination by atomic absorption. By either method concentrations as low as 50 parts per billion of gold can be determined in a 15-gram sample.

  3. Emotional experiences in surrogate mothers: A qualitative study

    PubMed Central

    Ahmari Tehran, Hoda; Tashi, Shohreh; Mehran, Nahid; Eskandari, Narges; Dadkhah Tehrani, Tahmineh

    2014-01-01

    Background: Surrogacy is one of the new techniques of assisted reproduction technology in which a woman carries and bears a child for another woman. In Iran, many Shia clerics and jurists considered it permissible so there is no religious prohibition for it. In addition to the risk of physical complications for complete surrogate mothers, the possibility of psychological complications resulted from emotional attachment to a living creature in the surrogate mother as another injury requires counseling and assessment prior to acceptance by infertile couples and complete surrogate mothers. Objective: The purpose of this study was to assess the emotional experiences of surrogate mothers. Materials and Methods: This was a qualitative, phenomenological study. We selected eight complete surrogate mothers in Isfahan. We used convenient sampling method and in-depth interview to collect the information. The data analysis was fulfilled via Colaizzi’s seven-stage method. Reliability and validity study of the roots in the four-axis was done. Results: The findings of these interviews were classified into two main themes and four sub themes: acquired experiences in pregnancy (feelings toward pregnancy, relationship with family, relatives and commissioning couple) and consequences of surrogacy (complications of pregnancy, religious and financial problems of surrogacy). Conclusion: Surrogacy pregnancy should be considered as high-risk emotional experience because many of surrogate mothers may face negative experiences. Therefore, it is recommended that surrogates should receive professional counseling prior to, during and following pregnancy. PMID:25114669

  4. The effect of prior upper abdominal surgery on outcomes after liver transplantation for hepatocellular carcinoma: An analysis of the database of the organ procurement transplant network.

    PubMed

    Silva, Jack P; Berger, Nicholas G; Yin, Ziyan; Liu, Ying; Tsai, Susan; Christians, Kathleen K; Clarke, Callisia N; Mogal, Harveshp; Gamblin, T Clark

    2018-05-01

    Orthotopic liver transplantation (OLT) is the preferred treatment for hepatocellular carcinoma (HCC) in select patients. Many patients listed for OLT have a history of prior upper abdominal surgery (UAS). Repeat abdominal surgery increases operative complexity and may cause a greater incidence of complication. This study sought to compare outcomes after liver transplantation for patients with and without prior UAS. Adult HCC patients undergoing OLT were identified using the database from the Organ Procurement and Transplantation Network (1987-2015). Patients were separated by presence of prior UAS into 2 propensity-matched cohorts. Overall survival (OS) and graft survival (GS) were analyzed by log-rank test and graphed using Kaplan-Meier method. Recipient and donor demographic and clinical characteristics were also studied using Cox regression models. A total of 15,043 patients were identified, of whom 6,205 had prior UAS (41.2%). After 1:1 propensity score matching, cohorts (UAS versus no UAS) contained 4,669 patients. UAS patients experienced shorter GS (122 months vs 129 months; P < .001) and shorter OS (130 months vs 141 months; P < .001). Median duration of stay for both cohorts was 8 days. Multivariate Cox regression models revealed that prior UAS was associated with an increased hazard ratio (HR) for GS (HR 1.14; 95% confidence interval (CI) 1.06-1.22; P < .001) and OS (HR 1.14; 95% CI 1.06-1.23; P < .001). Prior UAS is an independent negative predictor of GS and OS after OLT for HCC. OLT performed in patients with UAS remains a well-tolerated and effective treatment for select HCC patients but may alter expected outcomes and influence follow-up protocols. Copyright © 2017 Elsevier Inc. All rights reserved.

  5. Parametric Bayesian priors and better choice of negative examples improve protein function prediction.

    PubMed

    Youngs, Noah; Penfold-Brown, Duncan; Drew, Kevin; Shasha, Dennis; Bonneau, Richard

    2013-05-01

    Computational biologists have demonstrated the utility of using machine learning methods to predict protein function from an integration of multiple genome-wide data types. Yet, even the best performing function prediction algorithms rely on heuristics for important components of the algorithm, such as choosing negative examples (proteins without a given function) or determining key parameters. The improper choice of negative examples, in particular, can hamper the accuracy of protein function prediction. We present a novel approach for choosing negative examples, using a parameterizable Bayesian prior computed from all observed annotation data, which also generates priors used during function prediction. We incorporate this new method into the GeneMANIA function prediction algorithm and demonstrate improved accuracy of our algorithm over current top-performing function prediction methods on the yeast and mouse proteomes across all metrics tested. Code and Data are available at: http://bonneaulab.bio.nyu.edu/funcprop.html

  6. Ultrasound-Guided Percutaneous Thyroid Nodule Core Biopsy: Clinical Utility in Patients with Prior Nondiagnostic Fine-Needle Aspirate

    PubMed Central

    Vij, Abhinav; Seale, Melanie K.; Desai, Gaurav; Halpern, Elkan; Faquin, William C.; Parangi, Sareh; Hahn, Peter F.; Daniels, Gilbert H.

    2012-01-01

    Background Five percent to 20% of thyroid nodule fine-needle aspiration (FNA) samples are nondiagnostic. The objective of this study was to determine whether a combination of FNA and core biopsy (CFNACB) would yield a higher proportion of diagnostic readings compared with FNA alone in patients with a history of one or more prior nondiagnostic FNA readings. Methods We conducted a retrospective study of 90 core biopsies (CBs) performed in 82 subjects (55 women and 27 men) between 2006 and 2008 in an outpatient clinic. Results CFNACB yielded a diagnostic reading in 87%. The diagnostic reading yield of the CB component of CFNACB was significantly superior to the concurrent FNA component, with CB yielding a diagnosis in 77% of cases and FNA yielding a diagnosis in 47% (p<0.0001). The combination of CB and FNA had a higher diagnostic reading yield than either alone. In 69 nodules that had only one prior nondiagnostic FNA, CB was diagnostic in 74%, FNA was diagnostic in 52%, CFNACB was diagnostic in 87%, and CB performed significantly better than FNA (p=0.0135). In 21 nodules with two or more prior nondiagnostic FNAs, CFNACB and CB were diagnostic in 86%, FNA was diagnostic in 29%, and CB was significantly better than FNA (p=0.0005). Clinical, ultrasound, or histopathologic follow-up was available for 81% (73/90) of the CFNACB procedures. No subject with a benign CFNACB reading was diagnosed with thyroid malignancy in the follow-up period (range 4–37 months, mean 18 months), although one subject had minimal increase in nodule size and was awaiting repeat sonography at study conclusion. Conclusion Thyroid nodule CFNACB is safe and clinically useful in selected patients when a prior FNA reading is nondiagnostic. CFNACB is superior to either CB or FNA alone. CFNACB should be strongly considered as an alternative to surgery in individuals with two prior nondiagnostic FNAs. PMID:22304390

  7. 3D microwave tomography of the breast using prior anatomical information

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Golnabi, Amir H., E-mail: golnabia@montclair.edu; Meaney, Paul M.; Paulsen, Keith D.

    2016-04-15

    Purpose: The authors have developed a new 3D breast image reconstruction technique that utilizes the soft tissue spatial resolution of magnetic resonance imaging (MRI) and integrates the dielectric property differentiation from microwave imaging to produce a dual modality approach with the goal of augmenting the specificity of MR imaging, possibly without the need for nonspecific contrast agents. The integration is performed through the application of a soft prior regularization which imports segmented geometric meshes generated from MR exams and uses it to constrain the microwave tomography algorithm to recover nearly uniform property distributions within segmented regions with sharp delineation betweenmore » these internal subzones. Methods: Previous investigations have demonstrated that this approach is effective in 2D simulation and phantom experiments and also in clinical exams. The current study extends the algorithm to 3D and provides a thorough analysis of the sensitivity and robustness to misalignment errors in size and location between the spatial prior information and the actual data. Results: Image results in 3D were not strongly dependent on reconstruction mesh density, and the changes of less than 30% in recovered property values arose from variations of more than 125% in target region size—an outcome which was more robust than in 2D. Similarly, changes of less than 13% occurred in the 3D image results from variations in target location of nearly 90% of the inclusion size. Permittivity and conductivity errors were about 5 times and 2 times smaller, respectively, with the 3D spatial prior algorithm in actual phantom experiments than those which occurred without priors. Conclusions: The presented study confirms that the incorporation of structural information in the form of a soft constraint can considerably improve the accuracy of the property estimates in predefined regions of interest. These findings are encouraging and establish a strong foundation for using the soft prior technique in clinical studies, where their microwave imaging system and MRI can simultaneously collect breast exam data in patients.« less

  8. [Interlaboratory Study on Evaporation Residue Test for Food Contact Products (Report 1)].

    PubMed

    Ohno, Hiroyuki; Mutsuga, Motoh; Abe, Tomoyuki; Abe, Yutaka; Amano, Homare; Ishihara, Kinuyo; Ohsaka, Ikue; Ohno, Haruka; Ohno, Yuichiro; Ozaki, Asako; Kakihara, Yoshiteru; Kobayashi, Hisashi; Sakuragi, Hiroshi; Shibata, Hiroshi; Shirono, Katsuhiro; Sekido, Haruko; Takasaka, Noriko; Takenaka, Yu; Tajima, Yoshiyasu; Tanaka, Aoi; Tanaka, Hideyuki; Tonooka, Hiroyuki; Nakanishi, Toru; Nomura, Chie; Haneishi, Nahoko; Hayakawa, Masato; Miura, Toshihiko; Yamaguchi, Miku; Watanabe, Kazunari; Sato, Kyoko

    2018-01-01

    An interlaboratory study was performed to evaluate the equivalence between an official method and a modified method of evaporation residue test using three food-simulating solvents (water, 4% acetic acid and 20% ethanol), based on the Japanese Food Sanitation Law for food contact products. Twenty-three laboratories participated, and tested the evaporation residues of nine test solutions as blind duplicates. For evaporation, a water bath was used in the official method, and a hot plate in the modified method. In most laboratories, the test solutions were heated until just prior to evaporation to dryness, and then allowed to dry under residual heat. Statistical analysis revealed that there was no significant difference between the two methods, regardless of the heating equipment used. Accordingly, the modified method provides performance equal to the official method, and is available as an alternative method.

  9. Manipulation method for the treatment of ankle equinus.

    PubMed

    Dananberg, H J; Shearstone, J; Guillano, M

    2000-09-01

    Ankle equinus is a well-known clinical entity that has previously been shown to compound a variety of foot and ankle conditions. Treatments for this disorder have included surgery to lengthen the Achilles tendon and daily stretching. This article describes a method of manual manipulation that can immediately and substantially increase ankle joint dorsiflexion. Patients treated with manipulation in the current study demonstrated nearly twice as much dorsiflexion motion as that demonstrated by patients in a prior study who were treated with a 5-minute daily stretching program for 6 months.

  10. Examination of the mechanism of action of two pre-quit pharmacotherapies for smoking cessation.

    PubMed

    Ferguson, Stuart G; Walters, Julia A E; Lu, Wenying; Wells, Gudrun P; Schüz, Natalie

    2015-12-21

    There is substantial scope for improvement in the current arsenal of smoking cessation methods and techniques: even when front-line cessation treatments are utilized, smokers are still more likely to fail than to succeed. Studies testing the incremental benefit of using nicotine patch for 1-4 weeks prior to quitting have shown pre-quit nicotine patch use produces a robust incremental improvement over standard post-quit patch treatment. The primary objective of the current study is to test the mechanism of action of two pre-quit smoking cessation medications-varenicline and nicotine patch-in order to learn how best to optimize these pre-quit treatments. The study is a three group, randomized, open-label controlled clinical trial. Participants (n = 216 interested quitters) will be randomized to receive standard patch treatment (10 weeks of patch starting from a designated quit day), pre-quit patch treatment (two weeks of patch treatment prior to a quit day, followed by 10 weeks post-quit treatment) or varenicline (starting two weeks prior to quit day followed by 10 weeks post-quit). Participants will use study-specific modified smart-phones to monitor their smoking, withdrawal symptoms, craving, mood and social situations in near real-time over four weeks; two weeks prior to an assigned quit date and two weeks after this date. Smoking and abstinence will be assessed at regular study visits and biochemically verified. Understanding how nicotine patches and varenicline influence abstinence may allow for better tailoring of these treatments to individual smokers. Australian New Zealand Clinical Trials Registry, ACTRN12614000329662 (Registered: 27 March 2014).

  11. The Effects of Weaning Methods on Gut Microbiota Composition and Horse Physiology

    PubMed Central

    Mach, Núria; Foury, Aline; Kittelmann, Sandra; Reigner, Fabrice; Moroldo, Marco; Ballester, Maria; Esquerré, Diane; Rivière, Julie; Sallé, Guillaume; Gérard, Philippe; Moisan, Marie-Pierre; Lansade, Léa

    2017-01-01

    Weaning has been described as one of the most stressful events in the life of horses. Given the importance of the interaction between the gut-brain axis and gut microbiota under stress, we evaluated (i) the effect of two different weaning methods on the composition of gut microbiota across time and (ii) how the shifts of gut microbiota composition after weaning affect the host. A total of 34 foals were randomly subjected to a progressive (P) or an abrupt (A) weaning method. In the P method, mares were separated from foals at progressively increasing intervals every day, starting from five min during the fourth week prior to weaning and ending with 6 h during the last week before weaning. In the A method, mares and foals were never separated prior to weaning (0 d). Different host phenotypes and gut microbiota composition were studied across 6 age strata (days −30, 0, 3, 5, 7, and 30 after weaning) by 16S rRNA gene sequencing. Results revealed that the beneficial species belonging to Prevotella, Paraprevotella, and Ruminococcus were more abundant in the A group prior to weaning compared to the P group, suggesting that the gut microbiota in the A cohort was better adapted to weaning. Streptococcus, on the other hand, showed the opposite pattern after weaning. Fungal loads, which are thought to increase the capacity for fermenting the complex polysaccharides from diet, were higher in P relative to A. Beyond the effects of weaning methods, maternal separation at weaning markedly shifted the composition of the gut microbiota in all foals, which fell into three distinct community types at 3 days post-weaning. Most genera in community type 2 (i.e., Eubacterium, Coprococcus, Clostridium XI, and Blautia spp.) were negatively correlated with salivary cortisol levels, but positively correlated with telomere length and N-butyrate production. Average daily gain was also greater in the foals harboring a community type 2 microbiota. Therefore, community type 2 is likely to confer better stress response adaptation following weaning. This study identified potential microbial biomarkers that could predict the likelihood for physiological adaptations to weaning in horses, although causality remains to be addressed. PMID:28790932

  12. Advances in Time Estimation Methods for Molecular Data.

    PubMed

    Kumar, Sudhir; Hedges, S Blair

    2016-04-01

    Molecular dating has become central to placing a temporal dimension on the tree of life. Methods for estimating divergence times have been developed for over 50 years, beginning with the proposal of molecular clock in 1962. We categorize the chronological development of these methods into four generations based on the timing of their origin. In the first generation approaches (1960s-1980s), a strict molecular clock was assumed to date divergences. In the second generation approaches (1990s), the equality of evolutionary rates between species was first tested and then a strict molecular clock applied to estimate divergence times. The third generation approaches (since ∼2000) account for differences in evolutionary rates across the tree by using a statistical model, obviating the need to assume a clock or to test the equality of evolutionary rates among species. Bayesian methods in the third generation require a specific or uniform prior on the speciation-process and enable the inclusion of uncertainty in clock calibrations. The fourth generation approaches (since 2012) allow rates to vary from branch to branch, but do not need prior selection of a statistical model to describe the rate variation or the specification of speciation model. With high accuracy, comparable to Bayesian approaches, and speeds that are orders of magnitude faster, fourth generation methods are able to produce reliable timetrees of thousands of species using genome scale data. We found that early time estimates from second generation studies are similar to those of third and fourth generation studies, indicating that methodological advances have not fundamentally altered the timetree of life, but rather have facilitated time estimation by enabling the inclusion of more species. Nonetheless, we feel an urgent need for testing the accuracy and precision of third and fourth generation methods, including their robustness to misspecification of priors in the analysis of large phylogenies and data sets. © The Author(s) 2016. Published by Oxford University Press on behalf of the Society for Molecular Biology and Evolution. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.

  13. The Effects of Weaning Methods on Gut Microbiota Composition and Horse Physiology.

    PubMed

    Mach, Núria; Foury, Aline; Kittelmann, Sandra; Reigner, Fabrice; Moroldo, Marco; Ballester, Maria; Esquerré, Diane; Rivière, Julie; Sallé, Guillaume; Gérard, Philippe; Moisan, Marie-Pierre; Lansade, Léa

    2017-01-01

    Weaning has been described as one of the most stressful events in the life of horses. Given the importance of the interaction between the gut-brain axis and gut microbiota under stress, we evaluated (i) the effect of two different weaning methods on the composition of gut microbiota across time and (ii) how the shifts of gut microbiota composition after weaning affect the host. A total of 34 foals were randomly subjected to a progressive (P) or an abrupt (A) weaning method. In the P method, mares were separated from foals at progressively increasing intervals every day, starting from five min during the fourth week prior to weaning and ending with 6 h during the last week before weaning. In the A method, mares and foals were never separated prior to weaning (0 d). Different host phenotypes and gut microbiota composition were studied across 6 age strata (days -30, 0, 3, 5, 7, and 30 after weaning) by 16S rRNA gene sequencing. Results revealed that the beneficial species belonging to Prevotella, Paraprevotella , and Ruminococcus were more abundant in the A group prior to weaning compared to the P group, suggesting that the gut microbiota in the A cohort was better adapted to weaning. Streptococcus , on the other hand, showed the opposite pattern after weaning. Fungal loads, which are thought to increase the capacity for fermenting the complex polysaccharides from diet, were higher in P relative to A. Beyond the effects of weaning methods, maternal separation at weaning markedly shifted the composition of the gut microbiota in all foals, which fell into three distinct community types at 3 days post-weaning. Most genera in community type 2 (i.e., Eubacterium, Coprococcus, Clostridium XI, and Blautia spp.) were negatively correlated with salivary cortisol levels, but positively correlated with telomere length and N-butyrate production. Average daily gain was also greater in the foals harboring a community type 2 microbiota. Therefore, community type 2 is likely to confer better stress response adaptation following weaning. This study identified potential microbial biomarkers that could predict the likelihood for physiological adaptations to weaning in horses, although causality remains to be addressed.

  14. Prior failed ipsilateral percutaneous endovascular intervention in patients with critical limb ischemia predicts poor outcome after lower extremity bypass

    PubMed Central

    Nolan, Brian W.; De Martino, Randall R.; Stone, David H.; Schanzer, Andres; Goodney, Philip P.; Walsh, Daniel W.; Cronenwett, Jack L.

    2017-01-01

    Background Although open surgical bypass remains the standard revascularization strategy for patients with critical limb ischemia (CLI), many centers now perform peripheral endovascular intervention (PVI) as the first-line treatment for these patients. We sought to determine the effect of a prior ipsilateral PVI (iPVI) on the outcome of subsequent lower extremity bypass (LEB) in patients with CLI. Methods A retrospective cohort analysis of all patients undergoing infrainguinal LEB between 2003 and 2009 within hospitals comprising the Vascular Study Group of New England (VSGNE) was performed. Primary study endpoints were major amputation and graft occlusion at 1 year postoperatively. Secondary outcomes included in-hospital major adverse events (MAE), 1-year mortality, and composite 1-year major adverse limb events (MALE). Event rates were determined using life table analyses and comparisons were performed using the log-rank test. Multivariate predictors were determined using a Cox proportional hazards model with multilevel hierarchical adjustment. Results Of 1880 LEBs performed, 32% (n = 603) had a prior infrainguinal revascularization procedure (iPVI, 7%; ipsilateral bypass, 15%; contralateral PVI, 3%; contralateral bypass, 17%). Patients with prior iPVI, compared with those without a prior iPVI, were more likely to be women (32 vs 41%; P = .04), less likely to have tissue loss (52% vs 63%; P = .02), more likely to require arm vein conduit (16% vs 5%; P = .001), and more likely to be on statin (71% vs 54%; P = .01) and beta blocker therapy (92% vs 81%; P = .01) at the time of their bypass procedure. Other demographic factors were similar between these groups. Prior PVI or bypass did not alter 30-day MAE and 1-year mortality after the index bypass. In contrast, 1-year major amputation and 1-year graft occlusion rates were significantly higher in patients who had prior iPVI than those without (31% vs 20%; P = .046 and 28% vs 18%; P = .009), similar to patients who had a prior ipsilateral bypass (1 year major amputation, 29% vs 20%; P = .022; 1 year graft occlusion, 33% vs 18%; P = .001). Independent multivariate predictors of higher 1-year amputation and graft occlusion rates were prior iPVI, prior ipsilateral bypass, dialysis dependence, prosthetic conduit and distal (tibial and pedal) bypass target. Conclusions Prior iPVI is highly predictive for poor outcome in patients undergoing LEB for CLI with higher 1-year amputation and graft occlusion rates than those without prior revascularization, similar to prior ipsilateral bypass These findings provide information, which may help with the complex decisions surrounding revascularization options in patients with CLI. PMID:21802888

  15. Improvement in rice straw pulp bleaching effluent quality by incorporating oxygen delignification stage prior to elemental chlorine-free bleaching.

    PubMed

    Kaur, Daljeet; Bhardwaj, Nishi K; Lohchab, Rajesh Kumar

    2017-10-01

    Environmental degradation by industrial and other developmental activities is alarming for imperative environmental management by process advancements of production. Pulp and paper mills are now focusing on using nonwood-based raw materials to protect forest resources. In present study, rice straw was utilized for pulp production as it is easily and abundantly available as well as rich in carbohydrates (cellulose and hemicelluloses). Soda-anthraquinone method was used for pulp production as it is widely accepted for agro residues. Bleaching process during paper production is the chief source of wastewater generation. The chlorophenolic compounds generated during bleaching are highly toxic, mutagenic, and bioaccumulative in nature. The objectives of study were to use oxygen delignification (ODL) stage prior to elemental chlorine-free (ECF) bleaching to reduce wastewater load and to study its impact on bleached pulp characteristics. ODL stage prior to ECF bleaching improved the optical properties of pulp in comparison to only ECF bleaching. When ODL stage was incorporated prior to bleaching, the tensile index and folding endurance of the pulp were found to be 56.6 ± 1.5 Nm/g and 140, respectively, very high in comparison to ECF alone. A potential reduction of 51, 57, 43, and 53% in BOD 3 , COD, color, and AOX, respectively was observed on adding the ODL stage compared to ECF only. Generation of chlorophenolic compounds was reduced significantly. Incorporation of ODL stage prior to bleaching was found to be highly promising for reducing the toxicity of bleaching effluents and may lead to better management of nearby water resources. Graphical abstract ᅟ.

  16. Prospective multi-centre Voxel Based Morphometry study employing scanner specific segmentations: Procedure development using CaliBrain structural MRI data

    PubMed Central

    2009-01-01

    Background Structural Magnetic Resonance Imaging (sMRI) of the brain is employed in the assessment of a wide range of neuropsychiatric disorders. In order to improve statistical power in such studies it is desirable to pool scanning resources from multiple centres. The CaliBrain project was designed to provide for an assessment of scanner differences at three centres in Scotland, and to assess the practicality of pooling scans from multiple-centres. Methods We scanned healthy subjects twice on each of the 3 scanners in the CaliBrain project with T1-weighted sequences. The tissue classifier supplied within the Statistical Parametric Mapping (SPM5) application was used to map the grey and white tissue for each scan. We were thus able to assess within scanner variability and between scanner differences. We have sought to correct for between scanner differences by adjusting the probability mappings of tissue occupancy (tissue priors) used in SPM5 for tissue classification. The adjustment procedure resulted in separate sets of tissue priors being developed for each scanner and we refer to these as scanner specific priors. Results Voxel Based Morphometry (VBM) analyses and metric tests indicated that the use of scanner specific priors reduced tissue classification differences between scanners. However, the metric results also demonstrated that the between scanner differences were not reduced to the level of within scanner variability, the ideal for scanner harmonisation. Conclusion Our results indicate the development of scanner specific priors for SPM can assist in pooling of scan resources from different research centres. This can facilitate improvements in the statistical power of quantitative brain imaging studies. PMID:19445668

  17. A blind deconvolution method based on L1/L2 regularization prior in the gradient space

    NASA Astrophysics Data System (ADS)

    Cai, Ying; Shi, Yu; Hua, Xia

    2018-02-01

    In the process of image restoration, the result of image restoration is very different from the real image because of the existence of noise, in order to solve the ill posed problem in image restoration, a blind deconvolution method based on L1/L2 regularization prior to gradient domain is proposed. The method presented in this paper first adds a function to the prior knowledge, which is the ratio of the L1 norm to the L2 norm, and takes the function as the penalty term in the high frequency domain of the image. Then, the function is iteratively updated, and the iterative shrinkage threshold algorithm is applied to solve the high frequency image. In this paper, it is considered that the information in the gradient domain is better for the estimation of blur kernel, so the blur kernel is estimated in the gradient domain. This problem can be quickly implemented in the frequency domain by fast Fast Fourier Transform. In addition, in order to improve the effectiveness of the algorithm, we have added a multi-scale iterative optimization method. This paper proposes the blind deconvolution method based on L1/L2 regularization priors in the gradient space can obtain the unique and stable solution in the process of image restoration, which not only keeps the edges and details of the image, but also ensures the accuracy of the results.

  18. Subject-Specific Sparse Dictionary Learning for Atlas-Based Brain MRI Segmentation.

    PubMed

    Roy, Snehashis; He, Qing; Sweeney, Elizabeth; Carass, Aaron; Reich, Daniel S; Prince, Jerry L; Pham, Dzung L

    2015-09-01

    Quantitative measurements from segmentations of human brain magnetic resonance (MR) images provide important biomarkers for normal aging and disease progression. In this paper, we propose a patch-based tissue classification method from MR images that uses a sparse dictionary learning approach and atlas priors. Training data for the method consists of an atlas MR image, prior information maps depicting where different tissues are expected to be located, and a hard segmentation. Unlike most atlas-based classification methods that require deformable registration of the atlas priors to the subject, only affine registration is required between the subject and training atlas. A subject-specific patch dictionary is created by learning relevant patches from the atlas. Then the subject patches are modeled as sparse combinations of learned atlas patches leading to tissue memberships at each voxel. The combination of prior information in an example-based framework enables us to distinguish tissues having similar intensities but different spatial locations. We demonstrate the efficacy of the approach on the application of whole-brain tissue segmentation in subjects with healthy anatomy and normal pressure hydrocephalus, as well as lesion segmentation in multiple sclerosis patients. For each application, quantitative comparisons are made against publicly available state-of-the art approaches.

  19. Evaluation of prior photorefractive keratectomy in donor tissue.

    PubMed

    Terry, M A; Ousley, P J; Rich, L F; Wilson, D J

    1999-05-01

    To describe a case in which an eye donor had prior bilateral photorefractive keratectomies and to elucidate possible methods of evaluation and screening of donor tissue. Case report. A 62-year-old eye donor was reported to have received radial keratotomy before his death. Further investigation by the eye bank showed a history of photorefractive keratectomy (PRK), not radial keratotomy. The corneas were therefore not used for transplantation, and the eyes were evaluated by slit-lamp examination, photography, corneal topography, and histology. Slit-lamp and photographic examination did not indicate the presence of PRK ablations. Corneal topography mapping with the TMS-1 was relatively ambiguous for identifying PRK flattening, while multiple data formatting of the cornea with the Orbscan resulted in the strongest suggestion of prior PRK. Histologic analysis showed central corneal thinning and loss of Bowman's membrane consistent with PRK. In the absence of a positive donor history for PRK, current methods of screening donor tissue for prior PRK often are insufficient to exclude these corneas from use in transplantation. More refined placido imagery corneal topography or newer technologies such as the Orbscan may allow more sensitive and specific methods of donor tissue screening.

  20. Impact of video technology on efficiency of pharmacist-provided anticoagulation counseling and patient comprehension.

    PubMed

    Moore, Sarah J; Blair, Elizabeth A; Steeb, David R; Reed, Brent N; Hull, J Heyward; Rodgers, Jo Ellen

    2015-06-01

    Discharge anticoagulation counseling is important for ensuring patient comprehension and optimizing clinical outcomes. As pharmacy resources become increasingly limited, the impact of informational videos on the counseling process becomes more relevant. To evaluate differences in pharmacist time spent counseling and patient comprehension (measured by the Oral Anticoagulation Knowledge [OAK] test) between informational videos and traditional face-to-face (oral) counseling. This prospective, open, parallel-group study at an academic medical center randomized 40 individuals-17 warfarin-naïve ("New Start") and 23 with prior warfarin use ("Restart")-to receive warfarin discharge education by video or face-to-face counseling. "Teach-back" questions were used in both groups. Although overall pharmacist time was reduced in the video counseling group (P < 0.001), an interaction between prior warfarin use and counseling method (P = 0.012) suggests the difference between counseling methods was smaller in New Start participants. Following adjustment, mean total time was reduced 8.71 (95% CI = 5.15-12.26) minutes (adjusted P < 0.001) in Restart participants and 2.31 (-2.19 to 6.81) minutes (adjusted P = 0.472) in New Start participants receiving video counseling. Postcounseling OAK test scores did not differ. Age, gender, socioeconomic status, and years of education were not predictive of total time or OAK test score. Use of informational videos coupled with teach-back questions significantly reduced pharmacist time spent on anticoagulation counseling without compromising short-term patient comprehension, primarily in patients with prior warfarin use. Study results demonstrate that video technology provides an efficient method of anticoagulation counseling while achieving similar comprehension. © The Author(s) 2015.

  1. The combination of short rest and energy drink consumption as fatigue countermeasures during a prolonged drive of professional truck drivers.

    PubMed

    Ronen, Adi; Oron-Gilad, Tal; Gershon, Pnina

    2014-06-01

    One of the major concerns for professional drivers is fatigue. Many studies evaluated specific fatigue countermeasures, in many cases comparing the efficiency of each method separately. The present study evaluated the effectiveness of rest areas combined with consumption of energy drinks on professional truck drivers during a prolonged simulated drive. Fifteen professional truck drivers participated in three experimental sessions: control-drivers were asked to drink 500 ml of a placebo drink prior to the beginning of the drive. Energy drink-drivers were asked to drink 500 ml of an energy drink containing 160 mg of caffeine prior to the beginning of the drive, and an Energy drink+Rest session--where the drivers were asked to drink 500 ml of an energy drink prior to driving, and rest for 10 min at a designated rest area zone 100 min into the drive. For all sessions, driving duration was approximately 150 min and consisted of driving on a monotonous, two-way rural road. In addition to driving performance measures, subjective measures, and heart rate variability were obtained. Results indicated that consumption of an energy drink (in both sessions) facilitated lower lane position deviations and reduced steering wheel deviations during the first 80-100 min of the drive relative to the control sessions. Resting after 100 min of driving, in addition to the energy drink that was consumed before the drive, enabled the drivers to maintain these abilities throughout the remainder of the driving session. Practical applications: Practical applications arising from the results of this research may give indication on the possible added value of combining fatigue counter measures methods during a prolonged drive and the importance of the timing of the use for each method. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Group-regularized individual prediction: theory and application to pain.

    PubMed

    Lindquist, Martin A; Krishnan, Anjali; López-Solà, Marina; Jepma, Marieke; Woo, Choong-Wan; Koban, Leonie; Roy, Mathieu; Atlas, Lauren Y; Schmidt, Liane; Chang, Luke J; Reynolds Losin, Elizabeth A; Eisenbarth, Hedwig; Ashar, Yoni K; Delk, Elizabeth; Wager, Tor D

    2017-01-15

    Multivariate pattern analysis (MVPA) has become an important tool for identifying brain representations of psychological processes and clinical outcomes using fMRI and related methods. Such methods can be used to predict or 'decode' psychological states in individual subjects. Single-subject MVPA approaches, however, are limited by the amount and quality of individual-subject data. In spite of higher spatial resolution, predictive accuracy from single-subject data often does not exceed what can be accomplished using coarser, group-level maps, because single-subject patterns are trained on limited amounts of often-noisy data. Here, we present a method that combines population-level priors, in the form of biomarker patterns developed on prior samples, with single-subject MVPA maps to improve single-subject prediction. Theoretical results and simulations motivate a weighting based on the relative variances of biomarker-based prediction-based on population-level predictive maps from prior groups-and individual-subject, cross-validated prediction. Empirical results predicting pain using brain activity on a trial-by-trial basis (single-trial prediction) across 6 studies (N=180 participants) confirm the theoretical predictions. Regularization based on a population-level biomarker-in this case, the Neurologic Pain Signature (NPS)-improved single-subject prediction accuracy compared with idiographic maps based on the individuals' data alone. The regularization scheme that we propose, which we term group-regularized individual prediction (GRIP), can be applied broadly to within-person MVPA-based prediction. We also show how GRIP can be used to evaluate data quality and provide benchmarks for the appropriateness of population-level maps like the NPS for a given individual or study. Copyright © 2015 Elsevier Inc. All rights reserved.

  3. Comparison of haematology, coagulation and clinical chemistry parameters in blood samples from the sublingual vein and vena cava in Sprague-Dawley rats.

    PubMed

    Seibel, J; Bodié, K; Weber, S; Bury, D; Kron, M; Blaich, G

    2010-10-01

    The investigation of clinical pathology parameters (haematology, clinical chemistry and coagulation) is an important part of the preclinical evaluation of drug safety. However, the blood sampling method employed should avoid or minimize stress and injury in laboratory animals. In the present study, we compared the clinical pathology results from blood samples collected terminally from the vena cava (VC) immediately before necropsy with samples taken from the sublingual vein (VS) also prior to necropsy in order to determine whether the sampling method has an influence on clinical pathology parameters. Forty-six 12-week-old male Sprague-Dawley rats were assigned to two groups (VC or VS; n = 23 each). All rats were anaesthetized with isoflurane prior to sampling. In the VC group, blood was withdrawn from the inferior VC. For VS sampling, the tongue was gently pulled out and the VS was punctured. The haematology, coagulation and clinical chemistry parameters were compared. Equivalence was established for 13 parameters, such as mean corpuscular volume, white blood cells and calcium. No equivalence was found for the remaining 26 parameters, although they were considered to be similar when compared with the historical data and normal ranges. The most conspicuous finding was that activated prothrombin time was 30.3% less in blood taken from the VC (16.6 ± 0.89 s) than that in the VS samples (23.8 ± 1.58 s). Summing up, blood sampling from the inferior VC prior to necropsy appears to be a suitable and reliable method for terminal blood sampling that reduces stress and injury to laboratory rats in preclinical drug safety studies.

  4. Applications of the Ultrasonic Serial Number Restoration Technique to Guns and Typical Stolen Articles

    NASA Technical Reports Server (NTRS)

    Young, S. G.

    1976-01-01

    An ultrasonic cavitation method for restoring obliterated serial numbers has been further explored by application to articles involved in police cases. The method was applied successfully to gun parts. In one case portions of numbers were restored after prior failure by other laboratories using chemical etching techniques. The ultrasonic method was not successful on a heavily obliterated and restamped automobile engine block, but it was partially successful on a motorcycle gear-case housing. Additional studies were made on the effect of a larger diameter ultrasonic probe, and on the method's ability to restore numbers obliterated by peening.

  5. Reaffirmed limitations of meta-analytic methods in the study of mild traumatic brain injury: a response to Rohling et al.

    PubMed

    Bigler, Erin D; Farrer, Thomas J; Pertab, Jon L; James, Kelly; Petrie, Jo Ann; Hedges, Dawson W

    2013-01-01

    In 2009 Pertab, James, and Bigler published a critique of two prior meta-analyses by Binder, Rohling, and Larrabee (1997) and Frencham, Fox, and Maybery (2005) that showed small effect size difference at least 3 months post-injury in individuals who had sustained a mild traumatic brain injury (mTBI). The Binder et al. and Frencham et al. meta-analyses have been widely cited as showing no lasting effect of mTBI. In their critique Pertab et al. (2009) point out many limitations of these two prior meta-analyses, demonstrating that depending on how inclusion/exclusion criteria were defined different meta-analytic findings occur, some supporting the persistence of neuropsychological impairments beyond 3 months. Rohling et al. (2011) have now critiqued Pertab et al. (2009). Herein we respond to the Rolling et al. (2011) critique reaffirming the original findings of Pertab et al. (2009), providing additional details concerning the flaws in prior meta-analytic mTBI studies and the effects on neuropsychological performance.

  6. Measurement Uncertainty of Dew-Point Temperature in a Two-Pressure Humidity Generator

    NASA Astrophysics Data System (ADS)

    Martins, L. Lages; Ribeiro, A. Silva; Alves e Sousa, J.; Forbes, Alistair B.

    2012-09-01

    This article describes the measurement uncertainty evaluation of the dew-point temperature when using a two-pressure humidity generator as a reference standard. The estimation of the dew-point temperature involves the solution of a non-linear equation for which iterative solution techniques, such as the Newton-Raphson method, are required. Previous studies have already been carried out using the GUM method and the Monte Carlo method but have not discussed the impact of the approximate numerical method used to provide the temperature estimation. One of the aims of this article is to take this approximation into account. Following the guidelines presented in the GUM Supplement 1, two alternative approaches can be developed: the forward measurement uncertainty propagation by the Monte Carlo method when using the Newton-Raphson numerical procedure; and the inverse measurement uncertainty propagation by Bayesian inference, based on prior available information regarding the usual dispersion of values obtained by the calibration process. The measurement uncertainties obtained using these two methods can be compared with previous results. Other relevant issues concerning this research are the broad application to measurements that require hygrometric conditions obtained from two-pressure humidity generators and, also, the ability to provide a solution that can be applied to similar iterative models. The research also studied the factors influencing both the use of the Monte Carlo method (such as the seed value and the convergence parameter) and the inverse uncertainty propagation using Bayesian inference (such as the pre-assigned tolerance, prior estimate, and standard deviation) in terms of their accuracy and adequacy.

  7. The Effects of Prior Combat Experience on the Expression of Somatic and Affective Symptoms in Deploying Soldiers

    DTIC Science & Technology

    2006-01-01

    Journal of Psychosomatic ResThe effects of prior combat experience on the expression of somatic and affective symptoms in deploying soldiers William...rates of somatic complaints compared with combat-naive soldiers. Methods: Self-reports of posttraumatic stress disorder (PTSD) and affective and somatic ...identical for the experienced and inexperienced groups, scores on the Affective and Somatic scales differed as a function of prior combat history. Previous

  8. Object Recognition using Feature- and Color-Based Methods

    NASA Technical Reports Server (NTRS)

    Duong, Tuan; Duong, Vu; Stubberud, Allen

    2008-01-01

    An improved adaptive method of processing image data in an artificial neural network has been developed to enable automated, real-time recognition of possibly moving objects under changing (including suddenly changing) conditions of illumination and perspective. The method involves a combination of two prior object-recognition methods one based on adaptive detection of shape features and one based on adaptive color segmentation to enable recognition in situations in which either prior method by itself may be inadequate. The chosen prior feature-based method is known as adaptive principal-component analysis (APCA); the chosen prior color-based method is known as adaptive color segmentation (ACOSE). These methods are made to interact with each other in a closed-loop system to obtain an optimal solution of the object-recognition problem in a dynamic environment. One of the results of the interaction is to increase, beyond what would otherwise be possible, the accuracy of the determination of a region of interest (containing an object that one seeks to recognize) within an image. Another result is to provide a minimized adaptive step that can be used to update the results obtained by the two component methods when changes of color and apparent shape occur. The net effect is to enable the neural network to update its recognition output and improve its recognition capability via an adaptive learning sequence. In principle, the improved method could readily be implemented in integrated circuitry to make a compact, low-power, real-time object-recognition system. It has been proposed to demonstrate the feasibility of such a system by integrating a 256-by-256 active-pixel sensor with APCA, ACOSE, and neural processing circuitry on a single chip. It has been estimated that such a system on a chip would have a volume no larger than a few cubic centimeters, could operate at a rate as high as 1,000 frames per second, and would consume in the order of milliwatts of power.

  9. Sixth Grade Students' Development of Historical Perspective: World War II and the Atomic Bombing of Hiroshima and Nagasaki.

    ERIC Educational Resources Information Center

    Ogawa, Masato

    This study investigated how the use of various teaching methods influenced perspective taking skills of sixth grade middle school students during a unit of instruction on World War II. Three questions directed the study: (1) What do students know about World War II prior to a unit of study on World War II; (2) What do students know about World War…

  10. Discovery learning model with geogebra assisted for improvement mathematical visual thinking ability

    NASA Astrophysics Data System (ADS)

    Juandi, D.; Priatna, N.

    2018-05-01

    The main goal of this study is to improve the mathematical visual thinking ability of high school student through implementation the Discovery Learning Model with Geogebra Assisted. This objective can be achieved through study used quasi-experimental method, with non-random pretest-posttest control design. The sample subject of this research consist of 62 senior school student grade XI in one of school in Bandung district. The required data will be collected through documentation, observation, written tests, interviews, daily journals, and student worksheets. The results of this study are: 1) Improvement students Mathematical Visual Thinking Ability who obtain learning with applied the Discovery Learning Model with Geogebra assisted is significantly higher than students who obtain conventional learning; 2) There is a difference in the improvement of students’ Mathematical Visual Thinking ability between groups based on prior knowledge mathematical abilities (high, medium, and low) who obtained the treatment. 3) The Mathematical Visual Thinking Ability improvement of the high group is significantly higher than in the medium and low groups. 4) The quality of improvement ability of high and low prior knowledge is moderate category, in while the quality of improvement ability in the high category achieved by student with medium prior knowledge.

  11. Effect of the Availability of Prior Full-Field Digital Mammography and Digital Breast Tomosynthesis Images on the Interpretation of Mammograms

    PubMed Central

    Catullo, Victor J.; Chough, Denise M.; Ganott, Marie A.; Kelly, Amy E.; Shinde, Dilip D.; Sumkin, Jules H.; Wallace, Luisa P.; Bandos, Andriy I.; Gur, David

    2015-01-01

    Purpose To assess the effect of and interaction between the availability of prior images and digital breast tomosynthesis (DBT) images in decisions to recall women during mammogram interpretation. Materials and Methods Verbal informed consent was obtained for this HIPAA-compliant institutional review board–approved protocol. Eight radiologists independently interpreted twice deidentified mammograms obtained in 153 women (age range, 37–83 years; mean age, 53.7 years ± 9.3 [standard deviation]) in a mode by reader by case-balanced fully crossed study. Each study consisted of current and prior full-field digital mammography (FFDM) images and DBT images that were acquired in our facility between June 2009 and January 2013. For one reading, sequential ratings were provided by using (a) current FFDM images only, (b) current FFDM and DBT images, and (c) current FFDM, DBT, and prior FFDM images. The other reading consisted of (a) current FFDM images only, (b) current and prior FFDM images, and (c) current FFDM, prior FFDM, and DBT images. Fifty verified cancer cases, 60 negative and benign cases (clinically not recalled), and 43 benign cases (clinically recalled) were included. Recall recommendations and interaction between the effect of prior FFDM and DBT images were assessed by using a generalized linear model accounting for case and reader variability. Results Average recall rates in noncancer cases were significantly reduced with the addition of prior FFDM images by 34% (145 of 421) and 32% (106 of 333) without and with DBT images, respectively (P < .001). However, this recall reduction was achieved at the cost of a corresponding 7% (23 of 345) and 4% (14 of 353) reduction in sensitivity (P = .006). In contrast, availability of DBT images resulted in a smaller reduction in recall rates (false-positive interpretations) of 19% (76 of 409) and 26% (71 of 276) without and with prior FFDM images, respectively (P = .001). Availability of DBT images resulted in 4% (15 of 338) and 8% (25 of 322) increases in sensitivity, respectively (P = .007). The effects of the availability of prior FFDM images or DBT images did not significantly change regardless of the sequence in presentation (P = .81 and P = .47 for specificity and sensitivity, respectively). Conclusion The availability of prior FFDM or DBT images is a largely independent contributing factor in reducing recall recommendations during mammographic interpretation. © RSNA, 2015 PMID:25768673

  12. TU-AB-BRA-09: A Novel Method of Generating Ultrafast Volumetric Cine MRI (VC-MRI) Using Prior 4D-MRI and On-Board Phase-Skipped Encoding Acquisition for Radiotherapy Target Localization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, C; Yin, F; Harris, W

    Purpose: To develop a technique generating ultrafast on-board VC-MRI using prior 4D-MRI and on-board phase-skipped encoding k-space acquisition for real-time 3D target tracking of liver and lung radiotherapy. Methods: The end-of-expiration (EOE) volume in 4D-MRI acquired during the simulation was selected as the prior volume. 3 major respiratory deformation patterns were extracted through the principal component analysis of the deformation field maps (DFMs) generated between EOE and all other phases. The on-board VC-MRI at each instant was considered as a deformation of the prior volume, and the deformation was modeled as a linear combination of the extracted 3 major deformationmore » patterns. To solve the weighting coefficients of the 3 major patterns, a 2D slice was extracted from VC-MRI volume to match with the 2D on-board sampling data, which was generated by 8-fold phase skipped-encoding k-space acquisition (i.e., sample 1 phase-encoding line out of every 8 lines) to achieve an ultrafast 16–24 volumes/s frame rate. The method was evaluated using XCAT digital phantom to simulate lung cancer patients. The 3D volume of end-ofinhalation (EOI) phase at the treatment day was used as ground-truth onboard VC-MRI with simulated changes in 1) breathing amplitude and 2) breathing amplitude/phase change from the simulation day. A liver cancer patient case was evaluated for in-vivo feasibility demonstration. Results: The comparison between ground truth and estimated on-board VC-MRI shows good agreements. In XCAT study with changed breathing amplitude, the volume-percent-difference(VPD) between ground-truth and estimated tumor volumes at EOI was 6.28% and the Center-of-Mass-Shift(COMS) was 0.82mm; with changed breathing amplitude and phase, the VPD was 8.50% and the COMS was 0.54mm. The study of liver patient case also demonstrated a promising in vivo feasibility of the proposed method Conclusion: Preliminary results suggest the feasibility to estimate ultrafast VC-MRI for on-board target localization with phase skipped-encoding k-space acquisition. Research grant from NIH R01-184173.« less

  13. Evaluating marginal likelihood with thermodynamic integration method and comparison with several other numerical methods

    DOE PAGES

    Liu, Peigui; Elshall, Ahmed S.; Ye, Ming; ...

    2016-02-05

    Evaluating marginal likelihood is the most critical and computationally expensive task, when conducting Bayesian model averaging to quantify parametric and model uncertainties. The evaluation is commonly done by using Laplace approximations to evaluate semianalytical expressions of the marginal likelihood or by using Monte Carlo (MC) methods to evaluate arithmetic or harmonic mean of a joint likelihood function. This study introduces a new MC method, i.e., thermodynamic integration, which has not been attempted in environmental modeling. Instead of using samples only from prior parameter space (as in arithmetic mean evaluation) or posterior parameter space (as in harmonic mean evaluation), the thermodynamicmore » integration method uses samples generated gradually from the prior to posterior parameter space. This is done through a path sampling that conducts Markov chain Monte Carlo simulation with different power coefficient values applied to the joint likelihood function. The thermodynamic integration method is evaluated using three analytical functions by comparing the method with two variants of the Laplace approximation method and three MC methods, including the nested sampling method that is recently introduced into environmental modeling. The thermodynamic integration method outperforms the other methods in terms of their accuracy, convergence, and consistency. The thermodynamic integration method is also applied to a synthetic case of groundwater modeling with four alternative models. The application shows that model probabilities obtained using the thermodynamic integration method improves predictive performance of Bayesian model averaging. As a result, the thermodynamic integration method is mathematically rigorous, and its MC implementation is computationally general for a wide range of environmental problems.« less

  14. Exploring expectation effects in EMDR: does prior treatment knowledge affect the degrading effects of eye movements on memories?

    PubMed Central

    Littel, Marianne; van Schie, Kevin; van den Hout, Marcel A.

    2017-01-01

    ABSTRACT Background: Eye movement desensitization and reprocessing (EMDR) is an effective psychological treatment for posttraumatic stress disorder. Recalling a memory while simultaneously making eye movements (EM) decreases a memory’s vividness and/or emotionality. It has been argued that non-specific factors, such as treatment expectancy and experimental demand, may contribute to the EMDR’s effectiveness. Objective: The present study was designed to test whether expectations about the working mechanism of EMDR would alter the memory attenuating effects of EM. Two experiments were conducted. In Experiment 1, we examined the effects of pre-existing (non-manipulated) knowledge of EMDR in participants with and without prior knowledge. In Experiment 2, we experimentally manipulated prior knowledge by providing participants without prior knowledge with correct or incorrect information about EMDR’s working mechanism. Method: Participants in both experiments recalled two aversive, autobiographical memories during brief sets of EM (Recall+EM) or keeping eyes stationary (Recall Only). Before and after the intervention, participants scored their memories on vividness and emotionality. A Bayesian approach was used to compare two competing hypotheses on the effects of (existing/given) prior knowledge: (1) Prior (correct) knowledge increases the effects of Recall+EM vs. Recall Only, vs. (2) prior knowledge does not affect the effects of Recall+EM. Results: Recall+EM caused greater reductions in memory vividness and emotionality than Recall Only in all groups, including the incorrect information group. In Experiment 1, both hypotheses were supported by the data: prior knowledge boosted the effects of EM, but only modestly. In Experiment 2, the second hypothesis was clearly supported over the first: providing knowledge of the underlying mechanism of EMDR did not alter the effects of EM. Conclusions: Recall+EM appears to be quite robust against the effects of prior expectations. As Recall+EM is the core component of EMDR, expectancy effects probably contribute little to the effectiveness of EMDR treatment. PMID:29038685

  15. Validating hierarchical verbal autopsy expert algorithms in a large data set with known causes of death.

    PubMed

    Kalter, Henry D; Perin, Jamie; Black, Robert E

    2016-06-01

    Physician assessment historically has been the most common method of analyzing verbal autopsy (VA) data. Recently, the World Health Organization endorsed two automated methods, Tariff 2.0 and InterVA-4, which promise greater objectivity and lower cost. A disadvantage of the Tariff method is that it requires a training data set from a prior validation study, while InterVA relies on clinically specified conditional probabilities. We undertook to validate the hierarchical expert algorithm analysis of VA data, an automated, intuitive, deterministic method that does not require a training data set. Using Population Health Metrics Research Consortium study hospital source data, we compared the primary causes of 1629 neonatal and 1456 1-59 month-old child deaths from VA expert algorithms arranged in a hierarchy to their reference standard causes. The expert algorithms were held constant, while five prior and one new "compromise" neonatal hierarchy, and three former child hierarchies were tested. For each comparison, the reference standard data were resampled 1000 times within the range of cause-specific mortality fractions (CSMF) for one of three approximated community scenarios in the 2013 WHO global causes of death, plus one random mortality cause proportions scenario. We utilized CSMF accuracy to assess overall population-level validity, and the absolute difference between VA and reference standard CSMFs to examine particular causes. Chance-corrected concordance (CCC) and Cohen's kappa were used to evaluate individual-level cause assignment. Overall CSMF accuracy for the best-performing expert algorithm hierarchy was 0.80 (range 0.57-0.96) for neonatal deaths and 0.76 (0.50-0.97) for child deaths. Performance for particular causes of death varied, with fairly flat estimated CSMF over a range of reference values for several causes. Performance at the individual diagnosis level was also less favorable than that for overall CSMF (neonatal: best CCC = 0.23, range 0.16-0.33; best kappa = 0.29, 0.23-0.35; child: best CCC = 0.40, 0.19-0.45; best kappa = 0.29, 0.07-0.35). Expert algorithms in a hierarchy offer an accessible, automated method for assigning VA causes of death. Overall population-level accuracy is similar to that of more complex machine learning methods, but without need for a training data set from a prior validation study.

  16. Integration of existing systematic reviews into new reviews: identification of guidance needs

    PubMed Central

    2014-01-01

    Background An exponential increase in the number of systematic reviews published, and constrained resources for new reviews, means that there is an urgent need for guidance on explicitly and transparently integrating existing reviews into new systematic reviews. The objectives of this paper are: 1) to identify areas where existing guidance may be adopted or adapted, and 2) to suggest areas for future guidance development. Methods We searched documents and websites from healthcare focused systematic review organizations to identify and, where available, to summarize relevant guidance on the use of existing systematic reviews. We conducted informational interviews with members of Evidence-based Practice Centers (EPCs) to gather experiences in integrating existing systematic reviews, including common issues and challenges, as well as potential solutions. Results There was consensus among systematic review organizations and the EPCs about some aspects of incorporating existing systematic reviews into new reviews. Current guidance may be used in assessing the relevance of prior reviews and in scanning references of prior reviews to identify studies for a new review. However, areas of challenge remain. Areas in need of guidance include how to synthesize, grade the strength of, and present bodies of evidence composed of primary studies and existing systematic reviews. For instance, empiric evidence is needed regarding how to quality check data abstraction and when and how to use study-level risk of bias assessments from prior reviews. Conclusions There remain areas of uncertainty for how to integrate existing systematic reviews into new reviews. Methods research and consensus processes among systematic review organizations are needed to develop guidance to address these challenges. PMID:24956937

  17. Adenosine Monophosphate-Based Detection of Bacterial Spores

    NASA Technical Reports Server (NTRS)

    Kern, Roger G.; Chen, Fei; Venkateswaran, Kasthuri; Hattori, Nori; Suzuki, Shigeya

    2009-01-01

    A method of rapid detection of bacterial spores is based on the discovery that a heat shock consisting of exposure to a temperature of 100 C for 10 minutes causes the complete release of adenosine monophosphate (AMP) from the spores. This method could be an alternative to the method described in the immediately preceding article. Unlike that method and related prior methods, the present method does not involve germination and cultivation; this feature is an important advantage because in cases in which the spores are those of pathogens, delays involved in germination and cultivation could increase risks of infection. Also, in comparison with other prior methods that do not involve germination, the present method affords greater sensitivity. At present, the method is embodied in a laboratory procedure, though it would be desirable to implement the method by means of a miniaturized apparatus in order to make it convenient and economical enough to encourage widespread use.

  18. Evaluation of next generation sequencing for the analysis of Eimeria communities in wildlife.

    PubMed

    Vermeulen, Elke T; Lott, Matthew J; Eldridge, Mark D B; Power, Michelle L

    2016-05-01

    Next-generation sequencing (NGS) techniques are well-established for studying bacterial communities but not yet for microbial eukaryotes. Parasite communities remain poorly studied, due in part to the lack of reliable and accessible molecular methods to analyse eukaryotic communities. We aimed to develop and evaluate a methodology to analyse communities of the protozoan parasite Eimeria from populations of the Australian marsupial Petrogale penicillata (brush-tailed rock-wallaby) using NGS. An oocyst purification method for small sample sizes and polymerase chain reaction (PCR) protocol for the 18S rRNA locus targeting Eimeria was developed and optimised prior to sequencing on the Illumina MiSeq platform. A data analysis approach was developed by modifying methods from bacterial metagenomics and utilising existing Eimeria sequences in GenBank. Operational taxonomic unit (OTU) assignment at a high similarity threshold (97%) was more accurate at assigning Eimeria contigs into Eimeria OTUs but at a lower threshold (95%) there was greater resolution between OTU consensus sequences. The assessment of two amplification PCR methods prior to Illumina MiSeq, single and nested PCR, determined that single PCR was more sensitive to Eimeria as more Eimeria OTUs were detected in single amplicons. We have developed a simple and cost-effective approach to a data analysis pipeline for community analysis of eukaryotic organisms using Eimeria communities as a model. The pipeline provides a basis for evaluation using other eukaryotic organisms and potential for diverse community analysis studies. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Comparison of Bootstrapping and Markov Chain Monte Carlo for Copula Analysis of Hydrological Droughts

    NASA Astrophysics Data System (ADS)

    Yang, P.; Ng, T. L.; Yang, W.

    2015-12-01

    Effective water resources management depends on the reliable estimation of the uncertainty of drought events. Confidence intervals (CIs) are commonly applied to quantify this uncertainty. A CI seeks to be at the minimal length necessary to cover the true value of the estimated variable with the desired probability. In drought analysis where two or more variables (e.g., duration and severity) are often used to describe a drought, copulas have been found suitable for representing the joint probability behavior of these variables. However, the comprehensive assessment of the parameter uncertainties of copulas of droughts has been largely ignored, and the few studies that have recognized this issue have not explicitly compared the various methods to produce the best CIs. Thus, the objective of this study to compare the CIs generated using two widely applied uncertainty estimation methods, bootstrapping and Markov Chain Monte Carlo (MCMC). To achieve this objective, (1) the marginal distributions lognormal, Gamma, and Generalized Extreme Value, and the copula functions Clayton, Frank, and Plackett are selected to construct joint probability functions of two drought related variables. (2) The resulting joint functions are then fitted to 200 sets of simulated realizations of drought events with known distribution and extreme parameters and (3) from there, using bootstrapping and MCMC, CIs of the parameters are generated and compared. The effect of an informative prior on the CIs generated by MCMC is also evaluated. CIs are produced for different sample sizes (50, 100, and 200) of the simulated drought events for fitting the joint probability functions. Preliminary results assuming lognormal marginal distributions and the Clayton copula function suggest that for cases with small or medium sample sizes (~50-100), MCMC to be superior method if an informative prior exists. Where an informative prior is unavailable, for small sample sizes (~50), both bootstrapping and MCMC yield the same level of performance, and for medium sample sizes (~100), bootstrapping is better. For cases with a large sample size (~200), there is little difference between the CIs generated using bootstrapping and MCMC regardless of whether or not an informative prior exists.

  20. Learners' strategies for reconstructing cognitive frameworks and navigating conceptual change from prior conception to consensual genetics knowledge

    NASA Astrophysics Data System (ADS)

    Parrott, Annette M.

    Problem. Science teachers are charged with preparing students to become scientifically literate individuals. Teachers are given curriculum that specifies the knowledge that students should come away with; however, they are not necessarily aware of the knowledge with which the student arrives or how best to help them navigate between the two knowledge states. Educators must be aware, not only of where their students are conceptually, but how their students move from their prior knowledge and naive theories, to scientifically acceptable theories. The understanding of how students navigate this course has the potential to revolutionize educational practices. Methods. This study explored how five 9th grade biology students reconstructed their cognitive frameworks and navigated conceptual change from prior conception to consensual genetics knowledge. The research questions investigated were: (1) how do students in the process of changing their naive science theories to accepted science theories describe their journey from prior knowledge to current conception, and (2) what are the methods that students utilize to bridge the gap between alternate and consensual science conceptions to effect conceptual change. Qualitative and quantitative methods were employed to gather and analyze the data. In depth, semi-structured interviews formed the primary data for probing the context and details of students' conceptual change experience. Primary interview data was coded by thematic analysis. Results and discussion. This study revealed information about students' perceived roles in learning, the role of articulation in the conceptual change process, and ways in which a community of learners aids conceptual change. It was ascertained that students see their role in learning primarily as repeating information until they could add that information to their knowledge. Students are more likely to consider challenges to their conceptual frameworks and be more motivated to become active participants in constructing their knowledge when they are working collaboratively with peers instead of receiving instruction from their teacher. Articulation was found to be instrumental in aiding learners in identifying their alternate conceptions as well as in revisiting, investigating and reconstructing their conceptual frameworks. Based on the assumptions generated, suggestions were offered to inform pedagogical practice in support of the conceptual change process.

  1. Development and Validation of a Lifecycle-based Prognostics Architecture with Test Bed Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hines, J. Wesley; Upadhyaya, Belle; Sharp, Michael

    On-line monitoring and tracking of nuclear plant system and component degradation is being investigated as a method for improving the safety, reliability, and maintainability of aging nuclear power plants. Accurate prediction of the current degradation state of system components and structures is important for accurate estimates of their remaining useful life (RUL). The correct quantification and propagation of both the measurement uncertainty and model uncertainty is necessary for quantifying the uncertainty of the RUL prediction. This research project developed and validated methods to perform RUL estimation throughout the lifecycle of plant components. Prognostic methods should seamlessly operate from beginning ofmore » component life (BOL) to end of component life (EOL). We term this "Lifecycle Prognostics." When a component is put into use, the only information available may be past failure times of similar components used in similar conditions, and the predicted failure distribution can be estimated with reliability methods such as Weibull Analysis (Type I Prognostics). As the component operates, it begins to degrade and consume its available life. This life consumption may be a function of system stresses, and the failure distribution should be updated to account for the system operational stress levels (Type II Prognostics). When degradation becomes apparent, this information can be used to again improve the RUL estimate (Type III Prognostics). This research focused on developing prognostics algorithms for the three types of prognostics, developing uncertainty quantification methods for each of the algorithms, and, most importantly, developing a framework using Bayesian methods to transition between prognostic model types and update failure distribution estimates as new information becomes available. The developed methods were then validated on a range of accelerated degradation test beds. The ultimate goal of prognostics is to provide an accurate assessment for RUL predictions, with as little uncertainty as possible. From a reliability and maintenance standpoint, there would be improved safety by avoiding all failures. Calculated risk would decrease, saving money by avoiding unnecessary maintenance. One major bottleneck for data-driven prognostics is the availability of run-to-failure degradation data. Without enough degradation data leading to failure, prognostic models can yield RUL distributions with large uncertainty or mathematically unsound predictions. To address these issues a "Lifecycle Prognostics" method was developed to create RUL distributions from Beginning of Life (BOL) to End of Life (EOL). This employs established Type I, II, and III prognostic methods, and Bayesian transitioning between each Type. Bayesian methods, as opposed to classical frequency statistics, show how an expected value, a priori, changes with new data to form a posterior distribution. For example, when you purchase a component you have a prior belief, or estimation, of how long it will operate before failing. As you operate it, you may collect information related to its condition that will allow you to update your estimated failure time. Bayesian methods are best used when limited data are available. The use of a prior also means that information is conserved when new data are available. The weightings of the prior belief and information contained in the sampled data are dependent on the variance (uncertainty) of the prior, the variance (uncertainty) of the data, and the amount of measured data (number of samples). If the variance of the prior is small compared to the uncertainty of the data, the prior will be weighed more heavily. However, as more data are collected, the data will be weighted more heavily and will eventually swamp out the prior in calculating the posterior distribution of model parameters. Fundamentally Bayesian analysis updates a prior belief with new data to get a posterior belief. The general approach to applying the Bayesian method to lifecycle prognostics consisted of identifying the prior, which is the RUL estimate and uncertainty from the previous prognostics type, and combining it with observational data related to the newer prognostics type. The resulting lifecycle prognostics algorithm uses all available information throughout the component lifecycle.« less

  2. Guidelines for the Investigation of Mediating Variables in Business Research.

    PubMed

    MacKinnon, David P; Coxe, Stefany; Baraldi, Amanda N

    2012-03-01

    Business theories often specify the mediating mechanisms by which a predictor variable affects an outcome variable. In the last 30 years, investigations of mediating processes have become more widespread with corresponding developments in statistical methods to conduct these tests. The purpose of this article is to provide guidelines for mediation studies by focusing on decisions made prior to the research study that affect the clarity of conclusions from a mediation study, the statistical models for mediation analysis, and methods to improve interpretation of mediation results after the research study. Throughout this article, the importance of a program of experimental and observational research for investigating mediating mechanisms is emphasized.

  3. Contaminant removal by wastewater treatment plants in the Stillaguamish River Basin, Washington

    USGS Publications Warehouse

    Barbash, Jack E.; Moran, Patrick W.; Wagner, Richard J.; Wolanek, Michael

    2015-01-01

    Human activities in most areas of the developed world typically release nutrients, pharmaceuticals, personal care products, pesticides, and other contaminants into the environment, many of which reach freshwater ecosystems. In urbanized areas, wastewater treatment plants (WWTPs) are critical facilities for collecting and reducing the amounts of wastewater contaminants (WWCs) that ultimately discharge to rivers, coastal areas, and groundwater. Most WWTPs use multiple methods to remove contaminants from wastewater. These include physical methods to remove solid materials (primary treatment), biological and chemical methods to remove most organic matter (secondary treatment), advanced methods to reduce the concentrations of various contaminants such as nitrogen, phosphorus and (or) synthetic organic compounds (tertiary treatment), and disinfection prior to discharge (Metcalf and Eddy, Inc., 1979). This study examined the extent to which 114 organic WWCs were removed by each of three WWTPs, prior to discharge to freshwater and marine ecosystems, in a rapidly developing area in northwestern Washington State. Removal percentages for each WWC were estimated by comparing the concentrations measured in the WWTP influents with those measured in the effluents. The investigation was carried out in the 700-mi2Stillaguamish River Basin, the fifth largest watershed that discharges to Puget Sound (fig. 1).

  4. Incorporating biological information in sparse principal component analysis with application to genomic data.

    PubMed

    Li, Ziyi; Safo, Sandra E; Long, Qi

    2017-07-11

    Sparse principal component analysis (PCA) is a popular tool for dimensionality reduction, pattern recognition, and visualization of high dimensional data. It has been recognized that complex biological mechanisms occur through concerted relationships of multiple genes working in networks that are often represented by graphs. Recent work has shown that incorporating such biological information improves feature selection and prediction performance in regression analysis, but there has been limited work on extending this approach to PCA. In this article, we propose two new sparse PCA methods called Fused and Grouped sparse PCA that enable incorporation of prior biological information in variable selection. Our simulation studies suggest that, compared to existing sparse PCA methods, the proposed methods achieve higher sensitivity and specificity when the graph structure is correctly specified, and are fairly robust to misspecified graph structures. Application to a glioblastoma gene expression dataset identified pathways that are suggested in the literature to be related with glioblastoma. The proposed sparse PCA methods Fused and Grouped sparse PCA can effectively incorporate prior biological information in variable selection, leading to improved feature selection and more interpretable principal component loadings and potentially providing insights on molecular underpinnings of complex diseases.

  5. Analysis of Longitudinal Studies With Repeated Outcome Measures: Adjusting for Time-Dependent Confounding Using Conventional Methods.

    PubMed

    Keogh, Ruth H; Daniel, Rhian M; VanderWeele, Tyler J; Vansteelandt, Stijn

    2018-05-01

    Estimation of causal effects of time-varying exposures using longitudinal data is a common problem in epidemiology. When there are time-varying confounders, which may include past outcomes, affected by prior exposure, standard regression methods can lead to bias. Methods such as inverse probability weighted estimation of marginal structural models have been developed to address this problem. However, in this paper we show how standard regression methods can be used, even in the presence of time-dependent confounding, to estimate the total effect of an exposure on a subsequent outcome by controlling appropriately for prior exposures, outcomes, and time-varying covariates. We refer to the resulting estimation approach as sequential conditional mean models (SCMMs), which can be fitted using generalized estimating equations. We outline this approach and describe how including propensity score adjustment is advantageous. We compare the causal effects being estimated using SCMMs and marginal structural models, and we compare the two approaches using simulations. SCMMs enable more precise inferences, with greater robustness against model misspecification via propensity score adjustment, and easily accommodate continuous exposures and interactions. A new test for direct effects of past exposures on a subsequent outcome is described.

  6. Mammogram image quality as a potential contributor to disparities in breast cancer stage at diagnosis: an observational study

    PubMed Central

    2013-01-01

    Background In an ongoing study of racial/ethnic disparities in breast cancer stage at diagnosis, we consented patients to allow us to review their mammogram images, in order to examine the potential role of mammogram image quality on this disparity. Methods In a population-based study of urban breast cancer patients, a single breast imaging specialist (EC) performed a blinded review of the index mammogram that prompted diagnostic follow-up, as well as recent prior mammograms performed approximately one or two years prior to the index mammogram. Seven indicators of image quality were assessed on a five-point Likert scale, where 4 and 5 represented good and excellent quality. These included 3 technologist-associated image quality (TAIQ) indicators (positioning, compression, sharpness), and 4 machine associated image quality (MAIQ) indicators (contrast, exposure, noise and artifacts). Results are based on 494 images examined for 268 patients, including 225 prior images. Results Whereas MAIQ was generally high, TAIQ was more variable. In multivariable models of sociodemographic predictors of TAIQ, less income was associated with lower TAIQ (p < 0.05). Among prior mammograms, lower TAIQ was subsequently associated with later stage at diagnosis, even after adjusting for multiple patient and practice factors (OR = 0.80, 95% CI: 0.65, 0.99). Conclusions Considerable gains could be made in terms of increasing image quality through better positioning, compression and sharpness, gains that could impact subsequent stage at diagnosis. PMID:23621946

  7. Risk Factors for Erosion of Artificial Urinary Sphincters: A Multicenter Prospective Study

    PubMed Central

    Brant, William O.; Erickson, Bradley A.; Elliott, Sean P.; Powell, Christopher; Alsikafi, Nejd; McClung, Christopher; Myers, Jeremy B.; Voelzke, Bryan B.; Smith, Thomas G.; Broghammer, Joshua A.

    2015-01-01

    OBJECTIVE To evaluate the short- to medium-term outcomes after artificial urinary sphincter (AUS) placement from a large, multi-institutional, prospective, follow-up study. We hypothesize that along with radiation, patients with any history of a direct surgery to the urethra will have higher rates of eventual AUS explantation for erosion and/or infection. MATERIALS AND METHODS A prospective outcome analysis was performed on 386 patients treated with AUS placement from April 2009 to December 2012 at 8 institutions with at least 3 months of follow-up. Charts were analyzed for preoperative risk factors and postoperative complications requiring explantation. RESULTS Approximately 50% of patients were considered high risk. High risk was defined as patients having undergone radiation therapy, urethroplasty, multiple treatments for bladder neck contracture or urethral stricture, urethral stent placement, or a history of erosion or infection in a previous AUS. A total of 31 explantations (8.03%) were performed during the follow-up period. Overall explantation rates were higher in those with prior radiation and prior UroLume. Men with prior AUS infection or erosion also had a trend for higher rates of subsequent explantation. Men receiving 3.5-cm cuffs had significantly higher explantation rates than those receiving larger cuffs. CONCLUSION This outcomes study confirms that urethral risk factors, including radiation history, prior AUS erosion, and a history of urethral stent placement, increase the risk of AUS explantation in short-term follow-up. PMID:25109562

  8. PREVALENCE AND CORRELATES OF SUICIDAL BEHAVIOR AMONG NEW SOLDIERS IN THE U.S. ARMY: RESULTS FROM THE ARMY STUDY TO ASSESS RISK AND RESILIENCE IN SERVICEMEMBERS (ARMY STARRS)

    PubMed Central

    Ursano, Robert J.; Heeringa, Steven G.; Stein, Murray B.; Jain, Sonia; Raman, Rema; Sun, Xiaoying; Chiu, Wai Tat; Colpe, Lisa J.; Fullerton, Carol S.; Gilman, Stephen E.; Hwang, Irving; Naifeh, James A.; Nock, Matthew K.; Rosellini, Anthony J.; Sampson, Nancy A.; Schoenbaum, Michael; Zaslavsky, Alan M.; Kessler, Ronald C.

    2016-01-01

    Background The prevalence of suicide among U.S. Army soldiers has risen dramatically in recent years. Prior studies suggest that most soldiers with suicidal behaviors (i.e., ideation, plans, and attempts) had first onsets prior to enlistment. However, those data are based on retrospective self-reports of soldiers later in their Army careers. Unbiased examination of this issue requires investigation of suicidality among new soldiers. Method The New Soldier Study (NSS) of the Army Study to Assess Risk and Resilience in Servicemembers (Army STARRS) used fully structured self-administered measures to estimate preenlistment histories of suicide ideation, plans, and attempts among new soldiers reporting for Basic Combat Training in 2011–2012. Survival models examined sociodemographic correlates of each suicidal outcome. Results Lifetime prevalence estimates of preenlistment suicide ideation, plans, and attempts were 14.1, 2.3, and 1.9%, respectively. Most reported onsets of suicide plans and attempts (73.3–81.5%) occurred within the first year after onset of ideation. Odds of these lifetime suicidal behaviors among new soldiers were positively, but weakly associated with being female, unmarried, religion other than Protestant or Catholic, and a race/ethnicity other than non-Hispanic White, non-Hispanic Black, or Hispanic. Conclusions Lifetime prevalence estimates of suicidal behaviors among new soldiers are consistent with retrospective reports of preenlistment prevalence obtained from soldiers later in their Army careers. Given that prior suicidal behaviors are among the strongest predictors of later suicides, consideration should be given to developing methods of obtaining valid reports of preenlistment suicidality from new soldiers to facilitate targeting of preventive interventions. PMID:25338964

  9. Bayesian analysis of caustic-crossing microlensing events

    NASA Astrophysics Data System (ADS)

    Cassan, A.; Horne, K.; Kains, N.; Tsapras, Y.; Browne, P.

    2010-06-01

    Aims: Caustic-crossing binary-lens microlensing events are important anomalous events because they are capable of detecting an extrasolar planet companion orbiting the lens star. Fast and robust modelling methods are thus of prime interest in helping to decide whether a planet is detected by an event. Cassan introduced a new set of parameters to model binary-lens events, which are closely related to properties of the light curve. In this work, we explain how Bayesian priors can be added to this framework, and investigate on interesting options. Methods: We develop a mathematical formulation that allows us to compute analytically the priors on the new parameters, given some previous knowledge about other physical quantities. We explicitly compute the priors for a number of interesting cases, and show how this can be implemented in a fully Bayesian, Markov chain Monte Carlo algorithm. Results: Using Bayesian priors can accelerate microlens fitting codes by reducing the time spent considering physically implausible models, and helps us to discriminate between alternative models based on the physical plausibility of their parameters.

  10. Abdominal multi-organ CT segmentation using organ correlation graph and prediction-based shape and location priors.

    PubMed

    Okada, Toshiyuki; Linguraru, Marius George; Hori, Masatoshi; Summers, Ronald M; Tomiyama, Noriyuki; Sato, Yoshinobu

    2013-01-01

    The paper addresses the automated segmentation of multiple organs in upper abdominal CT data. We propose a framework of multi-organ segmentation which is adaptable to any imaging conditions without using intensity information in manually traced training data. The features of the framework are as follows: (1) the organ correlation graph (OCG) is introduced, which encodes the spatial correlations among organs inherent in human anatomy; (2) the patient-specific organ shape and location priors obtained using OCG enable the estimation of intensity priors from only target data and optionally a number of untraced CT data of the same imaging condition as the target data. The proposed methods were evaluated through segmentation of eight abdominal organs (liver, spleen, left and right kidney, pancreas, gallbladder, aorta, and inferior vena cava) from 86 CT data obtained by four imaging conditions at two hospitals. The performance was comparable to the state-of-the-art method using intensity priors constructed from manually traced data.

  11. The PEWTER Study: Breaking Bad News Communication Skills Training for Counseling Programs

    ERIC Educational Resources Information Center

    Keefe-Cooperman, Kathleen; Savitsky, Devyn; Koshel, Walter; Bhat, Varsha; Cooperman, Jessica

    2018-01-01

    The efficacy of teaching communication skills for breaking bad news in graduate-level counseling programs was examined. A structured model, PEWTER (Prepare, Evaluate, Warning, Telling, Emotional Response, Regrouping; Keefe-Cooperman and Nardi 2004), provides a method for this difficult task. Prior to training in using the model, students reported…

  12. Tests of Alignment among Assessment, Standards, and Instruction Using Generalized Linear Model Regression

    ERIC Educational Resources Information Center

    Fulmer, Gavin W.; Polikoff, Morgan S.

    2014-01-01

    An essential component in school accountability efforts is for assessments to be well-aligned with the standards or curriculum they are intended to measure. However, relatively little prior research has explored methods to determine statistical significance of alignment or misalignment. This study explores analyses of alignment as a special case…

  13. The Politics and Statistics of Value-Added Modeling for Accountability of Teacher Preparation Programs

    ERIC Educational Resources Information Center

    Lincove, Jane Arnold; Osborne, Cynthia; Dillon, Amanda; Mills, Nicholas

    2014-01-01

    Despite questions about validity and reliability, the use of value-added estimation methods has moved beyond academic research into state accountability systems for teachers, schools, and teacher preparation programs (TPPs). Prior studies of value-added measurement for TPPs test the validity of researcher-designed models and find that measuring…

  14. Prekindergarten and Kindergarten Teachers' Perceptions of Childhood Demographic Determinants and Academic Achievement

    ERIC Educational Resources Information Center

    Boyle, Melanie Ellen

    2013-01-01

    The purpose of this study was to examine kindergarten and prekindergarten teachers' perceptions of academic success for children based on the type of care children received prior to beginning kindergarten, as well as other demographics, which could cause variations in academic success. The researcher used a seven section multi-method survey…

  15. Avoiding the Struggle: Instruction That Supports Students' Motivation in Reading and Writing about Content Material

    ERIC Educational Resources Information Center

    Mason, Linda H.; Meadan, Hedda; Hedin, Laura R.; Cramer, Anne Mong

    2012-01-01

    We conducted a mixed methods study to evaluate motivation among 20 fourth-grade students who struggle with reading and writing prior to and after receiving either self-regulated strategy development (SRSD) instruction for expository reading comprehension or SRSD instruction for expository reading comprehension plus informative writing. We…

  16. Effect of Par Frying on Composition and Texture of Breaded and Battered Catfish

    USDA-ARS?s Scientific Manuscript database

    Catfish is often consumed as a breaded and battered fried product; however, baking is considered a healthier alternative to frying. One method of improving the texture properties of baked products is to par fried prior to baking. The objective of this study was to examine the effect of par frying ...

  17. How Should Intelligent Tutoring Systems Sequence Multiple Graphical Representations of Fractions? A Multi-Methods Study

    ERIC Educational Resources Information Center

    Rau, M. A.; Aleven, V.; Rummel, N.; Pardos, Z.

    2014-01-01

    Providing learners with multiple representations of learning content has been shown to enhance learning outcomes. When multiple representations are presented across consecutive problems, we have to decide in what sequence to present them. Prior research has demonstrated that interleaving "tasks types" (as opposed to blocking them) can…

  18. Towards Individualized Online Learning: The Design and Development of an Adaptive Web Based Learning Environment

    ERIC Educational Resources Information Center

    Inan, Fethi A.; Flores, Raymond; Ari, Fatih; Arslan-Ari, Ismahan

    2011-01-01

    The purpose of this study was to document the design and development of an adaptive system which individualizes instruction such as content, interfaces, instructional strategies, and resources dependent on two factors, namely student motivation and prior knowledge levels. Combining adaptive hypermedia methods with strategies proposed by…

  19. Using "Fremyella Diplosiphon" as a Model Organism for Genetics-Based Laboratory Exercises

    ERIC Educational Resources Information Center

    Montgomery, Beronda L.

    2011-01-01

    In this pilot study, a genetics-based laboratory exercise using the cyanobacterium Fremyella diplosiphon was developed and trialled with thirteen Natural Sciences undergraduates. Despite most students only having limited prior exposure to molecular genetics laboratory methods, this cohort confirmed that they were able to follow the protocol and…

  20. Genome-wide association study (GWAS) of coleoptile and mesocotyl elongation in rice (Oryza sativa L.)

    USDA-ARS?s Scientific Manuscript database

    Direct-seeding of rice without prior pre-germination is gaining popularity in rice growing countries because it requires less water and is less labor than transplanting rice seedlings. Slow emergence and poor seedling establishment of direct-seeded rice are the primary drawback of this method. Adeq...

  1. Some Methodological Issues with "Draw a Scientist Tests" among Young Children

    ERIC Educational Resources Information Center

    Losh, Susan C.; Wilke, Ryan; Pop, Margareta

    2008-01-01

    Children's stereotypes about scientists have been postulated to affect student science identity and interest in science. Findings from prior studies using "Draw a Scientist Test" methods suggest that students see scientists as largely white, often unattractive, men; one consequence may be that girls and minority students feel a science career is…

  2. Using Blended Learning Design to Enhance Learning Experience in Teacher Education

    ERIC Educational Resources Information Center

    Zhou, Mingming; Chua, Bee Leng

    2016-01-01

    This study examined students' views on a blended learning environment designed for 29 in-service teachers in Singapore enrolled in an educational research method course. Their self-report data highlighted that students' prior knowledge and the amount and difficulty of content covered in the course affected the effectiveness of blended learning…

  3. Teacher Education for Social Change: Transforming a Content Methods Course Block

    ERIC Educational Resources Information Center

    Ritchie, Scott; An, Sohyun; Cone, Neporcha; Bullock, Patricia

    2013-01-01

    This article analyzes data from a qualitative practitioner-research case study in which four university faculty members attempted to disrupt the hegemonic domestication of candidates enrolled in an undergraduate teacher education program. During the semester prior to their student teaching, 16 candidates at a large public university in the…

  4. Academic Vocabulary Learning in First through Third Grade in Low-Income Schools: Effects of Automated Supplemental Instruction

    ERIC Educational Resources Information Center

    Goldstein, Howard; Ziolkowski, Robyn A.; Bojczyk, Kathryn E.; Marty, Ana; Schneider, Naomi; Harpring, Jayme; Haring, Christa D.

    2017-01-01

    Purpose: This study investigated cumulative effects of language learning, specifically whether prior vocabulary knowledge or special education status moderated the effects of academic vocabulary instruction in high-poverty schools. Method: Effects of a supplemental intervention targeting academic vocabulary in first through third grades were…

  5. The Effect of Ear Playing Instruction on Adult Amateur Wind Instrumentalists' Musical Self-Efficacy: An Exploratory Study

    ERIC Educational Resources Information Center

    Hartz, Barry; Bauer, William

    2016-01-01

    The purpose of this mixed methods study was to examine the effect of ear playing instruction on adult amateur wind instrumentalists' musical self-efficacy. Ten volunteer members of a community band in a small town in Ohio completed the "Ear Playing Profile" both prior to and following an eight-week period of instruction in playing by ear…

  6. Change in Self-Rated Health and Mortality among Community-Dwelling Disabled Older Women

    ERIC Educational Resources Information Center

    Han, Beth; Phillips, Caroline; Ferrucci, Luigi; Bandeen-Roche, Karen; Jylha, Marja; Kasper, Judith; Guralnik, Jack M.

    2005-01-01

    Purpose: Our study assessed whether change in self-rated health is a stronger predictor of mortality than baseline self-rated health and the most recent self-rated health (prior to death or loss to follow-up) among disabled older women. Design and Methods: The Women's Health and Aging Study examined disabled older women at baseline and every 6…

  7. Intra- and extra-familial child homicide in Sweden 1992-2012: A population-based study.

    PubMed

    Hedlund, Jonatan; Masterman, Thomas; Sturup, Joakim

    2016-04-01

    Previous studies have shown decreasing child homicide rates in many countries - in Sweden mainly due to a drop in filicide-suicides. This study examines the rate of child homicides during 21 years, with the hypothesis that a decline might be attributable to a decrease in the number of depressive filicide offenders (as defined by a proxy measure). In addition, numerous characteristics of child homicide are presented. All homicide incidents involving 0-14-year-old victims in Sweden during 1992-2012 (n = 90) were identified in an autopsy database. Data from multiple registries, forensic psychiatric evaluations, police reports, verdicts and other sources were collected. Utilizing Poisson regression, we found a 4% annual decrease in child homicides, in accordance with prior studies, but no marked decrease regarding the depressive-offender proxy. Diagnoses from forensic psychiatric evaluations (n = 50) included substance misuse (8%), affective disorders (10%), autism-spectrum disorders (18%), psychotic disorders (28%) and personality disorders (30%). Prior violent offences were more common among offenders in filicides than filicide-suicides (17.8% vs. 6.9%); and about 20% of offenders in each group had previously received psychiatric inpatient care. Aggressive methods of filicide predominated among fathers. Highly lethal methods of filicide (firearms, fire) were more commonly followed by same-method suicide than less lethal methods. Interestingly, a third of the extra-familial offenders had an autism-spectrum disorder. Based on several findings, e.g., the low rate of substance misuse, the study concludes that non-traditional risk factors for violence must be highlighted by healthcare providers. Also, the occurrence of autism-spectrum disorders in the present study is a novel finding that warrants further investigation. Copyright © 2016 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  8. Previous pregnancy outcomes and subsequent pregnancy anxiety in a Quebec prospective cohort

    PubMed Central

    Shapiro, Gabriel D.; Séguin, Jean R.; Muckle, Gina; Monnier, Patricia; Fraser, William D.

    2017-01-01

    Introduction Pregnancy anxiety is an important psychosocial risk factor that may be more strongly associated with adverse birth outcomes than other measures of stress. Better understanding of the upstream predictors and causes of pregnancy anxiety could help to identify high-risk women for adverse maternal and infant outcomes. The objective of the present study was to measure the associations between five past pregnancy outcomes (live preterm birth (PTB), live term birth, miscarriage at <20 weeks, stillbirth at ≥20 weeks, and elective abortion) and pregnancy anxiety at three trimesters in a subsequent pregnancy. Methods Analyses were conducted using data from the 3D Cohort Study, a Canadian birth cohort. Data on maternal demographic characteristics and pregnancy history for each known previous pregnancy were collected via interviewer-administered questionnaires at study entry. Pregnancy anxiety for the index study pregnancy was measured prospectively by self-administered questionnaire following three prenatal study visits. Results Of 2366 participants in the 3D Study, 1505 had at least one previous pregnancy. In linear regression analyses with adjustment for confounding variables, prior live term birth was associated with lower pregnancy anxiety in all three trimesters, whereas prior miscarriage was significantly associated with higher pregnancy anxiety in the first trimester. Prior stillbirth was associated with greater pregnancy anxiety in the third trimester. Prior elective abortion was significantly associated with higher pregnancy anxiety scores in the first and second trimesters, with an association of similar magnitude observed in the third trimester. Discussion Our findings suggest that the outcomes of previous pregnancies should be incorporated, along with demographic and psychosocial characteristics, into conceptual models framing pregnancy anxiety. PMID:28079434

  9. Nonparametric Hierarchical Bayesian Model for Functional Brain Parcellation

    PubMed Central

    Lashkari, Danial; Sridharan, Ramesh; Vul, Edward; Hsieh, Po-Jang; Kanwisher, Nancy; Golland, Polina

    2011-01-01

    We develop a method for unsupervised analysis of functional brain images that learns group-level patterns of functional response. Our algorithm is based on a generative model that comprises two main layers. At the lower level, we express the functional brain response to each stimulus as a binary activation variable. At the next level, we define a prior over the sets of activation variables in all subjects. We use a Hierarchical Dirichlet Process as the prior in order to simultaneously learn the patterns of response that are shared across the group, and to estimate the number of these patterns supported by data. Inference based on this model enables automatic discovery and characterization of salient and consistent patterns in functional signals. We apply our method to data from a study that explores the response of the visual cortex to a collection of images. The discovered profiles of activation correspond to selectivity to a number of image categories such as faces, bodies, and scenes. More generally, our results appear superior to the results of alternative data-driven methods in capturing the category structure in the space of stimuli. PMID:21841977

  10. Prior Heat Stress Effects Fatigue Recovery of the Elbow Flexor Muscles

    PubMed Central

    Iguchi, Masaki; Shields, Richard K.

    2011-01-01

    Introduction Long-lasting alterations in hormones, neurotransmitters and stress proteins after hyperthermia may be responsible for the impairment in motor performance during muscle fatigue. Methods Subjects (n = 25) performed a maximal intermittent fatigue task of elbow flexion after sitting in either 73 or 26 deg C to examine the effects of prior heat stress on fatigue mechanisms. Results The heat stress increased the tympanic and rectal temperatures by 2.3 and 0.82 deg C, respectively, but there was full recovery prior to the fatigue task. While prior heat stress had no effects on fatigue-related changes in volitional torque, EMG activity, torque relaxation rate, MEP size and SP duration, prior heat stress acutely increased the pre-fatigue relaxation rate and chronically prevented long-duration fatigue (p < 0.05). Discussion These findings indicate that prior passive heat stress alone does not alter voluntary activation during fatigue, but prior heat stress and exercise produce longer-term protection against long-duration fatigue. PMID:21674526

  11. Empirical Bayes estimation of proportions with application to cowbird parasitism rates

    USGS Publications Warehouse

    Link, W.A.; Hahn, D.C.

    1996-01-01

    Bayesian models provide a structure for studying collections of parameters such as are considered in the investigation of communities, ecosystems, and landscapes. This structure allows for improved estimation of individual parameters, by considering them in the context of a group of related parameters. Individual estimates are differentially adjusted toward an overall mean, with the magnitude of their adjustment based on their precision. Consequently, Bayesian estimation allows for a more credible identification of extreme values in a collection of estimates. Bayesian models regard individual parameters as values sampled from a specified probability distribution, called a prior. The requirement that the prior be known is often regarded as an unattractive feature of Bayesian analysis and may be the reason why Bayesian analyses are not frequently applied in ecological studies. Empirical Bayes methods provide an alternative approach that incorporates the structural advantages of Bayesian models while requiring a less stringent specification of prior knowledge. Rather than requiring that the prior distribution be known, empirical Bayes methods require only that it be in a certain family of distributions, indexed by hyperparameters that can be estimated from the available data. This structure is of interest per se, in addition to its value in allowing for improved estimation of individual parameters; for example, hypotheses regarding the existence of distinct subgroups in a collection of parameters can be considered under the empirical Bayes framework by allowing the hyperparameters to vary among subgroups. Though empirical Bayes methods have been applied in a variety of contexts, they have received little attention in the ecological literature. We describe the empirical Bayes approach in application to estimation of proportions, using data obtained in a community-wide study of cowbird parasitism rates for illustration. Since observed proportions based on small sample sizes are heavily adjusted toward the mean, extreme values among empirical Bayes estimates identify those species for which there is the greatest evidence of extreme parasitism rates. Applying a subgroup analysis to our data on cowbird parasitism rates, we conclude that parasitism rates for Neotropical Migrants as a group are no greater than those of Resident/Short-distance Migrant species in this forest community. Our data and analyses demonstrate that the parasitism rates for certain Neotropical Migrant species are remarkably low (Wood Thrush and Rose-breasted Grosbeak) while those for others are remarkably high (Ovenbird and Red-eyed Vireo).

  12. Determining informative priors for cognitive models.

    PubMed

    Lee, Michael D; Vanpaemel, Wolf

    2018-02-01

    The development of cognitive models involves the creative scientific formalization of assumptions, based on theory, observation, and other relevant information. In the Bayesian approach to implementing, testing, and using cognitive models, assumptions can influence both the likelihood function of the model, usually corresponding to assumptions about psychological processes, and the prior distribution over model parameters, usually corresponding to assumptions about the psychological variables that influence those processes. The specification of the prior is unique to the Bayesian context, but often raises concerns that lead to the use of vague or non-informative priors in cognitive modeling. Sometimes the concerns stem from philosophical objections, but more often practical difficulties with how priors should be determined are the stumbling block. We survey several sources of information that can help to specify priors for cognitive models, discuss some of the methods by which this information can be formalized in a prior distribution, and identify a number of benefits of including informative priors in cognitive modeling. Our discussion is based on three illustrative cognitive models, involving memory retention, categorization, and decision making.

  13. Segmentation and tracking of lung nodules via graph-cuts incorporating shape prior and motion from 4D CT.

    PubMed

    Cha, Jungwon; Farhangi, Mohammad Mehdi; Dunlap, Neal; Amini, Amir A

    2018-01-01

    We have developed a robust tool for performing volumetric and temporal analysis of nodules from respiratory gated four-dimensional (4D) CT. The method could prove useful in IMRT of lung cancer. We modified the conventional graph-cuts method by adding an adaptive shape prior as well as motion information within a signed distance function representation to permit more accurate and automated segmentation and tracking of lung nodules in 4D CT data. Active shape models (ASM) with signed distance function were used to capture the shape prior information, preventing unwanted surrounding tissues from becoming part of the segmented object. The optical flow method was used to estimate the local motion and to extend three-dimensional (3D) segmentation to 4D by warping a prior shape model through time. The algorithm has been applied to segmentation of well-circumscribed, vascularized, and juxtapleural lung nodules from respiratory gated CT data. In all cases, 4D segmentation and tracking for five phases of high-resolution CT data took approximately 10 min on a PC workstation with AMD Phenom II and 32 GB of memory. The method was trained based on 500 breath-held 3D CT data from the LIDC data base and was tested on 17 4D lung nodule CT datasets consisting of 85 volumetric frames. The validation tests resulted in an average Dice Similarity Coefficient (DSC) = 0.68 for all test data. An important by-product of the method is quantitative volume measurement from 4D CT from end-inspiration to end-expiration which will also have important diagnostic value. The algorithm performs robust segmentation of lung nodules from 4D CT data. Signed distance ASM provides the shape prior information which based on the iterative graph-cuts framework is adaptively refined to best fit the input data, preventing unwanted surrounding tissue from merging with the segmented object. © 2017 American Association of Physicists in Medicine.

  14. Reduction of pain and anxiety prior to botulinum toxin injections with a new topical anesthetic method.

    PubMed

    Weiss, Richard A; Lavin, Phillip T

    2009-01-01

    To evaluate the safety and efficacy of vapocoolants (topical skin refrigerants) to induce skin anesthesia and relieve patient anxiety and pain prior to cosmetic botulinum injections. A paired (split-face) design was used in 52 patients where patient side (left vs. right) was randomized to receive either vapocoolant spray or no treatment control to test the study hypothesis of better anesthetic efficacy of vapocoolant spray versus no treatment control. A pain and anxiety questionnaire was administered before, during, and after the injections. A considerable percentage of patients either expected pain (35% of naïve patients expected moderate pain) or had experienced pain from their prior treatment (35% had experienced moderate pain). Among naïve patients, 15% had moderate or severe anxiety and among experienced patients, 31% had moderate anxiety. Pain was a factor in delaying the scheduling of cosmetic botulinum toxin treatments in 19% of naïve patients and 31% of experienced patients. Pain reported from actual injections was higher than what was anticipated prior to treatment. There was a significant reduction in pain at injection sites treated with vapocoolant (p < 0.001, paired t test). Overall, 67% of all patients reported that the vapocoolant method had less pain than no anesthesia and 54% preferred vapocoolant for their next treatment. Overall, 6% of all patients would schedule their next botulinum toxin treatment sooner if vapocoolant were available. Vapocoolants represent a safe and effective means to reduce patient discomfort and anxiety before and during botulinum toxin type A treatments for glabellar area indications.

  15. A unified framework for penalized statistical muon tomography reconstruction with edge preservation priors of lp norm type

    NASA Astrophysics Data System (ADS)

    Yu, Baihui; Zhao, Ziran; Wang, Xuewu; Wu, Dufan; Zeng, Zhi; Zeng, Ming; Wang, Yi; Cheng, Jianping

    2016-01-01

    The Tsinghua University MUon Tomography facilitY (TUMUTY) has been built up and it is utilized to reconstruct the special objects with complex structure. Since fine image is required, the conventional Maximum likelihood Scattering and Displacement (MLSD) algorithm is employed. However, due to the statistical characteristics of muon tomography and the data incompleteness, the reconstruction is always instable and accompanied with severe noise. In this paper, we proposed a Maximum a Posterior (MAP) algorithm for muon tomography regularization, where an edge-preserving prior on the scattering density image is introduced to the object function. The prior takes the lp norm (p>0) of the image gradient magnitude, where p=1 and p=2 are the well-known total-variation (TV) and Gaussian prior respectively. The optimization transfer principle is utilized to minimize the object function in a unified framework. At each iteration the problem is transferred to solving a cubic equation through paraboloidal surrogating. To validate the method, the French Test Object (FTO) is imaged by both numerical simulation and TUMUTY. The proposed algorithm is used for the reconstruction where different norms are detailedly studied, including l2, l1, l0.5, and an l2-0.5 mixture norm. Compared with MLSD method, MAP achieves better image quality in both structure preservation and noise reduction. Furthermore, compared with the previous work where one dimensional image was acquired, we achieve the relatively clear three dimensional images of FTO, where the inner air hole and the tungsten shell is visible.

  16. Does additional prenatal care in the home improve birth outcomes for women with a prior preterm delivery? A randomized clinical trial.

    PubMed

    Lutenbacher, Melanie; Gabbe, Patricia Temple; Karp, Sharon M; Dietrich, Mary S; Narrigan, Deborah; Carpenter, Lavenia; Walsh, William

    2014-07-01

    Women with a history of a prior preterm birth (PTB) have a high probability of a recurrent preterm birth. Some risk factors and health behaviors that contribute to PTB may be amenable to intervention. Home visitation is a promising method to deliver evidence based interventions. We evaluated a system of care designed to reduce preterm births and hospital length of stay in a sample of pregnant women with a history of a PTB. Single site randomized clinical trial. Eligibility: >18 years with prior live birth ≥20-<37 weeks gestation; <24 weeks gestation at enrollment; spoke and read English; received care at regional medical center. All participants (N = 211) received standard prenatal care. Intervention participants (N = 109) also received home visits by certified nurse-midwives guided by protocols for specific risk factors (e.g., depressive symptoms, abuse, smoking). Data was collected via multiple methods and sources including intervention fidelity assessments. Average age 27.8 years; mean gestational age at enrollment was 15 weeks. Racial breakdown mirrored local demographics. Most had a partner, high school education, and 62% had Medicaid. No statistically significant group differences were found in gestational age at birth. Intervention participants had a shorter intrapartum length of stay. Enhanced prenatal care by nurse-midwife home visits may limit some risk factors and shorten intrapartum length of stay for women with a prior PTB. This study contributes to knowledge about evidence-based home visit interventions directed at risk factors associated with PTB.

  17. On the regularization for nonlinear tomographic absorption spectroscopy

    NASA Astrophysics Data System (ADS)

    Dai, Jinghang; Yu, Tao; Xu, Lijun; Cai, Weiwei

    2018-02-01

    Tomographic absorption spectroscopy (TAS) has attracted increased research efforts recently due to the development in both hardware and new imaging concepts such as nonlinear tomography and compressed sensing. Nonlinear TAS is one of the emerging modality that bases on the concept of nonlinear tomography and has been successfully demonstrated both numerically and experimentally. However, all the previous demonstrations were realized using only two orthogonal projections simply for ease of implementation. In this work, we examine the performance of nonlinear TAS using other beam arrangements and test the effectiveness of the beam optimization technique that has been developed for linear TAS. In addition, so far only smoothness prior has been adopted and applied in nonlinear TAS. Nevertheless, there are also other useful priors such as sparseness and model-based prior which have not been investigated yet. This work aims to show how these priors can be implemented and included in the reconstruction process. Regularization through Bayesian formulation will be introduced specifically for this purpose, and a method for the determination of a proper regularization factor will be proposed. The comparative studies performed with different beam arrangements and regularization schemes on a few representative phantoms suggest that the beam optimization method developed for linear TAS also works for the nonlinear counterpart and the regularization scheme should be selected properly according to the available a priori information under specific application scenarios so as to achieve the best reconstruction fidelity. Though this work is conducted under the context of nonlinear TAS, it can also provide useful insights for other tomographic modalities.

  18. The high frequency of healthcare use in patients one year prior to a sarcoidosis diagnosis

    PubMed Central

    Gerke, Alicia K.; Tang, Fan; Pendergast, Jane; Cavanaugh, Joseph E.; Polgreen, Philip M.

    2015-01-01

    Background The clinical presentation of sarcoidosis can be varied. Prior investigations have shown that diagnosis is often delayed over six months, particularly in patients with pulmonary symptoms. Delays may lead to high healthcare use prior to diagnosis. Objective To investigate healthcare use prior to diagnosis of sarcoidosis for a cohort of insured patients. Methods We conducted a case-control study using a de-identified limited dataset of private health insurance claims. Cases were identified as persons with sarcoidosis from 2003-2009. Controls with other respiratory-related diagnoses (asthma, chronic obstructive pulmonary disease, pneumonia) were matched by age, gender, and diagnosis date. We compared frequencies of doctor visits, prescriptions, and imaging in the year prior to established diagnosis. Results We identified 206 cases and 2060 controls and compared healthcare use patterns in the year prior to diagnosis. Among those receiving prescriptions, a larger proportion of cases received two or more antibiotic courses (69% vs. 55%, p=0.0020) or two or more corticosteroid prescriptions (63% vs. 50%, p=0.0137). On average, cases had more doctor visits (14.7 vs. 7.8, p<0.0001), saw more specialties (3.9 vs. 2.1, p<0.0001), and underwent more chest x-rays (2.0 vs. 1.5, p<0.0001). A larger proportion of cases underwent two or more chest x-rays (54% vs. 24%, p<0.0001). Conclusions Patients with sarcoidosis undergo a large amount of healthcare prior to diagnosis, some of which may not be necessary, compared to controls with respiratory-related disease. These results highlight the need for improved diagnostic algorithms to identify patients with sarcoidosis and avoid potentially excessive delays in diagnosis. PMID:25363229

  19. Estimating haplotype frequencies by combining data from large DNA pools with database information.

    PubMed

    Gasbarra, Dario; Kulathinal, Sangita; Pirinen, Matti; Sillanpää, Mikko J

    2011-01-01

    We assume that allele frequency data have been extracted from several large DNA pools, each containing genetic material of up to hundreds of sampled individuals. Our goal is to estimate the haplotype frequencies among the sampled individuals by combining the pooled allele frequency data with prior knowledge about the set of possible haplotypes. Such prior information can be obtained, for example, from a database such as HapMap. We present a Bayesian haplotyping method for pooled DNA based on a continuous approximation of the multinomial distribution. The proposed method is applicable when the sizes of the DNA pools and/or the number of considered loci exceed the limits of several earlier methods. In the example analyses, the proposed model clearly outperforms a deterministic greedy algorithm on real data from the HapMap database. With a small number of loci, the performance of the proposed method is similar to that of an EM-algorithm, which uses a multinormal approximation for the pooled allele frequencies, but which does not utilize prior information about the haplotypes. The method has been implemented using Matlab and the code is available upon request from the authors.

  20. Learning Using Dynamic and Static Visualizations: Students' Comprehension, Prior Knowledge and Conceptual Status of a Biotechnological Method

    NASA Astrophysics Data System (ADS)

    Yarden, Hagit; Yarden, Anat

    2010-05-01

    The importance of biotechnology education at the high-school level has been recognized in a number of international curriculum frameworks around the world. One of the most problematic issues in learning biotechnology has been found to be the biotechnological methods involved. Here, we examine the unique contribution of an animation of the polymerase chain reaction (PCR) in promoting conceptual learning of the biotechnological method among 12th-grade biology majors. All of the students learned about the PCR using still images ( n = 83) or the animation ( n = 90). A significant advantage to the animation treatment was identified following learning. Students’ prior content knowledge was found to be an important factor for students who learned PCR using still images, serving as an obstacle to learning the PCR method in the case of low prior knowledge. Through analysing students’ discourse, using the framework of the conceptual status analysis, we found that students who learned about PCR using still images faced difficulties in understanding some mechanistic aspects of the method. On the other hand, using the animation gave the students an advantage in understanding those aspects.

  1. Hierarchical Bayesian modeling of ionospheric TEC disturbances as non-stationary processes

    NASA Astrophysics Data System (ADS)

    Seid, Abdu Mohammed; Berhane, Tesfahun; Roininen, Lassi; Nigussie, Melessew

    2018-03-01

    We model regular and irregular variation of ionospheric total electron content as stationary and non-stationary processes, respectively. We apply the method developed to SCINDA GPS data set observed at Bahir Dar, Ethiopia (11.6 °N, 37.4 °E) . We use hierarchical Bayesian inversion with Gaussian Markov random process priors, and we model the prior parameters in the hyperprior. We use Matérn priors via stochastic partial differential equations, and use scaled Inv -χ2 hyperpriors for the hyperparameters. For drawing posterior estimates, we use Markov Chain Monte Carlo methods: Gibbs sampling and Metropolis-within-Gibbs for parameter and hyperparameter estimations, respectively. This allows us to quantify model parameter estimation uncertainties as well. We demonstrate the applicability of the method proposed using a synthetic test case. Finally, we apply the method to real GPS data set, which we decompose to regular and irregular variation components. The result shows that the approach can be used as an accurate ionospheric disturbance characterization technique that quantifies the total electron content variability with corresponding error uncertainties.

  2. Fault diagnosis of sensor networked structures with multiple faults using a virtual beam based approach

    NASA Astrophysics Data System (ADS)

    Wang, H.; Jing, X. J.

    2017-07-01

    This paper presents a virtual beam based approach suitable for conducting diagnosis of multiple faults in complex structures with limited prior knowledge of the faults involved. The "virtual beam", a recently-proposed concept for fault detection in complex structures, is applied, which consists of a chain of sensors representing a vibration energy transmission path embedded in the complex structure. Statistical tests and adaptive threshold are particularly adopted for fault detection due to limited prior knowledge of normal operational conditions and fault conditions. To isolate the multiple faults within a specific structure or substructure of a more complex one, a 'biased running' strategy is developed and embedded within the bacterial-based optimization method to construct effective virtual beams and thus to improve the accuracy of localization. The proposed method is easy and efficient to implement for multiple fault localization with limited prior knowledge of normal conditions and faults. With extensive experimental results, it is validated that the proposed method can localize both single fault and multiple faults more effectively than the classical trust index subtract on negative add on positive (TI-SNAP) method.

  3. An experimental study of nonlinear dynamic system identification

    NASA Technical Reports Server (NTRS)

    Stry, Greselda I.; Mook, D. Joseph

    1990-01-01

    A technique for robust identification of nonlinear dynamic systems is developed and illustrated using both simulations and analog experiments. The technique is based on the Minimum Model Error optimal estimation approach. A detailed literature review is included in which fundamental differences between the current approach and previous work is described. The most significant feature of the current work is the ability to identify nonlinear dynamic systems without prior assumptions regarding the form of the nonlinearities, in constrast to existing nonlinear identification approaches which usually require detailed assumptions of the nonlinearities. The example illustrations indicate that the method is robust with respect to prior ignorance of the model, and with respect to measurement noise, measurement frequency, and measurement record length.

  4. Differences in preconceptional and prenatal behaviors in women with intended and unintended pregnancies.

    PubMed Central

    Hellerstedt, W L; Pirie, P L; Lando, H A; Curry, S J; McBride, C M; Grothaus, L C; Nelson, J C

    1998-01-01

    OBJECTIVES: This study examined whether pregnancy intention was associated with cigarette smoking, alcohol drinking, use of vitamins, and consumption of caffeinated drinks prior to pregnancy and in early pregnancy. METHODS: Data from a telephone survey of 7174 pregnant women were analyzed. RESULTS: In comparison with women whose pregnancies were intended, women with unintended pregnancies were more likely to report cigarette smoking and less likely to report daily vitamin use. Women with unintended pregnancies were also less likely to decrease consumption of caffeinated beverages or increase daily vitamin use. CONCLUSIONS: Pregnancy intention was associated with health behaviors, prior to pregnancy and in early pregnancy, that may influence pregnancy course and birth outcomes. PMID:9551015

  5. Transferring and generalizing deep-learning-based neural encoding models across subjects.

    PubMed

    Wen, Haiguang; Shi, Junxing; Chen, Wei; Liu, Zhongming

    2018-08-01

    Recent studies have shown the value of using deep learning models for mapping and characterizing how the brain represents and organizes information for natural vision. However, modeling the relationship between deep learning models and the brain (or encoding models), requires measuring cortical responses to large and diverse sets of natural visual stimuli from single subjects. This requirement limits prior studies to few subjects, making it difficult to generalize findings across subjects or for a population. In this study, we developed new methods to transfer and generalize encoding models across subjects. To train encoding models specific to a target subject, the models trained for other subjects were used as the prior models and were refined efficiently using Bayesian inference with a limited amount of data from the target subject. To train encoding models for a population, the models were progressively trained and updated with incremental data from different subjects. For the proof of principle, we applied these methods to functional magnetic resonance imaging (fMRI) data from three subjects watching tens of hours of naturalistic videos, while a deep residual neural network driven by image recognition was used to model visual cortical processing. Results demonstrate that the methods developed herein provide an efficient and effective strategy to establish both subject-specific and population-wide predictive models of cortical representations of high-dimensional and hierarchical visual features. Copyright © 2018 Elsevier Inc. All rights reserved.

  6. Where to locate a tree plantation within a low rainfall catchment to minimise impacts on groundwater resources

    NASA Astrophysics Data System (ADS)

    Dean, J. F.; Webb, J. A.; Jacobsen, G. E.; Chisari, R.; Dresel, P. E.

    2014-08-01

    Despite the fact that there are many studies that consider the impacts of plantation forestry on water resources, and others that explore the spatial heterogeneity of groundwater recharge in dry regions, there is little marriage of the two subjects in forestry management guidelines and legislation. Here we carry out an in-depth analysis of the groundwater and surface water regime in a low rainfall, high evapotranspiration paired catchment study to examine the impact of reforestation, using water table fluctuations and chloride mass balance methods to estimate groundwater recharge. Recharge estimations using the chloride mass balance method were shown to be more likely representative of groundwater recharge regimes prior to the planting of the trees, and most likely prior to widespread land clearance by European settlers. These estimations were complicated by large amounts of recharge occurring as a result of runoff and streamflow in the lower parts of the catchment. Water table fluctuation method estimations of recharge verified that groundwater recharge occurs predominantly in the lowland areas of the study catchment. This leads to the conclusion that spatial variations in recharge are important considerations for locating tree plantations with respect to conserving water resources for downstream users. For dry regions, this means planting trees in the upland parts of the catchments, as recharge is shown to occur predominantly in the lowland areas.

  7. VizieR Online Data Catalog: Wide binaries in Tycho-Gaia: search method (Andrews+, 2017)

    NASA Astrophysics Data System (ADS)

    Andrews, J. J.; Chaname, J.; Agueros, M. A.

    2017-11-01

    Our catalogue of wide binaries identified in the Tycho-Gaia Astrometric Solution catalogue. The Gaia source IDs, Tycho IDs, astrometry, posterior probabilities for both the log-flat prior and power-law prior models, and angular separation are presented. (1 data file).

  8. Nonlinear dynamics and health monitoring of 6-DOF breathing cracked Jeffcott rotor

    NASA Astrophysics Data System (ADS)

    Zhao, Jie; DeSmidt, Hans; Yao, Wei

    2015-04-01

    Jeffcott rotor is employed to study the nonlinear vibration characteristics of breathing cracked rotor system and explore the possibility of further damage identification. This paper is an extension work of prior study based on 4 degree-of-freedom Jeffcott rotor system. With consideration of disk tilting and gyroscopic effect, 6-dof EOM is derived and the crack model is established using SERR (strain energy release rate) in facture mechanics. Same as the prior work, the damaged stiffness matrix is updated by computing the instant crack closure line through Zero Stress Intensity Factor method. The breathing crack area is taken as a variable to analyze the breathing behavior in terms of eccentricity phase and shaft speed. Furthermore, the coupled vibration among lateral, torsional and longitudinal d.o.f is studied under torsional/axial excitation. The final part demonstrates the possibility of using vibration signal of damaged system for the crack diagnosis and health monitoring.

  9. A comparison of between- and within-subjects imitation designs.

    PubMed

    Kressley, Regina A; Knopf, Monika

    2006-12-01

    Two experimental methods, which have dominated the study of declarative memory in preverbal children with imitation tasks, namely the deferred imitation and elicited imitation paradigm, differ in the amount of physical contact with test stimuli afforded infants prior to a test for long-term recall. The current study assessed effects of pre- and post-demonstration contact with test stimuli on deferred imitation of novel, single-step unrelated actions with multiple objects by 8(1/2)- and 10(1/2)-month-old infants (N=50). The rate of target action completion after a delay remained consistent at both ages across different conditions of prior contact with test stimuli. This study shows that a within-subjects baseline appraisal is valid within certain experimental parameters and offers a more economical alternative. The results show furthermore that different experimental designs utilized to assess deferred imitation are highly comparable for the first year despite differences in determining baseline.

  10. A Gentle Introduction to Bayesian Analysis: Applications to Developmental Research

    PubMed Central

    van de Schoot, Rens; Kaplan, David; Denissen, Jaap; Asendorpf, Jens B; Neyer, Franz J; van Aken, Marcel AG

    2014-01-01

    Bayesian statistical methods are becoming ever more popular in applied and fundamental research. In this study a gentle introduction to Bayesian analysis is provided. It is shown under what circumstances it is attractive to use Bayesian estimation, and how to interpret properly the results. First, the ingredients underlying Bayesian methods are introduced using a simplified example. Thereafter, the advantages and pitfalls of the specification of prior knowledge are discussed. To illustrate Bayesian methods explained in this study, in a second example a series of studies that examine the theoretical framework of dynamic interactionism are considered. In the Discussion the advantages and disadvantages of using Bayesian statistics are reviewed, and guidelines on how to report on Bayesian statistics are provided. PMID:24116396

  11. Establishing pass/fail criteria for bronchoscopy performance.

    PubMed

    Konge, Lars; Clementsen, Paul; Larsen, Klaus Richter; Arendrup, Henrik; Buchwald, Christian; Ringsted, Charlotte

    2012-01-01

    Several tools have been created to assess competence in bronchoscopy. However, educational guidelines still use an arbitrary number of performed procedures to decide when basic competency is acquired. The purpose of this study was to define pass/fail scores for two bronchoscopy assessment tools, and investigate how these scores relate to physicians' experience regarding the number of bronchoscopy procedures performed. We studied two assessment tools and used two standard setting methods to create cut scores: the contrasting-groups method and the extended Angoff method. In the first we compared bronchoscopy performance scores of 14 novices with the scores of 14 experienced consultants to find the score that best discriminated between the two groups. In the second we asked an expert group of 7 experienced bronchoscopists to judge how a borderline trainee would perform on each item of the test. Using the contrasting-groups method we found a standard that would fail all novices and pass all consultants. A clear pass related to prior experience of 75 procedures. The consequences of using the extended Angoff method were also acceptable: all trainees who had performed less than 50 bronchoscopies failed the test and all consultants passed. A clear pass related to 80 procedures. Our proposed pass/fail scores for these two methods seem appropriate in terms of consequences. Prior experience with the performance of 75 and 80 bronchoscopies, respectively, seemed to ensure basic competency. In the future objective assessment tools could become an important aid in the certification of physicians performing bronchoscopies. Copyright © 2011 S. Karger AG, Basel.

  12. Risk of Alcohol-Impaired Driving Recidivism Among First Offenders and Multiple Offenders

    PubMed Central

    Zador, Paul L.; Ahlin, Eileen M.; Howard, Jan M.; Frissell, Kevin C.; Duncan, G. Doug

    2010-01-01

    Objectives. We sought to determine the statewide impact of having prior alcohol-impaired driving violations of any type on the rate of first occurrence or recidivism among drivers with 0, 1, 2, or 3 or more prior violations in Maryland. Methods. We analyzed more than 100 million driver records from 1973 to 2004 and classified all Maryland drivers into 4 groups: those with 0, 1, 2, or 3 or more prior violations. The violation rates for approximately 21 million drivers in these 4 groups were compared for the study period 1999 to 2004. Results. On average, there were 3.4, 24.3, 35.9, and 50.8 violations per 1000 drivers a year among those with 0, 1, 2, or 3 or more priors, respectively. The relative risks for men compared with women among these groups of drivers were 3.8, 1.2, 1.0, and 1.0, respectively. Conclusions. The recidivism rate among first offenders more closely resembles that of second offenders than of nonoffenders. Men and women are at equal risk of recidivating once they have had a first violation documented. Any alcohol-impaired driving violation, not just convictions, is a marker for future recidivism. PMID:19846687

  13. The Diagnostic Value of the Vacuum Phenomenon during Hip Arthroscopy

    PubMed Central

    Rath, Ehud; Gortzak, Yair; Schwarzkopf, Ran; Benkovich, Vadim; Cohen, Eugene; Atar, Dan

    2011-01-01

    The diagnostic value of the vacuum phenomenon between the femoral head and the acetabulum, and time frame of its occurrence after application of traction is an important clinical question. The resulting arthrogram may outline the shape, location, and extent of cartilage lesions prior to arthroscopy of the hip joint. The presence, duration, and diagnostic information of the vacuum phenomenon were evaluated in 24 hips that underwent arthroscopy. The operative diagnosis was compared to the results of imaging studies and to findings obtained during a traction trial prior to arthroscopy. Indications for arthroscopy included avascular necrosis, labral tears, loose bodies, osteoarthrosis, and intractable hip pain. In 22 hips the vacuum phenomenon developed within 30 seconds after application of traction. The most important data obtained from the vacuum phenomenon was the location and extent of femoral head articular cartilage detachment and the presence of nonossified loose bodies. The vacuum phenomenon did not reveal labral or acetabular cartilage pathology in any of these patients. The vacuum phenomenon obtained during the trial of traction can add valuable information prior to hip arthroscopy. Femoral head articular cartilage detachment was best documented by this method. The hip arthroscopist should utilize this diagnostic window routinely prior to hip arthroscopy. PMID:24977068

  14. The Diagnostic Value of the Vacuum Phenomenon during Hip Arthroscopy.

    PubMed

    Rath, Ehud; Gortzak, Yair; Schwarzkopf, Ran; Benkovich, Vadim; Cohen, Eugene; Atar, Dan

    2011-01-01

    The diagnostic value of the vacuum phenomenon between the femoral head and the acetabulum, and time frame of its occurrence after application of traction is an important clinical question. The resulting arthrogram may outline the shape, location, and extent of cartilage lesions prior to arthroscopy of the hip joint. The presence, duration, and diagnostic information of the vacuum phenomenon were evaluated in 24 hips that underwent arthroscopy. The operative diagnosis was compared to the results of imaging studies and to findings obtained during a traction trial prior to arthroscopy. Indications for arthroscopy included avascular necrosis, labral tears, loose bodies, osteoarthrosis, and intractable hip pain. In 22 hips the vacuum phenomenon developed within 30 seconds after application of traction. The most important data obtained from the vacuum phenomenon was the location and extent of femoral head articular cartilage detachment and the presence of nonossified loose bodies. The vacuum phenomenon did not reveal labral or acetabular cartilage pathology in any of these patients. The vacuum phenomenon obtained during the trial of traction can add valuable information prior to hip arthroscopy. Femoral head articular cartilage detachment was best documented by this method. The hip arthroscopist should utilize this diagnostic window routinely prior to hip arthroscopy.

  15. Orthodontic extrusion for pre-implant site enhancement: Principles and clinical guidelines.

    PubMed

    Alsahhaf, Abdulaziz; Att, Wael

    2016-07-01

    The aim of this paper is to provide a concise overview about the principles of pre-implant orthodontic extrusion, describe methods and techniques available and provide the clinicians with guidelines about its application. A number of reports describe orthodontic extrusion as a reliable method for pre-implant site enhancement. However, no standard protocols have been provided about the application of this technique. The literature database was searched for studies involving implant site enhancement by means of orthodontic extrusion. Information about the principles, indications and contraindications of this method, type of anchorage, force and time were obtained from the literature. Despite that the scarce data is largely limited to case reports and case series, implant site enhancement by means of orthodontic extrusion seems to be a promising option to improve soft and hard tissue conditions prior to implant placement. Orthodontic extrusion is being implemented as a treatment alternative to enhance hard and soft tissue prior to implant placement. While the current literature does not provide clear guidelines, the decision making for a specific approach seems to be based on the clinician's preferences. Clinical studies are needed to verify the validity of this treatment option. Copyright © 2016 Japan Prosthodontic Society. Published by Elsevier Ltd. All rights reserved.

  16. Comparison between Conventional Mechanical Fixation and Use of Autologous Platelet Rich Plasma (PRP) in Wound Beds Prior to Resurfacing with Split Thickness Skin Graft.

    PubMed

    P Waiker, Veena; Shivalingappa, Shanthakumar

    2015-01-01

    Platelet rich plasma is known for its hemostatic, adhesive and healing properties in view of the multiple growth factors released from the platelets to the site of wound. The primary objective of this study was to use autologous platelet rich plasma (PRP) in wound beds for anchorage of skin grafts instead of conventional methods like sutures, staplers or glue. In a single center based randomized controlled prospective study of nine months duration, 200 patients with wounds were divided into two equal groups. Autologous PRP was applied on wound beds in PRP group and conventional methods like staples/sutures used to anchor the skin grafts in a control group. Instant graft adherence to wound bed was statistically significant in the PRP group. Time of first post-graft inspection was delayed, and hematoma, graft edema, discharge from graft site, frequency of dressings and duration of stay in plastic surgery unit were significantly less in the PRP group. Autologous PRP ensured instant skin graft adherence to wound bed in comparison to conventional methods of anchorage. Hence, we recommend the use of autologous PRP routinely on wounds prior to resurfacing to ensure the benefits of early healing.

  17. Spatially adapted augmentation of age-specific atlas-based segmentation using patch-based priors

    NASA Astrophysics Data System (ADS)

    Liu, Mengyuan; Seshamani, Sharmishtaa; Harrylock, Lisa; Kitsch, Averi; Miller, Steven; Chau, Van; Poskitt, Kenneth; Rousseau, Francois; Studholme, Colin

    2014-03-01

    One of the most common approaches to MRI brain tissue segmentation is to employ an atlas prior to initialize an Expectation- Maximization (EM) image labeling scheme using a statistical model of MRI intensities. This prior is commonly derived from a set of manually segmented training data from the population of interest. However, in cases where subject anatomy varies significantly from the prior anatomical average model (for example in the case where extreme developmental abnormalities or brain injuries occur), the prior tissue map does not provide adequate information about the observed MRI intensities to ensure the EM algorithm converges to an anatomically accurate labeling of the MRI. In this paper, we present a novel approach for automatic segmentation of such cases. This approach augments the atlas-based EM segmentation by exploring methods to build a hybrid tissue segmentation scheme that seeks to learn where an atlas prior fails (due to inadequate representation of anatomical variation in the statistical atlas) and utilize an alternative prior derived from a patch driven search of the atlas data. We describe a framework for incorporating this patch-based augmentation of EM (PBAEM) into a 4D age-specific atlas-based segmentation of developing brain anatomy. The proposed approach was evaluated on a set of MRI brain scans of premature neonates with ages ranging from 27.29 to 46.43 gestational weeks (GWs). Results indicated superior performance compared to the conventional atlas-based segmentation method, providing improved segmentation accuracy for gray matter, white matter, ventricles and sulcal CSF regions.

  18. Stepwise group sparse regression (SGSR): gene-set-based pharmacogenomic predictive models with stepwise selection of functional priors.

    PubMed

    Jang, In Sock; Dienstmann, Rodrigo; Margolin, Adam A; Guinney, Justin

    2015-01-01

    Complex mechanisms involving genomic aberrations in numerous proteins and pathways are believed to be a key cause of many diseases such as cancer. With recent advances in genomics, elucidating the molecular basis of cancer at a patient level is now feasible, and has led to personalized treatment strategies whereby a patient is treated according to his or her genomic profile. However, there is growing recognition that existing treatment modalities are overly simplistic, and do not fully account for the deep genomic complexity associated with sensitivity or resistance to cancer therapies. To overcome these limitations, large-scale pharmacogenomic screens of cancer cell lines--in conjunction with modern statistical learning approaches--have been used to explore the genetic underpinnings of drug response. While these analyses have demonstrated the ability to infer genetic predictors of compound sensitivity, to date most modeling approaches have been data-driven, i.e. they do not explicitly incorporate domain-specific knowledge (priors) in the process of learning a model. While a purely data-driven approach offers an unbiased perspective of the data--and may yield unexpected or novel insights--this strategy introduces challenges for both model interpretability and accuracy. In this study, we propose a novel prior-incorporated sparse regression model in which the choice of informative predictor sets is carried out by knowledge-driven priors (gene sets) in a stepwise fashion. Under regularization in a linear regression model, our algorithm is able to incorporate prior biological knowledge across the predictive variables thereby improving the interpretability of the final model with no loss--and often an improvement--in predictive performance. We evaluate the performance of our algorithm compared to well-known regularization methods such as LASSO, Ridge and Elastic net regression in the Cancer Cell Line Encyclopedia (CCLE) and Genomics of Drug Sensitivity in Cancer (Sanger) pharmacogenomics datasets, demonstrating that incorporation of the biological priors selected by our model confers improved predictability and interpretability, despite much fewer predictors, over existing state-of-the-art methods.

  19. Sliding window prior data assisted compressed sensing for MRI tracking of lung tumors.

    PubMed

    Yip, Eugene; Yun, Jihyun; Wachowicz, Keith; Gabos, Zsolt; Rathee, Satyapal; Fallone, B G

    2017-01-01

    Hybrid magnetic resonance imaging and radiation therapy devices are capable of imaging in real-time to track intrafractional lung tumor motion during radiotherapy. Highly accelerated magnetic resonance (MR) imaging methods can potentially reduce system delay time and/or improves imaging spatial resolution, and provide flexibility in imaging parameters. Prior Data Assisted Compressed Sensing (PDACS) has previously been proposed as an acceleration method that combines the advantages of 2D compressed sensing and the KEYHOLE view-sharing technique. However, as PDACS relies on prior data acquired at the beginning of a dynamic imaging sequence, decline in image quality occurs for longer duration scans due to drifts in MR signal. Novel sliding window-based techniques for refreshing prior data are proposed as a solution to this problem. MR acceleration is performed by retrospective removal of data from the fully sampled sets. Six patients with lung tumors are scanned with a clinical 3 T MRI using a balanced steady-state free precession (bSSFP) sequence for 3 min at approximately 4 frames per second, for a total of 650 dynamics. A series of distinct pseudo-random patterns of partial k-space acquisition is generated such that, when combined with other dynamics within a sliding window of 100 dynamics, covers the entire k-space. The prior data in the sliding window are continuously refreshed to reduce the impact of MR signal drifts. We intended to demonstrate two different ways to utilize the sliding window data: a simple averaging method and a navigator-based method. These two sliding window methods are quantitatively compared against the original PDACS method using three metrics: artifact power, centroid displacement error, and Dice's coefficient. The study is repeated with pseudo 0.5 T images by adding complex, normally distributed noise with a standard deviation that reduces image SNR, relative to original 3 T images, by a factor of 6. Without sliding window implemented, PDACS-reconstructed dynamic datasets showed progressive increases in image artifact power as the 3 min scan progresses. With sliding windows implemented, this increase in artifact power is eliminated. Near the end of a 3 min scan at 3 T SNR and 5× acceleration, implementation of an averaging (navigator) sliding window method improves our metrics by the following ways: artifact power decreases from 0.065 without sliding window to 0.030 (0.031), centroid error decreases from 2.64 to 1.41 mm (1.28 mm), and Dice coefficient agreement increases from 0.860 to 0.912 (0.915). At pseudo 0.5 T SNR, the improvements in metrics are as follows: artifact power decreases from 0.110 without sliding window to 0.0897 (0.0985), centroid error decreases from 2.92 mm to 1.36 mm (1.32 mm), and Dice coefficient agreements increases from 0.851 to 0.894 (0.896). In this work we demonstrated the negative impact of slow changes in MR signal for longer duration PDACS dynamic scans, namely increases in image artifact power and reductions of tumor tracking accuracy. We have also demonstrated sliding window implementations (i.e., refreshing of prior data) of PDACS are effective solutions to this problem at both 3 T and simulated 0.5 T bSSFP images. © 2016 American Association of Physicists in Medicine.

  20. A prior feature SVM – MRF based method for mouse brain segmentation

    PubMed Central

    Wu, Teresa; Bae, Min Hyeok; Zhang, Min; Pan, Rong; Badea, Alexandra

    2012-01-01

    We introduce an automated method, called prior feature Support Vector Machine- Markov Random Field (pSVMRF), to segment three-dimensional mouse brain Magnetic Resonance Microscopy (MRM) images. Our earlier work, extended MRF (eMRF) integrated Support Vector Machine (SVM) and Markov Random Field (MRF) approaches, leading to improved segmentation accuracy; however, the computation of eMRF is very expensive, which may limit its performance on segmentation and robustness. In this study pSVMRF reduces training and testing time for SVM, while boosting segmentation performance. Unlike the eMRF approach, where MR intensity information and location priors are linearly combined, pSVMRF combines this information in a nonlinear fashion, and enhances the discriminative ability of the algorithm. We validate the proposed method using MR imaging of unstained and actively stained mouse brain specimens, and compare segmentation accuracy with two existing methods: eMRF and MRF. C57BL/6 mice are used for training and testing, using cross validation. For formalin fixed C57BL/6 specimens, pSVMRF outperforms both eMRF and MRF. The segmentation accuracy for C57BL/6 brains, stained or not, was similar for larger structures like hippocampus and caudate putamen, (~87%), but increased substantially for smaller regions like susbtantia nigra (from 78.36% to 91.55%), and anterior commissure (from ~50% to ~80%). To test segmentation robustness against increased anatomical variability we add two strains, BXD29 and a transgenic mouse model of Alzheimer’s Disease. Segmentation accuracy for new strains is 80% for hippocampus, and caudate putamen, indicating that pSVMRF is a promising approach for phenotyping mouse models of human brain disorders. PMID:21988893

  1. A prior feature SVM-MRF based method for mouse brain segmentation.

    PubMed

    Wu, Teresa; Bae, Min Hyeok; Zhang, Min; Pan, Rong; Badea, Alexandra

    2012-02-01

    We introduce an automated method, called prior feature Support Vector Machine-Markov Random Field (pSVMRF), to segment three-dimensional mouse brain Magnetic Resonance Microscopy (MRM) images. Our earlier work, extended MRF (eMRF) integrated Support Vector Machine (SVM) and Markov Random Field (MRF) approaches, leading to improved segmentation accuracy; however, the computation of eMRF is very expensive, which may limit its performance on segmentation and robustness. In this study pSVMRF reduces training and testing time for SVM, while boosting segmentation performance. Unlike the eMRF approach, where MR intensity information and location priors are linearly combined, pSVMRF combines this information in a nonlinear fashion, and enhances the discriminative ability of the algorithm. We validate the proposed method using MR imaging of unstained and actively stained mouse brain specimens, and compare segmentation accuracy with two existing methods: eMRF and MRF. C57BL/6 mice are used for training and testing, using cross validation. For formalin fixed C57BL/6 specimens, pSVMRF outperforms both eMRF and MRF. The segmentation accuracy for C57BL/6 brains, stained or not, was similar for larger structures like hippocampus and caudate putamen, (~87%), but increased substantially for smaller regions like susbtantia nigra (from 78.36% to 91.55%), and anterior commissure (from ~50% to ~80%). To test segmentation robustness against increased anatomical variability we add two strains, BXD29 and a transgenic mouse model of Alzheimer's disease. Segmentation accuracy for new strains is 80% for hippocampus, and caudate putamen, indicating that pSVMRF is a promising approach for phenotyping mouse models of human brain disorders. Copyright © 2011 Elsevier Inc. All rights reserved.

  2. Alpha test results for a Housing First eLearning strategy: the value of multiple qualitative methods for intervention design.

    PubMed

    Ahonen, Emily Q; Watson, Dennis P; Adams, Erin L; McGuire, Alan

    2017-01-01

    Detailed descriptions of implementation strategies are lacking, and there is a corresponding dearth of information regarding methods employed in implementation strategy development. This paper describes methods and findings related to the alpha testing of eLearning modules developed as part of the Housing First Technical Assistance and Training (HFTAT) program's development. Alpha testing is an approach for improving the quality of a product prior to beta (i.e., real world) testing with potential applications for intervention development. Ten participants in two cities tested the modules. We collected data through (1) a structured log where participants were asked to record their experiences as they worked through the modules; (2) a brief online questionnaire delivered at the end of each module; and (3) focus groups. The alpha test provided useful data related to the acceptability and feasibility of eLearning as an implementation strategy, as well as identifying a number of technical issues and bugs. Each of the qualitative methods used provided unique and valuable information. In particular, logs were the most useful for identifying technical issues, and focus groups provided high quality data regarding how the intervention could best be used as an implementation strategy. Alpha testing was a valuable step in intervention development, providing us an understanding of issues that would have been more difficult to address at a later stage of the study. As a result, we were able to improve the modules prior to pilot testing of the entire HFTAT. Researchers wishing to alpha test interventions prior to piloting should balance the unique benefits of different data collection approaches with the need to minimize burdens for themselves and participants.

  3. Joint image restoration and location in visual navigation system

    NASA Astrophysics Data System (ADS)

    Wu, Yuefeng; Sang, Nong; Lin, Wei; Shao, Yuanjie

    2018-02-01

    Image location methods are the key technologies of visual navigation, most previous image location methods simply assume the ideal inputs without taking into account the real-world degradations (e.g. low resolution and blur). In view of such degradations, the conventional image location methods first perform image restoration and then match the restored image on the reference image. However, the defective output of the image restoration can affect the result of localization, by dealing with the restoration and location separately. In this paper, we present a joint image restoration and location (JRL) method, which utilizes the sparse representation prior to handle the challenging problem of low-quality image location. The sparse representation prior states that the degraded input image, if correctly restored, will have a good sparse representation in terms of the dictionary constructed from the reference image. By iteratively solving the image restoration in pursuit of the sparest representation, our method can achieve simultaneous restoration and location. Based on such a sparse representation prior, we demonstrate that the image restoration task and the location task can benefit greatly from each other. Extensive experiments on real scene images with Gaussian blur are carried out and our joint model outperforms the conventional methods of treating the two tasks independently.

  4. Efficacy, safety, and improved tolerability of travoprost BAK-free ophthalmic solution compared with prior prostaglandin therapy

    PubMed Central

    Henry, J Charles; Peace, James H; Stewart, Jeanette A; Stewart, William C

    2008-01-01

    Purpose To evaluate the efficacy, safety and tolerability of changing to travoprost BAK-free from prior prostaglandin therapy in patients with primary open-angle glaucoma or ocular hypertension. Design Prospective, multi-center, historical control study. Methods Patients treated with latanoprost or bimatoprost who needed alternative therapy due to tolerability issues were enrolled. Patients were surveyed using the Ocular Surface Disease Index (OSDI) to evaluate OSD symptoms prior to changing to travoprost BAK-free dosed once every evening. Patients were re-evaluated 3 months later. Results In 691 patients, travoprost BAK-free demonstrated improved mean OSDI scores compared to either latanoprost or bimatoprost (p < 0.0001). Patients having any baseline OSD symptoms (n = 235) demonstrated significant improvement after switching to travoprost BAK-free (p < 0.0001). In 70.2% of these patients, symptoms were reduced in severity by at least 1 level. After changing medications to travoprost BAK-free, mean intraocular pressure (IOP) was significantly decreased (p < 0.0001). Overall, 72.4% preferred travoprost BAK-free (p < 0.0001, travoprost BAK-free vs prior therapy). Travoprost BAK-free demonstrated less conjunctival hyperemia than either prior therapy (p < 0.0001). Conclusions Patients previously treated with a BAK-preserved prostaglandin analog who are changed to travoprost BAK-free have clinically and statistically significant improvement in their OSD symptoms, decreased hyperemia, and equal or better IOP control. PMID:19668762

  5. A Bayesian Approach for Image Segmentation with Shape Priors

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chang, Hang; Yang, Qing; Parvin, Bahram

    2008-06-20

    Color and texture have been widely used in image segmentation; however, their performance is often hindered by scene ambiguities, overlapping objects, or missingparts. In this paper, we propose an interactive image segmentation approach with shape prior models within a Bayesian framework. Interactive features, through mouse strokes, reduce ambiguities, and the incorporation of shape priors enhances quality of the segmentation where color and/or texture are not solely adequate. The novelties of our approach are in (i) formulating the segmentation problem in a well-de?ned Bayesian framework with multiple shape priors, (ii) ef?ciently estimating parameters of the Bayesian model, and (iii) multi-object segmentationmore » through user-speci?ed priors. We demonstrate the effectiveness of our method on a set of natural and synthetic images.« less

  6. Evoked prior learning experience and approach to learning as predictors of academic achievement.

    PubMed

    Trigwell, Keith; Ashwin, Paul; Millan, Elena S

    2013-09-01

    In separate studies and research from different perspectives, five factors are found to be among those related to higher quality outcomes of student learning (academic achievement). Those factors are higher self-efficacy, deeper approaches to learning, higher quality teaching, students' perceptions that their workload is appropriate, and greater learning motivation. University learning improvement strategies have been built on these research results. To investigate how students' evoked prior experience, perceptions of their learning environment, and their approaches to learning collectively contribute to academic achievement. This is the first study to investigate motivation and self-efficacy in the same educational context as conceptions of learning, approaches to learning and perceptions of the learning environment. Undergraduate students (773) from the full range of disciplines were part of a group of over 2,300 students who volunteered to complete a survey of their learning experience. On completing their degrees 6 and 18 months later, their academic achievement was matched with their learning experience survey data. A 77-item questionnaire was used to gather students' self-report of their evoked prior experience (self-efficacy, learning motivation, and conceptions of learning), perceptions of learning context (teaching quality and appropriate workload), and approaches to learning (deep and surface). Academic achievement was measured using the English honours degree classification system. Analyses were conducted using correlational and multi-variable (structural equation modelling) methods. The results from the correlation methods confirmed those found in numerous earlier studies. The results from the multi-variable analyses indicated that surface approach to learning was the strongest predictor of academic achievement, with self-efficacy and motivation also found to be directly related. In contrast to the correlation results, a deep approach to learning was not related to academic achievement, and teaching quality and conceptions of learning were only indirectly related to achievement. Research aimed at understanding how students experience their learning environment and how that experience relates to the quality of their learning needs to be conducted using a wider range of variables and more sophisticated analytical methods. In this study of one context, some of the relations found in earlier bivariate studies, and on which learning intervention strategies have been built, are not confirmed when more holistic teaching-learning contexts are analysed using multi-variable methods. © 2012 The British Psychological Society.

  7. Bias in the physical examination of patients with lumbar radiculopathy

    PubMed Central

    2010-01-01

    Background No prior studies have examined systematic bias in the musculoskeletal physical examination. The objective of this study was to assess the effects of bias due to prior knowledge of lumbar spine magnetic resonance imaging findings (MRI) on perceived diagnostic accuracy of the physical examination for lumbar radiculopathy. Methods This was a cross-sectional comparison of the performance characteristics of the physical examination with blinding to MRI results (the 'independent group') with performance in the situation where the physical examination was not blinded to MRI results (the 'non-independent group'). The reference standard was the final diagnostic impression of nerve root impingement by the examining physician. Subjects were recruited from a hospital-based outpatient specialty spine clinic. All adults age 18 and older presenting with lower extremity radiating pain of duration ≤ 12 weeks were evaluated for participation. 154 consecutively recruited subjects with lumbar disk herniation confirmed by lumbar spine MRI were included in this study. Sensitivities and specificities with 95% confidence intervals were calculated in the independent and non-independent groups for the four components of the radiculopathy examination: 1) provocative testing, 2) motor strength testing, 3) pinprick sensory testing, and 4) deep tendon reflex testing. Results The perceived sensitivity of sensory testing was higher with prior knowledge of MRI results (20% vs. 36%; p = 0.05). Sensitivities and specificities for exam components otherwise showed no statistically significant differences between groups. Conclusions Prior knowledge of lumbar MRI results may introduce bias into the pinprick sensory testing component of the physical examination for lumbar radiculopathy. No statistically significant effect of bias was seen for other components of the physical examination. The effect of bias due to prior knowledge of lumbar MRI results should be considered when an isolated sensory deficit on examination is used in medical decision-making. Further studies of bias should include surgical clinic populations and other common diagnoses including shoulder, knee and hip pathology. PMID:21118558

  8. Detection of Single Standing Dead Trees from Aerial Color Infrared Imagery by Segmentation with Shape and Intensity Priors

    NASA Astrophysics Data System (ADS)

    Polewski, P.; Yao, W.; Heurich, M.; Krzystek, P.; Stilla, U.

    2015-03-01

    Standing dead trees, known as snags, are an essential factor in maintaining biodiversity in forest ecosystems. Combined with their role as carbon sinks, this makes for a compelling reason to study their spatial distribution. This paper presents an integrated method to detect and delineate individual dead tree crowns from color infrared aerial imagery. Our approach consists of two steps which incorporate statistical information about prior distributions of both the image intensities and the shapes of the target objects. In the first step, we perform a Gaussian Mixture Model clustering in the pixel color space with priors on the cluster means, obtaining up to 3 components corresponding to dead trees, living trees, and shadows. We then refine the dead tree regions using a level set segmentation method enriched with a generative model of the dead trees' shape distribution as well as a discriminative model of their pixel intensity distribution. The iterative application of the statistical shape template yields the set of delineated dead crowns. The prior information enforces the consistency of the template's shape variation with the shape manifold defined by manually labeled training examples, which makes it possible to separate crowns located in close proximity and prevents the formation of large crown clusters. Also, the statistical information built into the segmentation gives rise to an implicit detection scheme, because the shape template evolves towards an empty contour if not enough evidence for the object is present in the image. We test our method on 3 sample plots from the Bavarian Forest National Park with reference data obtained by manually marking individual dead tree polygons in the images. Our results are scenario-dependent and range from a correctness/completeness of 0.71/0.81 up to 0.77/1, with an average center-of-gravity displacement of 3-5 pixels between the detected and reference polygons.

  9. Comparing interval estimates for small sample ordinal CFA models

    PubMed Central

    Natesan, Prathiba

    2015-01-01

    Robust maximum likelihood (RML) and asymptotically generalized least squares (AGLS) methods have been recommended for fitting ordinal structural equation models. Studies show that some of these methods underestimate standard errors. However, these studies have not investigated the coverage and bias of interval estimates. An estimate with a reasonable standard error could still be severely biased. This can only be known by systematically investigating the interval estimates. The present study compares Bayesian, RML, and AGLS interval estimates of factor correlations in ordinal confirmatory factor analysis models (CFA) for small sample data. Six sample sizes, 3 factor correlations, and 2 factor score distributions (multivariate normal and multivariate mildly skewed) were studied. Two Bayesian prior specifications, informative and relatively less informative were studied. Undercoverage of confidence intervals and underestimation of standard errors was common in non-Bayesian methods. Underestimated standard errors may lead to inflated Type-I error rates. Non-Bayesian intervals were more positive biased than negatively biased, that is, most intervals that did not contain the true value were greater than the true value. Some non-Bayesian methods had non-converging and inadmissible solutions for small samples and non-normal data. Bayesian empirical standard error estimates for informative and relatively less informative priors were closer to the average standard errors of the estimates. The coverage of Bayesian credibility intervals was closer to what was expected with overcoverage in a few cases. Although some Bayesian credibility intervals were wider, they reflected the nature of statistical uncertainty that comes with the data (e.g., small sample). Bayesian point estimates were also more accurate than non-Bayesian estimates. The results illustrate the importance of analyzing coverage and bias of interval estimates, and how ignoring interval estimates can be misleading. Therefore, editors and policymakers should continue to emphasize the inclusion of interval estimates in research. PMID:26579002

  10. Comparing interval estimates for small sample ordinal CFA models.

    PubMed

    Natesan, Prathiba

    2015-01-01

    Robust maximum likelihood (RML) and asymptotically generalized least squares (AGLS) methods have been recommended for fitting ordinal structural equation models. Studies show that some of these methods underestimate standard errors. However, these studies have not investigated the coverage and bias of interval estimates. An estimate with a reasonable standard error could still be severely biased. This can only be known by systematically investigating the interval estimates. The present study compares Bayesian, RML, and AGLS interval estimates of factor correlations in ordinal confirmatory factor analysis models (CFA) for small sample data. Six sample sizes, 3 factor correlations, and 2 factor score distributions (multivariate normal and multivariate mildly skewed) were studied. Two Bayesian prior specifications, informative and relatively less informative were studied. Undercoverage of confidence intervals and underestimation of standard errors was common in non-Bayesian methods. Underestimated standard errors may lead to inflated Type-I error rates. Non-Bayesian intervals were more positive biased than negatively biased, that is, most intervals that did not contain the true value were greater than the true value. Some non-Bayesian methods had non-converging and inadmissible solutions for small samples and non-normal data. Bayesian empirical standard error estimates for informative and relatively less informative priors were closer to the average standard errors of the estimates. The coverage of Bayesian credibility intervals was closer to what was expected with overcoverage in a few cases. Although some Bayesian credibility intervals were wider, they reflected the nature of statistical uncertainty that comes with the data (e.g., small sample). Bayesian point estimates were also more accurate than non-Bayesian estimates. The results illustrate the importance of analyzing coverage and bias of interval estimates, and how ignoring interval estimates can be misleading. Therefore, editors and policymakers should continue to emphasize the inclusion of interval estimates in research.

  11. Association between prostate cancer and urinary calculi: a population-based study.

    PubMed

    Chung, Shiu-Dong; Liu, Shih-Ping; Lin, Herng-Ching

    2013-01-01

    Understanding the reasons underlying the emerging trend and the changing demographics of Asian prostate cancer (PC) has become an important field of study. This study set out to explore the possibility that urinary calculi (UC) and PC may share an association by conducting a case-control study on a population-based database in Taiwan. The cases of this study included 2,900 subjects ≥ 40 years-old who had received their first-time diagnosis of PC and 14,500 randomly selected controls without PC. Conditional logistic regressions were employed to explore the association between PC and having been previously diagnosed with UC. We found that prior UC was found among 608 (21.0%) cases and 2,037 (14.1%) controls (p<0.001). Conditional logistic regression analysis revealed that compared to controls, the odds ratio (OR) of prior UC for cases was 1.63 (95% CI = 1.47-1.80). Furthermore, we found that cases were more likely to have been previously diagnosed with kidney calculus (OR = 1.71; 95% CI = 1.42-2.05), bladder calculus (OR = 2.06; 95% CI = 1.32-3.23), unspecified calculus (OR = 1.66; 95% CI = 1.37-2.00), and ≥2 locations of UC (OR = 1.73; 1.47-2.02) than controls. However, there was no significant relationship between PC and prior ureter calculus. We also found that of the patients with UC, there was no significant difference between PC and treatment method. This investigation detected an association between PC and prior UC. These results highlight a potential target population for PC screening.

  12. Outdoor fine particles and nonfatal strokes: systematic review and meta-analysis.

    PubMed

    Shin, Hwashin H; Fann, Neal; Burnett, Richard T; Cohen, Aaron; Hubbell, Bryan J

    2014-11-01

    Epidemiologic studies find that long- and short-term exposure to fine particles (PM2.5) is associated with adverse cardiovascular outcomes, including ischemic and hemorrhagic strokes. However, few systematic reviews or meta-analyses have synthesized these results. We reviewed epidemiologic studies that estimated the risks of nonfatal strokes attributable to ambient PM2.5. To pool risks among studies we used a random-effects model and 2 Bayesian approaches. The first Bayesian approach assumes a normal prior that allows risks to be zero, positive or negative. The second assumes a gamma prior, where risks can only be positive. This second approach is proposed when the number of studies pooled is small, and there is toxicological or clinical literature to support a causal relation. We identified 20 studies suitable for quantitative meta-analysis. Evidence for publication bias is limited. The frequentist meta-analysis produced pooled risk ratios of 1.06 (95% confidence interval = 1.00-1.13) and 1.007 (1.003-1.010) for long- and short-term effects, respectively. The Bayesian meta-analysis found a posterior mean risk ratio of 1.08 (95% posterior interval = 0.96-1.26) and 1.008 (1.003-1.013) from a normal prior, and of 1.05 (1.02-1.10) and 1.008 (1.004-1.013) from a gamma prior, for long- and short-term effects, respectively, per 10 μg/m PM2.5. Sufficient evidence exists to develop a concentration-response relation for short- and long-term exposures to PM2.5 and stroke incidence. Long-term exposures to PM2.5 result in a higher risk ratio than short-term exposures, regardless of the pooling method. The evidence for short-term PM2.5-related ischemic stroke is especially strong.

  13. The inSIGHT study: costs and effects of routine hysteroscopy prior to a first IVF treatment cycle. A randomised controlled trial

    PubMed Central

    2012-01-01

    Background In in vitro fertilization (IVF) and intracytoplasmatic sperm injection (ICSI) treatment a large drop is present between embryo transfer and occurrence of pregnancy. The implantation rate per embryo transferred is only 30%. Studies have shown that minor intrauterine abnormalities can be found in 11–45% of infertile women with a normal transvaginal sonography or hysterosalpingography. Two randomised controlled trials have indicated that detection and treatment of these abnormalities by office hysteroscopy after two failed IVF cycles leads to a 9–13% increase in pregnancy rate. Therefore, screening of all infertile women for intracavitary pathology prior to the start of IVF/ICSI is increasingly advocated. In absence of a scientific basis for such a policy, this study will assess the effects and costs of screening for and treatment of unsuspected intrauterine abnormalities by routine office hysteroscopy, with or without saline infusion sonography (SIS), prior to a first IVF/ICSI cycle. Methods/design Multicenter randomised controlled trial in asymptomatic subfertile women, indicated for a first IVF/ICSI treatment cycle, with normal findings at transvaginal sonography. Women with recurrent miscarriages, prior hysteroscopy treatment and intermenstrual blood loss will not be included. Participants will be randomised for a routine fertility work-up with additional (SIS and) hysteroscopy with on-the-spot-treatment of predefined intrauterine abnormalities versus the regular fertility work-up without additional diagnostic tests. The primary study outcome is the cumulative ongoing pregnancy rate resulting in live birth achieved within 18 months of IVF/ICSI treatment after randomisation. Secondary study outcome parameters are the cumulative implantation rate; cumulative miscarriage rate; patient preference and patient tolerance of a SIS and hysteroscopy procedure. All data will be analysed according to the intention-to-treat principle, using univariate and multivariate logistic regression and cox regression. Cost-effectiveness analysis will be performed to evaluate the costs of the additional tests as routine procedure. In total 700 patients will be included in this study. Discussion The results of this study will help to clarify the significance of hysteroscopy prior to IVF treatment. Trial registration NCT01242852 PMID:22873367

  14. Methods to control for unmeasured confounding in pharmacoepidemiology: an overview.

    PubMed

    Uddin, Md Jamal; Groenwold, Rolf H H; Ali, Mohammed Sanni; de Boer, Anthonius; Roes, Kit C B; Chowdhury, Muhammad A B; Klungel, Olaf H

    2016-06-01

    Background Unmeasured confounding is one of the principal problems in pharmacoepidemiologic studies. Several methods have been proposed to detect or control for unmeasured confounding either at the study design phase or the data analysis phase. Aim of the Review To provide an overview of commonly used methods to detect or control for unmeasured confounding and to provide recommendations for proper application in pharmacoepidemiology. Methods/Results Methods to control for unmeasured confounding in the design phase of a study are case only designs (e.g., case-crossover, case-time control, self-controlled case series) and the prior event rate ratio adjustment method. Methods that can be applied in the data analysis phase include, negative control method, perturbation variable method, instrumental variable methods, sensitivity analysis, and ecological analysis. A separate group of methods are those in which additional information on confounders is collected from a substudy. The latter group includes external adjustment, propensity score calibration, two-stage sampling, and multiple imputation. Conclusion As the performance and application of the methods to handle unmeasured confounding may differ across studies and across databases, we stress the importance of using both statistical evidence and substantial clinical knowledge for interpretation of the study results.

  15. Uncertainty plus prior equals rational bias: an intuitive Bayesian probability weighting function.

    PubMed

    Fennell, John; Baddeley, Roland

    2012-10-01

    Empirical research has shown that when making choices based on probabilistic options, people behave as if they overestimate small probabilities, underestimate large probabilities, and treat positive and negative outcomes differently. These distortions have been modeled using a nonlinear probability weighting function, which is found in several nonexpected utility theories, including rank-dependent models and prospect theory; here, we propose a Bayesian approach to the probability weighting function and, with it, a psychological rationale. In the real world, uncertainty is ubiquitous and, accordingly, the optimal strategy is to combine probability statements with prior information using Bayes' rule. First, we show that any reasonable prior on probabilities leads to 2 of the observed effects; overweighting of low probabilities and underweighting of high probabilities. We then investigate 2 plausible kinds of priors: informative priors based on previous experience and uninformative priors of ignorance. Individually, these priors potentially lead to large problems of bias and inefficiency, respectively; however, when combined using Bayesian model comparison methods, both forms of prior can be applied adaptively, gaining the efficiency of empirical priors and the robustness of ignorance priors. We illustrate this for the simple case of generic good and bad options, using Internet blogs to estimate the relevant priors of inference. Given this combined ignorant/informative prior, the Bayesian probability weighting function is not only robust and efficient but also matches all of the major characteristics of the distortions found in empirical research. PsycINFO Database Record (c) 2012 APA, all rights reserved.

  16. Network inference using informative priors

    PubMed Central

    Mukherjee, Sach; Speed, Terence P.

    2008-01-01

    Recent years have seen much interest in the study of systems characterized by multiple interacting components. A class of statistical models called graphical models, in which graphs are used to represent probabilistic relationships between variables, provides a framework for formal inference regarding such systems. In many settings, the object of inference is the network structure itself. This problem of “network inference” is well known to be a challenging one. However, in scientific settings there is very often existing information regarding network connectivity. A natural idea then is to take account of such information during inference. This article addresses the question of incorporating prior information into network inference. We focus on directed models called Bayesian networks, and use Markov chain Monte Carlo to draw samples from posterior distributions over network structures. We introduce prior distributions on graphs capable of capturing information regarding network features including edges, classes of edges, degree distributions, and sparsity. We illustrate our approach in the context of systems biology, applying our methods to network inference in cancer signaling. PMID:18799736

  17. Network inference using informative priors.

    PubMed

    Mukherjee, Sach; Speed, Terence P

    2008-09-23

    Recent years have seen much interest in the study of systems characterized by multiple interacting components. A class of statistical models called graphical models, in which graphs are used to represent probabilistic relationships between variables, provides a framework for formal inference regarding such systems. In many settings, the object of inference is the network structure itself. This problem of "network inference" is well known to be a challenging one. However, in scientific settings there is very often existing information regarding network connectivity. A natural idea then is to take account of such information during inference. This article addresses the question of incorporating prior information into network inference. We focus on directed models called Bayesian networks, and use Markov chain Monte Carlo to draw samples from posterior distributions over network structures. We introduce prior distributions on graphs capable of capturing information regarding network features including edges, classes of edges, degree distributions, and sparsity. We illustrate our approach in the context of systems biology, applying our methods to network inference in cancer signaling.

  18. Normal-faulting stress state associated with low differential stress in an overriding plate in northeast Japan prior to the 2011 Mw 9.0 Tohoku earthquake

    NASA Astrophysics Data System (ADS)

    Otsubo, Makoto; Miyakawa, Ayumu; Imanishi, Kazutoshi

    2018-03-01

    Spatial and temporal variations in inland crustal stress prior to the 2011 Mw 9.0 Tohoku earthquake are investigated using focal mechanism solutions for shallow seismicity in Iwaki City, Japan. The multiple inverse method of stress tensor inversion detected two normal-faulting stress states that dominate in different regions. The stress field around Iwaki City changed from a NNW-SSE-trending triaxial extensional stress (stress regime A) to a NW-SE-trending axial tension (stress regime B) between 2005 and 2008. These stress changes may be the result of accumulated extensional stress associated with co- and post-seismic deformation due to the M7 class earthquakes. In this study we suggest that the stress state around Iwaki City prior to the 2011 Tohoku earthquake may have been extensional with a low differential stress. High pore pressure is required to cause earthquakes under such small differential stresses.

  19. Violence Against Women in Mexico: A Study of Abuse Before and During Pregnancy

    PubMed Central

    Castro, Roberto; Peek-Asa, Corinne; Ruiz, Agustin

    2003-01-01

    Objective. We identified the prevalence and types of violence experienced by pregnant women, the ways victimization changed during pregnancy from the year prior to pregnancy, and factors associated with violence during pregnancy. Methods. We interviewed 914 pregnant women treated in health clinics in Mexico about violence during and prior to pregnancy, violence during childhood and against their own children, and other socioeconomic indicators. Results. Approximately one quarter of the women experienced violence during pregnancy. The severity of emotional violence increased during pregnancy, whereas physical and sexual violence decreased. The strongest predictors of abuse were violence prior to pregnancy, low socioeconomic status, parental violence witnessed by women in childhood, and violence in the abusive partner’s childhood. The probability of violence during pregnancy for women experiencing all of these factors was 61%. Conclusions. Violence is common among pregnant women, but pregnancy does not appear to be an initiating factor. Intergenerational violence is highly predictive of violence during pregnancy. PMID:12835194

  20. Bayesian image reconstruction for improving detection performance of muon tomography.

    PubMed

    Wang, Guobao; Schultz, Larry J; Qi, Jinyi

    2009-05-01

    Muon tomography is a novel technology that is being developed for detecting high-Z materials in vehicles or cargo containers. Maximum likelihood methods have been developed for reconstructing the scattering density image from muon measurements. However, the instability of maximum likelihood estimation often results in noisy images and low detectability of high-Z targets. In this paper, we propose using regularization to improve the image quality of muon tomography. We formulate the muon reconstruction problem in a Bayesian framework by introducing a prior distribution on scattering density images. An iterative shrinkage algorithm is derived to maximize the log posterior distribution. At each iteration, the algorithm obtains the maximum a posteriori update by shrinking an unregularized maximum likelihood update. Inverse quadratic shrinkage functions are derived for generalized Laplacian priors and inverse cubic shrinkage functions are derived for generalized Gaussian priors. Receiver operating characteristic studies using simulated data demonstrate that the Bayesian reconstruction can greatly improve the detection performance of muon tomography.

  1. Methods of and apparatus for recording images occurring just prior to a rapid, random event

    DOEpatents

    Kelley, Edward F.

    1994-01-01

    An apparatus and a method are disclosed for recording images of events in a medium wherein the images that are recorded are of conditions existing just prior to and during the occurrence of an event that triggers recording of these images. The apparatus and method use an optical delay path that employs a spherical focusing mirror facing a circular array of flat return mirrors around a central flat mirror. The image is reflected in a symmetric pattern which balances astigmatism which is created by the spherical mirror. Delays on the order of hundreds of nanoseconds are possible.

  2. The Use of Virtual Reality Computer Simulation in Learning Port-A Cath Injection

    ERIC Educational Resources Information Center

    Tsai, Sing-Ling; Chai, Sin-Kuo; Hsieh, Li-Feng; Lin, Shirling; Taur, Fang-Meei; Sung, Wen-Hsu; Doong, Ji-Liang

    2008-01-01

    Cost-benefit management trends in Taiwan healthcare settings have led nurses to perform more invasive skills, such as Port-A cath administration of medications. Accordingly, nurses must be well-prepared prior to teaching by the mentor and supervision method. The purpose of the current study was to develop a computer-assisted protocol using virtual…

  3. Long-Term Stability of Membership in a Wechsler Intelligence Scale for Children--Third Edition (WISC-III) Subtest Core Profile Taxonomy

    ERIC Educational Resources Information Center

    Borsuk, Ellen R.; Watkins, Marley W.; Canivez, Gary L.

    2006-01-01

    Although often applied in practice, clinically based cognitive subtest profile analysis has failed to achieve empirical support. Nonlinear multivariate subtest profile analysis may have benefits over clinically based techniques, but the psychometric properties of these methods must be studied prior to their implementation and interpretation. The…

  4. The Effects of an Integrated Reading Comprehension Strategy: A Culturally Responsive Teaching Approach for Fifth-Grade Students' Reading Comprehension

    ERIC Educational Resources Information Center

    Bui, Yvonne N.; Fagan, Yvette M.

    2013-01-01

    The study evaluated the effects of the Integrated Reading Comprehension Strategy on two levels. The Integrated Reading Comprehension Strategy integrated story grammar instruction and story maps, prior knowledge and prediction method, and word webs through a culturally responsive teaching framework; the Integrated Reading Comprehension Strategy…

  5. Exploring the Relationships between Facilitation Methods, Students' Sense of Community, and Their Online Behaviors

    ERIC Educational Resources Information Center

    Phirangee, Krystle; Epp, Carrie Demmans; Hewitt, Jim

    2016-01-01

    The popularity of online learning has boomed over the last few years, pushing instructors to consider the best ways to design their courses to support student learning needs and participation. Prior research suggests the need for instructor facilitation to provide this guidance and support, whereas other studies have suggested peer facilitation…

  6. Factors Influencing the Effectiveness of Note Taking on Computer-Based Graphic Organizers

    ERIC Educational Resources Information Center

    Crooks, Steven M.; White, David R.; Barnard, Lucy

    2007-01-01

    Previous research on graphic organizer (GO) note taking has shown that this method is most effective when the GO is presented to the student partially complete with provided notes. This study extended prior research by investigating the effects of provided note type (summary vs. verbatim) and GO bite size (large vs. small) on the transfer…

  7. Interpartner Conflict and Child Abuse Risk among African American and Latino Adolescent Parenting Couples

    ERIC Educational Resources Information Center

    Moore, David R.; Florsheim, Paul

    2008-01-01

    Objective: The goal of this study was to identify links between observed conflict interactions and risk for child abuse and harsh parenting among a multiethnic sample of adolescent mothers (14-19 years) and young fathers (14-24 years). Methods: Prior to childbirth (T1), observation-based relationship data were collected from 154 expectant…

  8. The Impact of Five Missing Data Treatments on a Cross-Classified Random Effects Model

    ERIC Educational Resources Information Center

    Hoelzle, Braden R.

    2012-01-01

    The present study compared the performance of five missing data treatment methods within a Cross-Classified Random Effects Model environment under various levels and patterns of missing data given a specified sample size. Prior research has shown the varying effect of missing data treatment options within the context of numerous statistical…

  9. Effects of Speech Practice on Fast Mapping in Monolingual and Bilingual Speakers

    ERIC Educational Resources Information Center

    Kan, Pui Fong; Sadagopan, Neeraja; Janich, Lauren; Andrade, Marixa

    2014-01-01

    Purpose: This study examines the effects of the levels of speech practice on fast mapping in monolingual and bilingual speakers. Method: Participants were 30 English-speaking monolingual and 30 Spanish-English bilingual young adults. Each participant was randomly assigned to 1 of 3 practice conditions prior to the fast-mapping task: (a) intensive…

  10. Drug Prevention by Increasing Self-Esteem: Influence of Teaching Approaches and Gender on Different Consumption Groups

    ERIC Educational Resources Information Center

    Heyne, Thomas; Bogner, Franz X.

    2013-01-01

    Our study focused on an educational intervention designed to increase the self-esteem of low-achieving eighth graders. The intervention was a substance-specific life skills program built upon teacher-centered versus student-centered teaching methods. A cluster analysis identified four consumption groups prior to the intervention: A potentially…

  11. Combining Primary Prevention and Risk Reduction Approaches in Sexual Assault Protection Programming

    ERIC Educational Resources Information Center

    Menning, Chadwick; Holtzman, Mellisa

    2015-01-01

    Objective: The object of this study is to extend prior evaluations of Elemental, a sexual assault protection program that combines primary prevention and risk reduction strategies within a single program. Participants and Methods: During 2012 and 2013, program group and control group students completed pretest, posttest, and 6-week and 6-month…

  12. Child Abuse in Blended Households: Reports from Runaway and Homeless Youth

    ERIC Educational Resources Information Center

    McRee, Nick

    2008-01-01

    Objective: Building upon prior research that reveals an elevated risk of abuse to children in blended households, the study considers whether risk of abuse varies by the type of non-related parent figure (i.e., stepparent, adoptive parent, or cohabiting adult) in residence. Method: A sample of 40,000 youths that sought services from runaway and…

  13. Creativity, the Individual and Society: A Teaching Case Study within a High-Technology Firm.

    ERIC Educational Resources Information Center

    Edelson, Paul J.

    An innovative method for teaching creativity and leadership to adults was presented to engineers and executives within a high-technology corporation who wished to overcome fear of failure and the inhibiting influences of stress within their industry. The methodology developed was based upon prior research conducted in the area of self-directed…

  14. Waterpipe Smoking among Students in One US University: Predictors of an Intention to Quit

    ERIC Educational Resources Information Center

    Abughosh, Susan; Wu, I-Hsuan; Rajan, Suja; Peters, Ronald J.; Essien, E. James

    2012-01-01

    Objective: To examine the intention to quit waterpipe smoking among college students. Participants: A total of 276 University of Houston students identified through an online survey administered in February 2011. Participants indicated they had smoked a waterpipe in the month prior to the survey. Methods: Cross-sectional study. Questions included…

  15. Field Suppression of the peachtree borer, Synanthedon exitiosa, using Steinernema carpocapsae: Effects of irrigation, a sprayable gel and application method

    USDA-ARS?s Scientific Manuscript database

    The peachtree borer, Synanthedon exitiosa, is a major pest of stone fruit trees in North America. In prior studies, the entomopathogenic nematode, S. carpocapsae, caused substantial reductions in S. exitiosa damage when applied by watering can to peach trees that were irrigated regularly. Here we ...

  16. Promoting Uptake of the HPV Vaccine: The Knowledge and Views of School Staff

    ERIC Educational Resources Information Center

    Rose, Sally B.; Lanumata, Tolotea; Lawton, Beverley A.

    2011-01-01

    Background: School-based human papillomavirus (HPV)/cervical cancer vaccination programs have been implemented widely, but few studies have investigated the knowledge and views of school staff about this new vaccine. Methods: Prior to the introduction of the HPV vaccine in 2009, we surveyed staff at 14 socioeconomically diverse schools to assess…

  17. Talking to Learn: A Mixed-Methods Study of a Professional Development Program for Teachers of English Language Learners

    ERIC Educational Resources Information Center

    Shea, Lauren M.

    2012-01-01

    Most teachers of English language learners (ELLs) have had virtually no specialized, in-service training in adapting instruction for their students. Prior research fails to investigate the impact of professional development (PD) specifically designed for teachers of ELLs. This dissertation examines a PD program that attempted to prepare teachers…

  18. Gender Differences among Israeli Adolescents in Residential Drug Treatment

    ERIC Educational Resources Information Center

    Isralowitz, Richard; Reznik, Alex

    2007-01-01

    Aims: The use of licit and illicit drugs is considered to be primarily a male problem. Numerous studies, however, question the extent of gender differences. This article reports on last 30 day drug use and related problem behaviour among male and female youth prior to residential treatment. Methods: Self-report data were collected from 95 male and…

  19. Effects of two-stage and total vs. fence-line weaning on the physiology and performance of beef calves

    USDA-ARS?s Scientific Manuscript database

    Calves weaned using a two-stage method where nursing is prevented between cow-calf pairs prior to separation (Stage 1) experience less weaning stress after separation (Stage 2) based on behavior and growth measures. The aim of this study was to document changes in various physiological measures of s...

  20. What Are Confidence Judgments Made of? Students' Explanations for Their Confidence Ratings and What that Means for Calibration

    ERIC Educational Resources Information Center

    Dinsmore, Daniel L.; Parkinson, Meghan M.

    2013-01-01

    Although calibration has been widely studied, questions remain about how best to capture confidence ratings, how to calculate continuous variable calibration indices, and on what exactly students base their reported confidence ratings. Undergraduates in a research methods class completed a prior knowledge assessment, two sets of readings and…

Top