Science.gov

Sample records for bayesian statistical control

  1. Philosophy and the practice of Bayesian statistics

    PubMed Central

    Gelman, Andrew; Shalizi, Cosma Rohilla

    2015-01-01

    A substantial school in the philosophy of science identifies Bayesian inference with inductive inference and even rationality as such, and seems to be strengthened by the rise and practical success of Bayesian statistics. We argue that the most successful forms of Bayesian statistics do not actually support that particular philosophy but rather accord much better with sophisticated forms of hypothetico-deductivism. We examine the actual role played by prior distributions in Bayesian models, and the crucial aspects of model checking and model revision, which fall outside the scope of Bayesian confirmation theory. We draw on the literature on the consistency of Bayesian updating and also on our experience of applied work in social science. Clarity about these matters should benefit not just philosophy of science, but also statistical practice. At best, the inductivist view has encouraged researchers to fit and compare models without checking them; at worst, theorists have actively discouraged practitioners from performing model checking because it does not fit into their framework. PMID:22364575

  2. Philosophy and the practice of Bayesian statistics.

    PubMed

    Gelman, Andrew; Shalizi, Cosma Rohilla

    2013-02-01

    A substantial school in the philosophy of science identifies Bayesian inference with inductive inference and even rationality as such, and seems to be strengthened by the rise and practical success of Bayesian statistics. We argue that the most successful forms of Bayesian statistics do not actually support that particular philosophy but rather accord much better with sophisticated forms of hypothetico-deductivism. We examine the actual role played by prior distributions in Bayesian models, and the crucial aspects of model checking and model revision, which fall outside the scope of Bayesian confirmation theory. We draw on the literature on the consistency of Bayesian updating and also on our experience of applied work in social science. Clarity about these matters should benefit not just philosophy of science, but also statistical practice. At best, the inductivist view has encouraged researchers to fit and compare models without checking them; at worst, theorists have actively discouraged practitioners from performing model checking because it does not fit into their framework.

  3. Bayesian versus 'plain-vanilla Bayesian' multitarget statistics

    NASA Astrophysics Data System (ADS)

    Mahler, Ronald P. S.

    2004-08-01

    Finite-set statistics (FISST) is a direct generalization of single-sensor, single-target Bayes statistics to the multisensor-multitarget realm, based on random set theory. Various aspects of FISST are being investigated by several research teams around the world. In recent years, however, a few partisans have claimed that a "plain-vanilla Bayesian approach" suffices as down-to-earth, "straightforward," and general "first principles" for multitarget problems. Therefore, FISST is mere mathematical "obfuscation." In this and a companion paper I demonstrate the speciousness of these claims. In this paper I summarize general Bayes statistics, what is required to use it in multisensor-multitarget problems, and why FISST is necessary to make it practical. Then I demonstrate that the "plain-vanilla Bayesian approach" is so heedlessly formulated that it is erroneous, not even Bayesian denigrates FISST concepts while unwittingly assuming them, and has resulted in a succession of algorithms afflicted by inherent -- but less than candidly acknowledged -- computational "logjams."

  4. Approximate Bayesian computation with functional statistics.

    PubMed

    Soubeyrand, Samuel; Carpentier, Florence; Guiton, François; Klein, Etienne K

    2013-03-26

    Functional statistics are commonly used to characterize spatial patterns in general and spatial genetic structures in population genetics in particular. Such functional statistics also enable the estimation of parameters of spatially explicit (and genetic) models. Recently, Approximate Bayesian Computation (ABC) has been proposed to estimate model parameters from functional statistics. However, applying ABC with functional statistics may be cumbersome because of the high dimension of the set of statistics and the dependences among them. To tackle this difficulty, we propose an ABC procedure which relies on an optimized weighted distance between observed and simulated functional statistics. We applied this procedure to a simple step model, a spatial point process characterized by its pair correlation function and a pollen dispersal model characterized by genetic differentiation as a function of distance. These applications showed how the optimized weighted distance improved estimation accuracy. In the discussion, we consider the application of the proposed ABC procedure to functional statistics characterizing non-spatial processes.

  5. Computational statistics using the Bayesian Inference Engine

    NASA Astrophysics Data System (ADS)

    Weinberg, Martin D.

    2013-09-01

    This paper introduces the Bayesian Inference Engine (BIE), a general parallel, optimized software package for parameter inference and model selection. This package is motivated by the analysis needs of modern astronomical surveys and the need to organize and reuse expensive derived data. The BIE is the first platform for computational statistics designed explicitly to enable Bayesian update and model comparison for astronomical problems. Bayesian update is based on the representation of high-dimensional posterior distributions using metric-ball-tree based kernel density estimation. Among its algorithmic offerings, the BIE emphasizes hybrid tempered Markov chain Monte Carlo schemes that robustly sample multimodal posterior distributions in high-dimensional parameter spaces. Moreover, the BIE implements a full persistence or serialization system that stores the full byte-level image of the running inference and previously characterized posterior distributions for later use. Two new algorithms to compute the marginal likelihood from the posterior distribution, developed for and implemented in the BIE, enable model comparison for complex models and data sets. Finally, the BIE was designed to be a collaborative platform for applying Bayesian methodology to astronomy. It includes an extensible object-oriented and easily extended framework that implements every aspect of the Bayesian inference. By providing a variety of statistical algorithms for all phases of the inference problem, a scientist may explore a variety of approaches with a single model and data implementation. Additional technical details and download details are available from http://www.astro.umass.edu/bie. The BIE is distributed under the GNU General Public License.

  6. Teaching Bayesian Statistics to Undergraduate Students through Debates

    ERIC Educational Resources Information Center

    Stewart, Sepideh; Stewart, Wayne

    2014-01-01

    This paper describes a lecturer's approach to teaching Bayesian statistics to students who were only exposed to the classical paradigm. The study shows how the lecturer extended himself by making use of ventriloquist dolls to grab hold of students' attention and embed important ideas in revealing the differences between the Bayesian and…

  7. A pleiotropy-informed Bayesian false discovery rate adapted to a shared control design finds new disease associations from GWAS summary statistics.

    PubMed

    Liley, James; Wallace, Chris

    2015-02-01

    Genome-wide association studies (GWAS) have been successful in identifying single nucleotide polymorphisms (SNPs) associated with many traits and diseases. However, at existing sample sizes, these variants explain only part of the estimated heritability. Leverage of GWAS results from related phenotypes may improve detection without the need for larger datasets. The Bayesian conditional false discovery rate (cFDR) constitutes an upper bound on the expected false discovery rate (FDR) across a set of SNPs whose p values for two diseases are both less than two disease-specific thresholds. Calculation of the cFDR requires only summary statistics and have several advantages over traditional GWAS analysis. However, existing methods require distinct control samples between studies. Here, we extend the technique to allow for some or all controls to be shared, increasing applicability. Several different SNP sets can be defined with the same cFDR value, and we show that the expected FDR across the union of these sets may exceed expected FDR in any single set. We describe a procedure to establish an upper bound for the expected FDR among the union of such sets of SNPs. We apply our technique to pairwise analysis of p values from ten autoimmune diseases with variable sharing of controls, enabling discovery of 59 SNP-disease associations which do not reach GWAS significance after genomic control in individual datasets. Most of the SNPs we highlight have previously been confirmed using replication studies or larger GWAS, a useful validation of our technique; we report eight SNP-disease associations across five diseases not previously declared. Our technique extends and strengthens the previous algorithm, and establishes robust limits on the expected FDR. This approach can improve SNP detection in GWAS, and give insight into shared aetiology between phenotypically related conditions.

  8. Bayesian Analysis of Order-Statistics Models for Ranking Data.

    ERIC Educational Resources Information Center

    Yu, Philip L. H.

    2000-01-01

    Studied the order-statistics models, extending the usual normal order-statistics model into one in which the underlying random variables followed a multivariate normal distribution. Used a Bayesian approach and the Gibbs sampling technique. Applied the proposed method to analyze presidential election data from the American Psychological…

  9. Bayesian statistics in environmental engineering planning

    SciTech Connect

    Englehardt, J.D.; Simon, T.W.

    1999-07-01

    Today's engineer must be able to quantify both uncertainty due to information limitations, and the variability of natural processes, in order to determine risk. Nowhere is this emphasis on risk assessment more evident than in environmental engineering. The use of Bayesian inference for the rigorous assessment of risk based on available information is reviewed in this paper. Several example environmental engineering planning applications are presented: (1) assessment of losses involving the evaluation of proposed revisions to the South Florida Building Code after Hurricane Andrew; (2) development of a model to predict oil spill consequences due to proposed changes in the oil transportation network in the Gulf of Mexico; (3) studies of ambient concentrations of perchloroethylene surrounding dry cleaners and of tire particulates in residential areas near roadways in Miami, FL; (4) risk assessment from contaminated soils at a cleanup of an old transformer dump site.

  10. A BAYESIAN STATISTICAL APPROACH FOR THE EVALUATION OF CMAQ

    EPA Science Inventory

    Bayesian statistical methods are used to evaluate Community Multiscale Air Quality (CMAQ) model simulations of sulfate aerosol over a section of the eastern US for 4-week periods in summer and winter 2001. The observed data come from two U.S. Environmental Protection Agency data ...

  11. Some Bayesian statistical techniques useful in estimating frequency and density

    USGS Publications Warehouse

    Johnson, D.H.

    1977-01-01

    This paper presents some elementary applications of Bayesian statistics to problems faced by wildlife biologists. Bayesian confidence limits for frequency of occurrence are shown to be generally superior to classical confidence limits. Population density can be estimated from frequency data if the species is sparsely distributed relative to the size of the sample plot. For other situations, limits are developed based on the normal distribution and prior knowledge that the density is non-negative, which insures that the lower confidence limit is non-negative. Conditions are described under which Bayesian confidence limits are superior to those calculated with classical methods; examples are also given on how prior knowledge of the density can be used to sharpen inferences drawn from a new sample.

  12. Human Motion Retrieval Based on Statistical Learning and Bayesian Fusion

    PubMed Central

    Xiao, Qinkun; Song, Ren

    2016-01-01

    A novel motion retrieval approach based on statistical learning and Bayesian fusion is presented. The approach includes two primary stages. (1) In the learning stage, fuzzy clustering is utilized firstly to get the representative frames of motions, and the gesture features of the motions are extracted to build a motion feature database. Based on the motion feature database and statistical learning, the probability distribution function of different motion classes is obtained. (2) In the motion retrieval stage, the query motion feature is extracted firstly according to stage (1). Similarity measurements are then conducted employing a novel method that combines category-based motion similarity distances with similarity distances based on canonical correlation analysis. The two motion distances are fused using Bayesian estimation, and the retrieval results are ranked according to the fused values. The effectiveness of the proposed method is verified experimentally. PMID:27732673

  13. Bayesian statistical analysis of protein side-chain rotamer preferences.

    PubMed Central

    Dunbrack, R. L.; Cohen, F. E.

    1997-01-01

    We present a Bayesian statistical analysis of the conformations of side chains in proteins from the Protein Data Bank. This is an extension of the backbone-dependent rotamer library, and includes rotamer populations and average chi angles for a full range of phi, psi values. The Bayesian analysis used here provides a rigorous statistical method for taking account of varying amounts of data. Bayesian statistics requires the assumption of a prior distribution for parameters over their range of possible values. This prior distribution can be derived from previous data or from pooling some of the present data. The prior distribution is combined with the data to form the posterior distribution, which is a compromise between the prior distribution and the data. For the chi 2, chi 3, and chi 4 rotamer prior distributions, we assume that the probability of each rotamer type is dependent only on the previous chi rotamer in the chain. For the backbone-dependence of the chi 1 rotamers, we derive prior distributions from the product of the phi-dependent and psi-dependent probabilities. Molecular mechanics calculations with the CHARMM22 potential show a strong similarity with the experimental distributions, indicating that proteins attain their lowest energy rotamers with respect to local backbone-side-chain interactions. The new library is suitable for use in homology modeling, protein folding simulations, and the refinement of X-ray and NMR structures. PMID:9260279

  14. Bayesian modeling of flexible cognitive control

    PubMed Central

    Jiang, Jiefeng; Heller, Katherine; Egner, Tobias

    2014-01-01

    “Cognitive control” describes endogenous guidance of behavior in situations where routine stimulus-response associations are suboptimal for achieving a desired goal. The computational and neural mechanisms underlying this capacity remain poorly understood. We examine recent advances stemming from the application of a Bayesian learner perspective that provides optimal prediction for control processes. In reviewing the application of Bayesian models to cognitive control, we note that an important limitation in current models is a lack of a plausible mechanism for the flexible adjustment of control over conflict levels changing at varying temporal scales. We then show that flexible cognitive control can be achieved by a Bayesian model with a volatility-driven learning mechanism that modulates dynamically the relative dependence on recent and remote experiences in its prediction of future control demand. We conclude that the emergent Bayesian perspective on computational mechanisms of cognitive control holds considerable promise, especially if future studies can identify neural substrates of the variables encoded by these models, and determine the nature (Bayesian or otherwise) of their neural implementation. PMID:24929218

  15. Many faces of entropy or Bayesian statistical mechanics.

    PubMed

    Starikov, Evgeni B

    2010-11-15

    Some 80-90 years ago, George A. Linhart, unlike A. Einstein, P. Debye, M. Planck and W. Nernst, managed to derive a very simple, but ultimately general mathematical formula for heat capacity versus temperature from fundamental thermodynamic principles, using what we would nowadays dub a "Bayesian approach to probability". Moreover, he successfully applied his result to fit the experimental data for diverse substances in their solid state over a rather broad temperature range. Nevertheless, Linhart's work was undeservedly forgotten, although it represents a valid and fresh standpoint on thermodynamics and statistical physics, which may have a significant implication for academic and applied science.

  16. Bayesian statistics and information fusion for GPS-denied navigation

    NASA Astrophysics Data System (ADS)

    Copp, Brian Lee

    It is well known that satellite navigation systems are vulnerable to disruption due to jamming, spoofing, or obstruction of the signal. The desire for robust navigation of aircraft in GPS-denied environments has motivated the development of feature-aided navigation systems, in which measurements of environmental features are used to complement the dead reckoning solution produced by an inertial navigation system. Examples of environmental features which can be exploited for navigation include star positions, terrain elevation, terrestrial wireless signals, and features extracted from photographic data. Feature-aided navigation represents a particularly challenging estimation problem because the measurements are often strongly nonlinear, and the quality of the navigation solution is limited by the knowledge of nuisance parameters which may be difficult to model accurately. As a result, integration approaches based on the Kalman filter and its variants may fail to give adequate performance. This project develops a framework for the integration of feature-aided navigation techniques using Bayesian statistics. In this approach, the probability density function for aircraft horizontal position (latitude and longitude) is approximated by a two-dimensional point mass function defined on a rectangular grid. Nuisance parameters are estimated using a hypothesis based approach (Multiple Model Adaptive Estimation) which continuously maintains an accurate probability density even in the presence of strong nonlinearities. The effectiveness of the proposed approach is illustrated by the simulated use of terrain referenced navigation and wireless time-of-arrival positioning to estimate a reference aircraft trajectory. Monte Carlo simulations have shown that accurate position estimates can be obtained in terrain referenced navigation even with a strongly nonlinear altitude bias. The integration of terrain referenced and wireless time-of-arrival measurements is described along with

  17. Bayesian Sensitivity Analysis of Statistical Models with Missing Data

    PubMed Central

    ZHU, HONGTU; IBRAHIM, JOSEPH G.; TANG, NIANSHENG

    2013-01-01

    Methods for handling missing data depend strongly on the mechanism that generated the missing values, such as missing completely at random (MCAR) or missing at random (MAR), as well as other distributional and modeling assumptions at various stages. It is well known that the resulting estimates and tests may be sensitive to these assumptions as well as to outlying observations. In this paper, we introduce various perturbations to modeling assumptions and individual observations, and then develop a formal sensitivity analysis to assess these perturbations in the Bayesian analysis of statistical models with missing data. We develop a geometric framework, called the Bayesian perturbation manifold, to characterize the intrinsic structure of these perturbations. We propose several intrinsic influence measures to perform sensitivity analysis and quantify the effect of various perturbations to statistical models. We use the proposed sensitivity analysis procedure to systematically investigate the tenability of the non-ignorable missing at random (NMAR) assumption. Simulation studies are conducted to evaluate our methods, and a dataset is analyzed to illustrate the use of our diagnostic measures. PMID:24753718

  18. Defining statistical perceptions with an empirical Bayesian approach

    NASA Astrophysics Data System (ADS)

    Tajima, Satohiro

    2013-04-01

    Extracting statistical structures (including textures or contrasts) from a natural stimulus is a central challenge in both biological and engineering contexts. This study interprets the process of statistical recognition in terms of hyperparameter estimations and free-energy minimization procedures with an empirical Bayesian approach. This mathematical interpretation resulted in a framework for relating physiological insights in animal sensory systems to the functional properties of recognizing stimulus statistics. We applied the present theoretical framework to two typical models of natural images that are encoded by a population of simulated retinal neurons, and demonstrated that the resulting cognitive performances could be quantified with the Fisher information measure. The current enterprise yielded predictions about the properties of human texture perception, suggesting that the perceptual resolution of image statistics depends on visual field angles, internal noise, and neuronal information processing pathways, such as the magnocellular, parvocellular, and koniocellular systems. Furthermore, the two conceptually similar natural-image models were found to yield qualitatively different predictions, striking a note of warning against confusing the two models when describing a natural image.

  19. Teaching Bayesian Statistics in a Health Research Methodology Program

    ERIC Educational Resources Information Center

    Pullenayegum, Eleanor M.; Thabane, Lehana

    2009-01-01

    Despite the appeal of Bayesian methods in health research, they are not widely used. This is partly due to a lack of courses in Bayesian methods at an appropriate level for non-statisticians in health research. Teaching such a course can be challenging because most statisticians have been taught Bayesian methods using a mathematical approach, and…

  20. Statistical process control

    SciTech Connect

    Oakland, J.S.

    1986-01-01

    Addressing the increasing importance for firms to have a thorough knowledge of statistically based quality control procedures, this book presents the fundamentals of statistical process control (SPC) in a non-mathematical, practical way. It provides real-life examples and data drawn from a wide variety of industries. The foundations of good quality management and process control, and control of conformance and consistency during production are given. Offers clear guidance to those who wish to understand and implement modern SPC techniques.

  1. Online Dectection and Modeling of Safety Boundaries for Aerospace Application Using Bayesian Statistics

    NASA Technical Reports Server (NTRS)

    He, Yuning

    2015-01-01

    The behavior of complex aerospace systems is governed by numerous parameters. For safety analysis it is important to understand how the system behaves with respect to these parameter values. In particular, understanding the boundaries between safe and unsafe regions is of major importance. In this paper, we describe a hierarchical Bayesian statistical modeling approach for the online detection and characterization of such boundaries. Our method for classification with active learning uses a particle filter-based model and a boundary-aware metric for best performance. From a library of candidate shapes incorporated with domain expert knowledge, the location and parameters of the boundaries are estimated using advanced Bayesian modeling techniques. The results of our boundary analysis are then provided in a form understandable by the domain expert. We illustrate our approach using a simulation model of a NASA neuro-adaptive flight control system, as well as a system for the detection of separation violations in the terminal airspace.

  2. Bayesian Statistical Approach To Binary Asteroid Orbit Determination

    NASA Astrophysics Data System (ADS)

    Dmitrievna Kovalenko, Irina; Stoica, Radu S.

    2015-08-01

    Orbit determination from observations is one of the classical problems in celestial mechanics. Deriving the trajectory of binary asteroid with high precision is much more complicate than the trajectory of simple asteroid. Here we present a method of orbit determination based on the algorithm of Monte Carlo Markov Chain (MCMC). This method can be used for the preliminary orbit determination with relatively small number of observations, or for adjustment of orbit previously determined.The problem consists on determination of a conditional a posteriori probability density with given observations. Applying the Bayesian statistics, the a posteriori probability density of the binary asteroid orbital parameters is proportional to the a priori and likelihood probability densities. The likelihood function is related to the noise probability density and can be calculated from O-C deviations (Observed minus Calculated positions). The optionally used a priori probability density takes into account information about the population of discovered asteroids. The a priori probability density is used to constrain the phase space of possible orbits.As a MCMC method the Metropolis-Hastings algorithm has been applied, adding a globally convergent coefficient. The sequence of possible orbits derives through the sampling of each orbital parameter and acceptance criteria.The method allows to determine the phase space of every possible orbit considering each parameter. It also can be used to derive one orbit with the biggest probability density of orbital elements.

  3. Bayesian statistical ionospheric tomography improved by incorporating ionosonde measurements

    NASA Astrophysics Data System (ADS)

    Norberg, Johannes; Virtanen, Ilkka I.; Roininen, Lassi; Vierinen, Juha; Orispää, Mikko; Kauristie, Kirsti; Lehtinen, Markku S.

    2016-04-01

    We validate two-dimensional ionospheric tomography reconstructions against EISCAT incoherent scatter radar measurements. Our tomography method is based on Bayesian statistical inversion with prior distribution given by its mean and covariance. We employ ionosonde measurements for the choice of the prior mean and covariance parameters and use the Gaussian Markov random fields as a sparse matrix approximation for the numerical computations. This results in a computationally efficient tomographic inversion algorithm with clear probabilistic interpretation. We demonstrate how this method works with simultaneous beacon satellite and ionosonde measurements obtained in northern Scandinavia. The performance is compared with results obtained with a zero-mean prior and with the prior mean taken from the International Reference Ionosphere 2007 model. In validating the results, we use EISCAT ultra-high-frequency incoherent scatter radar measurements as the ground truth for the ionization profile shape. We find that in comparison to the alternative prior information sources, ionosonde measurements improve the reconstruction by adding accurate information about the absolute value and the altitude distribution of electron density. With an ionosonde at continuous disposal, the presented method enhances stand-alone near-real-time ionospheric tomography for the given conditions significantly.

  4. Thermal conductance of thin film YIG determined using Bayesian statistics

    NASA Astrophysics Data System (ADS)

    Euler, C.; Hołuj, P.; Langner, T.; Kehlberger, A.; Vasyuchka, V. I.; Kläui, M.; Jakob, G.

    2015-09-01

    Thin film YIG (Y3Fe5O12 ) is a prototypical material for experiments on thermally generated pure spin currents and the spin Seebeck effect. The 3 ω method is an established technique to measure the cross-plane thermal conductance of thin films, but cannot be used in YIG/GGG (Ga3Gd5O12 ) systems in its standard form. We use two-dimensional modeling of heat transport and introduce a technique based on Bayesian statistics to evaluate measurement data taken from the 3 ω method. Our analysis method allows us to study material systems that have not been accessible with the conventionally used 3 ω analysis. Temperature-dependent thermal conductance data of thin film YIG are of major importance for experiments in the field of spin caloritronics. Here we show data between room temperature and 10 K for films covering a wide thickness range as well as the magnetic field effect on the thermal conductance between 10 and 50 K.

  5. Ockham's razor and Bayesian analysis. [statistical theory for systems evaluation

    NASA Technical Reports Server (NTRS)

    Jefferys, William H.; Berger, James O.

    1992-01-01

    'Ockham's razor', the ad hoc principle enjoining the greatest possible simplicity in theoretical explanations, is presently shown to be justifiable as a consequence of Bayesian inference; Bayesian analysis can, moreover, clarify the nature of the 'simplest' hypothesis consistent with the given data. By choosing the prior probabilities of hypotheses, it becomes possible to quantify the scientific judgment that simpler hypotheses are more likely to be correct. Bayesian analysis also shows that a hypothesis with fewer adjustable parameters intrinsically possesses an enhanced posterior probability, due to the clarity of its predictions.

  6. Bayesian Statistical Analysis of Circadian Oscillations in Fibroblasts

    PubMed Central

    Cohen, Andrew L.; Leise, Tanya L.; Welsh, David K.

    2012-01-01

    Precise determination of a noisy biological oscillator’s period from limited experimental data can be challenging. The common practice is to calculate a single number (a point estimate) for the period of a particular time course. Uncertainty is inherent in any statistical estimator applied to noisy data, so our confidence in such point estimates depends on the quality and quantity of the data. Ideally, a period estimation method should both produce an accurate point estimate of the period and measure the uncertainty in that point estimate. A variety of period estimation methods are known, but few assess the uncertainty of the estimates, and a measure of uncertainty is rarely reported in the experimental literature. We compare the accuracy of point estimates using six common methods, only one of which can also produce uncertainty measures. We then illustrate the advantages of a new Bayesian method for estimating period, which outperforms the other six methods in accuracy of point estimates for simulated data and also provides a measure of uncertainty. We apply this method to analyze circadian oscillations of gene expression in individual mouse fibroblast cells and compute the number of cells and sampling duration required to reduce the uncertainty in period estimates to a desired level. This analysis indicates that, due to the stochastic variability of noisy intracellular oscillators, achieving a narrow margin of error can require an impractically large number of cells. In addition, we use a hierarchical model to determine the distribution of intrinsic cell periods, thereby separating the variability due to stochastic gene expression within each cell from the variability in period across the population of cells. PMID:22982138

  7. Bayesian nonparametric adaptive control using Gaussian processes.

    PubMed

    Chowdhary, Girish; Kingravi, Hassan A; How, Jonathan P; Vela, Patricio A

    2015-03-01

    Most current model reference adaptive control (MRAC) methods rely on parametric adaptive elements, in which the number of parameters of the adaptive element are fixed a priori, often through expert judgment. An example of such an adaptive element is radial basis function networks (RBFNs), with RBF centers preallocated based on the expected operating domain. If the system operates outside of the expected operating domain, this adaptive element can become noneffective in capturing and canceling the uncertainty, thus rendering the adaptive controller only semiglobal in nature. This paper investigates a Gaussian process-based Bayesian MRAC architecture (GP-MRAC), which leverages the power and flexibility of GP Bayesian nonparametric models of uncertainty. The GP-MRAC does not require the centers to be preallocated, can inherently handle measurement noise, and enables MRAC to handle a broader set of uncertainties, including those that are defined as distributions over functions. We use stochastic stability arguments to show that GP-MRAC guarantees good closed-loop performance with no prior domain knowledge of the uncertainty. Online implementable GP inference methods are compared in numerical simulations against RBFN-MRAC with preallocated centers and are shown to provide better tracking and improved long-term learning.

  8. Quantification and propagation of disciplinary uncertainty via Bayesian statistics

    NASA Astrophysics Data System (ADS)

    Mantis, George Constantine

    2002-08-01

    Several needs exist in the military, commercial, and civil sectors for new hypersonic systems. These needs remain unfulfilled, due in part to the uncertainty encountered in designing these systems. This uncertainty takes a number of forms, including disciplinary uncertainty, that which is inherent in the analytical tools utilized during the design process. Yet, few efforts to date empower the designer with the means to account for this uncertainty within the disciplinary analyses. In the current state-of-the-art in design, the effects of this unquantifiable uncertainty significantly increase the risks associated with new design efforts. Typically, the risk proves too great to allow a given design to proceed beyond the conceptual stage. To that end, the research encompasses the formulation and validation of a new design method, a systematic process for probabilistically assessing the impact of disciplinary uncertainty. The method implements Bayesian Statistics theory to quantify this source of uncertainty, and propagate its effects to the vehicle system level. Comparison of analytical and physical data for existing systems, modeled a priori in the given analysis tools, leads to quantification of uncertainty in those tools' calculation of discipline-level metrics. Then, after exploration of the new vehicle's design space, the quantified uncertainty is propagated probabilistically through the design space. This ultimately results in the assessment of the impact of disciplinary uncertainty on the confidence in the design solution: the final shape and variability of the probability functions defining the vehicle's system-level metrics. Although motivated by the hypersonic regime, the proposed treatment of uncertainty applies to any class of aerospace vehicle, just as the problem itself affects the design process of any vehicle. A number of computer programs comprise the environment constructed for the implementation of this work. Application to a single

  9. Ice Shelf Modeling: A Cross-Polar Bayesian Statistical Approach

    NASA Astrophysics Data System (ADS)

    Kirchner, N.; Furrer, R.; Jakobsson, M.; Zwally, H. J.

    2010-12-01

    Ice streams interlink glacial terrestrial and marine environments: embedded in a grounded inland ice such as the Antarctic Ice Sheet or the paleo ice sheets covering extensive parts of the Eurasian and Amerasian Arctic respectively, ice streams are major drainage agents facilitating the discharge of substantial portions of continental ice into the ocean. At their seaward side, ice streams can either extend onto the ocean as floating ice tongues (such as the Drygalsky Ice Tongue/East Antarctica), or feed large ice shelves (as is the case for e.g. the Siple Coast and the Ross Ice Shelf/West Antarctica). The flow behavior of ice streams has been recognized to be intimately linked with configurational changes in their attached ice shelves; in particular, ice shelf disintegration is associated with rapid ice stream retreat and increased mass discharge from the continental ice mass, contributing eventually to sea level rise. Investigations of ice stream retreat mechanism are however incomplete if based on terrestrial records only: rather, the dynamics of ice shelves (and, eventually, the impact of the ocean on the latter) must be accounted for. However, since floating ice shelves leave hardly any traces behind when melting, uncertainty regarding the spatio-temporal distribution and evolution of ice shelves in times prior to instrumented and recorded observation is high, calling thus for a statistical modeling approach. Complementing ongoing large-scale numerical modeling efforts (Pollard & DeConto, 2009), we model the configuration of ice shelves by using a Bayesian Hiearchial Modeling (BHM) approach. We adopt a cross-polar perspective accounting for the fact that currently, ice shelves exist mainly along the coastline of Antarctica (and are virtually non-existing in the Arctic), while Arctic Ocean ice shelves repeatedly impacted the Arctic ocean basin during former glacial periods. Modeled Arctic ocean ice shelf configurations are compared with geological spatial

  10. TOWARDS A BAYESIAN PERSPECTIVE ON STATISTICAL DISCLOSURE LIMITATION

    EPA Science Inventory

    National statistical offices and other organizations collect data on individual subjects (person, businesses, organizations), typically while assuring the subject that data pertaining to them will be held confidential. These data provide the raw material for statistical data pro...

  11. Group sequential control of overall toxicity incidents in clinical trials - non-Bayesian and Bayesian approaches.

    PubMed

    Yu, Jihnhee; Hutson, Alan D; Siddiqui, Adnan H; Kedron, Mary A

    2016-02-01

    In some small clinical trials, toxicity is not a primary endpoint; however, it often has dire effects on patients' quality of life and is even life-threatening. For such clinical trials, rigorous control of the overall incidence of adverse events is desirable, while simultaneously collecting safety information. In this article, we propose group sequential toxicity monitoring strategies to control overall toxicity incidents below a certain level as opposed to performing hypothesis testing, which can be incorporated into an existing study design based on the primary endpoint. We consider two sequential methods: a non-Bayesian approach in which stopping rules are obtained based on the 'future' probability of an excessive toxicity rate; and a Bayesian adaptation modifying the proposed non-Bayesian approach, which can use the information obtained at interim analyses. Through an extensive Monte Carlo study, we show that the Bayesian approach often provides better control of the overall toxicity rate than the non-Bayesian approach. We also investigate adequate toxicity estimation after the studies. We demonstrate the applicability of our proposed methods in controlling the symptomatic intracranial hemorrhage rate for treating acute ischemic stroke patients.

  12. Preferential sampling and Bayesian geostatistics: Statistical modeling and examples.

    PubMed

    Cecconi, Lorenzo; Grisotto, Laura; Catelan, Dolores; Lagazio, Corrado; Berrocal, Veronica; Biggeri, Annibale

    2016-08-01

    Preferential sampling refers to any situation in which the spatial process and the sampling locations are not stochastically independent. In this paper, we present two examples of geostatistical analysis in which the usual assumption of stochastic independence between the point process and the measurement process is violated. To account for preferential sampling, we specify a flexible and general Bayesian geostatistical model that includes a shared spatial random component. We apply the proposed model to two different case studies that allow us to highlight three different modeling and inferential aspects of geostatistical modeling under preferential sampling: (1) continuous or finite spatial sampling frame; (2) underlying causal model and relevant covariates; and (3) inferential goals related to mean prediction surface or prediction uncertainty.

  13. Preferential sampling and Bayesian geostatistics: Statistical modeling and examples.

    PubMed

    Cecconi, Lorenzo; Grisotto, Laura; Catelan, Dolores; Lagazio, Corrado; Berrocal, Veronica; Biggeri, Annibale

    2016-08-01

    Preferential sampling refers to any situation in which the spatial process and the sampling locations are not stochastically independent. In this paper, we present two examples of geostatistical analysis in which the usual assumption of stochastic independence between the point process and the measurement process is violated. To account for preferential sampling, we specify a flexible and general Bayesian geostatistical model that includes a shared spatial random component. We apply the proposed model to two different case studies that allow us to highlight three different modeling and inferential aspects of geostatistical modeling under preferential sampling: (1) continuous or finite spatial sampling frame; (2) underlying causal model and relevant covariates; and (3) inferential goals related to mean prediction surface or prediction uncertainty. PMID:27566774

  14. Statistical detection of EEG synchrony using empirical bayesian inference.

    PubMed

    Singh, Archana K; Asoh, Hideki; Takeda, Yuji; Phillips, Steven

    2015-01-01

    There is growing interest in understanding how the brain utilizes synchronized oscillatory activity to integrate information across functionally connected regions. Computing phase-locking values (PLV) between EEG signals is a popular method for quantifying such synchronizations and elucidating their role in cognitive tasks. However, high-dimensionality in PLV data incurs a serious multiple testing problem. Standard multiple testing methods in neuroimaging research (e.g., false discovery rate, FDR) suffer severe loss of power, because they fail to exploit complex dependence structure between hypotheses that vary in spectral, temporal and spatial dimension. Previously, we showed that a hierarchical FDR and optimal discovery procedures could be effectively applied for PLV analysis to provide better power than FDR. In this article, we revisit the multiple comparison problem from a new Empirical Bayes perspective and propose the application of the local FDR method (locFDR; Efron, 2001) for PLV synchrony analysis to compute FDR as a posterior probability that an observed statistic belongs to a null hypothesis. We demonstrate the application of Efron's Empirical Bayes approach for PLV synchrony analysis for the first time. We use simulations to validate the specificity and sensitivity of locFDR and a real EEG dataset from a visual search study for experimental validation. We also compare locFDR with hierarchical FDR and optimal discovery procedures in both simulation and experimental analyses. Our simulation results showed that the locFDR can effectively control false positives without compromising on the power of PLV synchrony inference. Our results from the application locFDR on experiment data detected more significant discoveries than our previously proposed methods whereas the standard FDR method failed to detect any significant discoveries. PMID:25822617

  15. A BAYESIAN STATISTICAL APPROACHES FOR THE EVALUATION OF CMAQ

    EPA Science Inventory

    This research focuses on the application of spatial statistical techniques for the evaluation of the Community Multiscale Air Quality (CMAQ) model. The upcoming release version of the CMAQ model was run for the calendar year 2001 and is in the process of being evaluated by EPA an...

  16. Elicitation by design in ecology: using expert opinion to inform priors for Bayesian statistical models.

    PubMed

    Choy, Samantha Low; O'Leary, Rebecca; Mengersen, Kerrie

    2009-01-01

    Bayesian statistical modeling has several benefits within an ecological context. In particular, when observed data are limited in sample size or representativeness, then the Bayesian framework provides a mechanism to combine observed data with other "prior" information. Prior information may be obtained from earlier studies, or in their absence, from expert knowledge. This use of the Bayesian framework reflects the scientific "learning cycle," where prior or initial estimates are updated when new data become available. In this paper we outline a framework for statistical design of expert elicitation processes for quantifying such expert knowledge, in a form suitable for input as prior information into Bayesian models. We identify six key elements: determining the purpose and motivation for using prior information; specifying the relevant expert knowledge available; formulating the statistical model; designing effective and efficient numerical encoding; managing uncertainty; and designing a practical elicitation protocol. We demonstrate this framework applies to a variety of situations, with two examples from the ecological literature and three from our experience. Analysis of these examples reveals several recurring important issues affecting practical design of elicitation in ecological problems.

  17. Applications of Bayesian Statistics to Problems in Gamma-Ray Bursts

    NASA Technical Reports Server (NTRS)

    Meegan, Charles A.

    1997-01-01

    This presentation will describe two applications of Bayesian statistics to Gamma Ray Bursts (GRBS). The first attempts to quantify the evidence for a cosmological versus galactic origin of GRBs using only the observations of the dipole and quadrupole moments of the angular distribution of bursts. The cosmological hypothesis predicts isotropy, while the galactic hypothesis is assumed to produce a uniform probability distribution over positive values for these moments. The observed isotropic distribution indicates that the Bayes factor for the cosmological hypothesis over the galactic hypothesis is about 300. Another application of Bayesian statistics is in the estimation of chance associations of optical counterparts with galaxies. The Bayesian approach is preferred to frequentist techniques here because the Bayesian approach easily accounts for galaxy mass distributions and because one can incorporate three disjoint hypotheses: (1) bursts come from galactic centers, (2) bursts come from galaxies in proportion to luminosity, and (3) bursts do not come from external galaxies. This technique was used in the analysis of the optical counterpart to GRB970228.

  18. Bayesian Bigot? Statistical Discrimination, Stereotypes, and Employer Decision Making.

    PubMed

    Pager, Devah; Karafin, Diana

    2009-01-01

    Much of the debate over the underlying causes of discrimination centers on the rationality of employer decision making. Economic models of statistical discrimination emphasize the cognitive utility of group estimates as a means of dealing with the problems of uncertainty. Sociological and social-psychological models, by contrast, question the accuracy of group-level attributions. Although mean differences may exist between groups on productivity-related characteristics, these differences are often inflated in their application, leading to much larger differences in individual evaluations than would be warranted by actual group-level trait distributions. In this study, the authors examine the nature of employer attitudes about black and white workers and the extent to which these views are calibrated against their direct experiences with workers from each group. They use data from fifty-five in-depth interviews with hiring managers to explore employers' group-level attributions and their direct observations to develop a model of attitude formation and employer learning.

  19. Bayesian Bigot? Statistical Discrimination, Stereotypes, and Employer Decision Making

    PubMed Central

    Pager, Devah; Karafin, Diana

    2010-01-01

    Much of the debate over the underlying causes of discrimination centers on the rationality of employer decision making. Economic models of statistical discrimination emphasize the cognitive utility of group estimates as a means of dealing with the problems of uncertainty. Sociological and social-psychological models, by contrast, question the accuracy of group-level attributions. Although mean differences may exist between groups on productivity-related characteristics, these differences are often inflated in their application, leading to much larger differences in individual evaluations than would be warranted by actual group-level trait distributions. In this study, the authors examine the nature of employer attitudes about black and white workers and the extent to which these views are calibrated against their direct experiences with workers from each group. They use data from fifty-five in-depth interviews with hiring managers to explore employers’ group-level attributions and their direct observations to develop a model of attitude formation and employer learning. PMID:20686633

  20. Bayesian Statistical Analysis Applied to NAA Data for Neutron Flux Spectrum Determination

    NASA Astrophysics Data System (ADS)

    Chiesa, D.; Previtali, E.; Sisti, M.

    2014-04-01

    In this paper, we present a statistical method, based on Bayesian statistics, to evaluate the neutron flux spectrum from the activation data of different isotopes. The experimental data were acquired during a neutron activation analysis (NAA) experiment [A. Borio di Tigliole et al., Absolute flux measurement by NAA at the Pavia University TRIGA Mark II reactor facilities, ENC 2012 - Transactions Research Reactors, ISBN 978-92-95064-14-0, 22 (2012)] performed at the TRIGA Mark II reactor of Pavia University (Italy). In order to evaluate the neutron flux spectrum, subdivided in energy groups, we must solve a system of linear equations containing the grouped cross sections and the activation rate data. We solve this problem with Bayesian statistical analysis, including the uncertainties of the coefficients and the a priori information about the neutron flux. A program for the analysis of Bayesian hierarchical models, based on Markov Chain Monte Carlo (MCMC) simulations, is used to define the problem statistical model and solve it. The energy group fluxes and their uncertainties are then determined with great accuracy and the correlations between the groups are analyzed. Finally, the dependence of the results on the prior distribution choice and on the group cross section data is investigated to confirm the reliability of the analysis.

  1. Fine Mapping Causal Variants with an Approximate Bayesian Method Using Marginal Test Statistics.

    PubMed

    Chen, Wenan; Larrabee, Beth R; Ovsyannikova, Inna G; Kennedy, Richard B; Haralambieva, Iana H; Poland, Gregory A; Schaid, Daniel J

    2015-07-01

    Two recently developed fine-mapping methods, CAVIAR and PAINTOR, demonstrate better performance over other fine-mapping methods. They also have the advantage of using only the marginal test statistics and the correlation among SNPs. Both methods leverage the fact that the marginal test statistics asymptotically follow a multivariate normal distribution and are likelihood based. However, their relationship with Bayesian fine mapping, such as BIMBAM, is not clear. In this study, we first show that CAVIAR and BIMBAM are actually approximately equivalent to each other. This leads to a fine-mapping method using marginal test statistics in the Bayesian framework, which we call CAVIAR Bayes factor (CAVIARBF). Another advantage of the Bayesian framework is that it can answer both association and fine-mapping questions. We also used simulations to compare CAVIARBF with other methods under different numbers of causal variants. The results showed that both CAVIARBF and BIMBAM have better performance than PAINTOR and other methods. Compared to BIMBAM, CAVIARBF has the advantage of using only marginal test statistics and takes about one-quarter to one-fifth of the running time. We applied different methods on two independent cohorts of the same phenotype. Results showed that CAVIARBF, BIMBAM, and PAINTOR selected the same top 3 SNPs; however, CAVIARBF and BIMBAM had better consistency in selecting the top 10 ranked SNPs between the two cohorts. Software is available at https://bitbucket.org/Wenan/caviarbf.

  2. Bridging the gap between GLUE and formal statistical approaches: approximate Bayesian computation

    NASA Astrophysics Data System (ADS)

    Sadegh, M.; Vrugt, J. A.

    2013-12-01

    In recent years, a strong debate has emerged in the hydrologic literature regarding how to properly treat nontraditional error residual distributions and quantify parameter and predictive uncertainty. Particularly, there is strong disagreement whether such uncertainty framework should have its roots within a proper statistical (Bayesian) context using Markov chain Monte Carlo (MCMC) simulation techniques, or whether such a framework should be based on a quite different philosophy and implement informal likelihood functions and simplistic search methods to summarize parameter and predictive distributions. This paper is a follow-up of our previous work published in Vrugt and Sadegh (2013) and demonstrates that approximate Bayesian computation (ABC) bridges the gap between formal and informal statistical model-data fitting approaches. The ABC methodology has recently emerged in the fields of biology and population genetics and relaxes the need for an explicit likelihood function in favor of one or multiple different summary statistics that measure the distance of each model simulation to the data. This paper further studies the theoretical and numerical equivalence of formal and informal Bayesian approaches using discharge and forcing data from different watersheds in the United States, in particular generalized likelihood uncertainty estimation (GLUE). We demonstrate that the limits of acceptability approach of GLUE is a special variant of ABC if each discharge observation of the calibration data set is used as a summary diagnostic.

  3. [Statistical process control in healthcare].

    PubMed

    Anhøj, Jacob; Bjørn, Brian

    2009-05-18

    Statistical process control (SPC) is a branch of statistical science which comprises methods for the study of process variation. Common cause variation is inherent in any process and predictable within limits. Special cause variation is unpredictable and indicates change in the process. The run chart is a simple tool for analysis of process variation. Run chart analysis may reveal anomalies that suggest shifts or unusual patterns that are attributable to special cause variation. PMID:19454196

  4. Boosting Bayesian parameter inference of stochastic differential equation models with methods from statistical physics

    NASA Astrophysics Data System (ADS)

    Albert, Carlo; Ulzega, Simone; Stoop, Ruedi

    2016-04-01

    Measured time-series of both precipitation and runoff are known to exhibit highly non-trivial statistical properties. For making reliable probabilistic predictions in hydrology, it is therefore desirable to have stochastic models with output distributions that share these properties. When parameters of such models have to be inferred from data, we also need to quantify the associated parametric uncertainty. For non-trivial stochastic models, however, this latter step is typically very demanding, both conceptually and numerically, and always never done in hydrology. Here, we demonstrate that methods developed in statistical physics make a large class of stochastic differential equation (SDE) models amenable to a full-fledged Bayesian parameter inference. For concreteness we demonstrate these methods by means of a simple yet non-trivial toy SDE model. We consider a natural catchment that can be described by a linear reservoir, at the scale of observation. All the neglected processes are assumed to happen at much shorter time-scales and are therefore modeled with a Gaussian white noise term, the standard deviation of which is assumed to scale linearly with the system state (water volume in the catchment). Even for constant input, the outputs of this simple non-linear SDE model show a wealth of desirable statistical properties, such as fat-tailed distributions and long-range correlations. Standard algorithms for Bayesian inference fail, for models of this kind, because their likelihood functions are extremely high-dimensional intractable integrals over all possible model realizations. The use of Kalman filters is illegitimate due to the non-linearity of the model. Particle filters could be used but become increasingly inefficient with growing number of data points. Hamiltonian Monte Carlo algorithms allow us to translate this inference problem to the problem of simulating the dynamics of a statistical mechanics system and give us access to most sophisticated methods

  5. UNITY: Confronting Supernova Cosmology's Statistical and Systematic Uncertainties in a Unified Bayesian Framework

    NASA Astrophysics Data System (ADS)

    Rubin, D.; Aldering, G.; Barbary, K.; Boone, K.; Chappell, G.; Currie, M.; Deustua, S.; Fagrelius, P.; Fruchter, A.; Hayden, B.; Lidman, C.; Nordin, J.; Perlmutter, S.; Saunders, C.; Sofiatti, C.; Supernova Cosmology Project, The

    2015-11-01

    While recent supernova (SN) cosmology research has benefited from improved measurements, current analysis approaches are not statistically optimal and will prove insufficient for future surveys. This paper discusses the limitations of current SN cosmological analyses in treating outliers, selection effects, shape- and color-standardization relations, unexplained dispersion, and heterogeneous observations. We present a new Bayesian framework, called UNITY (Unified Nonlinear Inference for Type-Ia cosmologY), that incorporates significant improvements in our ability to confront these effects. We apply the framework to real SN observations and demonstrate smaller statistical and systematic uncertainties. We verify earlier results that SNe Ia require nonlinear shape and color standardizations, but we now include these nonlinear relations in a statistically well-justified way. This analysis was primarily performed blinded, in that the basic framework was first validated on simulated data before transitioning to real data. We also discuss possible extensions of the method.

  6. Dissolution curve comparisons through the F(2) parameter, a Bayesian extension of the f(2) statistic.

    PubMed

    Novick, Steven; Shen, Yan; Yang, Harry; Peterson, John; LeBlond, Dave; Altan, Stan

    2015-01-01

    Dissolution (or in vitro release) studies constitute an important aspect of pharmaceutical drug development. One important use of such studies is for justifying a biowaiver for post-approval changes which requires establishing equivalence between the new and old product. We propose a statistically rigorous modeling approach for this purpose based on the estimation of what we refer to as the F2 parameter, an extension of the commonly used f2 statistic. A Bayesian test procedure is proposed in relation to a set of composite hypotheses that capture the similarity requirement on the absolute mean differences between test and reference dissolution profiles. Several examples are provided to illustrate the application. Results of our simulation study comparing the performance of f2 and the proposed method show that our Bayesian approach is comparable to or in many cases superior to the f2 statistic as a decision rule. Further useful extensions of the method, such as the use of continuous-time dissolution modeling, are considered.

  7. A Bayesian Formulation of Behavioral Control

    ERIC Educational Resources Information Center

    Huys, Quentin J. M.; Dayan, Peter

    2009-01-01

    Helplessness, a belief that the world is not subject to behavioral control, has long been central to our understanding of depression, and has influenced cognitive theories, animal models and behavioral treatments. However, despite its importance, there is no fully accepted definition of helplessness or behavioral control in psychology or…

  8. Intelligent Condition Diagnosis Method Based on Adaptive Statistic Test Filter and Diagnostic Bayesian Network

    PubMed Central

    Li, Ke; Zhang, Qiuju; Wang, Kun; Chen, Peng; Wang, Huaqing

    2016-01-01

    A new fault diagnosis method for rotating machinery based on adaptive statistic test filter (ASTF) and Diagnostic Bayesian Network (DBN) is presented in this paper. ASTF is proposed to obtain weak fault features under background noise, ASTF is based on statistic hypothesis testing in the frequency domain to evaluate similarity between reference signal (noise signal) and original signal, and remove the component of high similarity. The optimal level of significance α is obtained using particle swarm optimization (PSO). To evaluate the performance of the ASTF, evaluation factor Ipq is also defined. In addition, a simulation experiment is designed to verify the effectiveness and robustness of ASTF. A sensitive evaluation method using principal component analysis (PCA) is proposed to evaluate the sensitiveness of symptom parameters (SPs) for condition diagnosis. By this way, the good SPs that have high sensitiveness for condition diagnosis can be selected. A three-layer DBN is developed to identify condition of rotation machinery based on the Bayesian Belief Network (BBN) theory. Condition diagnosis experiment for rolling element bearings demonstrates the effectiveness of the proposed method. PMID:26761006

  9. Combined Bayesian statistics and load duration curve method for bacteria nonpoint source loading estimation.

    PubMed

    Shen, Jian; Zhao, Yuan

    2010-01-01

    Nonpoint source load estimation is an essential part of the development of the bacterial total maximum daily load (TMDL) mandated by the Clean Water Act. However, the currently widely used watershed-receiving water modeling approach is usually associated with a high level of uncertainty and requires long-term observational data and intensive training effort. The load duration curve (LDC) method recommended by the EPA provides a simpler way to estimate bacteria loading. This method, however, does not take into consideration the specific fate and transport mechanisms of the pollutant and cannot address the uncertainty. In this study, a Bayesian statistical approach is applied to the Escherichia coli TMDL development of a stream on the Eastern Shore of Virginia to inversely estimate watershed bacteria loads from the in-stream monitoring data. The mechanism of bacteria transport is incorporated. The effects of temperature, bottom slope, and flow on allowable and existing load calculations are discussed. The uncertainties associated with load estimation are also fully described. Our method combines the merits of LDC, mechanistic modeling, and Bayesian statistics, while overcoming some of the shortcomings associated with these methods. It is a cost-effective tool for bacteria TMDL development and can be modified and applied to multi-segment streams as well. PMID:19781737

  10. Intelligent Condition Diagnosis Method Based on Adaptive Statistic Test Filter and Diagnostic Bayesian Network.

    PubMed

    Li, Ke; Zhang, Qiuju; Wang, Kun; Chen, Peng; Wang, Huaqing

    2016-01-01

    A new fault diagnosis method for rotating machinery based on adaptive statistic test filter (ASTF) and Diagnostic Bayesian Network (DBN) is presented in this paper. ASTF is proposed to obtain weak fault features under background noise, ASTF is based on statistic hypothesis testing in the frequency domain to evaluate similarity between reference signal (noise signal) and original signal, and remove the component of high similarity. The optimal level of significance α is obtained using particle swarm optimization (PSO). To evaluate the performance of the ASTF, evaluation factor Ipq is also defined. In addition, a simulation experiment is designed to verify the effectiveness and robustness of ASTF. A sensitive evaluation method using principal component analysis (PCA) is proposed to evaluate the sensitiveness of symptom parameters (SPs) for condition diagnosis. By this way, the good SPs that have high sensitiveness for condition diagnosis can be selected. A three-layer DBN is developed to identify condition of rotation machinery based on the Bayesian Belief Network (BBN) theory. Condition diagnosis experiment for rolling element bearings demonstrates the effectiveness of the proposed method.

  11. Intelligent Condition Diagnosis Method Based on Adaptive Statistic Test Filter and Diagnostic Bayesian Network.

    PubMed

    Li, Ke; Zhang, Qiuju; Wang, Kun; Chen, Peng; Wang, Huaqing

    2016-01-01

    A new fault diagnosis method for rotating machinery based on adaptive statistic test filter (ASTF) and Diagnostic Bayesian Network (DBN) is presented in this paper. ASTF is proposed to obtain weak fault features under background noise, ASTF is based on statistic hypothesis testing in the frequency domain to evaluate similarity between reference signal (noise signal) and original signal, and remove the component of high similarity. The optimal level of significance α is obtained using particle swarm optimization (PSO). To evaluate the performance of the ASTF, evaluation factor Ipq is also defined. In addition, a simulation experiment is designed to verify the effectiveness and robustness of ASTF. A sensitive evaluation method using principal component analysis (PCA) is proposed to evaluate the sensitiveness of symptom parameters (SPs) for condition diagnosis. By this way, the good SPs that have high sensitiveness for condition diagnosis can be selected. A three-layer DBN is developed to identify condition of rotation machinery based on the Bayesian Belief Network (BBN) theory. Condition diagnosis experiment for rolling element bearings demonstrates the effectiveness of the proposed method. PMID:26761006

  12. Assessing compositional variability through graphical analysis and Bayesian statistical approaches: case studies on transgenic crops.

    PubMed

    Harrigan, George G; Harrison, Jay M

    2012-01-01

    New transgenic (GM) crops are subjected to extensive safety assessments that include compositional comparisons with conventional counterparts as a cornerstone of the process. The influence of germplasm, location, environment, and agronomic treatments on compositional variability is, however, often obscured in these pair-wise comparisons. Furthermore, classical statistical significance testing can often provide an incomplete and over-simplified summary of highly responsive variables such as crop composition. In order to more clearly describe the influence of the numerous sources of compositional variation we present an introduction to two alternative but complementary approaches to data analysis and interpretation. These include i) exploratory data analysis (EDA) with its emphasis on visualization and graphics-based approaches and ii) Bayesian statistical methodology that provides easily interpretable and meaningful evaluations of data in terms of probability distributions. The EDA case-studies include analyses of herbicide-tolerant GM soybean and insect-protected GM maize and soybean. Bayesian approaches are presented in an analysis of herbicide-tolerant GM soybean. Advantages of these approaches over classical frequentist significance testing include the more direct interpretation of results in terms of probabilities pertaining to quantities of interest and no confusion over the application of corrections for multiple comparisons. It is concluded that a standardized framework for these methodologies could provide specific advantages through enhanced clarity of presentation and interpretation in comparative assessments of crop composition.

  13. Fast SAR image change detection using Bayesian approach based difference image and modified statistical region merging.

    PubMed

    Zhang, Han; Ni, Weiping; Yan, Weidong; Bian, Hui; Wu, Junzheng

    2014-01-01

    A novel fast SAR image change detection method is presented in this paper. Based on a Bayesian approach, the prior information that speckles follow the Nakagami distribution is incorporated into the difference image (DI) generation process. The new DI performs much better than the familiar log ratio (LR) DI as well as the cumulant based Kullback-Leibler divergence (CKLD) DI. The statistical region merging (SRM) approach is first introduced to change detection context. A new clustering procedure with the region variance as the statistical inference variable is exhibited to tailor SAR image change detection purposes, with only two classes in the final map, the unchanged and changed classes. The most prominent advantages of the proposed modified SRM (MSRM) method are the ability to cope with noise corruption and the quick implementation. Experimental results show that the proposed method is superior in both the change detection accuracy and the operation efficiency.

  14. Fast SAR Image Change Detection Using Bayesian Approach Based Difference Image and Modified Statistical Region Merging

    PubMed Central

    Ni, Weiping; Yan, Weidong; Bian, Hui; Wu, Junzheng

    2014-01-01

    A novel fast SAR image change detection method is presented in this paper. Based on a Bayesian approach, the prior information that speckles follow the Nakagami distribution is incorporated into the difference image (DI) generation process. The new DI performs much better than the familiar log ratio (LR) DI as well as the cumulant based Kullback-Leibler divergence (CKLD) DI. The statistical region merging (SRM) approach is first introduced to change detection context. A new clustering procedure with the region variance as the statistical inference variable is exhibited to tailor SAR image change detection purposes, with only two classes in the final map, the unchanged and changed classes. The most prominent advantages of the proposed modified SRM (MSRM) method are the ability to cope with noise corruption and the quick implementation. Experimental results show that the proposed method is superior in both the change detection accuracy and the operation efficiency. PMID:25258740

  15. Statistical Inference at Work: Statistical Process Control as an Example

    ERIC Educational Resources Information Center

    Bakker, Arthur; Kent, Phillip; Derry, Jan; Noss, Richard; Hoyles, Celia

    2008-01-01

    To characterise statistical inference in the workplace this paper compares a prototypical type of statistical inference at work, statistical process control (SPC), with a type of statistical inference that is better known in educational settings, hypothesis testing. Although there are some similarities between the reasoning structure involved in…

  16. Exploring the Connection Between Sampling Problems in Bayesian Inference and Statistical Mechanics

    NASA Technical Reports Server (NTRS)

    Pohorille, Andrew

    2006-01-01

    The Bayesian and statistical mechanical communities often share the same objective in their work - estimating and integrating probability distribution functions (pdfs) describing stochastic systems, models or processes. Frequently, these pdfs are complex functions of random variables exhibiting multiple, well separated local minima. Conventional strategies for sampling such pdfs are inefficient, sometimes leading to an apparent non-ergodic behavior. Several recently developed techniques for handling this problem have been successfully applied in statistical mechanics. In the multicanonical and Wang-Landau Monte Carlo (MC) methods, the correct pdfs are recovered from uniform sampling of the parameter space by iteratively establishing proper weighting factors connecting these distributions. Trivial generalizations allow for sampling from any chosen pdf. The closely related transition matrix method relies on estimating transition probabilities between different states. All these methods proved to generate estimates of pdfs with high statistical accuracy. In another MC technique, parallel tempering, several random walks, each corresponding to a different value of a parameter (e.g. "temperature"), are generated and occasionally exchanged using the Metropolis criterion. This method can be considered as a statistically correct version of simulated annealing. An alternative approach is to represent the set of independent variables as a Hamiltonian system. Considerab!e progress has been made in understanding how to ensure that the system obeys the equipartition theorem or, equivalently, that coupling between the variables is correctly described. Then a host of techniques developed for dynamical systems can be used. Among them, probably the most powerful is the Adaptive Biasing Force method, in which thermodynamic integration and biased sampling are combined to yield very efficient estimates of pdfs. The third class of methods deals with transitions between states described

  17. Evaluation of Oceanic Transport Statistics By Use of Transient Tracers and Bayesian Methods

    NASA Astrophysics Data System (ADS)

    Trossman, D. S.; Thompson, L.; Mecking, S.; Bryan, F.; Peacock, S.

    2013-12-01

    Key variables that quantify the time scales over which atmospheric signals penetrate into the oceanic interior and their uncertainties are computed using Bayesian methods and transient tracers from both models and observations. First, the mean residence times, subduction rates, and formation rates of Subtropical Mode Water (STMW) and Subpolar Mode Water (SPMW) in the North Atlantic and Subantarctic Mode Water (SAMW) in the Southern Ocean are estimated by combining a model and observations of chlorofluorocarbon-11 (CFC-11) via Bayesian Model Averaging (BMA), statistical technique that weights model estimates according to how close they agree with observations. Second, a Bayesian method is presented to find two oceanic transport parameters associated with the age distribution of ocean waters, the transit-time distribution (TTD), by combining an eddying global ocean model's estimate of the TTD with hydrographic observations of CFC-11, temperature, and salinity. Uncertainties associated with objectively mapping irregularly spaced bottle data are quantified by making use of a thin-plate spline and then propagated via the two Bayesian techniques. It is found that the subduction of STMW, SPMW, and SAMW is mostly an advective process, but up to about one-third of STMW subduction likely owes to non-advective processes. Also, while the formation of STMW is mostly due to subduction, the formation of SPMW is mostly due to other processes. About half of the formation of SAMW is due to subduction and half is due to other processes. A combination of air-sea flux, acting on relatively short time scales, and turbulent mixing, acting on a wide range of time scales, is likely the dominant SPMW erosion mechanism. Air-sea flux is likely responsible for most STMW erosion, and turbulent mixing is likely responsible for most SAMW erosion. Two oceanic transport parameters, the mean age of a water parcel and the half-variance associated with the TTD, estimated using the model's tracers as

  18. Integrating quantitative PCR and Bayesian statistics in quantifying human adenoviruses in small volumes of source water.

    PubMed

    Wu, Jianyong; Gronewold, Andrew D; Rodriguez, Roberto A; Stewart, Jill R; Sobsey, Mark D

    2014-02-01

    Rapid quantification of viral pathogens in drinking and recreational water can help reduce waterborne disease risks. For this purpose, samples in small volume (e.g. 1L) are favored because of the convenience of collection, transportation and processing. However, the results of viral analysis are often subject to uncertainty. To overcome this limitation, we propose an approach that integrates Bayesian statistics, efficient concentration methods, and quantitative PCR (qPCR) to quantify viral pathogens in water. Using this approach, we quantified human adenoviruses (HAdVs) in eighteen samples of source water collected from six drinking water treatment plants. HAdVs were found in seven samples. In the other eleven samples, HAdVs were not detected by qPCR, but might have existed based on Bayesian inference. Our integrated approach that quantifies uncertainty provides a better understanding than conventional assessments of potential risks to public health, particularly in cases when pathogens may present a threat but cannot be detected by traditional methods. PMID:24140696

  19. Integrating quantitative PCR and Bayesian statistics in quantifying human adenoviruses in small volumes of source water.

    PubMed

    Wu, Jianyong; Gronewold, Andrew D; Rodriguez, Roberto A; Stewart, Jill R; Sobsey, Mark D

    2014-02-01

    Rapid quantification of viral pathogens in drinking and recreational water can help reduce waterborne disease risks. For this purpose, samples in small volume (e.g. 1L) are favored because of the convenience of collection, transportation and processing. However, the results of viral analysis are often subject to uncertainty. To overcome this limitation, we propose an approach that integrates Bayesian statistics, efficient concentration methods, and quantitative PCR (qPCR) to quantify viral pathogens in water. Using this approach, we quantified human adenoviruses (HAdVs) in eighteen samples of source water collected from six drinking water treatment plants. HAdVs were found in seven samples. In the other eleven samples, HAdVs were not detected by qPCR, but might have existed based on Bayesian inference. Our integrated approach that quantifies uncertainty provides a better understanding than conventional assessments of potential risks to public health, particularly in cases when pathogens may present a threat but cannot be detected by traditional methods.

  20. Crossing statistic: Bayesian interpretation, model selection and resolving dark energy parametrization problem

    SciTech Connect

    Shafieloo, Arman

    2012-05-01

    By introducing Crossing functions and hyper-parameters I show that the Bayesian interpretation of the Crossing Statistics [1] can be used trivially for the purpose of model selection among cosmological models. In this approach to falsify a cosmological model there is no need to compare it with other models or assume any particular form of parametrization for the cosmological quantities like luminosity distance, Hubble parameter or equation of state of dark energy. Instead, hyper-parameters of Crossing functions perform as discriminators between correct and wrong models. Using this approach one can falsify any assumed cosmological model without putting priors on the underlying actual model of the universe and its parameters, hence the issue of dark energy parametrization is resolved. It will be also shown that the sensitivity of the method to the intrinsic dispersion of the data is small that is another important characteristic of the method in testing cosmological models dealing with data with high uncertainties.

  1. Bayesian Software Health Management for Aircraft Guidance, Navigation, and Control

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Mbaya, Timmy; Menghoel, Ole

    2011-01-01

    Modern aircraft, both piloted fly-by-wire commercial aircraft as well as UAVs, more and more depend on highly complex safety critical software systems with many sensors and computer-controlled actuators. Despite careful design and V&V of the software, severe incidents have happened due to malfunctioning software. In this paper, we discuss the use of Bayesian networks (BNs) to monitor the health of the on-board software and sensor system, and to perform advanced on-board diagnostic reasoning. We will focus on the approach to develop reliable and robust health models for the combined software and sensor systems.

  2. Automated parameter estimation for biological models using Bayesian statistical model checking

    PubMed Central

    2015-01-01

    Background Probabilistic models have gained widespread acceptance in the systems biology community as a useful way to represent complex biological systems. Such models are developed using existing knowledge of the structure and dynamics of the system, experimental observations, and inferences drawn from statistical analysis of empirical data. A key bottleneck in building such models is that some system variables cannot be measured experimentally. These variables are incorporated into the model as numerical parameters. Determining values of these parameters that justify existing experiments and provide reliable predictions when model simulations are performed is a key research problem. Domain experts usually estimate the values of these parameters by fitting the model to experimental data. Model fitting is usually expressed as an optimization problem that requires minimizing a cost-function which measures some notion of distance between the model and the data. This optimization problem is often solved by combining local and global search methods that tend to perform well for the specific application domain. When some prior information about parameters is available, methods such as Bayesian inference are commonly used for parameter learning. Choosing the appropriate parameter search technique requires detailed domain knowledge and insight into the underlying system. Results Using an agent-based model of the dynamics of acute inflammation, we demonstrate a novel parameter estimation algorithm by discovering the amount and schedule of doses of bacterial lipopolysaccharide that guarantee a set of observed clinical outcomes with high probability. We synthesized values of twenty-eight unknown parameters such that the parameterized model instantiated with these parameter values satisfies four specifications describing the dynamic behavior of the model. Conclusions We have developed a new algorithmic technique for discovering parameters in complex stochastic models of

  3. Ionosonde measurements in Bayesian statistical ionospheric tomography with incoherent scatter radar validation

    NASA Astrophysics Data System (ADS)

    Norberg, J.; Virtanen, I. I.; Roininen, L.; Vierinen, J.; Orispää, M.; Kauristie, K.; Lehtinen, M. S.

    2015-09-01

    We validate two-dimensional ionospheric tomography reconstructions against EISCAT incoherent scatter radar measurements. Our tomography method is based on Bayesian statistical inversion with prior distribution given by its mean and covariance. We employ ionosonde measurements for the choice of the prior mean and covariance parameters, and use the Gaussian Markov random fields as a sparse matrix approximation for the numerical computations. This results in a computationally efficient and statistically clear inversion algorithm for tomography. We demonstrate how this method works with simultaneous beacon satellite and ionosonde measurements obtained in northern Scandinavia. The performance is compared with results obtained with a zero mean prior and with the prior mean taken from the International Reference Ionosphere 2007 model. In validating the results, we use EISCAT UHF incoherent scatter radar measurements as the ground truth for the ionization profile shape. We find that ionosonde measurements improve the reconstruction by adding accurate information about the absolute value and the height distribution of electron density, and outperforms the alternative prior information sources. With an ionosonde at continuous disposal, the presented method enhances stand-alone near real-time ionospheric tomography for the given conditions significantly.

  4. Rating locomotive crew diesel emission exposure profiles using statistics and Bayesian Decision Analysis.

    PubMed

    Hewett, Paul; Bullock, William H

    2014-01-01

    For more than 20 years CSX Transportation (CSXT) has collected exposure measurements from locomotive engineers and conductors who are potentially exposed to diesel emissions. The database included measurements for elemental and total carbon, polycyclic aromatic hydrocarbons, aromatics, aldehydes, carbon monoxide, and nitrogen dioxide. This database was statistically analyzed and summarized, and the resulting statistics and exposure profiles were compared to relevant occupational exposure limits (OELs) using both parametric and non-parametric descriptive and compliance statistics. Exposure ratings, using the American Industrial Health Association (AIHA) exposure categorization scheme, were determined using both the compliance statistics and Bayesian Decision Analysis (BDA). The statistical analysis of the elemental carbon data (a marker for diesel particulate) strongly suggests that the majority of levels in the cabs of the lead locomotives (n = 156) were less than the California guideline of 0.020 mg/m(3). The sample 95th percentile was roughly half the guideline; resulting in an AIHA exposure rating of category 2/3 (determined using BDA). The elemental carbon (EC) levels in the trailing locomotives tended to be greater than those in the lead locomotive; however, locomotive crews rarely ride in the trailing locomotive. Lead locomotive EC levels were similar to those reported by other investigators studying locomotive crew exposures and to levels measured in urban areas. Lastly, both the EC sample mean and 95%UCL were less than the Environmental Protection Agency (EPA) reference concentration of 0.005 mg/m(3). With the exception of nitrogen dioxide, the overwhelming majority of the measurements for total carbon, polycyclic aromatic hydrocarbons, aromatics, aldehydes, and combustion gases in the cabs of CSXT locomotives were either non-detects or considerably less than the working OELs for the years represented in the database. When compared to the previous American

  5. Finding the optimal statistical model to describe target motion during radiotherapy delivery—a Bayesian approach

    NASA Astrophysics Data System (ADS)

    Herschtal, A.; Foroudi, F.; Greer, P. B.; Eade, T. N.; Hindson, B. R.; Kron, T.

    2012-05-01

    Early approaches to characterizing errors in target displacement during a fractionated course of radiotherapy assumed that the underlying fraction-to-fraction variability in target displacement, known as the ‘treatment error’ or ‘random error’, could be regarded as constant across patients. More recent approaches have modelled target displacement allowing for differences in random error between patients. However, until recently it has not been feasible to compare the goodness of fit of alternate models of random error rigorously. This is because the large volumes of real patient data necessary to distinguish between alternative models have only very recently become available. This work uses real-world displacement data collected from 365 patients undergoing radical radiotherapy for prostate cancer to compare five candidate models for target displacement. The simplest model assumes constant random errors across patients, while other models allow for random errors that vary according to one of several candidate distributions. Bayesian statistics and Markov Chain Monte Carlo simulation of the model parameters are used to compare model goodness of fit. We conclude that modelling the random error as inverse gamma distributed provides a clearly superior fit over all alternatives considered. This finding can facilitate more accurate margin recipes and correction strategies.

  6. Bayesian Analysis of Two Stellar Populations in Galactic Globular Clusters. I. Statistical and Computational Methods

    NASA Astrophysics Data System (ADS)

    Stenning, D. C.; Wagner-Kaiser, R.; Robinson, E.; van Dyk, D. A.; von Hippel, T.; Sarajedini, A.; Stein, N.

    2016-07-01

    We develop a Bayesian model for globular clusters composed of multiple stellar populations, extending earlier statistical models for open clusters composed of simple (single) stellar populations. Specifically, we model globular clusters with two populations that differ in helium abundance. Our model assumes a hierarchical structuring of the parameters in which physical properties—age, metallicity, helium abundance, distance, absorption, and initial mass—are common to (i) the cluster as a whole or to (ii) individual populations within a cluster, or are unique to (iii) individual stars. An adaptive Markov chain Monte Carlo (MCMC) algorithm is devised for model fitting that greatly improves convergence relative to its precursor non-adaptive MCMC algorithm. Our model and computational tools are incorporated into an open-source software suite known as BASE-9. We use numerical studies to demonstrate that our method can recover parameters of two-population clusters, and also show how model misspecification can potentially be identified. As a proof of concept, we analyze the two stellar populations of globular cluster NGC 5272 using our model and methods. (BASE-9 is available from GitHub: https://github.com/argiopetech/base/releases).

  7. Bayesian Atmospheric Radiative Transfer (BART): Model, Statistics Driver, and Application to HD 209458b

    NASA Astrophysics Data System (ADS)

    Cubillos, Patricio; Harrington, Joseph; Blecic, Jasmina; Stemm, Madison M.; Lust, Nate B.; Foster, Andrew S.; Rojo, Patricio M.; Loredo, Thomas J.

    2014-11-01

    Multi-wavelength secondary-eclipse and transit depths probe the thermo-chemical properties of exoplanets. In recent years, several research groups have developed retrieval codes to analyze the existing data and study the prospects of future facilities. However, the scientific community has limited access to these packages. Here we premiere the open-source Bayesian Atmospheric Radiative Transfer (BART) code. We discuss the key aspects of the radiative-transfer algorithm and the statistical package. The radiation code includes line databases for all HITRAN molecules, high-temperature H2O, TiO, and VO, and includes a preprocessor for adding additional line databases without recompiling the radiation code. Collision-induced absorption lines are available for H2-H2 and H2-He. The parameterized thermal and molecular abundance profiles can be modified arbitrarily without recompilation. The generated spectra are integrated over arbitrary bandpasses for comparison to data. BART's statistical package, Multi-core Markov-chain Monte Carlo (MC3), is a general-purpose MCMC module. MC3 implements the Differental-evolution Markov-chain Monte Carlo algorithm (ter Braak 2006, 2009). MC3 converges 20-400 times faster than the usual Metropolis-Hastings MCMC algorithm, and in addition uses the Message Passing Interface (MPI) to parallelize the MCMC chains. We apply the BART retrieval code to the HD 209458b data set to estimate the planet's temperature profile and molecular abundances. This work was supported by NASA Planetary Atmospheres grant NNX12AI69G and NASA Astrophysics Data Analysis Program grant NNX13AF38G. JB holds a NASA Earth and Space Science Fellowship.

  8. Bayesian statistical approaches to compositional analyses of transgenic crops 2. Application and validation of informative prior distributions.

    PubMed

    Harrison, Jay M; Breeze, Matthew L; Berman, Kristina H; Harrigan, George G

    2013-03-01

    Bayesian approaches to evaluation of crop composition data allow simpler interpretations than traditional statistical significance tests. An important advantage of Bayesian approaches is that they allow formal incorporation of previously generated data through prior distributions in the analysis steps. This manuscript describes key steps to ensure meaningful and transparent selection and application of informative prior distributions. These include (i) review of previous data in the scientific literature to form the prior distributions, (ii) proper statistical model specification and documentation, (iii) graphical analyses to evaluate the fit of the statistical model to new study data, and (iv) sensitivity analyses to evaluate the robustness of results to the choice of prior distribution. The validity of the prior distribution for any crop component is critical to acceptance of Bayesian approaches to compositional analyses and would be essential for studies conducted in a regulatory setting. Selection and validation of prior distributions for three soybean isoflavones (daidzein, genistein, and glycitein) and two oligosaccharides (raffinose and stachyose) are illustrated in a comparative assessment of data obtained on GM and non-GM soybean seed harvested from replicated field sites at multiple locations in the US during the 2009 growing season. PMID:23261475

  9. Bayesian statistical approaches to compositional analyses of transgenic crops 2. Application and validation of informative prior distributions.

    PubMed

    Harrison, Jay M; Breeze, Matthew L; Berman, Kristina H; Harrigan, George G

    2013-03-01

    Bayesian approaches to evaluation of crop composition data allow simpler interpretations than traditional statistical significance tests. An important advantage of Bayesian approaches is that they allow formal incorporation of previously generated data through prior distributions in the analysis steps. This manuscript describes key steps to ensure meaningful and transparent selection and application of informative prior distributions. These include (i) review of previous data in the scientific literature to form the prior distributions, (ii) proper statistical model specification and documentation, (iii) graphical analyses to evaluate the fit of the statistical model to new study data, and (iv) sensitivity analyses to evaluate the robustness of results to the choice of prior distribution. The validity of the prior distribution for any crop component is critical to acceptance of Bayesian approaches to compositional analyses and would be essential for studies conducted in a regulatory setting. Selection and validation of prior distributions for three soybean isoflavones (daidzein, genistein, and glycitein) and two oligosaccharides (raffinose and stachyose) are illustrated in a comparative assessment of data obtained on GM and non-GM soybean seed harvested from replicated field sites at multiple locations in the US during the 2009 growing season.

  10. Bayesian Statistical Analysis of Historical and Late Holocene Rates of Sea-Level Change

    NASA Astrophysics Data System (ADS)

    Cahill, Niamh; Parnell, Andrew; Kemp, Andrew; Horton, Benjamin

    2014-05-01

    A fundamental concern associated with climate change is the rate at which sea levels are rising. Studies of past sea level (particularly beyond the instrumental data range) allow modern sea-level rise to be placed in a more complete context. Considering this, we perform a Bayesian statistical analysis on historical and late Holocene rates of sea-level change. The data that form the input to the statistical model are tide-gauge measurements and proxy reconstructions from cores of coastal sediment. The aims are to estimate rates of sea-level rise, to determine when modern rates of sea-level rise began and to observe how these rates have been changing over time. Many of the current methods for doing this use simple linear regression to estimate rates. This is often inappropriate as it is too rigid and it can ignore uncertainties that arise as part of the data collection exercise. This can lead to over confidence in the sea-level trends being characterized. The proposed Bayesian model places a Gaussian process prior on the rate process (i.e. the process that determines how rates of sea-level are changing over time). The likelihood of the observed data is the integral of this process. When dealing with proxy reconstructions, this is set in an errors-in-variables framework so as to take account of age uncertainty. It is also necessary, in this case, for the model to account for glacio-isostatic adjustment, which introduces a covariance between individual age and sea-level observations. This method provides a flexible fit and it allows for the direct estimation of the rate process with full consideration of all sources of uncertainty. Analysis of tide-gauge datasets and proxy reconstructions in this way means that changing rates of sea level can be estimated more comprehensively and accurately than previously possible. The model captures the continuous and dynamic evolution of sea-level change and results show that not only are modern sea levels rising but that the rates

  11. A Bayesian statistical assessment of representative samples for asteroidal or meteoritical material

    NASA Astrophysics Data System (ADS)

    Carter, Jonathan N.; Sephton, Mark A.

    2013-06-01

    Primitive substances in asteroid and meteorite materials represent a record of early solar system evolution. To allow the study of these materials, they must be collected and transferred to the laboratory. Collection during sample return missions requires an assessment of the size of samples needed. Meteorite falls or finds must be subdivided into appropriate subsamples for analysis by successive generations of scientists. It is essential, therefore, to determine a representative mass or volume at which the collected or allocated sample is representative of the whole. For the first time, we have used a Bayesian statistical approach and a selected meteorite sample, Murchison, to identify a recommended smallest sample mass that can be used without interferences from sampling bias. Enhancing background knowledge to inform sample selection and analysis is an effective means of increasing the probability of obtaining a positive scientific outcome. The influence of the subdivision mechanism when preparing samples for distribution has also been examined. Assuming a similar size distribution of fragments to that of the Murchison meteorite, cubes can be similarly representative as fragments, but at orders of magnitude smaller sizes. We find that: (1) at all defined probabilities (90%, 95%, and 99%), nanometer-sized particles (where the axes of a three-dimensional sample are less that a nanometer in length) are never representative of the whole; (2) at the intermediate and highest defined probabilities (95% and 99%), micrometer-sized particles are never representative of the whole; and (3) for micrometer-sized samples, the only sample that is representative of the whole is a cube and then only at a 90% probability. The difference between cubes and fragments becomes less important as sample size increases and any >0.5 mm-sized sample will be representative of the whole with a probability of 99.9%. The results provide guidance for sample return mission planners and curators or

  12. Image Denoising via Bayesian Estimation of Statistical Parameter Using Generalized Gamma Density Prior in Gaussian Noise Model

    NASA Astrophysics Data System (ADS)

    Kittisuwan, Pichid

    2015-03-01

    The application of image processing in industry has shown remarkable success over the last decade, for example, in security and telecommunication systems. The denoising of natural image corrupted by Gaussian noise is a classical problem in image processing. So, image denoising is an indispensable step during image processing. This paper is concerned with dual-tree complex wavelet-based image denoising using Bayesian techniques. One of the cruxes of the Bayesian image denoising algorithms is to estimate the statistical parameter of the image. Here, we employ maximum a posteriori (MAP) estimation to calculate local observed variance with generalized Gamma density prior for local observed variance and Laplacian or Gaussian distribution for noisy wavelet coefficients. Evidently, our selection of prior distribution is motivated by efficient and flexible properties of generalized Gamma density. The experimental results show that the proposed method yields good denoising results.

  13. An overview of component qualification using Bayesian statistics and energy methods.

    SciTech Connect

    Dohner, Jeffrey Lynn

    2011-09-01

    The below overview is designed to give the reader a limited understanding of Bayesian and Maximum Likelihood (MLE) estimation; a basic understanding of some of the mathematical tools to evaluate the quality of an estimation; an introduction to energy methods and a limited discussion of damage potential. This discussion then goes on to presented a limited presentation as to how energy methods and Bayesian estimation are used together to qualify components. Example problems with solutions have been supplied as a learning aid. Bold letters are used to represent random variables. Un-bolded letter represent deterministic values. A concluding section presents a discussion of attributes and concerns.

  14. Improved parameterization of managed grassland in a global process-based vegetation model using Bayesian statistics

    NASA Astrophysics Data System (ADS)

    Rolinski, S.; Müller, C.; Lotze-Campen, H.; Bondeau, A.

    2010-12-01

    information on boundary conditions such as water and light availability or temperature sensibility. Based on the given limitation factors, a number of sensitive parameters are chosen, e.g. for the phenological development, biomass allocation, and different management regimes. These are introduced to a sensitivity analysis and Bayesian parameter evaluation using the R package FME (Soetart & Petzoldt, Journal of Statistical Software, 2010). Given the extremely different climatic conditions at the FluxNet grass sites, the premises for the global sensitivity analysis are very promising.

  15. Statistical Physics for Adaptive Distributed Control

    NASA Technical Reports Server (NTRS)

    Wolpert, David H.

    2005-01-01

    A viewgraph presentation on statistical physics for distributed adaptive control is shown. The topics include: 1) The Golden Rule; 2) Advantages; 3) Roadmap; 4) What is Distributed Control? 5) Review of Information Theory; 6) Iterative Distributed Control; 7) Minimizing L(q) Via Gradient Descent; and 8) Adaptive Distributed Control.

  16. Statistical analysis using the Bayesian nonparametric method for irradiation embrittlement of reactor pressure vessels

    NASA Astrophysics Data System (ADS)

    Takamizawa, Hisashi; Itoh, Hiroto; Nishiyama, Yutaka

    2016-10-01

    In order to understand neutron irradiation embrittlement in high fluence regions, statistical analysis using the Bayesian nonparametric (BNP) method was performed for the Japanese surveillance and material test reactor irradiation database. The BNP method is essentially expressed as an infinite summation of normal distributions, with input data being subdivided into clusters with identical statistical parameters, such as mean and standard deviation, for each cluster to estimate shifts in ductile-to-brittle transition temperature (DBTT). The clusters typically depend on chemical compositions, irradiation conditions, and the irradiation embrittlement. Specific variables contributing to the irradiation embrittlement include the content of Cu, Ni, P, Si, and Mn in the pressure vessel steels, neutron flux, neutron fluence, and irradiation temperatures. It was found that the measured shifts of DBTT correlated well with the calculated ones. Data associated with the same materials were subdivided into the same clusters even if neutron fluences were increased. Comparing cluster IDs 2 and 6, embrittlement of high-Cu-bearing materials (<0.07 wt%) was larger than that of low-Cu-bearing (0.07 < wt.%) materials. This is attributed to irradiation-induced Cu-enriched clusters, as well as those that are irradiation-enhanced [4]. A similar feature is recognized for cluster IDs 5 and 8 in materials with a higher Ni content. A flux effect with a higher flux range was demonstrated for cluster ID 3 comprising MTR irradiation in a high flux region (≤1 × 1013 n/cm2/s) [44]. For cluster ID 10, classification is rendered based upon flux effect, where embrittlement is accelerated in high Cu-bearing materials irradiated at lower flux levels (less than 5 × 109 n/cm2·s). This is possibly due to increased thermal equilibrium vacancies [44,45]. Per all the above considerations, it was hence ascertained that data belonging to identical cluster ID

  17. Bayesian Statistical Inference in Ion-Channel Models with Exact Missed Event Correction.

    PubMed

    Epstein, Michael; Calderhead, Ben; Girolami, Mark A; Sivilotti, Lucia G

    2016-07-26

    The stochastic behavior of single ion channels is most often described as an aggregated continuous-time Markov process with discrete states. For ligand-gated channels each state can represent a different conformation of the channel protein or a different number of bound ligands. Single-channel recordings show only whether the channel is open or shut: states of equal conductance are aggregated, so transitions between them have to be inferred indirectly. The requirement to filter noise from the raw signal further complicates the modeling process, as it limits the time resolution of the data. The consequence of the reduced bandwidth is that openings or shuttings that are shorter than the resolution cannot be observed; these are known as missed events. Postulated models fitted using filtered data must therefore explicitly account for missed events to avoid bias in the estimation of rate parameters and therefore assess parameter identifiability accurately. In this article, we present the first, to our knowledge, Bayesian modeling of ion-channels with exact missed events correction. Bayesian analysis represents uncertain knowledge of the true value of model parameters by considering these parameters as random variables. This allows us to gain a full appreciation of parameter identifiability and uncertainty when estimating values for model parameters. However, Bayesian inference is particularly challenging in this context as the correction for missed events increases the computational complexity of the model likelihood. Nonetheless, we successfully implemented a two-step Markov chain Monte Carlo method that we called "BICME", which performs Bayesian inference in models of realistic complexity. The method is demonstrated on synthetic and real single-channel data from muscle nicotinic acetylcholine channels. We show that parameter uncertainty can be characterized more accurately than with maximum-likelihood methods. Our code for performing inference in these ion channel

  18. Predicting uncertainty in future marine ice sheet volume using Bayesian statistical methods

    NASA Astrophysics Data System (ADS)

    Davis, A. D.

    2015-12-01

    The marine ice instability can trigger rapid retreat of marine ice streams. Recent observations suggest that marine ice systems in West Antarctica have begun retreating. However, unknown ice dynamics, computationally intensive mathematical models, and uncertain parameters in these models make predicting retreat rate and ice volume difficult. In this work, we fuse current observational data with ice stream/shelf models to develop probabilistic predictions of future grounded ice sheet volume. Given observational data (e.g., thickness, surface elevation, and velocity) and a forward model that relates uncertain parameters (e.g., basal friction and basal topography) to these observations, we use a Bayesian framework to define a posterior distribution over the parameters. A stochastic predictive model then propagates uncertainties in these parameters to uncertainty in a particular quantity of interest (QoI)---here, the volume of grounded ice at a specified future time. While the Bayesian approach can in principle characterize the posterior predictive distribution of the QoI, the computational cost of both the forward and predictive models makes this effort prohibitively expensive. To tackle this challenge, we introduce a new Markov chain Monte Carlo method that constructs convergent approximations of the QoI target density in an online fashion, yielding accurate characterizations of future ice sheet volume at significantly reduced computational cost.Our second goal is to attribute uncertainty in these Bayesian predictions to uncertainties in particular parameters. Doing so can help target data collection, for the purpose of constraining the parameters that contribute most strongly to uncertainty in the future volume of grounded ice. For instance, smaller uncertainties in parameters to which the QoI is highly sensitive may account for more variability in the prediction than larger uncertainties in parameters to which the QoI is less sensitive. We use global sensitivity

  19. On the limitations of standard statistical modeling in biological systems: a full Bayesian approach for biology.

    PubMed

    Gomez-Ramirez, Jaime; Sanz, Ricardo

    2013-09-01

    One of the most important scientific challenges today is the quantitative and predictive understanding of biological function. Classical mathematical and computational approaches have been enormously successful in modeling inert matter, but they may be inadequate to address inherent features of biological systems. We address the conceptual and methodological obstacles that lie in the inverse problem in biological systems modeling. We introduce a full Bayesian approach (FBA), a theoretical framework to study biological function, in which probability distributions are conditional on biophysical information that physically resides in the biological system that is studied by the scientist.

  20. An absolute chronology for early Egypt using radiocarbon dating and Bayesian statistical modelling.

    PubMed

    Dee, Michael; Wengrow, David; Shortland, Andrew; Stevenson, Alice; Brock, Fiona; Girdland Flink, Linus; Bronk Ramsey, Christopher

    2013-11-01

    The Egyptian state was formed prior to the existence of verifiable historical records. Conventional dates for its formation are based on the relative ordering of artefacts. This approach is no longer considered sufficient for cogent historical analysis. Here, we produce an absolute chronology for Early Egypt by combining radiocarbon and archaeological evidence within a Bayesian paradigm. Our data cover the full trajectory of Egyptian state formation and indicate that the process occurred more rapidly than previously thought. We provide a timeline for the First Dynasty of Egypt of generational-scale resolution that concurs with prevailing archaeological analysis and produce a chronometric date for the foundation of Egypt that distinguishes between historical estimates.

  1. Quantifying Trace Amounts of Aggregates in Biopharmaceuticals Using Analytical Ultracentrifugation Sedimentation Velocity: Bayesian Analyses and F Statistics.

    PubMed

    Wafer, Lucas; Kloczewiak, Marek; Luo, Yin

    2016-07-01

    Analytical ultracentrifugation-sedimentation velocity (AUC-SV) is often used to quantify high molar mass species (HMMS) present in biopharmaceuticals. Although these species are often present in trace quantities, they have received significant attention due to their potential immunogenicity. Commonly, AUC-SV data is analyzed as a diffusion-corrected, sedimentation coefficient distribution, or c(s), using SEDFIT to numerically solve Lamm-type equations. SEDFIT also utilizes maximum entropy or Tikhonov-Phillips regularization to further allow the user to determine relevant sample information, including the number of species present, their sedimentation coefficients, and their relative abundance. However, this methodology has several, often unstated, limitations, which may impact the final analysis of protein therapeutics. These include regularization-specific effects, artificial "ripple peaks," and spurious shifts in the sedimentation coefficients. In this investigation, we experimentally verified that an explicit Bayesian approach, as implemented in SEDFIT, can largely correct for these effects. Clear guidelines on how to implement this technique and interpret the resulting data, especially for samples containing micro-heterogeneity (e.g., differential glycosylation), are also provided. In addition, we demonstrated how the Bayesian approach can be combined with F statistics to draw more accurate conclusions and rigorously exclude artifactual peaks. Numerous examples with an antibody and an antibody-drug conjugate were used to illustrate the strengths and drawbacks of each technique.

  2. An absolute chronology for early Egypt using radiocarbon dating and Bayesian statistical modelling

    PubMed Central

    Dee, Michael; Wengrow, David; Shortland, Andrew; Stevenson, Alice; Brock, Fiona; Girdland Flink, Linus; Bronk Ramsey, Christopher

    2013-01-01

    The Egyptian state was formed prior to the existence of verifiable historical records. Conventional dates for its formation are based on the relative ordering of artefacts. This approach is no longer considered sufficient for cogent historical analysis. Here, we produce an absolute chronology for Early Egypt by combining radiocarbon and archaeological evidence within a Bayesian paradigm. Our data cover the full trajectory of Egyptian state formation and indicate that the process occurred more rapidly than previously thought. We provide a timeline for the First Dynasty of Egypt of generational-scale resolution that concurs with prevailing archaeological analysis and produce a chronometric date for the foundation of Egypt that distinguishes between historical estimates. PMID:24204188

  3. Databased comparison of Sparse Bayesian Learning and Multiple Linear Regression for statistical downscaling of low flow indices

    NASA Astrophysics Data System (ADS)

    Joshi, Deepti; St-Hilaire, André; Daigle, Anik; Ouarda, Taha B. M. J.

    2013-04-01

    SummaryThis study attempts to compare the performance of two statistical downscaling frameworks in downscaling hydrological indices (descriptive statistics) characterizing the low flow regimes of three rivers in Eastern Canada - Moisie, Romaine and Ouelle. The statistical models selected are Relevance Vector Machine (RVM), an implementation of Sparse Bayesian Learning, and the Automated Statistical Downscaling tool (ASD), an implementation of Multiple Linear Regression. Inputs to both frameworks involve climate variables significantly (α = 0.05) correlated with the indices. These variables were processed using Canonical Correlation Analysis and the resulting canonical variates scores were used as input to RVM to estimate the selected low flow indices. In ASD, the significantly correlated climate variables were subjected to backward stepwise predictor selection and the selected predictors were subsequently used to estimate the selected low flow indices using Multiple Linear Regression. With respect to the correlation between climate variables and the selected low flow indices, it was observed that all indices are influenced, primarily, by wind components (Vertical, Zonal and Meridonal) and humidity variables (Specific and Relative Humidity). The downscaling performance of the framework involving RVM was found to be better than ASD in terms of Relative Root Mean Square Error, Relative Mean Absolute Bias and Coefficient of Determination. In all cases, the former resulted in less variability of the performance indices between calibration and validation sets, implying better generalization ability than for the latter.

  4. Statistical process control in nursing research.

    PubMed

    Polit, Denise F; Chaboyer, Wendy

    2012-02-01

    In intervention studies in which randomization to groups is not possible, researchers typically use quasi-experimental designs. Time series designs are strong quasi-experimental designs but are seldom used, perhaps because of technical and analytic hurdles. Statistical process control (SPC) is an alternative analytic approach to testing hypotheses about intervention effects using data collected over time. SPC, like traditional statistical methods, is a tool for understanding variation and involves the construction of control charts that distinguish between normal, random fluctuations (common cause variation), and statistically significant special cause variation that can result from an innovation. The purpose of this article is to provide an overview of SPC and to illustrate its use in a study of a nursing practice improvement intervention. PMID:22095634

  5. Statistical quality control for VLSIC fabrication processes

    SciTech Connect

    Mozumder, P.K.

    1989-01-01

    As the complexity of VLSICs increase and the device dimension shrink, random fluctuations become the main reason limiting the par metric yield. Whenever a new process is developed, the initial yield are low. The rate of climbing the learning curve is slow, i.e., the time necessary to bring the yield above an economically acceptable value can be unacceptably long, resulting in lost revenue and competitive edge in the market. The slow rates of climbing the learning curve and the low initial yields can be countered by using design methodologies that take into account the random fluctuations in the fabrication processes, and using statistical on-line and off-line control during the wafer fabrication. An integrated CAD-CAM approach with profit maximization as the objective is necessary to design and fabricate present day VLSICs. In this thesis the author proposes a methodology for monitoring and statistically controlling VLSIC manufacturing processes as part of an integrated CAD-CAM system. Present day statistical quality control systems fail to function satisfactorily due to lack of in-situ and in-line data, and absence of statistical techniques that take into account the multi-dimensionality of the data. A concerted effort has to be made to increase the number of in-situ parameters that are measured during the fabrication process using new generation equipment and sensors. Algorithms for identifying the minimal set of observable in-situ and in-line parameters that have to be measured to monitor the fabrication process are presented. The methodology for statistical quality control is based on the exploration of the multivariate distribution of the observed in-process parameters in the region of acceptability specified by the customer. Criteria for comparing the distributions of the normal process to that of the process under control are used to make the quality control decisions.

  6. Bayesian Statistics and Uncertainty Quantification for Safety Boundary Analysis in Complex Systems

    NASA Technical Reports Server (NTRS)

    He, Yuning; Davies, Misty Dawn

    2014-01-01

    The analysis of a safety-critical system often requires detailed knowledge of safe regions and their highdimensional non-linear boundaries. We present a statistical approach to iteratively detect and characterize the boundaries, which are provided as parameterized shape candidates. Using methods from uncertainty quantification and active learning, we incrementally construct a statistical model from only few simulation runs and obtain statistically sound estimates of the shape parameters for safety boundaries.

  7. Applied Behavior Analysis and Statistical Process Control?

    ERIC Educational Resources Information Center

    Hopkins, B. L.

    1995-01-01

    Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…

  8. Statistical process control for total quality

    NASA Astrophysics Data System (ADS)

    Ali, Syed W.

    1992-06-01

    The paper explains the techniques and applications of statistical process control (SPC). Examples of control charts used in the Poseidon program of the NASA ocean topography experiment (TOPEX) and a brief discussion of Taguchi methods are presented. It is noted that SPC involves everyone in process improvement by providing objective, workable data. It permits continuous improvement instead of merely aiming for all parts to be within a tolerance band.

  9. Comparing Trend and Gap Statistics across Tests: Distributional Change Using Ordinal Methods and Bayesian Inference

    ERIC Educational Resources Information Center

    Denbleyker, John Nickolas

    2012-01-01

    The shortcomings of the proportion above cut (PAC) statistic used so prominently in the educational landscape renders it a very problematic measure for making correct inferences with student test data. The limitations of PAC-based statistics are more pronounced with cross-test comparisons due to their dependency on cut-score locations. A better…

  10. Bayesian statistics applied to the location of the source of explosions at Stromboli Volcano, Italy

    USGS Publications Warehouse

    Saccorotti, G.; Chouet, B.; Martini, M.; Scarpa, R.

    1998-01-01

    We present a method for determining the location and spatial extent of the source of explosions at Stromboli Volcano, Italy, based on a Bayesian inversion of the slowness vector derived from frequency-slowness analyses of array data. The method searches for source locations that minimize the error between the expected and observed slowness vectors. For a given set of model parameters, the conditional probability density function of slowness vectors is approximated by a Gaussian distribution of expected errors. The method is tested with synthetics using a five-layer velocity model derived for the north flank of Stromboli and a smoothed velocity model derived from a power-law approximation of the layered structure. Application to data from Stromboli allows for a detailed examination of uncertainties in source location due to experimental errors and incomplete knowledge of the Earth model. Although the solutions are not constrained in the radial direction, excellent resolution is achieved in both transverse and depth directions. Under the assumption that the horizontal extent of the source does not exceed the crater dimension, the 90% confidence region in the estimate of the explosive source location corresponds to a small volume extending from a depth of about 100 m to a maximum depth of about 300 m beneath the active vents, with a maximum likelihood source region located in the 120- to 180-m-depth interval.

  11. Multiple LacI-mediated loops revealed by Bayesian statistics and tethered particle motion

    PubMed Central

    Johnson, Stephanie; van de Meent, Jan-Willem; Phillips, Rob; Wiggins, Chris H.; Lindén, Martin

    2014-01-01

    The bacterial transcription factor LacI loops DNA by binding to two separate locations on the DNA simultaneously. Despite being one of the best-studied model systems for transcriptional regulation, the number and conformations of loop structures accessible to LacI remain unclear, though the importance of multiple coexisting loops has been implicated in interactions between LacI and other cellular regulators of gene expression. To probe this issue, we have developed a new analysis method for tethered particle motion, a versatile and commonly used in vitro single-molecule technique. Our method, vbTPM, performs variational Bayesian inference in hidden Markov models. It learns the number of distinct states (i.e. DNA–protein conformations) directly from tethered particle motion data with better resolution than existing methods, while easily correcting for common experimental artifacts. Studying short (roughly 100 bp) LacI-mediated loops, we provide evidence for three distinct loop structures, more than previously reported in single-molecule studies. Moreover, our results confirm that changes in LacI conformation and DNA-binding topology both contribute to the repertoire of LacI-mediated loops formed in vitro, and provide qualitatively new input for models of looping and transcriptional regulation. We expect vbTPM to be broadly useful for probing complex protein–nucleic acid interactions. PMID:25120267

  12. Statistical Process Control In Photolithography Applications

    NASA Astrophysics Data System (ADS)

    Pritchard, Lois B.

    1987-04-01

    Recently there have been numerous papers, articles and books on the benefits and rewards of Statistical Process Control for manufacturing processes. Models are used that quite adequately describe methods appropriate for the factory situation where many discrete and identical items are turned out and where a limited number of parameters are inspected along the line. Photolithographic applications often require different statistical models from the usual factory methods. The difficulties encountered in getting started with SPC lie in determining: 1. what parameters should be tracked 2. what statistical model is appropriate for each of those parameters 3. how to use the models chosen. This paper describes three statistical models that, among them, account for most operations within a photolithographic manufacturing application. The process of determining which model is appropriate is described, along with the basic rules that may be used in making the determination. In addition, the application of each method is shown, and action instructions are covered. Initially the "x-bar, R" model is described. This model is the one most often found in off-the-shelf software packages, and enjoys wide applications in equipment tracking, besides general use process control. Secondly the "x, moving-R" model is described. This is appropriate where a series of measurements of the same parameter is taken on a single item, perhaps at different locations, such as in dimensional uniformity control for wafers or photomasks. In this case, each "x" is a single observation, or a number of measurements of a single observation, as opposed to a mean value taken in a sampling scheme. Thirdly a model for a Poisson distribution is described, which tends to fit defect density data, particulate counts, where count data is accumulated per unit or per unit time. The purpose of the paper is to briefly describe the included models, for those with little or no background in statistics, to enable them to

  13. Applying statistical process control to the adaptive rate control problem

    NASA Astrophysics Data System (ADS)

    Manohar, Nelson R.; Willebeek-LeMair, Marc H.; Prakash, Atul

    1997-12-01

    Due to the heterogeneity and shared resource nature of today's computer network environments, the end-to-end delivery of multimedia requires adaptive mechanisms to be effective. We present a framework for the adaptive streaming of heterogeneous media. We introduce the application of online statistical process control (SPC) to the problem of dynamic rate control. In SPC, the goal is to establish (and preserve) a state of statistical quality control (i.e., controlled variability around a target mean) over a process. We consider the end-to-end streaming of multimedia content over the internet as the process to be controlled. First, at each client, we measure process performance and apply statistical quality control (SQC) with respect to application-level requirements. Then, we guide an adaptive rate control (ARC) problem at the server based on the statistical significance of trends and departures on these measurements. We show this scheme facilitates handling of heterogeneous media. Last, because SPC is designed to monitor long-term process performance, we show that our online SPC scheme could be used to adapt to various degrees of long-term (network) variability (i.e., statistically significant process shifts as opposed to short-term random fluctuations). We develop several examples and analyze its statistical behavior and guarantees.

  14. Neural network uncertainty assessment using Bayesian statistics: a remote sensing application

    NASA Technical Reports Server (NTRS)

    Aires, F.; Prigent, C.; Rossow, W. B.

    2004-01-01

    Neural network (NN) techniques have proved successful for many regression problems, in particular for remote sensing; however, uncertainty estimates are rarely provided. In this article, a Bayesian technique to evaluate uncertainties of the NN parameters (i.e., synaptic weights) is first presented. In contrast to more traditional approaches based on point estimation of the NN weights, we assess uncertainties on such estimates to monitor the robustness of the NN model. These theoretical developments are illustrated by applying them to the problem of retrieving surface skin temperature, microwave surface emissivities, and integrated water vapor content from a combined analysis of satellite microwave and infrared observations over land. The weight uncertainty estimates are then used to compute analytically the uncertainties in the network outputs (i.e., error bars and correlation structure of these errors). Such quantities are very important for evaluating any application of an NN model. The uncertainties on the NN Jacobians are then considered in the third part of this article. Used for regression fitting, NN models can be used effectively to represent highly nonlinear, multivariate functions. In this situation, most emphasis is put on estimating the output errors, but almost no attention has been given to errors associated with the internal structure of the regression model. The complex structure of dependency inside the NN is the essence of the model, and assessing its quality, coherency, and physical character makes all the difference between a blackbox model with small output errors and a reliable, robust, and physically coherent model. Such dependency structures are described to the first order by the NN Jacobians: they indicate the sensitivity of one output with respect to the inputs of the model for given input data. We use a Monte Carlo integration procedure to estimate the robustness of the NN Jacobians. A regularization strategy based on principal component

  15. How Reliable is Bayesian Model Averaging Under Noisy Data? Statistical Assessment and Implications for Robust Model Selection

    NASA Astrophysics Data System (ADS)

    Schöniger, Anneli; Wöhling, Thomas; Nowak, Wolfgang

    2014-05-01

    Bayesian model averaging ranks the predictive capabilities of alternative conceptual models based on Bayes' theorem. The individual models are weighted with their posterior probability to be the best one in the considered set of models. Finally, their predictions are combined into a robust weighted average and the predictive uncertainty can be quantified. This rigorous procedure does, however, not yet account for possible instabilities due to measurement noise in the calibration data set. This is a major drawback, since posterior model weights may suffer a lack of robustness related to the uncertainty in noisy data, which may compromise the reliability of model ranking. We present a new statistical concept to account for measurement noise as source of uncertainty for the weights in Bayesian model averaging. Our suggested upgrade reflects the limited information content of data for the purpose of model selection. It allows us to assess the significance of the determined posterior model weights, the confidence in model selection, and the accuracy of the quantified predictive uncertainty. Our approach rests on a brute-force Monte Carlo framework. We determine the robustness of model weights against measurement noise by repeatedly perturbing the observed data with random realizations of measurement error. Then, we analyze the induced variability in posterior model weights and introduce this "weighting variance" as an additional term into the overall prediction uncertainty analysis scheme. We further determine the theoretical upper limit in performance of the model set which is imposed by measurement noise. As an extension to the merely relative model ranking, this analysis provides a measure of absolute model performance. To finally decide, whether better data or longer time series are needed to ensure a robust basis for model selection, we resample the measurement time series and assess the convergence of model weights for increasing time series length. We illustrate

  16. Analyzing Genome-Wide Association Studies with an FDR Controlling Modification of the Bayesian Information Criterion

    PubMed Central

    Dolejsi, Erich; Bodenstorfer, Bernhard; Frommlet, Florian

    2014-01-01

    The prevailing method of analyzing GWAS data is still to test each marker individually, although from a statistical point of view it is quite obvious that in case of complex traits such single marker tests are not ideal. Recently several model selection approaches for GWAS have been suggested, most of them based on LASSO-type procedures. Here we will discuss an alternative model selection approach which is based on a modification of the Bayesian Information Criterion (mBIC2) which was previously shown to have certain asymptotic optimality properties in terms of minimizing the misclassification error. Heuristic search strategies are introduced which attempt to find the model which minimizes mBIC2, and which are efficient enough to allow the analysis of GWAS data. Our approach is implemented in a software package called MOSGWA. Its performance in case control GWAS is compared with the two algorithms HLASSO and d-GWASelect, as well as with single marker tests, where we performed a simulation study based on real SNP data from the POPRES sample. Our results show that MOSGWA performs slightly better than HLASSO, where specifically for more complex models MOSGWA is more powerful with only a slight increase in Type I error. On the other hand according to our simulations GWASelect does not at all control the type I error when used to automatically determine the number of important SNPs. We also reanalyze the GWAS data from the Wellcome Trust Case-Control Consortium and compare the findings of the different procedures, where MOSGWA detects for complex diseases a number of interesting SNPs which are not found by other methods. PMID:25061809

  17. Beginning a statistical process control program

    SciTech Connect

    Davis, H.D.; Burnett, M. )

    1989-01-01

    Statistical Process Control (SPC) has in recent years become a hot'' topic in the manufacturing world. It has been touted as the means by which Japanese manufacturers have moved to the forefront of world-class quality, and subsequent financial power. Is SPC a business-saving strategy What is SPC What is the cost of quality and can we afford it Is SPC applicable to the petroleum refining and petrochemical manufacturing industry, or are these manufacturing operations so deterministic by nature that the statistics only show the accuracy and precision of the laboratory work If SPC is worthwhile how do we get started, and what problems can we expect to encounter If we begin an SPC Program, how will it benefit us These questions are addressed by the author. The view presented here is a management perspective with emphasis on rationale and implementation methods.

  18. Applied behavior analysis and statistical process control?

    PubMed Central

    Hopkins, B L

    1995-01-01

    This paper examines Pfadt and Wheeler's (1995) suggestions that the methods of statistical process control (SPC) be incorporated into applied behavior analysis. The research strategies of SPC are examined and compared to those of applied behavior analysis. I argue that the statistical methods that are a part of SPC would likely reduce applied behavior analysts' intimate contacts with the problems with which they deal and would, therefore, likely yield poor treatment and research decisions. Examples of these kinds of results and decisions are drawn from the cases and data Pfadt and Wheeler present. This paper also describes and clarifies many common misconceptions about SPC, including W. Edwards Deming's involvement in its development, its relationship to total quality management, and its confusion with various other methods designed to detect sources of unwanted variability. PMID:7592156

  19. Optimal speech motor control and token-to-token variability: a Bayesian modeling approach.

    PubMed

    Patri, Jean-François; Diard, Julien; Perrier, Pascal

    2015-12-01

    The remarkable capacity of the speech motor system to adapt to various speech conditions is due to an excess of degrees of freedom, which enables producing similar acoustical properties with different sets of control strategies. To explain how the central nervous system selects one of the possible strategies, a common approach, in line with optimal motor control theories, is to model speech motor planning as the solution of an optimality problem based on cost functions. Despite the success of this approach, one of its drawbacks is the intrinsic contradiction between the concept of optimality and the observed experimental intra-speaker token-to-token variability. The present paper proposes an alternative approach by formulating feedforward optimal control in a probabilistic Bayesian modeling framework. This is illustrated by controlling a biomechanical model of the vocal tract for speech production and by comparing it with an existing optimal control model (GEPPETO). The essential elements of this optimal control model are presented first. From them the Bayesian model is constructed in a progressive way. Performance of the Bayesian model is evaluated based on computer simulations and compared to the optimal control model. This approach is shown to be appropriate for solving the speech planning problem while accounting for variability in a principled way. PMID:26497359

  20. Optimal speech motor control and token-to-token variability: a Bayesian modeling approach.

    PubMed

    Patri, Jean-François; Diard, Julien; Perrier, Pascal

    2015-12-01

    The remarkable capacity of the speech motor system to adapt to various speech conditions is due to an excess of degrees of freedom, which enables producing similar acoustical properties with different sets of control strategies. To explain how the central nervous system selects one of the possible strategies, a common approach, in line with optimal motor control theories, is to model speech motor planning as the solution of an optimality problem based on cost functions. Despite the success of this approach, one of its drawbacks is the intrinsic contradiction between the concept of optimality and the observed experimental intra-speaker token-to-token variability. The present paper proposes an alternative approach by formulating feedforward optimal control in a probabilistic Bayesian modeling framework. This is illustrated by controlling a biomechanical model of the vocal tract for speech production and by comparing it with an existing optimal control model (GEPPETO). The essential elements of this optimal control model are presented first. From them the Bayesian model is constructed in a progressive way. Performance of the Bayesian model is evaluated based on computer simulations and compared to the optimal control model. This approach is shown to be appropriate for solving the speech planning problem while accounting for variability in a principled way.

  1. Controlling statistical moments of stochastic dynamical networks

    NASA Astrophysics Data System (ADS)

    Bielievtsov, Dmytro; Ladenbauer, Josef; Obermayer, Klaus

    2016-07-01

    We consider a general class of stochastic networks and ask which network nodes need to be controlled, and how, to stabilize and switch between desired metastable (target) states in terms of the first and second statistical moments of the system. We first show that it is sufficient to directly interfere with a subset of nodes which can be identified using information about the graph of the network only. Then we develop a suitable method for feedback control which acts on that subset of nodes and preserves the covariance structure of the desired target state. Finally, we demonstrate our theoretical results using a stochastic Hopfield network and a global brain model. Our results are applicable to a variety of (model) networks and further our understanding of the relationship between network structure and collective dynamics for the benefit of effective control.

  2. Controlling statistical moments of stochastic dynamical networks.

    PubMed

    Bielievtsov, Dmytro; Ladenbauer, Josef; Obermayer, Klaus

    2016-07-01

    We consider a general class of stochastic networks and ask which network nodes need to be controlled, and how, to stabilize and switch between desired metastable (target) states in terms of the first and second statistical moments of the system. We first show that it is sufficient to directly interfere with a subset of nodes which can be identified using information about the graph of the network only. Then we develop a suitable method for feedback control which acts on that subset of nodes and preserves the covariance structure of the desired target state. Finally, we demonstrate our theoretical results using a stochastic Hopfield network and a global brain model. Our results are applicable to a variety of (model) networks and further our understanding of the relationship between network structure and collective dynamics for the benefit of effective control. PMID:27575147

  3. Statistical quality control through overall vibration analysis

    NASA Astrophysics Data System (ADS)

    Carnero, M. a. Carmen; González-Palma, Rafael; Almorza, David; Mayorga, Pedro; López-Escobar, Carlos

    2010-05-01

    The present study introduces the concept of statistical quality control in automotive wheel bearings manufacturing processes. Defects on products under analysis can have a direct influence on passengers' safety and comfort. At present, the use of vibration analysis on machine tools for quality control purposes is not very extensive in manufacturing facilities. Noise and vibration are common quality problems in bearings. These failure modes likely occur under certain operating conditions and do not require high vibration amplitudes but relate to certain vibration frequencies. The vibration frequencies are affected by the type of surface problems (chattering) of ball races that are generated through grinding processes. The purpose of this paper is to identify grinding process variables that affect the quality of bearings by using statistical principles in the field of machine tools. In addition, an evaluation of the quality results of the finished parts under different combinations of process variables is assessed. This paper intends to establish the foundations to predict the quality of the products through the analysis of self-induced vibrations during the contact between the grinding wheel and the parts. To achieve this goal, the overall self-induced vibration readings under different combinations of process variables are analysed using statistical tools. The analysis of data and design of experiments follows a classical approach, considering all potential interactions between variables. The analysis of data is conducted through analysis of variance (ANOVA) for data sets that meet normality and homoscedasticity criteria. This paper utilizes different statistical tools to support the conclusions such as chi squared, Shapiro-Wilks, symmetry, Kurtosis, Cochran, Hartlett, and Hartley and Krushal-Wallis. The analysis presented is the starting point to extend the use of predictive techniques (vibration analysis) for quality control. This paper demonstrates the existence

  4. Bayesian Clinical Trials in Action

    PubMed Central

    Lee, J. Jack; Chu, Caleb T.

    2012-01-01

    Although the frequentist paradigm has been the predominant approach to clinical trial design since the 1940s, it has several notable limitations. The alternative Bayesian paradigm has been greatly enhanced by advancements in computational algorithms and computer hardware. Compared to its frequentist counterpart, the Bayesian framework has several unique advantages, and its incorporation into clinical trial design is occurring more frequently. Using an extensive literature review to assess how Bayesian methods are used in clinical trials, we find them most commonly used for dose finding, efficacy monitoring, toxicity monitoring, diagnosis/decision making, and for studying pharmacokinetics/pharmacodynamics. The additional infrastructure required for implementing Bayesian methods in clinical trials may include specialized software programs to run the study design, simulation, and analysis, and Web-based applications, which are particularly useful for timely data entry and analysis. Trial success requires not only the development of proper tools but also timely and accurate execution of data entry, quality control, adaptive randomization, and Bayesian computation. The relative merit of the Bayesian and frequentist approaches continues to be the subject of debate in statistics. However, more evidence can be found showing the convergence of the two camps, at least at the practical level. Ultimately, better clinical trial methods lead to more efficient designs, lower sample sizes, more accurate conclusions, and better outcomes for patients enrolled in the trials. Bayesian methods offer attractive alternatives for better trials. More such trials should be designed and conducted to refine the approach and demonstrate its real benefit in action. PMID:22711340

  5. Refining calibration and predictions of a Bayesian statistical-dynamical model for long term avalanche forecasting using dendrochronological reconstructions

    NASA Astrophysics Data System (ADS)

    Eckert, Nicolas; Schläppy, Romain; Jomelli, Vincent; Naaim, Mohamed

    2013-04-01

    A crucial step for proposing relevant long-term mitigation measures in long term avalanche forecasting is the accurate definition of high return period avalanches. Recently, "statistical-dynamical" approach combining a numerical model with stochastic operators describing the variability of its inputs-outputs have emerged. Their main interests is to take into account the topographic dependency of snow avalanche runout distances, and to constrain the correlation structure between model's variables by physical rules, so as to simulate the different marginal distributions of interest (pressure, flow depth, etc.) with a reasonable realism. Bayesian methods have been shown to be well adapted to achieve model inference, getting rid of identifiability problems thanks to prior information. An important problem which has virtually never been considered before is the validation of the predictions resulting from a statistical-dynamical approach (or from any other engineering method for computing extreme avalanches). In hydrology, independent "fossil" data such as flood deposits in caves are sometimes confronted to design discharges corresponding to high return periods. Hence, the aim of this work is to implement a similar comparison between high return period avalanches obtained with a statistical-dynamical approach and independent validation data resulting from careful dendrogeomorphological reconstructions. To do so, an up-to-date statistical model based on the depth-averaged equations and the classical Voellmy friction law is used on a well-documented case study. First, parameter values resulting from another path are applied, and the dendrological validation sample shows that this approach fails in providing realistic prediction for the case study. This may be due to the strongly bounded behaviour of runouts in this case (the extreme of their distribution is identified as belonging to the Weibull attraction domain). Second, local calibration on the available avalanche

  6. Statistical Process Control for KSC Processing

    NASA Technical Reports Server (NTRS)

    Ford, Roger G.; Delgado, Hector; Tilley, Randy

    1996-01-01

    The 1996 Summer Faculty Fellowship Program and Kennedy Space Center (KSC) served as the basis for a research effort into statistical process control for KSC processing. The effort entailed several tasks and goals. The first was to develop a customized statistical process control (SPC) course for the Safety and Mission Assurance Trends Analysis Group. The actual teaching of this course took place over several weeks. In addition, an Internet version of the same course complete with animation and video excerpts from the course when it was taught at KSC was developed. The application of SPC to shuttle processing took up the rest of the summer research project. This effort entailed the evaluation of SPC use at KSC, both present and potential, due to the change in roles for NASA and the Single Flight Operations Contractor (SFOC). Individual consulting on SPC use was accomplished as well as an evaluation of SPC software for KSC use in the future. A final accomplishment of the orientation of the author to NASA changes, terminology, data format, and new NASA task definitions will allow future consultation when the needs arise.

  7. Two levels of Bayesian model averaging for optimal control of stochastic systems

    NASA Astrophysics Data System (ADS)

    Darwen, Paul J.

    2013-02-01

    Bayesian model averaging provides the best possible estimate of a model, given the data. This article uses that approach twice: once to get a distribution of plausible models of the world, and again to find a distribution of plausible control functions. The resulting ensemble gives control instructions different from simply taking the single best-fitting model and using it to find a single lowest-error control function for that single model. The only drawback is, of course, the need for more computer time: this article demonstrates that the required computer time is feasible. The test problem here is from flood control and risk management.

  8. A statistical process control case study.

    PubMed

    Ross, Thomas K

    2006-01-01

    Statistical process control (SPC) charts can be applied to a wide number of health care applications, yet widespread use has not occurred. The greatest obstacle preventing wider use is the lack of quality management training that health care workers receive. The technical nature of the SPC guarantees that without explicit instruction this technique will not come into widespread use. Reviews of health care quality management texts inform the reader that SPC charts should be used to improve delivery processes and outcomes often without discussing how they are created. Conversely, medical research frequently reports the improved outcomes achieved after analyzing SPC charts. This article is targeted between these 2 positions: it reviews the SPC technique and presents a tool and data so readers can construct SPC charts. After tackling the case, it is hoped that the readers will collect their own data and apply the same technique to improve processes in their own organization. PMID:17047496

  9. Planetary micro-rover operations on Mars using a Bayesian framework for inference and control

    NASA Astrophysics Data System (ADS)

    Post, Mark A.; Li, Junquan; Quine, Brendan M.

    2016-03-01

    With the recent progress toward the application of commercially-available hardware to small-scale space missions, it is now becoming feasible for groups of small, efficient robots based on low-power embedded hardware to perform simple tasks on other planets in the place of large-scale, heavy and expensive robots. In this paper, we describe design and programming of the Beaver micro-rover developed for Northern Light, a Canadian initiative to send a small lander and rover to Mars to study the Martian surface and subsurface. For a small, hardware-limited rover to handle an uncertain and mostly unknown environment without constant management by human operators, we use a Bayesian network of discrete random variables as an abstraction of expert knowledge about the rover and its environment, and inference operations for control. A framework for efficient construction and inference into a Bayesian network using only the C language and fixed-point mathematics on embedded hardware has been developed for the Beaver to make intelligent decisions with minimal sensor data. We study the performance of the Beaver as it probabilistically maps a simple outdoor environment with sensor models that include uncertainty. Results indicate that the Beaver and other small and simple robotic platforms can make use of a Bayesian network to make intelligent decisions in uncertain planetary environments.

  10. Statistical process control for IMRT dosimetric verification.

    PubMed

    Breen, Stephen L; Moseley, Douglas J; Zhang, Beibei; Sharpe, Michael B

    2008-10-01

    Patient-specific measurements are typically used to validate the dosimetry of intensity-modulated radiotherapy (IMRT). To evaluate the dosimetric performance over time of our IMRT process, we have used statistical process control (SPC) concepts to analyze the measurements from 330 head and neck (H&N) treatment plans. The objectives of the present work are to: (i) Review the dosimetric measurements of a large series of consecutive head and neck treatment plans to better understand appropriate dosimetric tolerances; (ii) analyze the results with SPC to develop action levels for measured discrepancies; (iii) develop estimates for the number of measurements that are required to describe IMRT dosimetry in the clinical setting; and (iv) evaluate with SPC a new beam model in our planning system. H&N IMRT cases were planned with the PINNACLE treatment planning system versions 6.2b or 7.6c (Philips Medical Systems, Madison, WI) and treated on Varian (Palo Alto, CA) or Elekta (Crawley, UK) linacs. As part of regular quality assurance, plans were recalculated on a 20-cm-diam cylindrical phantom, and ion chamber measurements were made in high-dose volumes (the PTV with highest dose) and in low-dose volumes (spinal cord organ-at-risk, OR). Differences between the planned and measured doses were recorded as a percentage of the planned dose. Differences were stable over time. Measurements with PINNACLE3 6.2b and Varian linacs showed a mean difference of 0.6% for PTVs (n=149, range, -4.3% to 6.6%), while OR measurements showed a larger systematic discrepancy (mean 4.5%, range -4.5% to 16.3%) that was due to well-known limitations of the MLC model in the earlier version of the planning system. Measurements with PINNACLE3 7.6c and Varian linacs demonstrated a mean difference of 0.2% for PTVs (n=160, range, -3.0%, to 5.0%) and -1.0% for ORs (range -5.8% to 4.4%). The capability index (ratio of specification range to range of the data) was 1.3 for the PTV data, indicating that almost

  11. Statistical process control for IMRT dosimetric verification

    SciTech Connect

    Breen, Stephen L.; Moseley, Douglas J.; Zhang, Beibei; Sharpe, Michael B.

    2008-10-15

    Patient-specific measurements are typically used to validate the dosimetry of intensity-modulated radiotherapy (IMRT). To evaluate the dosimetric performance over time of our IMRT process, we have used statistical process control (SPC) concepts to analyze the measurements from 330 head and neck (H and N) treatment plans. The objectives of the present work are to: (i) Review the dosimetric measurements of a large series of consecutive head and neck treatment plans to better understand appropriate dosimetric tolerances; (ii) analyze the results with SPC to develop action levels for measured discrepancies; (iii) develop estimates for the number of measurements that are required to describe IMRT dosimetry in the clinical setting; and (iv) evaluate with SPC a new beam model in our planning system. H and N IMRT cases were planned with the PINNACLE{sup 3} treatment planning system versions 6.2b or 7.6c (Philips Medical Systems, Madison, WI) and treated on Varian (Palo Alto, CA) or Elekta (Crawley, UK) linacs. As part of regular quality assurance, plans were recalculated on a 20-cm-diam cylindrical phantom, and ion chamber measurements were made in high-dose volumes (the PTV with highest dose) and in low-dose volumes (spinal cord organ-at-risk, OR). Differences between the planned and measured doses were recorded as a percentage of the planned dose. Differences were stable over time. Measurements with PINNACLE{sup 3} 6.2b and Varian linacs showed a mean difference of 0.6% for PTVs (n=149, range, -4.3% to 6.6%), while OR measurements showed a larger systematic discrepancy (mean 4.5%, range -4.5% to 16.3%) that was due to well-known limitations of the MLC model in the earlier version of the planning system. Measurements with PINNACLE{sup 3} 7.6c and Varian linacs demonstrated a mean difference of 0.2% for PTVs (n=160, range, -3.0%, to 5.0%) and -1.0% for ORs (range -5.8% to 4.4%). The capability index (ratio of specification range to range of the data) was 1.3 for the PTV

  12. Statistical process control for IMRT dosimetric verification.

    PubMed

    Breen, Stephen L; Moseley, Douglas J; Zhang, Beibei; Sharpe, Michael B

    2008-10-01

    Patient-specific measurements are typically used to validate the dosimetry of intensity-modulated radiotherapy (IMRT). To evaluate the dosimetric performance over time of our IMRT process, we have used statistical process control (SPC) concepts to analyze the measurements from 330 head and neck (H&N) treatment plans. The objectives of the present work are to: (i) Review the dosimetric measurements of a large series of consecutive head and neck treatment plans to better understand appropriate dosimetric tolerances; (ii) analyze the results with SPC to develop action levels for measured discrepancies; (iii) develop estimates for the number of measurements that are required to describe IMRT dosimetry in the clinical setting; and (iv) evaluate with SPC a new beam model in our planning system. H&N IMRT cases were planned with the PINNACLE treatment planning system versions 6.2b or 7.6c (Philips Medical Systems, Madison, WI) and treated on Varian (Palo Alto, CA) or Elekta (Crawley, UK) linacs. As part of regular quality assurance, plans were recalculated on a 20-cm-diam cylindrical phantom, and ion chamber measurements were made in high-dose volumes (the PTV with highest dose) and in low-dose volumes (spinal cord organ-at-risk, OR). Differences between the planned and measured doses were recorded as a percentage of the planned dose. Differences were stable over time. Measurements with PINNACLE3 6.2b and Varian linacs showed a mean difference of 0.6% for PTVs (n=149, range, -4.3% to 6.6%), while OR measurements showed a larger systematic discrepancy (mean 4.5%, range -4.5% to 16.3%) that was due to well-known limitations of the MLC model in the earlier version of the planning system. Measurements with PINNACLE3 7.6c and Varian linacs demonstrated a mean difference of 0.2% for PTVs (n=160, range, -3.0%, to 5.0%) and -1.0% for ORs (range -5.8% to 4.4%). The capability index (ratio of specification range to range of the data) was 1.3 for the PTV data, indicating that almost

  13. Measuring the Mass of a Galaxy: An evaluation of the performance of Bayesian mass estimates using statistical simulation

    NASA Astrophysics Data System (ADS)

    Eadie, Gwendolyn Marie

    This research uses a Bayesian approach to study the biases that may occur when kinematic data is used to estimate the mass of a galaxy. Data is simulated from the Hernquist (1990) distribution functions (DFs) for velocity dispersions of the isotropic, constant anisotropic, and anisotropic Osipkov (1979) and Merritt (1985) type, and then analysed using the isotropic Hernquist model. Biases are explored when i) the model and data come from the same DF, ii) the model and data come from the same DF but tangential velocities are unknown, iii) the model and data come from different DFs, and iv) the model and data come from different DFs and the tangential velocities are unknown. Mock observations are also created from the Gauthier (2006) simulations and analysed with the isotropic Hernquist model. No bias was found in situation (i), a slight positive bias was found in (ii), a negative bias was found in (iii), and a large positive bias was found in (iv). The mass estimate of the Gauthier system when tangential velocities were unknown was nearly correct, but the mass profile was not described well by the isotropic Hernquist model. When the Gauthier data was analysed with the tangential velocities, the mass of the system was overestimated. The code created for the research runs three parallel Markov Chains for each data set, uses the Gelman-Rubin statistic to assess convergence, and combines the converged chains into a single sample of the posterior distribution for each data set. The code also includes two ways to deal with nuisance parameters. One is to marginalize over the nuisance parameter at every step in the chain, and the other is to sample the nuisance parameters using a hybrid-Gibbs sampler. When tangential velocities, v(t), are unobserved in the analyses above, they are sampled as nuisance parameters in the Markov Chain. The v(t) estimates from the Markov chains did a poor job of estimating the true tangential velocities. However, the posterior samples of v

  14. High-resolution chronology for the Mesoamerican urban center of Teotihuacan derived from Bayesian statistics of radiocarbon and archaeological data

    NASA Astrophysics Data System (ADS)

    Beramendi-Orosco, Laura E.; Gonzalez-Hernandez, Galia; Urrutia-Fucugauchi, Jaime; Manzanilla, Linda R.; Soler-Arechalde, Ana M.; Goguitchaishvili, Avto; Jarboe, Nick

    2009-03-01

    A high-resolution 14C chronology for the Teopancazco archaeological site in the Teotihuacan urban center of Mesoamerica was generated by Bayesian analysis of 33 radiocarbon dates and detailed archaeological information related to occupation stratigraphy, pottery and archaeomagnetic dates. The calibrated intervals obtained using the Bayesian model are up to ca. 70% shorter than those obtained with individual calibrations. For some samples, this is a consequence of plateaus in the part of the calibration curve covered by the sample dates (2500 to 1450 14C yr BP). Effects of outliers are explored by comparing the results from a Bayesian model that incorporates radiocarbon data for two outlier samples with the same model excluding them. The effect of outliers was more significant than expected. Inclusion of radiocarbon dates from two altered contexts, 500 14C yr earlier than those for the first occupational phase, results in ages calculated by the model earlier than the archaeological records. The Bayesian chronology excluding these outliers separates the first two Teopancazco occupational phases and suggests that ending of the Xolalpan phase was around cal AD 550, 100 yr earlier than previously estimated and in accordance with previously reported archaeomagnetic dates from lime plasters for the same site.

  15. Applying Statistical Process Quality Control Methodology to Educational Settings.

    ERIC Educational Resources Information Center

    Blumberg, Carol Joyce

    A subset of Statistical Process Control (SPC) methodology known as Control Charting is introduced. SPC methodology is a collection of graphical and inferential statistics techniques used to study the progress of phenomena over time. The types of control charts covered are the null X (mean), R (Range), X (individual observations), MR (moving…

  16. Bayesian inversion of marine controlled source electromagnetic data offshore Vancouver Island, Canada

    NASA Astrophysics Data System (ADS)

    Gehrmann, Romina A. S.; Schwalenberg, Katrin; Riedel, Michael; Spence, George D.; Spieß, Volkhard; Dosso, Stan E.

    2016-01-01

    This paper applies nonlinear Bayesian inversion to marine controlled source electromagnetic (CSEM) data collected near two sites of the Integrated Ocean Drilling Program (IODP) Expedition 311 on the northern Cascadia Margin to investigate subseafloor resistivity structure related to gas hydrate deposits and cold vents. The Cascadia margin, off the west coast of Vancouver Island, Canada, has a large accretionary prism where sediments are under pressure due to convergent plate boundary tectonics. Gas hydrate deposits and cold vent structures have previously been investigated by various geophysical methods and seabed drilling. Here, we invert time-domain CSEM data collected at Sites U1328 and U1329 of IODP Expedition 311 using Bayesian methods to derive subsurface resistivity model parameters and uncertainties. The Bayesian information criterion is applied to determine the amount of structure (number of layers in a depth-dependent model) that can be resolved by the data. The parameter space is sampled with the Metropolis-Hastings algorithm in principal-component space, utilizing parallel tempering to ensure wider and efficient sampling and convergence. Nonlinear inversion allows analysis of uncertain acquisition parameters such as time delays between receiver and transmitter clocks as well as input electrical current amplitude. Marginalizing over these instrument parameters in the inversion accounts for their contribution to the geophysical model uncertainties. One-dimensional inversion of time-domain CSEM data collected at measurement sites along a survey line allows interpretation of the subsurface resistivity structure. The data sets can be generally explained by models with 1 to 3 layers. Inversion results at U1329, at the landward edge of the gas hydrate stability zone, indicate a sediment unconformity as well as potential cold vents which were previously unknown. The resistivities generally increase upslope due to sediment erosion along the slope. Inversion

  17. Analysis of Feature Intervisibility and Cumulative Visibility Using GIS, Bayesian and Spatial Statistics: A Study from the Mandara Mountains, Northern Cameroon

    PubMed Central

    Wright, David K.; MacEachern, Scott; Lee, Jaeyong

    2014-01-01

    The locations of diy-geδ-bay (DGB) sites in the Mandara Mountains, northern Cameroon are hypothesized to occur as a function of their ability to see and be seen from points on the surrounding landscape. A series of geostatistical, two-way and Bayesian logistic regression analyses were performed to test two hypotheses related to the intervisibility of the sites to one another and their visual prominence on the landscape. We determine that the intervisibility of the sites to one another is highly statistically significant when compared to 10 stratified-random permutations of DGB sites. Bayesian logistic regression additionally demonstrates that the visibility of the sites to points on the surrounding landscape is statistically significant. The location of sites appears to have also been selected on the basis of lower slope than random permutations of sites. Using statistical measures, many of which are not commonly employed in archaeological research, to evaluate aspects of visibility on the landscape, we conclude that the placement of DGB sites improved their conspicuousness for enhanced ritual, social cooperation and/or competition purposes. PMID:25383883

  18. Bayesian learning

    NASA Technical Reports Server (NTRS)

    Denning, Peter J.

    1989-01-01

    In 1983 and 1984, the Infrared Astronomical Satellite (IRAS) detected 5,425 stellar objects and measured their infrared spectra. In 1987 a program called AUTOCLASS used Bayesian inference methods to discover the classes present in these data and determine the most probable class of each object, revealing unknown phenomena in astronomy. AUTOCLASS has rekindled the old debate on the suitability of Bayesian methods, which are computationally intensive, interpret probabilities as plausibility measures rather than frequencies, and appear to depend on a subjective assessment of the probability of a hypothesis before the data were collected. Modern statistical methods have, however, recently been shown to also depend on subjective elements. These debates bring into question the whole tradition of scientific objectivity and offer scientists a new way to take responsibility for their findings and conclusions.

  19. Artificial Intelligence Approach to Support Statistical Quality Control Teaching

    ERIC Educational Resources Information Center

    Reis, Marcelo Menezes; Paladini, Edson Pacheco; Khator, Suresh; Sommer, Willy Arno

    2006-01-01

    Statistical quality control--SQC (consisting of Statistical Process Control, Process Capability Studies, Acceptance Sampling and Design of Experiments) is a very important tool to obtain, maintain and improve the Quality level of goods and services produced by an organization. Despite its importance, and the fact that it is taught in technical and…

  20. Control Statistics Process Data Base V4

    1998-05-07

    The check standard database program, CSP_CB, is a menu-driven program that can acquire measurement data for check standards having a parameter dependence (such as frequency) or no parameter dependence (for example, mass measurements). The program may be run stand-alone or leaded as a subprogram to a Basic program already in memory. The software was designed to require little additional work on the part of the user. The facilitate this design goal, the program is entirelymore » menu-driven. In addition, the user does have control of file names and parameters within a definition file which sets up the basic scheme of file names.« less

  1. A Simple 2D Non-Parametric Resampling Statistical Approach to Assess Confidence in Species Identification in DNA Barcoding—An Alternative to Likelihood and Bayesian Approaches

    PubMed Central

    Jin, Qian; He, Li-Jun; Zhang, Ai-Bing

    2012-01-01

    In the recent worldwide campaign for the global biodiversity inventory via DNA barcoding, a simple and easily used measure of confidence for assigning sequences to species in DNA barcoding has not been established so far, although the likelihood ratio test and the Bayesian approach had been proposed to address this issue from a statistical point of view. The TDR (Two Dimensional non-parametric Resampling) measure newly proposed in this study offers users a simple and easy approach to evaluate the confidence of species membership in DNA barcoding projects. We assessed the validity and robustness of the TDR approach using datasets simulated under coalescent models, and an empirical dataset, and found that TDR measure is very robust in assessing species membership of DNA barcoding. In contrast to the likelihood ratio test and Bayesian approach, the TDR method stands out due to simplicity in both concepts and calculations, with little in the way of restrictive population genetic assumptions. To implement this approach we have developed a computer program package (TDR1.0beta) freely available from ftp://202.204.209.200/education/video/TDR1.0beta.rar. PMID:23239988

  2. IZI: INFERRING THE GAS PHASE METALLICITY (Z) AND IONIZATION PARAMETER (q) OF IONIZED NEBULAE USING BAYESIAN STATISTICS

    SciTech Connect

    Blanc, Guillermo A.; Kewley, Lisa; Vogt, Frédéric P. A.; Dopita, Michael A.

    2015-01-10

    We present a new method for inferring the metallicity (Z) and ionization parameter (q) of H II regions and star-forming galaxies using strong nebular emission lines (SELs). We use Bayesian inference to derive the joint and marginalized posterior probability density functions for Z and q given a set of observed line fluxes and an input photoionization model. Our approach allows the use of arbitrary sets of SELs and the inclusion of flux upper limits. The method provides a self-consistent way of determining the physical conditions of ionized nebulae that is not tied to the arbitrary choice of a particular SEL diagnostic and uses all the available information. Unlike theoretically calibrated SEL diagnostics, the method is flexible and not tied to a particular photoionization model. We describe our algorithm, validate it against other methods, and present a tool that implements it called IZI. Using a sample of nearby extragalactic H II regions, we assess the performance of commonly used SEL abundance diagnostics. We also use a sample of 22 local H II regions having both direct and recombination line (RL) oxygen abundance measurements in the literature to study discrepancies in the abundance scale between different methods. We find that oxygen abundances derived through Bayesian inference using currently available photoionization models in the literature can be in good (∼30%) agreement with RL abundances, although some models perform significantly better than others. We also confirm that abundances measured using the direct method are typically ∼0.2 dex lower than both RL and photoionization-model-based abundances.

  3. Statistical approach to linewidth control in a logic fab

    NASA Astrophysics Data System (ADS)

    Pitter, Michael; Doleschel, Bernhard; Eibl, Ludwig; Steinkirchner, Erwin; Grassmann, Andreas

    1999-04-01

    We designed an adaptive line width controller specially tailored to the needs of a highly diversified logic fab. Simulations of different controller types fed with historic CD data show advantages of an SPC based controller over a Run by Run controller. This result confirms the SPC assumption that as long as a process is in statistical control, changing the process parameters will only increase the variability of the output.

  4. Estimability and simple dynamical analyses of range (range-rate range-difference) observations to artificial satellites. [laser range observations to LAGEOS using non-Bayesian statistics

    NASA Technical Reports Server (NTRS)

    Vangelder, B. H. W.

    1978-01-01

    Non-Bayesian statistics were used in simulation studies centered around laser range observations to LAGEOS. The capabilities of satellite laser ranging especially in connection with relative station positioning are evaluated. The satellite measurement system under investigation may fall short in precise determinations of the earth's orientation (precession and nutation) and earth's rotation as opposed to systems as very long baseline interferometry (VLBI) and lunar laser ranging (LLR). Relative station positioning, determination of (differential) polar motion, positioning of stations with respect to the earth's center of mass and determination of the earth's gravity field should be easily realized by satellite laser ranging (SLR). The last two features should be considered as best (or solely) determinable by SLR in contrast to VLBI and LLR.

  5. A Bayesian hierarchical approach for combining case-control and prospective studies.

    PubMed

    Müller, P; Parmigiani, G; Schildkraut, J; Tardella, L

    1999-09-01

    Motivated by the absolute risk predictions required in medical decision making and patient counseling, we propose an approach for the combined analysis of case-control and prospective studies of disease risk factors. The approach is hierarchical to account for parameter heterogeneity among studies and among sampling units of the same study. It is based on modeling the retrospective distribution of the covariates given the disease outcome, a strategy that greatly simplifies both the combination of prospective and retrospective studies and the computation of Bayesian predictions in the hierarchical case-control context. Retrospective modeling differentiates our approach from most current strategies for inference on risk factors, which are based on the assumption of a specific prospective model. To ensure modeling flexibility, we propose using a mixture model for the retrospective distributions of the covariates. This leads to a general nonlinear regression family for the implied prospective likelihood. After introducing and motivating our proposal, we present simple results that highlight its relationship with existing approaches, develop Markov chain Monte Carlo methods for inference and prediction, and present an illustration using ovarian cancer data. PMID:11315018

  6. An examination of Bayesian statistical approaches to modeling change in cognitive decline in an Alzheimer’s disease population

    PubMed Central

    Bartolucci, Al; Bae, Sejong; Singh, Karan; Griffith, H. Randall

    2009-01-01

    The mini mental state examination (MMSE) is a common tool for measuring cognitive decline in Alzhiemer’s Disease (AD) subjects. Subjects are usually observed for a specified period of time or until death to determine the trajectory of the decline which for the most part appears to be linear. However, it may be noted that the decline may not be modeled by a single linear model over a specified period of time. There may be a point called a change point where the rate or gradient of the decline may change depending on the length of time of observation. A Bayesian approach is used to model the trajectory and determine an appropriate posterior estimate of the change point as well as the predicted model of decline before and after the change point. Estimates of the appropriate parameters as well as their posterior credible regions or regions of interest are established. Coherent prior to posterior analysis using mainly non informative priors for the parameters of interest is provided. This approach is applied to an existing AD database. PMID:20161460

  7. Towards Validation of an Adaptive Flight Control Simulation Using Statistical Emulation

    NASA Technical Reports Server (NTRS)

    He, Yuning; Lee, Herbert K. H.; Davies, Misty D.

    2012-01-01

    Traditional validation of flight control systems is based primarily upon empirical testing. Empirical testing is sufficient for simple systems in which a.) the behavior is approximately linear and b.) humans are in-the-loop and responsible for off-nominal flight regimes. A different possible concept of operation is to use adaptive flight control systems with online learning neural networks (OLNNs) in combination with a human pilot for off-nominal flight behavior (such as when a plane has been damaged). Validating these systems is difficult because the controller is changing during the flight in a nonlinear way, and because the pilot and the control system have the potential to co-adapt in adverse ways traditional empirical methods are unlikely to provide any guarantees in this case. Additionally, the time it takes to find unsafe regions within the flight envelope using empirical testing means that the time between adaptive controller design iterations is large. This paper describes a new concept for validating adaptive control systems using methods based on Bayesian statistics. This validation framework allows the analyst to build nonlinear models with modal behavior, and to have an uncertainty estimate for the difference between the behaviors of the model and system under test.

  8. Using Paper Helicopters to Teach Statistical Process Control

    ERIC Educational Resources Information Center

    Johnson, Danny J.

    2011-01-01

    This hands-on project uses a paper helicopter to teach students how to distinguish between common and special causes of variability when developing and using statistical process control charts. It allows the student to experience a process that is out-of-control due to imprecise or incomplete product design specifications and to discover how the…

  9. Manufacturing Squares: An Integrative Statistical Process Control Exercise

    ERIC Educational Resources Information Center

    Coy, Steven P.

    2016-01-01

    In the exercise, students in a junior-level operations management class are asked to manufacture a simple product. Given product specifications, they must design a production process, create roles and design jobs for each team member, and develop a statistical process control plan that efficiently and effectively controls quality during…

  10. Statistical Design Model (SDM) of satellite thermal control subsystem

    NASA Astrophysics Data System (ADS)

    Mirshams, Mehran; Zabihian, Ehsan; Aarabi Chamalishahi, Mahdi

    2016-07-01

    Satellites thermal control, is a satellite subsystem that its main task is keeping the satellite components at its own survival and activity temperatures. Ability of satellite thermal control plays a key role in satisfying satellite's operational requirements and designing this subsystem is a part of satellite design. In the other hand due to the lack of information provided by companies and designers still doesn't have a specific design process while it is one of the fundamental subsystems. The aim of this paper, is to identify and extract statistical design models of spacecraft thermal control subsystem by using SDM design method. This method analyses statistical data with a particular procedure. To implement SDM method, a complete database is required. Therefore, we first collect spacecraft data and create a database, and then we extract statistical graphs using Microsoft Excel, from which we further extract mathematical models. Inputs parameters of the method are mass, mission, and life time of the satellite. For this purpose at first thermal control subsystem has been introduced and hardware using in the this subsystem and its variants has been investigated. In the next part different statistical models has been mentioned and a brief compare will be between them. Finally, this paper particular statistical model is extracted from collected statistical data. Process of testing the accuracy and verifying the method use a case study. Which by the comparisons between the specifications of thermal control subsystem of a fabricated satellite and the analyses results, the methodology in this paper was proved to be effective. Key Words: Thermal control subsystem design, Statistical design model (SDM), Satellite conceptual design, Thermal hardware

  11. Reinforcement learning for adaptive threshold control of restorative brain-computer interfaces: a Bayesian simulation.

    PubMed

    Bauer, Robert; Gharabaghi, Alireza

    2015-01-01

    Restorative brain-computer interfaces (BCI) are increasingly used to provide feedback of neuronal states in a bid to normalize pathological brain activity and achieve behavioral gains. However, patients and healthy subjects alike often show a large variability, or even inability, of brain self-regulation for BCI control, known as BCI illiteracy. Although current co-adaptive algorithms are powerful for assistive BCIs, their inherent class switching clashes with the operant conditioning goal of restorative BCIs. Moreover, due to the treatment rationale, the classifier of restorative BCIs usually has a constrained feature space, thus limiting the possibility of classifier adaptation. In this context, we applied a Bayesian model of neurofeedback and reinforcement learning for different threshold selection strategies to study the impact of threshold adaptation of a linear classifier on optimizing restorative BCIs. For each feedback iteration, we first determined the thresholds that result in minimal action entropy and maximal instructional efficiency. We then used the resulting vector for the simulation of continuous threshold adaptation. We could thus show that threshold adaptation can improve reinforcement learning, particularly in cases of BCI illiteracy. Finally, on the basis of information-theory, we provided an explanation for the achieved benefits of adaptive threshold setting. PMID:25729347

  12. Reinforcement learning for adaptive threshold control of restorative brain-computer interfaces: a Bayesian simulation.

    PubMed

    Bauer, Robert; Gharabaghi, Alireza

    2015-01-01

    Restorative brain-computer interfaces (BCI) are increasingly used to provide feedback of neuronal states in a bid to normalize pathological brain activity and achieve behavioral gains. However, patients and healthy subjects alike often show a large variability, or even inability, of brain self-regulation for BCI control, known as BCI illiteracy. Although current co-adaptive algorithms are powerful for assistive BCIs, their inherent class switching clashes with the operant conditioning goal of restorative BCIs. Moreover, due to the treatment rationale, the classifier of restorative BCIs usually has a constrained feature space, thus limiting the possibility of classifier adaptation. In this context, we applied a Bayesian model of neurofeedback and reinforcement learning for different threshold selection strategies to study the impact of threshold adaptation of a linear classifier on optimizing restorative BCIs. For each feedback iteration, we first determined the thresholds that result in minimal action entropy and maximal instructional efficiency. We then used the resulting vector for the simulation of continuous threshold adaptation. We could thus show that threshold adaptation can improve reinforcement learning, particularly in cases of BCI illiteracy. Finally, on the basis of information-theory, we provided an explanation for the achieved benefits of adaptive threshold setting.

  13. Reinforcement learning for adaptive threshold control of restorative brain-computer interfaces: a Bayesian simulation

    PubMed Central

    Bauer, Robert; Gharabaghi, Alireza

    2015-01-01

    Restorative brain-computer interfaces (BCI) are increasingly used to provide feedback of neuronal states in a bid to normalize pathological brain activity and achieve behavioral gains. However, patients and healthy subjects alike often show a large variability, or even inability, of brain self-regulation for BCI control, known as BCI illiteracy. Although current co-adaptive algorithms are powerful for assistive BCIs, their inherent class switching clashes with the operant conditioning goal of restorative BCIs. Moreover, due to the treatment rationale, the classifier of restorative BCIs usually has a constrained feature space, thus limiting the possibility of classifier adaptation. In this context, we applied a Bayesian model of neurofeedback and reinforcement learning for different threshold selection strategies to study the impact of threshold adaptation of a linear classifier on optimizing restorative BCIs. For each feedback iteration, we first determined the thresholds that result in minimal action entropy and maximal instructional efficiency. We then used the resulting vector for the simulation of continuous threshold adaptation. We could thus show that threshold adaptation can improve reinforcement learning, particularly in cases of BCI illiteracy. Finally, on the basis of information-theory, we provided an explanation for the achieved benefits of adaptive threshold setting. PMID:25729347

  14. Statistical porcess control in Deep Space Network operation

    NASA Technical Reports Server (NTRS)

    Hodder, J. A.

    2002-01-01

    This report describes how the Deep Space Mission System (DSMS) Operations Program Office at the Jet Propulsion Laboratory's (EL) uses Statistical Process Control (SPC) to monitor performance and evaluate initiatives for improving processes on the National Aeronautics and Space Administration's (NASA) Deep Space Network (DSN).

  15. Statistical Process Control in the Practice of Program Evaluation.

    ERIC Educational Resources Information Center

    Posavac, Emil J.

    1995-01-01

    A technique developed to monitor the quality of manufactured products, statistical process control (SPC), incorporates several features that may prove attractive to evaluators. This paper reviews the history of SPC, suggests how the approach can enrich program evaluation, and illustrates its use in a hospital-based example. (SLD)

  16. Statistical Process Control. Impact and Opportunities for Ohio.

    ERIC Educational Resources Information Center

    Brown, Harold H.

    The first purpose of this study is to help the reader become aware of the evolution of Statistical Process Control (SPC) as it is being implemented and used in industry today. This is approached through the presentation of a brief historical account of SPC, from its inception through the technological miracle that has occurred in Japan. The…

  17. Statistical Process Control. A Summary. FEU/PICKUP Project Report.

    ERIC Educational Resources Information Center

    Owen, M.; Clark, I.

    A project was conducted to develop a curriculum and training materials to be used in training industrial operatives in statistical process control (SPC) techniques. During the first phase of the project, questionnaires were sent to 685 companies (215 of which responded) to determine where SPC was being used, what type of SPC firms needed, and how…

  18. Tool compensation using statistical process control on complex milling operations

    SciTech Connect

    Reilly, J.M.

    1994-03-01

    In today`s competitive manufacturing environment, many companies increasingly rely on numerical control (NC) mills to produce products at a reasonable cost. Typically, this is done by producing as many features as possible at each machining operation to minimize the total number of shop hours invested per part. Consequently, the number of cutting tools involved in one operation can become quite large since NC mills have the capacity to use in excess of 100 cutting tools. As the number of cutting tools increases, the difficulty of applying optimum tool compensation grows exponentially, quickly overwhelming machine operators and engineers. A systematic method of managing tool compensation is required. The name statistical process control (SPC) suggests a technique in which statistics are used to stabilize and control a machining operation. Feedback and control theory, the study of the stabilization of electronic and mechanical systems, states that control can be established by way of a feedback network. If these concepts were combined, SPC would stabilize and control manufacturing operations through the incorporation of statistically processed feedback. In its simplest application, SPC has been used as a tool to analyze inspection data. In its most mature application, SPC can be the link that applies process feedback. The approach involves: (1) identifying the significant process variables adjusted by the operator; (2) developing mathematical relationships that convert strategic part measurements into variable adjustments; and (3) implementing SPC charts that record required adjustment to each variable.

  19. Practical Bayesian tomography

    NASA Astrophysics Data System (ADS)

    Granade, Christopher; Combes, Joshua; Cory, D. G.

    2016-03-01

    In recent years, Bayesian methods have been proposed as a solution to a wide range of issues in quantum state and process tomography. State-of-the-art Bayesian tomography solutions suffer from three problems: numerical intractability, a lack of informative prior distributions, and an inability to track time-dependent processes. Here, we address all three problems. First, we use modern statistical methods, as pioneered by Huszár and Houlsby (2012 Phys. Rev. A 85 052120) and by Ferrie (2014 New J. Phys. 16 093035), to make Bayesian tomography numerically tractable. Our approach allows for practical computation of Bayesian point and region estimators for quantum states and channels. Second, we propose the first priors on quantum states and channels that allow for including useful experimental insight. Finally, we develop a method that allows tracking of time-dependent states and estimates the drift and diffusion processes affecting a state. We provide source code and animated visual examples for our methods.

  20. A Statistical Project Control Tool for Engineering Managers

    NASA Technical Reports Server (NTRS)

    Bauch, Garland T.

    2001-01-01

    This slide presentation reviews the use of a Statistical Project Control Tool (SPCT) for managing engineering projects. A literature review pointed to a definition of project success, (i.e., A project is successful when the cost, schedule, technical performance, and quality satisfy the customer.) The literature review also pointed to project success factors, and traditional project control tools, and performance measures that are detailed in the report. The essential problem is that with resources becoming more limited, and an increasing number or projects, project failure is increasing, there is a limitation of existing methods and systematic methods are required. The objective of the work is to provide a new statistical project control tool for project managers. Graphs using the SPCT method plotting results of 3 successful projects and 3 failed projects are reviewed, with success and failure being defined by the owner.

  1. Statistical inference in behavior analysis: Experimental control is better

    PubMed Central

    Perone, Michael

    1999-01-01

    Statistical inference promises automatic, objective, reliable assessments of data, independent of the skills or biases of the investigator, whereas the single-subject methods favored by behavior analysts often are said to rely too much on the investigator's subjective impressions, particularly in the visual analysis of data. In fact, conventional statistical methods are difficult to apply correctly, even by experts, and the underlying logic of null-hypothesis testing has drawn criticism since its inception. By comparison, single-subject methods foster direct, continuous interaction between investigator and subject and development of strong forms of experimental control that obviate the need for statistical inference. Treatment effects are demonstrated in experimental designs that incorporate replication within and between subjects, and the visual analysis of data is adequate when integrated into such designs. Thus, single-subject methods are ideal for shaping—and maintaining—the kind of experimental practices that will ensure the continued success of behavior analysis. PMID:22478328

  2. Statistical physics of human beings in games: Controlled experiments

    NASA Astrophysics Data System (ADS)

    Liang, Yuan; Huang, Ji-Ping

    2014-07-01

    It is important to know whether the laws or phenomena in statistical physics for natural systems with non-adaptive agents still hold for social human systems with adaptive agents, because this implies whether it is possible to study or understand social human systems by using statistical physics originating from natural systems. For this purpose, we review the role of human adaptability in four kinds of specific human behaviors, namely, normal behavior, herd behavior, contrarian behavior, and hedge behavior. The approach is based on controlled experiments in the framework of market-directed resource-allocation games. The role of the controlled experiments could be at least two-fold: adopting the real human decision-making process so that the system under consideration could reflect the performance of genuine human beings; making it possible to obtain macroscopic physical properties of a human system by tuning a particular factor of the system, thus directly revealing cause and effect. As a result, both computer simulations and theoretical analyses help to show a few counterparts of some laws or phenomena in statistical physics for social human systems: two-phase phenomena or phase transitions, entropy-related phenomena, and a non-equilibrium steady state. This review highlights the role of human adaptability in these counterparts, and makes it possible to study or understand some particular social human systems by means of statistical physics coming from natural systems.

  3. Predicting Time Series Outputs and Time-to-Failure for an Aircraft Controller Using Bayesian Modeling

    NASA Technical Reports Server (NTRS)

    He, Yuning

    2015-01-01

    Safety of unmanned aerial systems (UAS) is paramount, but the large number of dynamically changing controller parameters makes it hard to determine if the system is currently stable, and the time before loss of control if not. We propose a hierarchical statistical model using Treed Gaussian Processes to predict (i) whether a flight will be stable (success) or become unstable (failure), (ii) the time-to-failure if unstable, and (iii) time series outputs for flight variables. We first classify the current flight input into success or failure types, and then use separate models for each class to predict the time-to-failure and time series outputs. As different inputs may cause failures at different times, we have to model variable length output curves. We use a basis representation for curves and learn the mappings from input to basis coefficients. We demonstrate the effectiveness of our prediction methods on a NASA neuro-adaptive flight control system.

  4. CRN5EXP: Expert system for statistical quality control

    NASA Technical Reports Server (NTRS)

    Hentea, Mariana

    1991-01-01

    The purpose of the Expert System CRN5EXP is to assist in checking the quality of the coils at two very important mills: Hot Rolling and Cold Rolling in a steel plant. The system interprets the statistical quality control charts, diagnoses and predicts the quality of the steel. Measurements of process control variables are recorded in a database and sample statistics such as the mean and the range are computed and plotted on a control chart. The chart is analyzed through patterns using the C Language Integrated Production System (CLIPS) and a forward chaining technique to reach a conclusion about the causes of defects and to take management measures for the improvement of the quality control techniques. The Expert System combines the certainty factors associated with the process control variables to predict the quality of the steel. The paper presents the approach to extract data from the database, the reason to combine certainty factors, the architecture and the use of the Expert System. However, the interpretation of control charts patterns requires the human expert's knowledge and lends to Expert Systems rules.

  5. Utilizing effective statistical process control limits for critical dimension metrology

    NASA Astrophysics Data System (ADS)

    Buser, Joel T.

    2002-12-01

    To accurately control critical dimension (CD) metrology in a standard real-time solution across a multi-site operation there is a need to collect measure-to-measure and day-to-day variation across all sites. Each individual site's needs, technologies, and resources can affect the final solution. A preferred statistical process control (SPC) solution for testing measure-to-measure and day-to-day variation is the traditional Mean and Range chart. However, replicating the full measurement process needed for the Mean and Range chart in real-time can strain resources. To solve this problem, an initially proposed measurement methodology was to isolate a point of interest, measure the CD feature n number of times, and continue to the next feature; however, the interdependencies in measure-to-measure variation caused by this methodology resulted in exceedingly narrow control limits. This paper explains how traditional solutions to narrow control limits are statistically problematic and explores the approach of computing control limits for the Mean chart utilizing the moving range of sample means to estimate sigma instead of the traditional range method. Tool monitoring data from multiple CD metrology tools are reported and compared against control limits calculated by the traditional approach, engineering limits, and the suggested approach. The data indicate that the suggested approach is the most accurate of the three solutions.

  6. Bayesian methods in reliability

    NASA Astrophysics Data System (ADS)

    Sander, P.; Badoux, R.

    1991-11-01

    The present proceedings from a course on Bayesian methods in reliability encompasses Bayesian statistical methods and their computational implementation, models for analyzing censored data from nonrepairable systems, the traits of repairable systems and growth models, the use of expert judgment, and a review of the problem of forecasting software reliability. Specific issues addressed include the use of Bayesian methods to estimate the leak rate of a gas pipeline, approximate analyses under great prior uncertainty, reliability estimation techniques, and a nonhomogeneous Poisson process. Also addressed are the calibration sets and seed variables of expert judgment systems for risk assessment, experimental illustrations of the use of expert judgment for reliability testing, and analyses of the predictive quality of software-reliability growth models such as the Weibull order statistics.

  7. Statistical process control using optimized neural networks: a case study.

    PubMed

    Addeh, Jalil; Ebrahimzadeh, Ata; Azarbad, Milad; Ranaee, Vahid

    2014-09-01

    The most common statistical process control (SPC) tools employed for monitoring process changes are control charts. A control chart demonstrates that the process has altered by generating an out-of-control signal. This study investigates the design of an accurate system for the control chart patterns (CCPs) recognition in two aspects. First, an efficient system is introduced that includes two main modules: feature extraction module and classifier module. In the feature extraction module, a proper set of shape features and statistical feature are proposed as the efficient characteristics of the patterns. In the classifier module, several neural networks, such as multilayer perceptron, probabilistic neural network and radial basis function are investigated. Based on an experimental study, the best classifier is chosen in order to recognize the CCPs. Second, a hybrid heuristic recognition system is introduced based on cuckoo optimization algorithm (COA) algorithm to improve the generalization performance of the classifier. The simulation results show that the proposed algorithm has high recognition accuracy. PMID:24210290

  8. Statistical learning and adaptive decision-making underlie human response time variability in inhibitory control

    PubMed Central

    Ma, Ning; Yu, Angela J.

    2015-01-01

    Response time (RT) is an oft-reported behavioral measure in psychological and neurocognitive experiments, but the high level of observed trial-to-trial variability in this measure has often limited its usefulness. Here, we combine computational modeling and psychophysics to examine the hypothesis that fluctuations in this noisy measure reflect dynamic computations in human statistical learning and corresponding cognitive adjustments. We present data from the stop-signal task (SST), in which subjects respond to a go stimulus on each trial, unless instructed not to by a subsequent, infrequently presented stop signal. We model across-trial learning of stop signal frequency, P(stop), and stop-signal onset time, SSD (stop-signal delay), with a Bayesian hidden Markov model, and within-trial decision-making with an optimal stochastic control model. The combined model predicts that RT should increase with both expected P(stop) and SSD. The human behavioral data (n = 20) bear out this prediction, showing P(stop) and SSD both to be significant, independent predictors of RT, with P(stop) being a more prominent predictor in 75% of the subjects, and SSD being more prominent in the remaining 25%. The results demonstrate that humans indeed readily internalize environmental statistics and adjust their cognitive/behavioral strategy accordingly, and that subtle patterns in RT variability can serve as a valuable tool for validating models of statistical learning and decision-making. More broadly, the modeling tools presented in this work can be generalized to a large body of behavioral paradigms, in order to extract insights about cognitive and neural processing from apparently quite noisy behavioral measures. We also discuss how this behaviorally validated model can then be used to conduct model-based analysis of neural data, in order to help identify specific brain areas for representing and encoding key computational quantities in learning and decision-making. PMID:26321966

  9. Statistics

    Cancer.gov

    Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.

  10. Bayesian estimation of prevalence of paratuberculosis in dairy herds enrolled in a voluntary Johne's Disease Control Programme in Ireland.

    PubMed

    McAloon, Conor G; Doherty, Michael L; Whyte, Paul; O'Grady, Luke; More, Simon J; Messam, Locksley L McV; Good, Margaret; Mullowney, Peter; Strain, Sam; Green, Martin J

    2016-06-01

    Bovine paratuberculosis is a disease characterised by chronic granulomatous enteritis which manifests clinically as a protein-losing enteropathy causing diarrhoea, hypoproteinaemia, emaciation and, eventually death. Some evidence exists to suggest a possible zoonotic link and a national voluntary Johne's Disease Control Programme was initiated by Animal Health Ireland in 2013. The objective of this study was to estimate herd-level true prevalence (HTP) and animal-level true prevalence (ATP) of paratuberculosis in Irish herds enrolled in the national voluntary JD control programme during 2013-14. Two datasets were used in this study. The first dataset had been collected in Ireland during 2005 (5822 animals from 119 herds), and was used to construct model priors. Model priors were updated with a primary (2013-14) dataset which included test records from 99,101 animals in 1039 dairy herds and was generated as part of the national voluntary JD control programme. The posterior estimate of HTP from the final Bayesian model was 0.23-0.34 with a 95% probability. Across all herds, the median ATP was found to be 0.032 (0.009, 0.145). This study represents the first use of Bayesian methodology to estimate the prevalence of paratuberculosis in Irish dairy herds. The HTP estimate was higher than previous Irish estimates but still lower than estimates from other major dairy producing countries. PMID:27237395

  11. Statistical process control for hospitals: methodology, user education, and challenges.

    PubMed

    Matthes, Nikolas; Ogunbo, Samuel; Pennington, Gaither; Wood, Nell; Hart, Marilyn K; Hart, Robert F

    2007-01-01

    The health care industry is slowly embracing the use of statistical process control (SPC) to monitor and study causes of variation in health care processes. While the statistics and principles underlying the use of SPC are relatively straightforward, there is a need to be cognizant of the perils that await the user who is not well versed in the key concepts of SPC. This article introduces the theory behind SPC methodology, describes successful tactics for educating users, and discusses the challenges associated with encouraging adoption of SPC among health care professionals. To illustrate these benefits and challenges, this article references the National Hospital Quality Measures, presents critical elements of SPC curricula, and draws examples from hospitals that have successfully embedded SPC into their overall approach to performance assessment and improvement. PMID:17627215

  12. Monte Carlo Bayesian Inference on a Statistical Model of Sub-gridcolumn Moisture Variability Using High-resolution Cloud Observations . Part II; Sensitivity Tests and Results

    NASA Technical Reports Server (NTRS)

    da Silva, Arlindo M.; Norris, Peter M.

    2013-01-01

    Part I presented a Monte Carlo Bayesian method for constraining a complex statistical model of GCM sub-gridcolumn moisture variability using high-resolution MODIS cloud data, thereby permitting large-scale model parameter estimation and cloud data assimilation. This part performs some basic testing of this new approach, verifying that it does indeed significantly reduce mean and standard deviation biases with respect to the assimilated MODIS cloud optical depth, brightness temperature and cloud top pressure, and that it also improves the simulated rotational-Ramman scattering cloud optical centroid pressure (OCP) against independent (non-assimilated) retrievals from the OMI instrument. Of particular interest, the Monte Carlo method does show skill in the especially difficult case where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach allows finite jumps into regions of non-zero cloud probability. In the example provided, the method is able to restore marine stratocumulus near the Californian coast where the background state has a clear swath. This paper also examines a number of algorithmic and physical sensitivities of the new method and provides guidance for its cost-effective implementation. One obvious difficulty for the method, and other cloud data assimilation methods as well, is the lack of information content in the cloud observables on cloud vertical structure, beyond cloud top pressure and optical thickness, thus necessitating strong dependence on the background vertical moisture structure. It is found that a simple flow-dependent correlation modification due to Riishojgaard (1998) provides some help in this respect, by better honoring inversion structures in the background state.

  13. Statistical Process Control of a Kalman Filter Model

    PubMed Central

    Gamse, Sonja; Nobakht-Ersi, Fereydoun; Sharifi, Mohammad A.

    2014-01-01

    For the evaluation of measurement data, different functional and stochastic models can be used. In the case of time series, a Kalman filtering (KF) algorithm can be implemented. In this case, a very well-known stochastic model, which includes statistical tests in the domain of measurements and in the system state domain, is used. Because the output results depend strongly on input model parameters and the normal distribution of residuals is not always fulfilled, it is very important to perform all possible tests on output results. In this contribution, we give a detailed description of the evaluation of the Kalman filter model. We describe indicators of inner confidence, such as controllability and observability, the determinant of state transition matrix and observing the properties of the a posteriori system state covariance matrix and the properties of the Kalman gain matrix. The statistical tests include the convergence of standard deviations of the system state components and normal distribution beside standard tests. Especially, computing controllability and observability matrices and controlling the normal distribution of residuals are not the standard procedures in the implementation of KF. Practical implementation is done on geodetic kinematic observations. PMID:25264959

  14. Yield enhancement in micromechanical sensor fabrication using statistical process control

    NASA Astrophysics Data System (ADS)

    Borenstein, Jeffrey T.; Preble, Douglas M.

    1997-09-01

    Statistical process control (SPC) has gained wide acceptance in recent years as an essential tool for yield improvement in the microelectronics industry. In both manufacturing and research and development settings, statistical methods are extremely useful in process control and optimization. Here we describe the recent implementation of SPC in the micromachining fabrication process at Draper. A wide array of micromachined silicon sensors, including gyroscopes, accelerometers, and microphones, are routinely fabricated at Draper, often with rapidly changing designs and processes. In spite of Draper's requirements for rapid turnaround and relatively small, short production runs, SPC has turned out to be a critical component of the product development process. This paper describes the multipronged SPC approach we have developed and tailored to the particular requirements of an R & D micromachining process line. Standard tools such as Pareto charts, histograms, and cause-and-effect diagrams have been deployed to troubleshoot yield and performance problems in the micromachining process, and several examples of their use are described. More rigorous approaches, such as the use of control charts for variables and attributes, have been instituted with considerable success. The software package CornerstoneR was selected to handle the SPC program at Draper. We describe the highly automated process now in place for monitoring key processes, including diffusion, oxidation, photolithography, and etching. In addition to the process monitoring, gauge capability is applied to critical metrology tools on a regular basis. Applying these tools in the process line has resulted in sharply improved yields and shortened process cycles.

  15. Statistical process control program at a ceramics vendor facility

    SciTech Connect

    Enke, G.M.

    1992-12-01

    Development of a statistical process control (SPC) program at a ceramics vendor location was deemed necessary to improve product quality, reduce manufacturing flowtime, and reduce quality costs borne by AlliedSignal Inc., Kansas City Division (KCD), and the vendor. Because of the lack of available KCD manpower and the required time schedule for the project, it was necessary for the SPC program to be implemented by an external contractor. Approximately a year after the program had been installed, the original baseline was reviewed so that the success of the project could be determined.

  16. A Gentle Introduction to Bayesian Analysis: Applications to Developmental Research

    ERIC Educational Resources Information Center

    van de Schoot, Rens; Kaplan, David; Denissen, Jaap; Asendorpf, Jens B.; Neyer, Franz J.; van Aken, Marcel A. G.

    2014-01-01

    Bayesian statistical methods are becoming ever more popular in applied and fundamental research. In this study a gentle introduction to Bayesian analysis is provided. It is shown under what circumstances it is attractive to use Bayesian estimation, and how to interpret properly the results. First, the ingredients underlying Bayesian methods are…

  17. Bayesian neural adjustment of inhibitory control predicts emergence of problem stimulant use

    PubMed Central

    Stewart, Jennifer L.; Zhang, Shunan; Tapert, Susan F.; Yu, Angela J.; Paulus, Martin P.

    2015-01-01

    Bayesian ideal observer models quantify individuals’ context- and experience-dependent beliefs and expectations about their environment, which provides a powerful approach (i) to link basic behavioural mechanisms to neural processing; and (ii) to generate clinical predictors for patient populations. Here, we focus on (ii) and determine whether individual differences in the neural representation of the need to stop in an inhibitory task can predict the development of problem use (i.e. abuse or dependence) in individuals experimenting with stimulants. One hundred and fifty-seven non-dependent occasional stimulant users, aged 18–24, completed a stop-signal task while undergoing functional magnetic resonance imaging. These individuals were prospectively followed for 3 years and evaluated for stimulant use and abuse/dependence symptoms. At follow-up, 38 occasional stimulant users met criteria for a stimulant use disorder (problem stimulant users), while 50 had discontinued use (desisted stimulant users). We found that those individuals who showed greater neural responses associated with Bayesian prediction errors, i.e. the difference between actual and expected need to stop on a given trial, in right medial prefrontal cortex/anterior cingulate cortex, caudate, anterior insula, and thalamus were more likely to exhibit problem use 3 years later. Importantly, these computationally based neural predictors outperformed clinical measures and non-model based neural variables in predicting clinical status. In conclusion, young adults who show exaggerated brain processing underlying whether to ‘stop’ or to ‘go’ are more likely to develop stimulant abuse. Thus, Bayesian cognitive models provide both a computational explanation and potential predictive biomarkers of belief processing deficits in individuals at risk for stimulant addiction. PMID:26336910

  18. Bayesian neural adjustment of inhibitory control predicts emergence of problem stimulant use.

    PubMed

    Harlé, Katia M; Stewart, Jennifer L; Zhang, Shunan; Tapert, Susan F; Yu, Angela J; Paulus, Martin P

    2015-11-01

    Bayesian ideal observer models quantify individuals' context- and experience-dependent beliefs and expectations about their environment, which provides a powerful approach (i) to link basic behavioural mechanisms to neural processing; and (ii) to generate clinical predictors for patient populations. Here, we focus on (ii) and determine whether individual differences in the neural representation of the need to stop in an inhibitory task can predict the development of problem use (i.e. abuse or dependence) in individuals experimenting with stimulants. One hundred and fifty-seven non-dependent occasional stimulant users, aged 18-24, completed a stop-signal task while undergoing functional magnetic resonance imaging. These individuals were prospectively followed for 3 years and evaluated for stimulant use and abuse/dependence symptoms. At follow-up, 38 occasional stimulant users met criteria for a stimulant use disorder (problem stimulant users), while 50 had discontinued use (desisted stimulant users). We found that those individuals who showed greater neural responses associated with Bayesian prediction errors, i.e. the difference between actual and expected need to stop on a given trial, in right medial prefrontal cortex/anterior cingulate cortex, caudate, anterior insula, and thalamus were more likely to exhibit problem use 3 years later. Importantly, these computationally based neural predictors outperformed clinical measures and non-model based neural variables in predicting clinical status. In conclusion, young adults who show exaggerated brain processing underlying whether to 'stop' or to 'go' are more likely to develop stimulant abuse. Thus, Bayesian cognitive models provide both a computational explanation and potential predictive biomarkers of belief processing deficits in individuals at risk for stimulant addiction. PMID:26336910

  19. A Bayesian semiparametric approach for incorporating longitudinal information on exposure history for inference in case-control studies.

    PubMed

    Bhadra, Dhiman; Daniels, Michael J; Kim, Sungduk; Ghosh, Malay; Mukherjee, Bhramar

    2012-06-01

    In a typical case-control study, exposure information is collected at a single time point for the cases and controls. However, case-control studies are often embedded in existing cohort studies containing a wealth of longitudinal exposure history about the participants. Recent medical studies have indicated that incorporating past exposure history, or a constructed summary measure of cumulative exposure derived from the past exposure history, when available, may lead to more precise and clinically meaningful estimates of the disease risk. In this article, we propose a flexible Bayesian semiparametric approach to model the longitudinal exposure profiles of the cases and controls and then use measures of cumulative exposure based on a weighted integral of this trajectory in the final disease risk model. The estimation is done via a joint likelihood. In the construction of the cumulative exposure summary, we introduce an influence function, a smooth function of time to characterize the association pattern of the exposure profile on the disease status with different time windows potentially having differential influence/weights. This enables us to analyze how the present disease status of a subject is influenced by his/her past exposure history conditional on the current ones. The joint likelihood formulation allows us to properly account for uncertainties associated with both stages of the estimation process in an integrated manner. Analysis is carried out in a hierarchical Bayesian framework using reversible jump Markov chain Monte Carlo algorithms. The proposed methodology is motivated by, and applied to a case-control study of prostate cancer where longitudinal biomarker information is available for the cases and controls. PMID:22313248

  20. Statistical process control based chart for information systems security

    NASA Astrophysics Data System (ADS)

    Khan, Mansoor S.; Cui, Lirong

    2015-07-01

    Intrusion detection systems have a highly significant role in securing computer networks and information systems. To assure the reliability and quality of computer networks and information systems, it is highly desirable to develop techniques that detect intrusions into information systems. We put forward the concept of statistical process control (SPC) in computer networks and information systems intrusions. In this article we propose exponentially weighted moving average (EWMA) type quality monitoring scheme. Our proposed scheme has only one parameter which differentiates it from the past versions. We construct the control limits for the proposed scheme and investigate their effectiveness. We provide an industrial example for the sake of clarity for practitioner. We give comparison of the proposed scheme with EWMA schemes and p chart; finally we provide some recommendations for the future work.

  1. Bayesian demography 250 years after Bayes

    PubMed Central

    Bijak, Jakub; Bryant, John

    2016-01-01

    Bayesian statistics offers an alternative to classical (frequentist) statistics. It is distinguished by its use of probability distributions to describe uncertain quantities, which leads to elegant solutions to many difficult statistical problems. Although Bayesian demography, like Bayesian statistics more generally, is around 250 years old, only recently has it begun to flourish. The aim of this paper is to review the achievements of Bayesian demography, address some misconceptions, and make the case for wider use of Bayesian methods in population studies. We focus on three applications: demographic forecasts, limited data, and highly structured or complex models. The key advantages of Bayesian methods are the ability to integrate information from multiple sources and to describe uncertainty coherently. Bayesian methods also allow for including additional (prior) information next to the data sample. As such, Bayesian approaches are complementary to many traditional methods, which can be productively re-expressed in Bayesian terms. PMID:26902889

  2. Bayesian demography 250 years after Bayes.

    PubMed

    Bijak, Jakub; Bryant, John

    2016-01-01

    Bayesian statistics offers an alternative to classical (frequentist) statistics. It is distinguished by its use of probability distributions to describe uncertain quantities, which leads to elegant solutions to many difficult statistical problems. Although Bayesian demography, like Bayesian statistics more generally, is around 250 years old, only recently has it begun to flourish. The aim of this paper is to review the achievements of Bayesian demography, address some misconceptions, and make the case for wider use of Bayesian methods in population studies. We focus on three applications: demographic forecasts, limited data, and highly structured or complex models. The key advantages of Bayesian methods are the ability to integrate information from multiple sources and to describe uncertainty coherently. Bayesian methods also allow for including additional (prior) information next to the data sample. As such, Bayesian approaches are complementary to many traditional methods, which can be productively re-expressed in Bayesian terms.

  3. Bayesian demography 250 years after Bayes.

    PubMed

    Bijak, Jakub; Bryant, John

    2016-01-01

    Bayesian statistics offers an alternative to classical (frequentist) statistics. It is distinguished by its use of probability distributions to describe uncertain quantities, which leads to elegant solutions to many difficult statistical problems. Although Bayesian demography, like Bayesian statistics more generally, is around 250 years old, only recently has it begun to flourish. The aim of this paper is to review the achievements of Bayesian demography, address some misconceptions, and make the case for wider use of Bayesian methods in population studies. We focus on three applications: demographic forecasts, limited data, and highly structured or complex models. The key advantages of Bayesian methods are the ability to integrate information from multiple sources and to describe uncertainty coherently. Bayesian methods also allow for including additional (prior) information next to the data sample. As such, Bayesian approaches are complementary to many traditional methods, which can be productively re-expressed in Bayesian terms. PMID:26902889

  4. BIE: Bayesian Inference Engine

    NASA Astrophysics Data System (ADS)

    Weinberg, Martin D.

    2013-12-01

    The Bayesian Inference Engine (BIE) is an object-oriented library of tools written in C++ designed explicitly to enable Bayesian update and model comparison for astronomical problems. To facilitate "what if" exploration, BIE provides a command line interface (written with Bison and Flex) to run input scripts. The output of the code is a simulation of the Bayesian posterior distribution from which summary statistics e.g. by taking moments, or determine confidence intervals and so forth, can be determined. All of these quantities are fundamentally integrals and the Markov Chain approach produces variates heta distributed according to P( heta|D) so moments are trivially obtained by summing of the ensemble of variates.

  5. Post hoc Analysis for Detecting Individual Rare Variant Risk Associations Using Probit Regression Bayesian Variable Selection Methods in Case-Control Sequencing Studies.

    PubMed

    Larson, Nicholas B; McDonnell, Shannon; Albright, Lisa Cannon; Teerlink, Craig; Stanford, Janet; Ostrander, Elaine A; Isaacs, William B; Xu, Jianfeng; Cooney, Kathleen A; Lange, Ethan; Schleutker, Johanna; Carpten, John D; Powell, Isaac; Bailey-Wilson, Joan; Cussenot, Olivier; Cancel-Tassin, Geraldine; Giles, Graham; MacInnis, Robert; Maier, Christiane; Whittemore, Alice S; Hsieh, Chih-Lin; Wiklund, Fredrik; Catolona, William J; Foulkes, William; Mandal, Diptasri; Eeles, Rosalind; Kote-Jarai, Zsofia; Ackerman, Michael J; Olson, Timothy M; Klein, Christopher J; Thibodeau, Stephen N; Schaid, Daniel J

    2016-09-01

    Rare variants (RVs) have been shown to be significant contributors to complex disease risk. By definition, these variants have very low minor allele frequencies and traditional single-marker methods for statistical analysis are underpowered for typical sequencing study sample sizes. Multimarker burden-type approaches attempt to identify aggregation of RVs across case-control status by analyzing relatively small partitions of the genome, such as genes. However, it is generally the case that the aggregative measure would be a mixture of causal and neutral variants, and these omnibus tests do not directly provide any indication of which RVs may be driving a given association. Recently, Bayesian variable selection approaches have been proposed to identify RV associations from a large set of RVs under consideration. Although these approaches have been shown to be powerful at detecting associations at the RV level, there are often computational limitations on the total quantity of RVs under consideration and compromises are necessary for large-scale application. Here, we propose a computationally efficient alternative formulation of this method using a probit regression approach specifically capable of simultaneously analyzing hundreds to thousands of RVs. We evaluate our approach to detect causal variation on simulated data and examine sensitivity and specificity in instances of high RV dimensionality as well as apply it to pathway-level RV analysis results from a prostate cancer (PC) risk case-control sequencing study. Finally, we discuss potential extensions and future directions of this work.

  6. Post hoc Analysis for Detecting Individual Rare Variant Risk Associations Using Probit Regression Bayesian Variable Selection Methods in Case-Control Sequencing Studies.

    PubMed

    Larson, Nicholas B; McDonnell, Shannon; Albright, Lisa Cannon; Teerlink, Craig; Stanford, Janet; Ostrander, Elaine A; Isaacs, William B; Xu, Jianfeng; Cooney, Kathleen A; Lange, Ethan; Schleutker, Johanna; Carpten, John D; Powell, Isaac; Bailey-Wilson, Joan; Cussenot, Olivier; Cancel-Tassin, Geraldine; Giles, Graham; MacInnis, Robert; Maier, Christiane; Whittemore, Alice S; Hsieh, Chih-Lin; Wiklund, Fredrik; Catolona, William J; Foulkes, William; Mandal, Diptasri; Eeles, Rosalind; Kote-Jarai, Zsofia; Ackerman, Michael J; Olson, Timothy M; Klein, Christopher J; Thibodeau, Stephen N; Schaid, Daniel J

    2016-09-01

    Rare variants (RVs) have been shown to be significant contributors to complex disease risk. By definition, these variants have very low minor allele frequencies and traditional single-marker methods for statistical analysis are underpowered for typical sequencing study sample sizes. Multimarker burden-type approaches attempt to identify aggregation of RVs across case-control status by analyzing relatively small partitions of the genome, such as genes. However, it is generally the case that the aggregative measure would be a mixture of causal and neutral variants, and these omnibus tests do not directly provide any indication of which RVs may be driving a given association. Recently, Bayesian variable selection approaches have been proposed to identify RV associations from a large set of RVs under consideration. Although these approaches have been shown to be powerful at detecting associations at the RV level, there are often computational limitations on the total quantity of RVs under consideration and compromises are necessary for large-scale application. Here, we propose a computationally efficient alternative formulation of this method using a probit regression approach specifically capable of simultaneously analyzing hundreds to thousands of RVs. We evaluate our approach to detect causal variation on simulated data and examine sensitivity and specificity in instances of high RV dimensionality as well as apply it to pathway-level RV analysis results from a prostate cancer (PC) risk case-control sequencing study. Finally, we discuss potential extensions and future directions of this work. PMID:27312771

  7. Resist Profile Control Obtained Through A Desirability Function And Statistically Designed Experiments

    NASA Astrophysics Data System (ADS)

    Bell, Kenneth L.; Christensen, Lorna D.

    1989-07-01

    This paper describes a technique used to determine an optimized microlithographic process using statistical methods which included a statistically designed experiment (SDE); a desirability function, d(θ*) and a rigorous daily statistical process control program, (SPC).

  8. LOWER LEVEL INFERENCE CONTROL IN STATISTICAL DATABASE SYSTEMS

    SciTech Connect

    Lipton, D.L.; Wong, H.K.T.

    1984-02-01

    An inference is the process of transforming unclassified data values into confidential data values. Most previous research in inference control has studied the use of statistical aggregates to deduce individual records. However, several other types of inference are also possible. Unknown functional dependencies may be apparent to users who have 'expert' knowledge about the characteristics of a population. Some correlations between attributes may be concluded from 'commonly-known' facts about the world. To counter these threats, security managers should use random sampling of databases of similar populations, as well as expert systems. 'Expert' users of the DATABASE SYSTEM may form inferences from the variable performance of the user interface. Users may observe on-line turn-around time, accounting statistics. the error message received, and the point at which an interactive protocol sequence fails. One may obtain information about the frequency distributions of attribute values, and the validity of data object names from this information. At the back-end of a database system, improved software engineering practices will reduce opportunities to bypass functional units of the database system. The term 'DATA OBJECT' should be expanded to incorporate these data object types which generate new classes of threats. The security of DATABASES and DATABASE SySTEMS must be recognized as separate but related problems. Thus, by increased awareness of lower level inferences, system security managers may effectively nullify the threat posed by lower level inferences.

  9. Statistically Controlling for Confounding Constructs Is Harder than You Think

    PubMed Central

    Westfall, Jacob; Yarkoni, Tal

    2016-01-01

    Social scientists often seek to demonstrate that a construct has incremental validity over and above other related constructs. However, these claims are typically supported by measurement-level models that fail to consider the effects of measurement (un)reliability. We use intuitive examples, Monte Carlo simulations, and a novel analytical framework to demonstrate that common strategies for establishing incremental construct validity using multiple regression analysis exhibit extremely high Type I error rates under parameter regimes common in many psychological domains. Counterintuitively, we find that error rates are highest—in some cases approaching 100%—when sample sizes are large and reliability is moderate. Our findings suggest that a potentially large proportion of incremental validity claims made in the literature are spurious. We present a web application (http://jakewestfall.org/ivy/) that readers can use to explore the statistical properties of these and other incremental validity arguments. We conclude by reviewing SEM-based statistical approaches that appropriately control the Type I error rate when attempting to establish incremental validity. PMID:27031707

  10. Application of statistical process control to qualitative molecular diagnostic assays.

    PubMed

    O'Brien, Cathal P; Finn, Stephen P

    2014-01-01

    Modern pathology laboratories and in particular high throughput laboratories such as clinical chemistry have developed a reliable system for statistical process control (SPC). Such a system is absent from the majority of molecular laboratories and where present is confined to quantitative assays. As the inability to apply SPC to an assay is an obvious disadvantage this study aimed to solve this problem by using a frequency estimate coupled with a confidence interval calculation to detect deviations from an expected mutation frequency. The results of this study demonstrate the strengths and weaknesses of this approach and highlight minimum sample number requirements. Notably, assays with low mutation frequencies and detection of small deviations from an expected value require greater sample numbers to mitigate a protracted time to detection. Modeled laboratory data was also used to highlight how this approach might be applied in a routine molecular laboratory. This article is the first to describe the application of SPC to qualitative laboratory data. PMID:25988159

  11. Advances in Bayesian Modeling in Educational Research

    ERIC Educational Resources Information Center

    Levy, Roy

    2016-01-01

    In this article, I provide a conceptually oriented overview of Bayesian approaches to statistical inference and contrast them with frequentist approaches that currently dominate conventional practice in educational research. The features and advantages of Bayesian approaches are illustrated with examples spanning several statistical modeling…

  12. Geological Controls on Glacier Surging?: Statistics and Speculation

    NASA Astrophysics Data System (ADS)

    Flowers, G. E.; Crompton, J. W.

    2015-12-01

    Glacier surging represents an end-member behavior in the spectrum of ice dynamics, involving marked acceleration and high flow speeds due to abrupt changes in basal mechanics. Though much effort has been devoted to understanding the role of basal hydrology and thermal regime in fast glacier flow, fewer studies have addressed the potential role of the geologic substrate. One interesting observation is that surge-type glaciers appear almost universally associated with unconsolidated (till) beds, and several large-scale statistical studies have revealed correlations between glacier surging and bedrock properties. We revisit this relationship using field measurements. We selected 20 individual glaciers for sampling in a 40x40 km region of the St. Elias Mountains of Yukon, Canada. Eleven of these glaciers are known to surge and nine are not. The 20 study glaciers are underlain by lithologies that we have broadly classified into two types: metasedimentary only and mixed metasedimentary-granodiorite. We characterized geological and geotechnical properties of the bedrock in each basin, and analyzed the hydrochemistry and mineralogy and grain size distribution (GSD) of the suspended sediments in the proglacial streams. Here we focus on some intriguing results of the GSD analysis. Using statistical techniques, including significance testing and principal component analysis, we find that: (1) lithology determines GSD for non-surge-type glaciers, with metasedimentary basins associated with finer mean grain sizes and mixed-lithology basins with coarser mean grain sizes, but (2) the GSDs associated with surge-type glaciers are intermediate between the distributions described above, and are statistically indistinguishable between metasedimentary and mixed lithology basins. The latter suggests either that surge-type glaciers in our study area occur preferentially in basins where various processes conspire to produce a characteristic GSD, or that the surge cycle itself exerts an

  13. Statistical process control testing of electronic security equipment

    SciTech Connect

    Murray, D.W.; Spencer, D.D.

    1994-06-01

    Statistical Process Control testing of manufacturing processes began back in the 1940`s with the development of Process Control Charts by Dr. Walter A. Shewart. Sandia National Laboratories has developed an application of the SPC method for performance testing of electronic security equipment. This paper documents the evaluation of this testing methodology applied to electronic security equipment and an associated laptop computer-based system for obtaining and analyzing the test data. Sandia developed this SPC sensor performance testing method primarily for use on portal metal detectors, but, has evaluated it for testing of an exterior intrusion detection sensor and other electronic security devices. This method is an alternative to the traditional binomial (alarm or no-alarm) performance testing. The limited amount of information in binomial data drives the number of tests necessary to meet regulatory requirements to unnecessarily high levels. For example, a requirement of a 0.85 probability of detection with a 90% confidence requires a minimum of 19 alarms out of 19 trials. By extracting and analyzing measurement (variables) data whenever possible instead of the more typical binomial data, the user becomes more informed about equipment health with fewer tests (as low as five per periodic evaluation).

  14. Bayesian Inference: with ecological applications

    USGS Publications Warehouse

    Link, William A.; Barker, Richard J.

    2010-01-01

    This text provides a mathematically rigorous yet accessible and engaging introduction to Bayesian inference with relevant examples that will be of interest to biologists working in the fields of ecology, wildlife management and environmental studies as well as students in advanced undergraduate statistics.. This text opens the door to Bayesian inference, taking advantage of modern computational efficiencies and easily accessible software to evaluate complex hierarchical models.

  15. Experimental adaptive Bayesian tomography

    NASA Astrophysics Data System (ADS)

    Kravtsov, K. S.; Straupe, S. S.; Radchenko, I. V.; Houlsby, N. M. T.; Huszár, F.; Kulik, S. P.

    2013-06-01

    We report an experimental realization of an adaptive quantum state tomography protocol. Our method takes advantage of a Bayesian approach to statistical inference and is naturally tailored for adaptive strategies. For pure states, we observe close to N-1 scaling of infidelity with overall number of registered events, while the best nonadaptive protocols allow for N-1/2 scaling only. Experiments are performed for polarization qubits, but the approach is readily adapted to any dimension.

  16. Impact angle control of interplanetary shock geoeffectiveness: A statistical study

    NASA Astrophysics Data System (ADS)

    Oliveira, Denny M.; Raeder, Joachim

    2015-06-01

    We present a survey of interplanetary (IP) shocks using Wind and ACE satellite data from January 1995 to December 2013 to study how IP shock geoeffectiveness is controlled by IP shock impact angles. A shock list covering one and a half solar cycle is compiled. The yearly number of IP shocks is found to correlate well with the monthly sunspot number. We use data from SuperMAG, a large chain with more than 300 geomagnetic stations, to study geoeffectiveness triggered by IP shocks. The SuperMAG SML index, an enhanced version of the familiar AL index, is used in our statistical analysis. The jumps of the SML index triggered by IP shock impacts on the Earth's magnetosphere are investigated in terms of IP shock orientation and speed. We find that, in general, strong (high speed) and almost frontal (small impact angle) shocks are more geoeffective than inclined shocks with low speed. The strongest correlation (correlation coefficient R = 0.78) occurs for fixed IP shock speed and for varied IP shock impact angle. We attribute this result, predicted previously with simulations, to the fact that frontal shocks compress the magnetosphere symmetrically from all sides, which is a favorable condition for the release of magnetic energy stored in the magnetotail, which in turn can produce moderate to strong auroral substorms, which are then observed by ground-based magnetometers.

  17. A Statistical Process Control Method for Semiconductor Manufacturing

    NASA Astrophysics Data System (ADS)

    Kubo, Tomoaki; Ino, Tomomi; Minami, Kazuhiro; Minami, Masateru; Homma, Tetsuya

    To maintain stable operation of semiconductor fabrication lines, statistical process control (SPC) methods are recognized to be effective. However, in semiconductor fabrication lines, there exist a huge number of process state signals to be monitored, and these signals contain both normally and non-normally distributed data. Therefore, if we try to apply SPC methods to those signals, we need one which satisfies three requirements: 1) It can deal with both normally distributed data, and non-normally distributed data, 2) It can be set up automatically, 3) It can be easily understood by engineers and technicians. In this paper, we propose a new SPC method which satisfies these three requirements at the same time. This method uses similar rules to the Shewhart chart, but can deal with non-normally distributed data by introducing “effective standard deviations”. Usefulness of this method is demonstrated by comparing false alarm ratios to that of the Shewhart chart method. In the demonstration, we use various kinds of artificially generated data, and real data observed in a chemical vapor deposition (CVD) process tool in a semiconductor fabrication line.

  18. A Bayesian approach to strengthen inference for case-control studies with multiple error-prone exposure assessments.

    PubMed

    Zhang, Jing; Cole, Stephen R; Richardson, David B; Chu, Haitao

    2013-11-10

    In case-control studies, exposure assessments are almost always error-prone. In the absence of a gold standard, two or more assessment approaches are often used to classify people with respect to exposure. Each imperfect assessment tool may lead to misclassification of exposure assignment; the exposure misclassification may be differential with respect to case status or not; and, the errors in exposure classification under the different approaches may be independent (conditional upon the true exposure status) or not. Although methods have been proposed to study diagnostic accuracy in the absence of a gold standard, these methods are infrequently used in case-control studies to correct exposure misclassification that is simultaneously differential and dependent. In this paper, we proposed a Bayesian method to estimate the measurement-error corrected exposure-disease association, accounting for both differential and dependent misclassification. The performance of the proposed method is investigated using simulations, which show that the proposed approach works well, as well as an application to a case-control study assessing the association between asbestos exposure and mesothelioma.

  19. Bayesian model selection techniques as decision support for shaping a statistical analysis plan of a clinical trial: An example from a vertigo phase III study with longitudinal count data as primary endpoint

    PubMed Central

    2012-01-01

    Background A statistical analysis plan (SAP) is a critical link between how a clinical trial is conducted and the clinical study report. To secure objective study results, regulatory bodies expect that the SAP will meet requirements in pre-specifying inferential analyses and other important statistical techniques. To write a good SAP for model-based sensitivity and ancillary analyses involves non-trivial decisions on and justification of many aspects of the chosen setting. In particular, trials with longitudinal count data as primary endpoints pose challenges for model choice and model validation. In the random effects setting, frequentist strategies for model assessment and model diagnosis are complex and not easily implemented and have several limitations. Therefore, it is of interest to explore Bayesian alternatives which provide the needed decision support to finalize a SAP. Methods We focus on generalized linear mixed models (GLMMs) for the analysis of longitudinal count data. A series of distributions with over- and under-dispersion is considered. Additionally, the structure of the variance components is modified. We perform a simulation study to investigate the discriminatory power of Bayesian tools for model criticism in different scenarios derived from the model setting. We apply the findings to the data from an open clinical trial on vertigo attacks. These data are seen as pilot data for an ongoing phase III trial. To fit GLMMs we use a novel Bayesian computational approach based on integrated nested Laplace approximations (INLAs). The INLA methodology enables the direct computation of leave-one-out predictive distributions. These distributions are crucial for Bayesian model assessment. We evaluate competing GLMMs for longitudinal count data according to the deviance information criterion (DIC) or probability integral transform (PIT), and by using proper scoring rules (e.g. the logarithmic score). Results The instruments under study provide excellent

  20. Performance Monitoring and Assessment of Neuro-Adaptive Controllers for Aerospace Applications Using a Bayesian Approach

    NASA Technical Reports Server (NTRS)

    Gupta, Pramod; Guenther, Kurt; Hodgkinson, John; Jacklin, Stephen; Richard, Michael; Schumann, Johann; Soares, Fola

    2005-01-01

    Modern exploration missions require modern control systems-control systems that can handle catastrophic changes in the system's behavior, compensate for slow deterioration in sustained operations, and support fast system ID. Adaptive controllers, based upon Neural Networks have these capabilities, but they can only be used safely if proper verification & validation (V&V) can be done. In this paper we present our V & V approach and simulation result within NASA's Intelligent Flight Control Systems (IFCS).

  1. Operation assistance for the Bio-Remote environmental control system using a Bayesian Network-based prediction model.

    PubMed

    Shibanoki, Taro; Nakamura, Go; Shima, Keisuke; Chin, Takaaki; Tsuji, Toshio

    2015-08-01

    This paper proposes a Bayesian Network (BN) based prediction model for a layer-based selection and its application to an operation assistance for the environmental control system Bio-Remote (BR). In the proposed method, each node of the BN model is involved in the layer-based selection function, which corresponds to an individual operation command, appliance, etc., and previous logs of operation commands and time division are used as input factors to predict the user's intended operation. The prediction results are displayed on the layer-based selection for the BR, and the number of times of operations and time taken for the operations can be reduced with the proposed prediction model. In the experiments, life-logs were collected from a cervical spinal injury patient who used the BR in daily life, and the proposed model was trained based on these recorded life-logs. The prediction accuracy for control devices of the BR system using the proposed model was 84.3 ± 6.5 %. The results indicated that the proposed prediction model could be useful for the operation assistance of the BR system. PMID:26736472

  2. UNIFORMLY MOST POWERFUL BAYESIAN TESTS

    PubMed Central

    Johnson, Valen E.

    2014-01-01

    Uniformly most powerful tests are statistical hypothesis tests that provide the greatest power against a fixed null hypothesis among all tests of a given size. In this article, the notion of uniformly most powerful tests is extended to the Bayesian setting by defining uniformly most powerful Bayesian tests to be tests that maximize the probability that the Bayes factor, in favor of the alternative hypothesis, exceeds a specified threshold. Like their classical counterpart, uniformly most powerful Bayesian tests are most easily defined in one-parameter exponential family models, although extensions outside of this class are possible. The connection between uniformly most powerful tests and uniformly most powerful Bayesian tests can be used to provide an approximate calibration between p-values and Bayes factors. Finally, issues regarding the strong dependence of resulting Bayes factors and p-values on sample size are discussed. PMID:24659829

  3. 77 FR 46096 - Statistical Process Controls for Blood Establishments; Public Workshop

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-02

    ...: Statistical process control is the application of statistical methods to the monitoring, or quality control... monitors manufacturing procedures, validation summaries, and quality control data prior to licensure and... at implementation and then monitor these processes on a regular basis, using quality control...

  4. Performance Monitoring and Assessment of Neuro-Adaptive Controllers for Aerospace Applications Using a Bayesian Approach

    NASA Technical Reports Server (NTRS)

    Gupta, Pramod; Jacklin, Stephen; Schumann, Johann; Guenther, Kurt; Richard, Michael; Soares, Fola

    2005-01-01

    Modem aircraft, UAVs, and robotic spacecraft pose substantial requirements on controllers in the light of ever increasing demands for reusability, affordability, and reliability. The individual systems (which are often nonlinear) must be controlled safely and reliably in environments where it is virtually impossible to analyze-ahead of time- all the important and possible scenarios and environmental factors. For example, system components (e.g., gyros, bearings of reaction wheels, valves) may deteriorate or break during autonomous UAV operation or long-lasting space missions, leading to a sudden, drastic change in vehicle performance. Manual repair or replacement is not an option in such cases. Instead, the system must be able to cope with equipment failure and deterioration. Controllability of the system must be retained as good as possible or re-established as fast as possible with a minimum of deactivation or shutdown of the system being controlled. In such situations the control engineer has to employ adaptive control systems that automatically sense and correct themselves whenever drastic disturbances and/or severe changes in the plant or environment occur.

  5. Bayesian inference for an emerging arboreal epidemic in the presence of control

    PubMed Central

    Parry, Matthew; Gibson, Gavin J.; Parnell, Stephen; Gottwald, Tim R.; Irey, Michael S.; Gast, Timothy C.; Gilligan, Christopher A.

    2014-01-01

    The spread of Huanglongbing through citrus groves is used as a case study for modeling an emerging epidemic in the presence of a control. Specifically, the spread of the disease is modeled as a susceptible-exposed-infectious-detected-removed epidemic, where the exposure and infectious times are not observed, detection times are censored, removal times are known, and the disease is spreading through a heterogeneous host population with trees of different age and susceptibility. We show that it is possible to characterize the disease transmission process under these conditions. Two innovations in our work are (i) accounting for control measures via time dependence of the infectious process and (ii) including seasonal and host age effects in the model of the latent period. By estimating parameters in different subregions of a large commercially cultivated orchard, we establish a temporal pattern of invasion, host age dependence of the dispersal parameters, and a close to linear relationship between primary and secondary infectious rates. The model can be used to simulate Huanglongbing epidemics to assess economic costs and potential benefits of putative control scenarios. PMID:24711393

  6. Bayesian Analysis of Individual Level Personality Dynamics.

    PubMed

    Cripps, Edward; Wood, Robert E; Beckmann, Nadin; Lau, John; Beckmann, Jens F; Cripps, Sally Ann

    2016-01-01

    A Bayesian technique with analyses of within-person processes at the level of the individual is presented. The approach is used to examine whether the patterns of within-person responses on a 12-trial simulation task are consistent with the predictions of ITA theory (Dweck, 1999). ITA theory states that the performance of an individual with an entity theory of ability is more likely to spiral down following a failure experience than the performance of an individual with an incremental theory of ability. This is because entity theorists interpret failure experiences as evidence of a lack of ability which they believe is largely innate and therefore relatively fixed; whilst incremental theorists believe in the malleability of abilities and interpret failure experiences as evidence of more controllable factors such as poor strategy or lack of effort. The results of our analyses support ITA theory at both the within- and between-person levels of analyses and demonstrate the benefits of Bayesian techniques for the analysis of within-person processes. These include more formal specification of the theory and the ability to draw inferences about each individual, which allows for more nuanced interpretations of individuals within a personality category, such as differences in the individual probabilities of spiraling. While Bayesian techniques have many potential advantages for the analyses of processes at the level of the individual, ease of use is not one of them for psychologists trained in traditional frequentist statistical techniques.

  7. Bayesian Analysis of Individual Level Personality Dynamics.

    PubMed

    Cripps, Edward; Wood, Robert E; Beckmann, Nadin; Lau, John; Beckmann, Jens F; Cripps, Sally Ann

    2016-01-01

    A Bayesian technique with analyses of within-person processes at the level of the individual is presented. The approach is used to examine whether the patterns of within-person responses on a 12-trial simulation task are consistent with the predictions of ITA theory (Dweck, 1999). ITA theory states that the performance of an individual with an entity theory of ability is more likely to spiral down following a failure experience than the performance of an individual with an incremental theory of ability. This is because entity theorists interpret failure experiences as evidence of a lack of ability which they believe is largely innate and therefore relatively fixed; whilst incremental theorists believe in the malleability of abilities and interpret failure experiences as evidence of more controllable factors such as poor strategy or lack of effort. The results of our analyses support ITA theory at both the within- and between-person levels of analyses and demonstrate the benefits of Bayesian techniques for the analysis of within-person processes. These include more formal specification of the theory and the ability to draw inferences about each individual, which allows for more nuanced interpretations of individuals within a personality category, such as differences in the individual probabilities of spiraling. While Bayesian techniques have many potential advantages for the analyses of processes at the level of the individual, ease of use is not one of them for psychologists trained in traditional frequentist statistical techniques. PMID:27486415

  8. Bayesian Analysis of Individual Level Personality Dynamics

    PubMed Central

    Cripps, Edward; Wood, Robert E.; Beckmann, Nadin; Lau, John; Beckmann, Jens F.; Cripps, Sally Ann

    2016-01-01

    A Bayesian technique with analyses of within-person processes at the level of the individual is presented. The approach is used to examine whether the patterns of within-person responses on a 12-trial simulation task are consistent with the predictions of ITA theory (Dweck, 1999). ITA theory states that the performance of an individual with an entity theory of ability is more likely to spiral down following a failure experience than the performance of an individual with an incremental theory of ability. This is because entity theorists interpret failure experiences as evidence of a lack of ability which they believe is largely innate and therefore relatively fixed; whilst incremental theorists believe in the malleability of abilities and interpret failure experiences as evidence of more controllable factors such as poor strategy or lack of effort. The results of our analyses support ITA theory at both the within- and between-person levels of analyses and demonstrate the benefits of Bayesian techniques for the analysis of within-person processes. These include more formal specification of the theory and the ability to draw inferences about each individual, which allows for more nuanced interpretations of individuals within a personality category, such as differences in the individual probabilities of spiraling. While Bayesian techniques have many potential advantages for the analyses of processes at the level of the individual, ease of use is not one of them for psychologists trained in traditional frequentist statistical techniques. PMID:27486415

  9. Bayesian stable isotope mixing models

    EPA Science Inventory

    In this paper we review recent advances in Stable Isotope Mixing Models (SIMMs) and place them into an over-arching Bayesian statistical framework which allows for several useful extensions. SIMMs are used to quantify the proportional contributions of various sources to a mixtur...

  10. A Gentle Introduction to Bayesian Analysis: Applications to Developmental Research

    PubMed Central

    van de Schoot, Rens; Kaplan, David; Denissen, Jaap; Asendorpf, Jens B; Neyer, Franz J; van Aken, Marcel AG

    2014-01-01

    Bayesian statistical methods are becoming ever more popular in applied and fundamental research. In this study a gentle introduction to Bayesian analysis is provided. It is shown under what circumstances it is attractive to use Bayesian estimation, and how to interpret properly the results. First, the ingredients underlying Bayesian methods are introduced using a simplified example. Thereafter, the advantages and pitfalls of the specification of prior knowledge are discussed. To illustrate Bayesian methods explained in this study, in a second example a series of studies that examine the theoretical framework of dynamic interactionism are considered. In the Discussion the advantages and disadvantages of using Bayesian statistics are reviewed, and guidelines on how to report on Bayesian statistics are provided. PMID:24116396

  11. Bayesian Integrated Microbial Forensics

    SciTech Connect

    Jarman, Kristin H.; Kreuzer-Martin, Helen W.; Wunschel, David S.; Valentine, Nancy B.; Cliff, John B.; Petersen, Catherine E.; Colburn, Heather A.; Wahl, Karen L.

    2008-06-01

    In the aftermath of the 2001 anthrax letters, researchers have been exploring ways to predict the production environment of unknown source microorganisms. Different mass spectral techniques are being developed to characterize components of a microbe’s culture medium including water, carbon and nitrogen sources, metal ions added, and the presence of agar. Individually, each technique has the potential to identify one or two ingredients in a culture medium recipe. However, by integrating data from multiple mass spectral techniques, a more complete characterization is possible. We present a Bayesian statistical approach to integrated microbial forensics and illustrate its application on spores grown in different culture media.

  12. Statistical process control (SPC) for coordinate measurement machines

    SciTech Connect

    Escher, R.N.

    2000-01-04

    The application of process capability analysis, using designed experiments, and gage capability studies as they apply to coordinate measurement machine (CMM) uncertainty analysis and control will be demonstrated. The use of control standards in designed experiments, and the use of range charts and moving range charts to separate measurement error into it's discrete components will be discussed. The method used to monitor and analyze the components of repeatability and reproducibility will be presented with specific emphasis on how to use control charts to determine and monitor CMM performance and capability, and stay within your uncertainty assumptions.

  13. The application of statistical process control to the development of CIS-based photovoltaics

    NASA Astrophysics Data System (ADS)

    Wieting, R. D.

    1996-01-01

    This paper reviews the application of Statistical Process Control (SPC) as well as other statistical methods to the development of thin film CuInSe2-based module fabrication processes. These methods have rigorously demonstrated the reproducibility of a number of individual process steps in module fabrication and led to the identification of previously unrecognized sources of process variation. A process exhibiting good statistical control with 11.4% mean module efficiency has been demonstrated.

  14. Using Statistical Process Control to Make Data-Based Clinical Decisions.

    ERIC Educational Resources Information Center

    Pfadt, Al; Wheeler, Donald J.

    1995-01-01

    Statistical process control (SPC), which employs simple statistical tools and problem-solving techniques such as histograms, control charts, flow charts, and Pareto charts to implement continual product improvement procedures, can be incorporated into human service organizations. Examples illustrate use of SPC procedures to analyze behavioral data…

  15. Statistical Process Control Charts for Measuring and Monitoring Temporal Consistency of Ratings

    ERIC Educational Resources Information Center

    Omar, M. Hafidz

    2010-01-01

    Methods of statistical process control were briefly investigated in the field of educational measurement as early as 1999. However, only the use of a cumulative sum chart was explored. In this article other methods of statistical quality control are introduced and explored. In particular, methods in the form of Shewhart mean and standard deviation…

  16. Methods of Statistical Control for Groundwater Quality Indicators

    NASA Astrophysics Data System (ADS)

    Yankovich, E.; Nevidimova, O.; Yankovich, K.

    2016-06-01

    The article describes the results of conducted groundwater quality control. Controlled quality indicators included the following microelements - barium, manganese, iron, mercury, iodine, chromium, strontium, etc. Quality control charts - X-bar chart and R chart - were built. For the upper and the lower threshold limits, maximum permissible concentration of components in water and the lower limit of their biologically significant concentration, respectively, were selected. The charts analysis has shown that the levels of microelements content in water at the area of study are stable. Most elements in the underground water are contained in concentrations, significant for human organisms consuming the water. For example, such elements as Ba, Mn, Fe have concentrations that exceed maximum permissible levels for drinking water.

  17. Perception, illusions and Bayesian inference.

    PubMed

    Nour, Matthew M; Nour, Joseph M

    2015-01-01

    Descriptive psychopathology makes a distinction between veridical perception and illusory perception. In both cases a perception is tied to a sensory stimulus, but in illusions the perception is of a false object. This article re-examines this distinction in light of new work in theoretical and computational neurobiology, which views all perception as a form of Bayesian statistical inference that combines sensory signals with prior expectations. Bayesian perceptual inference can solve the 'inverse optics' problem of veridical perception and provides a biologically plausible account of a number of illusory phenomena, suggesting that veridical and illusory perceptions are generated by precisely the same inferential mechanisms.

  18. Active Control for Statistically Stationary Turbulent PremixedFlame Simulations

    SciTech Connect

    Bell, J.B.; Day, M.S.; Grcar, J.F.; Lijewski, M.J.

    2005-08-30

    The speed of propagation of a premixed turbulent flame correlates with the intensity of the turbulence encountered by the flame. One consequence of this property is that premixed flames in both laboratory experiments and practical combustors require some type of stabilization mechanism to prevent blow-off and flashback. The stabilization devices often introduce a level of geometric complexity that is prohibitive for detailed computational studies of turbulent flame dynamics. Furthermore, the stabilization introduces additional fluid mechanical complexity into the overall combustion process that can complicate the analysis of fundamental flame properties. To circumvent these difficulties we introduce a feedback control algorithm that allows us to computationally stabilize a turbulent premixed flame in a simple geometric configuration. For the simulations, we specify turbulent inflow conditions and dynamically adjust the integrated fueling rate to control the mean location of the flame in the domain. We outline the numerical procedure, and illustrate the behavior of the control algorithm on methane flames at various equivalence ratios in two dimensions. The simulation data are used to study the local variation in the speed of propagation due to flame surface curvature.

  19. A Comparison of Bayesian Monte Carlo Markov Chain and Maximum Likelihood Estimation Methods for the Statistical Analysis of Geodetic Time Series

    NASA Astrophysics Data System (ADS)

    Olivares, G.; Teferle, F. N.

    2013-12-01

    Geodetic time series provide information which helps to constrain theoretical models of geophysical processes. It is well established that such time series, for example from GPS, superconducting gravity or mean sea level (MSL), contain time-correlated noise which is usually assumed to be a combination of a long-term stochastic process (characterized by a power-law spectrum) and random noise. Therefore, when fitting a model to geodetic time series it is essential to also estimate the stochastic parameters beside the deterministic ones. Often the stochastic parameters include the power amplitudes of both time-correlated and random noise, as well as, the spectral index of the power-law process. To date, the most widely used method for obtaining these parameter estimates is based on maximum likelihood estimation (MLE). We present an integration method, the Bayesian Monte Carlo Markov Chain (MCMC) method, which, by using Markov chains, provides a sample of the posteriori distribution of all parameters and, thereby, using Monte Carlo integration, all parameters and their uncertainties are estimated simultaneously. This algorithm automatically optimizes the Markov chain step size and estimates the convergence state by spectral analysis of the chain. We assess the MCMC method through comparison with MLE, using the recently released GPS position time series from JPL and apply it also to the MSL time series from the Revised Local Reference data base of the PSMSL. Although the parameter estimates for both methods are fairly equivalent, they suggest that the MCMC method has some advantages over MLE, for example, without further computations it provides the spectral index uncertainty, is computationally stable and detects multimodality.

  20. GASP cloud encounter statistics - Implications for laminar flow control flight

    NASA Technical Reports Server (NTRS)

    Jasperson, W. H.; Nastrom, G. D.; Davis, R. E.; Holdeman, J. D.

    1984-01-01

    The cloud observation archive from the NASA Global Atmospheric Sampling Program (GASP) is analyzed in order to derive the probability of cloud encounter at altitudes normally flown by commercial airliners, for application to a determination of the feasability of Laminar Flow Control (LFC) on long-range routes. The probability of cloud encounter is found to vary significantly with season. Several meteorological circulation features are apparent in the latitudinal distribution of cloud cover. The cloud encounter data are shown to be consistent with the classical midlatitude cyclone model with more clouds encountered in highs than in lows. Aircraft measurements of route-averaged time-in-clouds fit a gamma probability distribution model which is applied to estimate the probability of extended cloud encounter, and the associated loss of LFC effectiveness along seven high-density routes. The probability is demonstrated to be low.

  1. What Is the Probability You Are a Bayesian?

    ERIC Educational Resources Information Center

    Wulff, Shaun S.; Robinson, Timothy J.

    2014-01-01

    Bayesian methodology continues to be widely used in statistical applications. As a result, it is increasingly important to introduce students to Bayesian thinking at early stages in their mathematics and statistics education. While many students in upper level probability courses can recite the differences in the Frequentist and Bayesian…

  2. Bayesian superresolution

    NASA Astrophysics Data System (ADS)

    Isakson, Steve Wesley

    2001-12-01

    Well-known principles of physics explain why resolution restrictions occur in images produced by optical diffraction-limited systems. The limitations involved are present in all diffraction-limited imaging systems, including acoustical and microwave. In most circumstances, however, prior knowledge about the object and the imaging system can lead to resolution improvements. In this dissertation I outline a method to incorporate prior information into the process of reconstructing images to superresolve the object beyond the above limitations. This dissertation research develops the details of this methodology. The approach can provide the most-probable global solution employing a finite number of steps in both far-field and near-field images. In addition, in order to overcome the effects of noise present in any imaging system, this technique provides a weighted image that quantifies the likelihood of various imaging solutions. By utilizing Bayesian probability, the procedure is capable of incorporating prior information about both the object and the noise to overcome the resolution limitation present in many imaging systems. Finally I will present an imaging system capable of detecting the evanescent waves missing from far-field systems, thus improving the resolution further.

  3. Interrupted Time Series Versus Statistical Process Control in Quality Improvement Projects.

    PubMed

    Andersson Hagiwara, Magnus; Andersson Gäre, Boel; Elg, Mattias

    2016-01-01

    To measure the effect of quality improvement interventions, it is appropriate to use analysis methods that measure data over time. Examples of such methods include statistical process control analysis and interrupted time series with segmented regression analysis. This article compares the use of statistical process control analysis and interrupted time series with segmented regression analysis for evaluating the longitudinal effects of quality improvement interventions, using an example study on an evaluation of a computerized decision support system.

  4. Application of statistical process control charts to monitor changes in animal production systems.

    PubMed

    De Vries, A; Reneau, J K

    2010-04-01

    Statistical process control (SPC) is a method of monitoring, controlling, and improving a process through statistical analysis. An important SPC tool is the control chart, which can be used to detect changes in production processes, including animal production systems, with a statistical level of confidence. This paper introduces the philosophy and types of control charts, design and performance issues, and provides a review of control chart applications in animal production systems found in the literature from 1977 to 2009. Primarily Shewhart and cumulative sum control charts have been described in animal production systems, with examples found in poultry, swine, dairy, and beef production systems. Examples include monitoring of growth, disease incidence, water intake, milk production, and reproductive performance. Most applications describe charting outcome variables, but more examples of control charts applied to input variables are needed, such as compliance to protocols, feeding practice, diet composition, and environmental factors. Common challenges for applications in animal production systems are the identification of the best statistical model for the common cause variability, grouping of data, selection of type of control chart, the cost of false alarms and lack of signals, and difficulty identifying the special causes when a change is signaled. Nevertheless, carefully constructed control charts are powerful methods to monitor animal production systems. Control charts might also supplement randomized controlled trials. PMID:20081080

  5. Quality Control of High-Dose-Rate Brachytherapy: Treatment Delivery Analysis Using Statistical Process Control

    SciTech Connect

    Able, Charles M.; Bright, Megan; Frizzell, Bart

    2013-03-01

    Purpose: Statistical process control (SPC) is a quality control method used to ensure that a process is well controlled and operates with little variation. This study determined whether SPC was a viable technique for evaluating the proper operation of a high-dose-rate (HDR) brachytherapy treatment delivery system. Methods and Materials: A surrogate prostate patient was developed using Vyse ordnance gelatin. A total of 10 metal oxide semiconductor field-effect transistors (MOSFETs) were placed from prostate base to apex. Computed tomography guidance was used to accurately position the first detector in each train at the base. The plan consisted of 12 needles with 129 dwell positions delivering a prescribed peripheral dose of 200 cGy. Sixteen accurate treatment trials were delivered as planned. Subsequently, a number of treatments were delivered with errors introduced, including wrong patient, wrong source calibration, wrong connection sequence, single needle displaced inferiorly 5 mm, and entire implant displaced 2 mm and 4 mm inferiorly. Two process behavior charts (PBC), an individual and a moving range chart, were developed for each dosimeter location. Results: There were 4 false positives resulting from 160 measurements from 16 accurately delivered treatments. For the inaccurately delivered treatments, the PBC indicated that measurements made at the periphery and apex (regions of high-dose gradient) were much more sensitive to treatment delivery errors. All errors introduced were correctly identified by either the individual or the moving range PBC in the apex region. Measurements at the urethra and base were less sensitive to errors. Conclusions: SPC is a viable method for assessing the quality of HDR treatment delivery. Further development is necessary to determine the most effective dose sampling, to ensure reproducible evaluation of treatment delivery accuracy.

  6. The application of statistical process control to the development of CIS-based photovoltaics

    SciTech Connect

    Wieting, R.D.

    1996-01-01

    This paper reviews the application of Statistical Process Control (SPC) as well as other statistical methods to the development of thin film CuInSe{sub 2}-based module fabrication processes. These methods have rigorously demonstrated the reproducibility of a number of individual process steps in module fabrication and led to the identification of previously unrecognized sources of process variation. A process exhibiting good statistical control with 11.4{percent} mean module efficiency has been demonstrated. {copyright} {ital 1996 American Institute of Physics.}

  7. Multileaf collimator performance monitoring and improvement using semiautomated quality control testing and statistical process control

    SciTech Connect

    Létourneau, Daniel McNiven, Andrea; Keller, Harald; Wang, An; Amin, Md Nurul; Pearce, Jim; Norrlinger, Bernhard; Jaffray, David A.

    2014-12-15

    Purpose: High-quality radiation therapy using highly conformal dose distributions and image-guided techniques requires optimum machine delivery performance. In this work, a monitoring system for multileaf collimator (MLC) performance, integrating semiautomated MLC quality control (QC) tests and statistical process control tools, was developed. The MLC performance monitoring system was used for almost a year on two commercially available MLC models. Control charts were used to establish MLC performance and assess test frequency required to achieve a given level of performance. MLC-related interlocks and servicing events were recorded during the monitoring period and were investigated as indicators of MLC performance variations. Methods: The QC test developed as part of the MLC performance monitoring system uses 2D megavoltage images (acquired using an electronic portal imaging device) of 23 fields to determine the location of the leaves with respect to the radiation isocenter. The precision of the MLC performance monitoring QC test and the MLC itself was assessed by detecting the MLC leaf positions on 127 megavoltage images of a static field. After initial calibration, the MLC performance monitoring QC test was performed 3–4 times/week over a period of 10–11 months to monitor positional accuracy of individual leaves for two different MLC models. Analysis of test results was performed using individuals control charts per leaf with control limits computed based on the measurements as well as two sets of specifications of ±0.5 and ±1 mm. Out-of-specification and out-of-control leaves were automatically flagged by the monitoring system and reviewed monthly by physicists. MLC-related interlocks reported by the linear accelerator and servicing events were recorded to help identify potential causes of nonrandom MLC leaf positioning variations. Results: The precision of the MLC performance monitoring QC test and the MLC itself was within ±0.22 mm for most MLC leaves

  8. Toward improved prediction of the bedrock depth underneath hillslopes: Bayesian inference of the bottom-up control hypothesis using high-resolution topographic data

    NASA Astrophysics Data System (ADS)

    Gomes, Guilherme J. C.; Vrugt, Jasper A.; Vargas, Eurípedes A.

    2016-04-01

    The depth to bedrock controls a myriad of processes by influencing subsurface flow paths, erosion rates, soil moisture, and water uptake by plant roots. As hillslope interiors are very difficult and costly to illuminate and access, the topography of the bedrock surface is largely unknown. This essay is concerned with the prediction of spatial patterns in the depth to bedrock (DTB) using high-resolution topographic data, numerical modeling, and Bayesian analysis. Our DTB model builds on the bottom-up control on fresh-bedrock topography hypothesis of Rempe and Dietrich (2014) and includes a mass movement and bedrock-valley morphology term to extent the usefulness and general applicability of the model. We reconcile the DTB model with field observations using Bayesian analysis with the DREAM algorithm. We investigate explicitly the benefits of using spatially distributed parameter values to account implicitly, and in a relatively simple way, for rock mass heterogeneities that are very difficult, if not impossible, to characterize adequately in the field. We illustrate our method using an artificial data set of bedrock depth observations and then evaluate our DTB model with real-world data collected at the Papagaio river basin in Rio de Janeiro, Brazil. Our results demonstrate that the DTB model predicts accurately the observed bedrock depth data. The posterior mean DTB simulation is shown to be in good agreement with the measured data. The posterior prediction uncertainty of the DTB model can be propagated forward through hydromechanical models to derive probabilistic estimates of factors of safety.

  9. Comparing energy sources for surgical ablation of atrial fibrillation: a Bayesian network meta-analysis of randomized, controlled trials.

    PubMed

    Phan, Kevin; Xie, Ashleigh; Kumar, Narendra; Wong, Sophia; Medi, Caroline; La Meir, Mark; Yan, Tristan D

    2015-08-01

    Simplified maze procedures involving radiofrequency, cryoenergy and microwave energy sources have been increasingly utilized for surgical treatment of atrial fibrillation as an alternative to the traditional cut-and-sew approach. In the absence of direct comparisons, a Bayesian network meta-analysis is another alternative to assess the relative effect of different treatments, using indirect evidence. A Bayesian meta-analysis of indirect evidence was performed using 16 published randomized trials identified from 6 databases. Rank probability analysis was used to rank each intervention in terms of their probability of having the best outcome. Sinus rhythm prevalence beyond the 12-month follow-up was similar between the cut-and-sew, microwave and radiofrequency approaches, which were all ranked better than cryoablation (respectively, 39, 36, and 25 vs 1%). The cut-and-sew maze was ranked worst in terms of mortality outcomes compared with microwave, radiofrequency and cryoenergy (2 vs 19, 34, and 24%, respectively). The cut-and-sew maze procedure was associated with significantly lower stroke rates compared with microwave ablation [odds ratio <0.01; 95% confidence interval 0.00, 0.82], and ranked the best in terms of pacemaker requirements compared with microwave, radiofrequency and cryoenergy (81 vs 14, and 1, <0.01% respectively). Bayesian rank probability analysis shows that the cut-and-sew approach is associated with the best outcomes in terms of sinus rhythm prevalence and stroke outcomes, and remains the gold standard approach for AF treatment. Given the limitations of indirect comparison analysis, these results should be viewed with caution and not over-interpreted.

  10. A Bayesian tutorial for data assimilation

    NASA Astrophysics Data System (ADS)

    Wikle, Christopher K.; Berliner, L. Mark

    2007-06-01

    Data assimilation is the process by which observational data are fused with scientific information. The Bayesian paradigm provides a coherent probabilistic approach for combining information, and thus is an appropriate framework for data assimilation. Viewing data assimilation as a problem in Bayesian statistics is not new. However, the field of Bayesian statistics is rapidly evolving and new approaches for model construction and sampling have been utilized recently in a wide variety of disciplines to combine information. This article includes a brief introduction to Bayesian methods. Paying particular attention to data assimilation, we review linkages to optimal interpolation, kriging, Kalman filtering, smoothing, and variational analysis. Discussion is provided concerning Monte Carlo methods for implementing Bayesian analysis, including importance sampling, particle filtering, ensemble Kalman filtering, and Markov chain Monte Carlo sampling. Finally, hierarchical Bayesian modeling is reviewed. We indicate how this approach can be used to incorporate significant physically based prior information into statistical models, thereby accounting for uncertainty. The approach is illustrated in a simplified advection-diffusion model.

  11. Bayesian analysis for kaon photoproduction

    SciTech Connect

    Marsainy, T. Mart, T.

    2014-09-25

    We have investigated contribution of the nucleon resonances in the kaon photoproduction process by using an established statistical decision making method, i.e. the Bayesian method. This method does not only evaluate the model over its entire parameter space, but also takes the prior information and experimental data into account. The result indicates that certain resonances have larger probabilities to contribute to the process.

  12. Bayesian classification theory

    NASA Technical Reports Server (NTRS)

    Hanson, Robin; Stutz, John; Cheeseman, Peter

    1991-01-01

    The task of inferring a set of classes and class descriptions most likely to explain a given data set can be placed on a firm theoretical foundation using Bayesian statistics. Within this framework and using various mathematical and algorithmic approximations, the AutoClass system searches for the most probable classifications, automatically choosing the number of classes and complexity of class descriptions. A simpler version of AutoClass has been applied to many large real data sets, has discovered new independently-verified phenomena, and has been released as a robust software package. Recent extensions allow attributes to be selectively correlated within particular classes, and allow classes to inherit or share model parameters though a class hierarchy. We summarize the mathematical foundations of AutoClass.

  13. An Automated Statistical Process Control Study of Inline Mixing Using Spectrophotometric Detection

    ERIC Educational Resources Information Center

    Dickey, Michael D.; Stewart, Michael D.; Willson, C. Grant

    2006-01-01

    An experiment is described, which is designed for a junior-level chemical engineering "fundamentals of measurements and data analysis" course, where students are introduced to the concept of statistical process control (SPC) through a simple inline mixing experiment. The students learn how to create and analyze control charts in an effort to…

  14. Disciplined Decision Making in an Interdisciplinary Environment: Some Implications for Clinical Applications of Statistical Process Control.

    ERIC Educational Resources Information Center

    Hantula, Donald A.

    1995-01-01

    Clinical applications of statistical process control (SPC) in human service organizations are considered. SPC is seen as providing a standard set of criteria that serves as a common interface for data-based decision making, which may bring decision making under the control of established contingencies rather than the immediate contingencies of…

  15. Bayesian structural equation modeling in sport and exercise psychology.

    PubMed

    Stenling, Andreas; Ivarsson, Andreas; Johnson, Urban; Lindwall, Magnus

    2015-08-01

    Bayesian statistics is on the rise in mainstream psychology, but applications in sport and exercise psychology research are scarce. In this article, the foundations of Bayesian analysis are introduced, and we will illustrate how to apply Bayesian structural equation modeling in a sport and exercise psychology setting. More specifically, we contrasted a confirmatory factor analysis on the Sport Motivation Scale II estimated with the most commonly used estimator, maximum likelihood, and a Bayesian approach with weakly informative priors for cross-loadings and correlated residuals. The results indicated that the model with Bayesian estimation and weakly informative priors provided a good fit to the data, whereas the model estimated with a maximum likelihood estimator did not produce a well-fitting model. The reasons for this discrepancy between maximum likelihood and Bayesian estimation are discussed as well as potential advantages and caveats with the Bayesian approach.

  16. Bayesian structural equation modeling in sport and exercise psychology.

    PubMed

    Stenling, Andreas; Ivarsson, Andreas; Johnson, Urban; Lindwall, Magnus

    2015-08-01

    Bayesian statistics is on the rise in mainstream psychology, but applications in sport and exercise psychology research are scarce. In this article, the foundations of Bayesian analysis are introduced, and we will illustrate how to apply Bayesian structural equation modeling in a sport and exercise psychology setting. More specifically, we contrasted a confirmatory factor analysis on the Sport Motivation Scale II estimated with the most commonly used estimator, maximum likelihood, and a Bayesian approach with weakly informative priors for cross-loadings and correlated residuals. The results indicated that the model with Bayesian estimation and weakly informative priors provided a good fit to the data, whereas the model estimated with a maximum likelihood estimator did not produce a well-fitting model. The reasons for this discrepancy between maximum likelihood and Bayesian estimation are discussed as well as potential advantages and caveats with the Bayesian approach. PMID:26442771

  17. [Application of multivariate statistical analysis and thinking in quality control of Chinese medicine].

    PubMed

    Liu, Na; Li, Jun; Li, Bao-Guo

    2014-11-01

    The study of quality control of Chinese medicine has always been the hot and the difficulty spot of the development of traditional Chinese medicine (TCM), which is also one of the key problems restricting the modernization and internationalization of Chinese medicine. Multivariate statistical analysis is an analytical method which is suitable for the analysis of characteristics of TCM. It has been used widely in the study of quality control of TCM. Multivariate Statistical analysis was used for multivariate indicators and variables that appeared in the study of quality control and had certain correlation between each other, to find out the hidden law or the relationship between the data can be found,.which could apply to serve the decision-making and realize the effective quality evaluation of TCM. In this paper, the application of multivariate statistical analysis in the quality control of Chinese medicine was summarized, which could provided the basis for its further study. PMID:25775806

  18. Statistical process control: An approach to quality assurance in the production of vitrified nuclear waste

    SciTech Connect

    Pulsipher, B.A.; Kuhn, W.L.

    1987-02-01

    Current planning for liquid high-level nuclear wastes existing in the US includes processing in a liquid-fed ceramic melter to incorporate it into a high-quality glass, and placement in a deep geologic repository. The nuclear waste vitrification process requires assurance of a quality product with little or no final inspection. Statistical process control (SPC) is a quantitative approach to one quality assurance aspect of vitrified nuclear waste. This method for monitoring and controlling a process in the presence of uncertainties provides a statistical basis for decisions concerning product quality improvement. Statistical process control is shown to be a feasible and beneficial tool to help the waste glass producers demonstrate that the vitrification process can be controlled sufficiently to produce an acceptable product. This quantitative aspect of quality assurance could be an effective means of establishing confidence in the claims to a quality product. 2 refs., 4 figs.

  19. Research on statistical process control for solvent residual quantity of packaging materials

    NASA Astrophysics Data System (ADS)

    Xiao, Yingzhe; Huang, Yanan

    2013-03-01

    Statistical Process Control (SPC) and the basic tool of its controlling - control chart - are discussed in this paper based on the development of quality management, current situation of quality management of Chinese packaging enterprises, and the necessity of applying SPC. On this basis, X-R control chart is used to analyze and control the solvent residual in the compound process. This work may allow field personnel to find the shortcomings in the quality control by noticing the corresponding of fluctuations and slow variations in the process in time. In addition, SPC also provides objective basis for the quality management personnel to assess semi-products or products quality.

  20. The Bayesian bridge between simple and universal kriging

    SciTech Connect

    Omre, H.; Halvorsen, K.B. )

    1989-10-01

    Kriging techniques are suited well for evaluation of continuous, spatial phenomena. Bayesian statistics are characterized by using prior qualified guesses on the model parameters. By merging kriging techniques and Bayesian theory, prior guesses may be used in a spatial setting. Partial knowledge of model parameters defines a continuum of models between what is named simple and universal kriging in geostatistical terminology. The Bayesian approach to kriging is developed and discussed, and a case study concerning depth conversion of seismic reflection times is presented.

  1. Intraplate volcanism controlled by back-arc and continental structures in NE Asia inferred from transdimensional Bayesian ambient noise tomography

    NASA Astrophysics Data System (ADS)

    Kim, Seongryong; Tkalčić, Hrvoje; Rhie, Junkee; Chen, Youlin

    2016-08-01

    Intraplate volcanism adjacent to active continental margins is not simply explained by plate tectonics or plume interaction. Recent volcanoes in northeast (NE) Asia, including NE China and the Korean Peninsula, are characterized by heterogeneous tectonic structures and geochemical compositions. Here we apply a transdimensional Bayesian tomography to estimate high-resolution images of group and phase velocity variations (with periods between 8 and 70 s). The method provides robust estimations of velocity maps, and the reliability of results is tested through carefully designed synthetic recovery experiments. Our maps reveal two sublithospheric low-velocity anomalies that connect back-arc regions (in Japan and Ryukyu Trench) with current margins of continental lithosphere where the volcanoes are distributed. Combined with evidences from previous geochemical and geophysical studies, we argue that the volcanoes are related to the low-velocity structures associated with back-arc processes and preexisting continental lithosphere.

  2. Application of machine learning and expert systems to Statistical Process Control (SPC) chart interpretation

    NASA Technical Reports Server (NTRS)

    Shewhart, Mark

    1991-01-01

    Statistical Process Control (SPC) charts are one of several tools used in quality control. Other tools include flow charts, histograms, cause and effect diagrams, check sheets, Pareto diagrams, graphs, and scatter diagrams. A control chart is simply a graph which indicates process variation over time. The purpose of drawing a control chart is to detect any changes in the process signalled by abnormal points or patterns on the graph. The Artificial Intelligence Support Center (AISC) of the Acquisition Logistics Division has developed a hybrid machine learning expert system prototype which automates the process of constructing and interpreting control charts.

  3. Human-centered sensor-based Bayesian control: Increased energy efficiency and user satisfaction in commercial lighting

    NASA Astrophysics Data System (ADS)

    Granderson, Jessica Ann

    2007-12-01

    The need for sustainable, efficient energy systems is the motivation that drove this research, which targeted the design of an intelligent commercial lighting system. Lighting in commercial buildings consumes approximately 13% of all the electricity generated in the US. Advanced lighting controls1 intended for use in commercial office spaces have proven to save up to 45% in electricity consumption. However, they currently comprise only a fraction of the market share, resulting in a missed opportunity to conserve energy. The research goals driving this dissertation relate directly to barriers hindering widespread adoption---increase user satisfaction, and provide increased energy savings through more sophisticated control. To satisfy these goals an influence diagram was developed to perform daylighting actuation. This algorithm was designed to balance the potentially conflicting lighting preferences of building occupants, with the efficiency desires of building facilities management. A supervisory control policy was designed to implement load shedding under a demand response tariff. Such tariffs offer incentives for customers to reduce their consumption during periods of peak demand, trough price reductions. In developing the value function occupant user testing was conducted to determine that computer and paper tasks require different illuminance levels, and that user preferences are sufficiently consistent to attain statistical significance. Approximately ten facilities managers were also interviewed and surveyed to isolate their lighting preferences with respect to measures of lighting quality and energy savings. Results from both simulation and physical implementation and user testing indicate that the intelligent controller can increase occupant satisfaction, efficiency, cost savings, and management satisfaction, with respect to existing commercial daylighting systems. Several important contributions were realized by satisfying the research goals. A general

  4. Adaptive statistic tracking control based on two-step neural networks with time delays.

    PubMed

    Yi, Yang; Guo, Lei; Wang, Hong

    2009-03-01

    This paper presents a new type of control framework for dynamical stochastic systems, called statistic tracking control (STC). The system considered is general and non-Gaussian and the tracking objective is the statistical information of a given target probability density function (pdf), rather than a deterministic signal. The control aims at making the statistical information of the output pdfs to follow those of a target pdf. For such a control framework, a variable structure adaptive tracking control strategy is first established using two-step neural network models. Following the B-spline neural network approximation to the integrated performance function, the concerned problem is transferred into the tracking of given weights. The dynamic neural network (DNN) is employed to identify the unknown nonlinear dynamics between the control input and the weights related to the integrated function. To achieve the required control objective, an adaptive controller based on the proposed DNN is developed so as to track a reference trajectory. Stability analysis for both the identification and tracking errors is developed via the use of Lyapunov stability criterion. Simulations are given to demonstrate the efficiency of the proposed approach. PMID:19179249

  5. Bayesian inference on proportional elections.

    PubMed

    Brunello, Gabriel Hideki Vatanabe; Nakano, Eduardo Yoshio

    2015-01-01

    Polls for majoritarian voting systems usually show estimates of the percentage of votes for each candidate. However, proportional vote systems do not necessarily guarantee the candidate with the most percentage of votes will be elected. Thus, traditional methods used in majoritarian elections cannot be applied on proportional elections. In this context, the purpose of this paper was to perform a Bayesian inference on proportional elections considering the Brazilian system of seats distribution. More specifically, a methodology to answer the probability that a given party will have representation on the chamber of deputies was developed. Inferences were made on a Bayesian scenario using the Monte Carlo simulation technique, and the developed methodology was applied on data from the Brazilian elections for Members of the Legislative Assembly and Federal Chamber of Deputies in 2010. A performance rate was also presented to evaluate the efficiency of the methodology. Calculations and simulations were carried out using the free R statistical software.

  6. Bayesian Inference on Proportional Elections

    PubMed Central

    Brunello, Gabriel Hideki Vatanabe; Nakano, Eduardo Yoshio

    2015-01-01

    Polls for majoritarian voting systems usually show estimates of the percentage of votes for each candidate. However, proportional vote systems do not necessarily guarantee the candidate with the most percentage of votes will be elected. Thus, traditional methods used in majoritarian elections cannot be applied on proportional elections. In this context, the purpose of this paper was to perform a Bayesian inference on proportional elections considering the Brazilian system of seats distribution. More specifically, a methodology to answer the probability that a given party will have representation on the chamber of deputies was developed. Inferences were made on a Bayesian scenario using the Monte Carlo simulation technique, and the developed methodology was applied on data from the Brazilian elections for Members of the Legislative Assembly and Federal Chamber of Deputies in 2010. A performance rate was also presented to evaluate the efficiency of the methodology. Calculations and simulations were carried out using the free R statistical software. PMID:25786259

  7. Mainstreaming Remedial Mathematics Students in Introductory Statistics: Results Using a Randomized Controlled Trial

    ERIC Educational Resources Information Center

    Logue, Alexandra W.; Watanabe-Rose, Mari

    2014-01-01

    This study used a randomized controlled trial to determine whether students, assessed by their community colleges as needing an elementary algebra (remedial) mathematics course, could instead succeed at least as well in a college-level, credit-bearing introductory statistics course with extra support (a weekly workshop). Researchers randomly…

  8. Using Statistical Process Control Charts to Study Stuttering Frequency Variability during a Single Day

    ERIC Educational Resources Information Center

    Karimi, Hamid; O'Brian, Sue; Onslow, Mark; Jones, Mark; Menzies, Ross; Packman, Ann

    2013-01-01

    Purpose: Stuttering varies between and within speaking situations. In this study, the authors used statistical process control charts with 10 case studies to investigate variability of stuttering frequency. Method: Participants were 10 adults who stutter. The authors counted the percentage of syllables stuttered (%SS) for segments of their speech…

  9. Project T.E.A.M. (Technical Education Advancement Modules). Advanced Statistical Process Control.

    ERIC Educational Resources Information Center

    Dunlap, Dale

    This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 20-hour advanced statistical process control (SPC) and quality improvement course designed to develop the following competencies: (1) understanding quality systems; (2) knowing the process; (3) solving quality problems; and (4)…

  10. Fieldcrest Cannon, Inc. Advanced Technical Preparation. Statistical Process Control (SPC). PRE-SPC I. Instructor Book.

    ERIC Educational Resources Information Center

    Averitt, Sallie D.

    This instructor guide, which was developed for use in a manufacturing firm's advanced technical preparation program, contains the materials required to present a learning module that is designed to prepare trainees for the program's statistical process control module by improving their basic math skills and instructing them in basic calculator…

  11. Analyzing a Mature Software Inspection Process Using Statistical Process Control (SPC)

    NASA Technical Reports Server (NTRS)

    Barnard, Julie; Carleton, Anita; Stamper, Darrell E. (Technical Monitor)

    1999-01-01

    This paper presents a cooperative effort where the Software Engineering Institute and the Space Shuttle Onboard Software Project could experiment applying Statistical Process Control (SPC) analysis to inspection activities. The topics include: 1) SPC Collaboration Overview; 2) SPC Collaboration Approach and Results; and 3) Lessons Learned.

  12. Project T.E.A.M. (Technical Education Advancement Modules). Introduction to Statistical Process Control.

    ERIC Educational Resources Information Center

    Billings, Paul H.

    This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 6-hour introductory module on statistical process control (SPC), designed to develop competencies in the following skill areas: (1) identification of the three classes of SPC use; (2) understanding a process and how it works; (3)…

  13. The Application of Bayesian Analysis to Issues in Developmental Research

    ERIC Educational Resources Information Center

    Walker, Lawrence J.; Gustafson, Paul; Frimer, Jeremy A.

    2007-01-01

    This article reviews the concepts and methods of Bayesian statistical analysis, which can offer innovative and powerful solutions to some challenging analytical problems that characterize developmental research. In this article, we demonstrate the utility of Bayesian analysis, explain its unique adeptness in some circumstances, address some…

  14. A Bayesian Analysis of Finite Mixtures in the LISREL Model.

    ERIC Educational Resources Information Center

    Zhu, Hong-Tu; Lee, Sik-Yum

    2001-01-01

    Proposes a Bayesian framework for estimating finite mixtures of the LISREL model. The model augments the observed data of the manifest variables with the latent variables and allocation variables and uses the Gibbs sampler to obtain the Bayesian solution. Discusses other associated statistical inferences. (SLD)

  15. Using Alien Coins to Test Whether Simple Inference Is Bayesian

    ERIC Educational Resources Information Center

    Cassey, Peter; Hawkins, Guy E.; Donkin, Chris; Brown, Scott D.

    2016-01-01

    Reasoning and inference are well-studied aspects of basic cognition that have been explained as statistically optimal Bayesian inference. Using a simplified experimental design, we conducted quantitative comparisons between Bayesian inference and human inference at the level of individuals. In 3 experiments, with more than 13,000 participants, we…

  16. Implementation of Statistical Process Control for Proteomic Experiments via LC MS/MS

    PubMed Central

    Bereman, Michael S.; Johnson, Richard; Bollinger, James; Boss, Yuval; Shulman, Nick; MacLean, Brendan; Hoofnagle, Andrew N.; MacCoss, Michael J.

    2014-01-01

    Statistical process control (SPC) is a robust set of tools that aids in the visualization, detection, and identification of assignable causes of variation in any process that creates products, services, or information. A tool has been developed termed Statistical Process Control in Proteomics (SProCoP) which implements aspects of SPC (e.g., control charts and Pareto analysis) into the Skyline proteomics software. It monitors five quality control metrics in a shotgun or targeted proteomic workflow. None of these metrics require peptide identification. The source code, written in the R statistical language, runs directly from the Skyline interface which supports the use of raw data files from several of the mass spectrometry vendors. It provides real time evaluation of the chromatographic performance (e.g., retention time reproducibility, peak asymmetry, and resolution); and mass spectrometric performance (targeted peptide ion intensity and mass measurement accuracy for high resolving power instruments) via control charts. Thresholds are experiment- and instrument-specific and are determined empirically from user-defined quality control standards that enable the separation of random noise and systematic error. Finally, Pareto analysis provides a summary of performance metrics and guides the user to metrics with high variance. The utility of these charts to evaluate proteomic experiments is illustrated in two case studies. PMID:24496601

  17. Implementation of Statistical Process Control for Proteomic Experiments Via LC MS/MS

    NASA Astrophysics Data System (ADS)

    Bereman, Michael S.; Johnson, Richard; Bollinger, James; Boss, Yuval; Shulman, Nick; MacLean, Brendan; Hoofnagle, Andrew N.; MacCoss, Michael J.

    2014-04-01

    Statistical process control (SPC) is a robust set of tools that aids in the visualization, detection, and identification of assignable causes of variation in any process that creates products, services, or information. A tool has been developed termed Statistical Process Control in Proteomics (SProCoP) which implements aspects of SPC (e.g., control charts and Pareto analysis) into the Skyline proteomics software. It monitors five quality control metrics in a shotgun or targeted proteomic workflow. None of these metrics require peptide identification. The source code, written in the R statistical language, runs directly from the Skyline interface, which supports the use of raw data files from several of the mass spectrometry vendors. It provides real time evaluation of the chromatographic performance (e.g., retention time reproducibility, peak asymmetry, and resolution), and mass spectrometric performance (targeted peptide ion intensity and mass measurement accuracy for high resolving power instruments) via control charts. Thresholds are experiment- and instrument-specific and are determined empirically from user-defined quality control standards that enable the separation of random noise and systematic error. Finally, Pareto analysis provides a summary of performance metrics and guides the user to metrics with high variance. The utility of these charts to evaluate proteomic experiments is illustrated in two case studies.

  18. Implementation of statistical process control for proteomic experiments via LC MS/MS.

    PubMed

    Bereman, Michael S; Johnson, Richard; Bollinger, James; Boss, Yuval; Shulman, Nick; MacLean, Brendan; Hoofnagle, Andrew N; MacCoss, Michael J

    2014-04-01

    Statistical process control (SPC) is a robust set of tools that aids in the visualization, detection, and identification of assignable causes of variation in any process that creates products, services, or information. A tool has been developed termed Statistical Process Control in Proteomics (SProCoP) which implements aspects of SPC (e.g., control charts and Pareto analysis) into the Skyline proteomics software. It monitors five quality control metrics in a shotgun or targeted proteomic workflow. None of these metrics require peptide identification. The source code, written in the R statistical language, runs directly from the Skyline interface, which supports the use of raw data files from several of the mass spectrometry vendors. It provides real time evaluation of the chromatographic performance (e.g., retention time reproducibility, peak asymmetry, and resolution), and mass spectrometric performance (targeted peptide ion intensity and mass measurement accuracy for high resolving power instruments) via control charts. Thresholds are experiment- and instrument-specific and are determined empirically from user-defined quality control standards that enable the separation of random noise and systematic error. Finally, Pareto analysis provides a summary of performance metrics and guides the user to metrics with high variance. The utility of these charts to evaluate proteomic experiments is illustrated in two case studies. PMID:24496601

  19. Statistical process control for referrals by general practitioner at Health Insurance Organization clinics in Alexandria.

    PubMed

    Abdel Wahab, Moataza M; Nofal, Laila M; Guirguis, Wafaa W; Mahdy, Nehad H

    2004-01-01

    Quality control is the application of statistical techniques to a process in an effort to identify and minimize both random and non-random sources of variation. The present study aimed at the application of Statistical Process Control (SPC) to analyze the referrals by General Practitioners (GP) at Health Insurance Organization (HIO) clinics in Alexandria. Retrospective analysis of records and cross sectional interview to 180 GPs were done. Using the control charts (p chart), the present study confirmed the presence of substantial variation in referral rates from GPs to specialists; more than 60% of variation was of the special cause, which revealed that the process of referral in Alexandria (HIO) was completely out of statistical control. Control charts for referrals by GPs classified by different GP characteristics or organizational factors revealed much variation, which suggested that the variation was at the level of individual GPs. Furthermore, the p chart for each GP separately; which yielded a fewer number of points out of control (outliers), with an average of 4 points. For 26 GPs, there was no points out of control, those GPs were slightly older than those having points out of control. Otherwise, there was no significant difference between them. The revised p chart for those 26 GPs together yielded a centerline of 9.7%, upper control limit of 12.0% and lower control limit of 7.4%. Those limits were in good agreement with the limits specified by HIO; they can be suggested to be the new specification limits after some training programs. PMID:17265609

  20. Bayesian Methods in Cosmology

    NASA Astrophysics Data System (ADS)

    Hobson, Michael P.; Jaffe, Andrew H.; Liddle, Andrew R.; Mukherjee, Pia; Parkinson, David

    2014-02-01

    Preface; Part I. Methods: 1. Foundations and algorithms John Skilling; 2. Simple applications of Bayesian methods D. S. Sivia and Steve Rawlings; 3. Parameter estimation using Monte Carlo sampling Antony Lewis and Sarah Bridle; 4. Model selection and multi-model interference Andrew R. Liddle, Pia Mukherjee and David Parkinson; 5. Bayesian experimental design and model selection forecasting Roberto Trotta, Martin Kunz, Pia Mukherjee and David Parkinson; 6. Signal separation in cosmology M. P. Hobson, M. A. J. Ashdown and V. Stolyarov; Part II. Applications: 7. Bayesian source extraction M. P. Hobson, Graça Rocha and R. Savage; 8. Flux measurement Daniel Mortlock; 9. Gravitational wave astronomy Neil Cornish; 10. Bayesian analysis of cosmic microwave background data Andrew H. Jaffe; 11. Bayesian multilevel modelling of cosmological populations Thomas J. Loredo and Martin A. Hendry; 12. A Bayesian approach to galaxy evolution studies Stefano Andreon; 13. Photometric redshift estimation: methods and applications Ofer Lahav, Filipe B. Abdalla and Manda Banerji; Index.

  1. Bayesian Methods in Cosmology

    NASA Astrophysics Data System (ADS)

    Hobson, Michael P.; Jaffe, Andrew H.; Liddle, Andrew R.; Mukherjee, Pia; Parkinson, David

    2009-12-01

    Preface; Part I. Methods: 1. Foundations and algorithms John Skilling; 2. Simple applications of Bayesian methods D. S. Sivia and Steve Rawlings; 3. Parameter estimation using Monte Carlo sampling Antony Lewis and Sarah Bridle; 4. Model selection and multi-model interference Andrew R. Liddle, Pia Mukherjee and David Parkinson; 5. Bayesian experimental design and model selection forecasting Roberto Trotta, Martin Kunz, Pia Mukherjee and David Parkinson; 6. Signal separation in cosmology M. P. Hobson, M. A. J. Ashdown and V. Stolyarov; Part II. Applications: 7. Bayesian source extraction M. P. Hobson, Graça Rocha and R. Savage; 8. Flux measurement Daniel Mortlock; 9. Gravitational wave astronomy Neil Cornish; 10. Bayesian analysis of cosmic microwave background data Andrew H. Jaffe; 11. Bayesian multilevel modelling of cosmological populations Thomas J. Loredo and Martin A. Hendry; 12. A Bayesian approach to galaxy evolution studies Stefano Andreon; 13. Photometric redshift estimation: methods and applications Ofer Lahav, Filipe B. Abdalla and Manda Banerji; Index.

  2. A suite of RS/1 procedures for chemical laboratory statistical quality control and Shewhart control charting

    SciTech Connect

    Shanahan, K.L.

    1990-09-01

    A suite of RS/1 procedures for Shewhart control charting in chemical laboratories is described. The suite uses the RS series product QCA (Quality Control Analysis) for chart construction and analysis. The suite prompts users for data in a user friendly fashion and adds the data to or creates the control charts. All activities are time stamped. Facilities for generating monthly or contiguous time segment summary charts are included. The suite is currently in use at Westinghouse Savannah River Company.

  3. Statistical multiplexing of VBR MPEG sources under credit-based flow control

    NASA Astrophysics Data System (ADS)

    Khorsandi, Siavash; Leon-Garcia, Alberto

    1996-03-01

    Due to statistical multiplexing in ATM networks, a large number of cells may be lost during the periods of network congestion. It is a common perception that feedback congestion control mechanisms do not work well for delay sensitive applications such as video transfer. The proposed approaches to avoid congestion in video applications are mainly based on constant bit-rate transmission. However, these schemes impose a delay in the order of a frame time. Besides, the network utilization is reduced since bandwidth allocation at peak rate is necessary. Variable bit rate (VBR) coding of video signals is more efficient both in terms of coding delay and bandwidth utilization. In this paper, we demonstrate that using credit-based flow control together with a selective cell discarding mechanism, VBR video signals coded according to the MPEG standard can be statistically multiplexed with a very high efficiency. Both cell delay and cell loss guarantees can be made while achieving a high network utilization. A throughput of up to 83 percent has been achieved with a cell loss rate of under 10-5 and maximum end-to-end cell queuing delay of 15 milliseconds in the statistical multiplexing scenarios under consideration. Since credit-based flow control works well for data applications, its successful deployment for video applications will pave the way for an integrated congestion control protocol.

  4. Bayesian Modeling of a Human MMORPG Player

    NASA Astrophysics Data System (ADS)

    Synnaeve, Gabriel; Bessière, Pierre

    2011-03-01

    This paper describes an application of Bayesian programming to the control of an autonomous avatar in a multiplayer role-playing game (the example is based on World of Warcraft). We model a particular task, which consists of choosing what to do and to select which target in a situation where allies and foes are present. We explain the model in Bayesian programming and show how we could learn the conditional probabilities from data gathered during human-played sessions.

  5. A Unified Statistical Rain-Attenuation Model for Communication Link Fade Predictions and Optimal Stochastic Fade Control Design Using a Location-Dependent Rain-Statistic Database

    NASA Technical Reports Server (NTRS)

    Manning, Robert M.

    1990-01-01

    A static and dynamic rain-attenuation model is presented which describes the statistics of attenuation on an arbitrarily specified satellite link for any location for which there are long-term rainfall statistics. The model may be used in the design of the optimal stochastic control algorithms to mitigate the effects of attenuation and maintain link reliability. A rain-statistics data base is compiled, which makes it possible to apply the model to any location in the continental U.S. with a resolution of 0-5 degrees in latitude and longitude. The model predictions are compared with experimental observations, showing good agreement.

  6. Bayesian second law of thermodynamics

    NASA Astrophysics Data System (ADS)

    Bartolotta, Anthony; Carroll, Sean M.; Leichenauer, Stefan; Pollack, Jason

    2016-08-01

    We derive a generalization of the second law of thermodynamics that uses Bayesian updates to explicitly incorporate the effects of a measurement of a system at some point in its evolution. By allowing an experimenter's knowledge to be updated by the measurement process, this formulation resolves a tension between the fact that the entropy of a statistical system can sometimes fluctuate downward and the information-theoretic idea that knowledge of a stochastically evolving system degrades over time. The Bayesian second law can be written as Δ H (ρm,ρ ) + F |m≥0 , where Δ H (ρm,ρ ) is the change in the cross entropy between the original phase-space probability distribution ρ and the measurement-updated distribution ρm and F |m is the expectation value of a generalized heat flow out of the system. We also derive refined versions of the second law that bound the entropy increase from below by a non-negative number, as well as Bayesian versions of integral fluctuation theorems. We demonstrate the formalism using simple analytical and numerical examples.

  7. Bayesian second law of thermodynamics.

    PubMed

    Bartolotta, Anthony; Carroll, Sean M; Leichenauer, Stefan; Pollack, Jason

    2016-08-01

    We derive a generalization of the second law of thermodynamics that uses Bayesian updates to explicitly incorporate the effects of a measurement of a system at some point in its evolution. By allowing an experimenter's knowledge to be updated by the measurement process, this formulation resolves a tension between the fact that the entropy of a statistical system can sometimes fluctuate downward and the information-theoretic idea that knowledge of a stochastically evolving system degrades over time. The Bayesian second law can be written as ΔH(ρ_{m},ρ)+〈Q〉_{F|m}≥0, where ΔH(ρ_{m},ρ) is the change in the cross entropy between the original phase-space probability distribution ρ and the measurement-updated distribution ρ_{m} and 〈Q〉_{F|m} is the expectation value of a generalized heat flow out of the system. We also derive refined versions of the second law that bound the entropy increase from below by a non-negative number, as well as Bayesian versions of integral fluctuation theorems. We demonstrate the formalism using simple analytical and numerical examples. PMID:27627241

  8. A Bayesian Approach to Interactive Retrieval

    ERIC Educational Resources Information Center

    Tague, Jean M.

    1973-01-01

    A probabilistic model for interactive retrieval is presented. Bayesian statistical decision theory principles are applied: use of prior and sample information about the relationship of document descriptions to query relevance; maximization of expected value of a utility function, to the problem of optimally restructuring search strategies in an…

  9. BART: Bayesian Atmospheric Radiative Transfer fitting code

    NASA Astrophysics Data System (ADS)

    Cubillos, Patricio; Blecic, Jasmina; Harrington, Joseph; Rojo, Patricio; Lust, Nate; Bowman, Oliver; Stemm, Madison; Foster, Andrew; Loredo, Thomas J.; Fortney, Jonathan; Madhusudhan, Nikku

    2016-08-01

    BART implements a Bayesian, Monte Carlo-driven, radiative-transfer scheme for extracting parameters from spectra of planetary atmospheres. BART combines a thermochemical-equilibrium code, a one-dimensional line-by-line radiative-transfer code, and the Multi-core Markov-chain Monte Carlo statistical module to constrain the atmospheric temperature and chemical-abundance profiles of exoplanets.

  10. Bayesian Mediation Analysis

    ERIC Educational Resources Information Center

    Yuan, Ying; MacKinnon, David P.

    2009-01-01

    In this article, we propose Bayesian analysis of mediation effects. Compared with conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian…

  11. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC)

    SciTech Connect

    Gerard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent; Kafrouni, Hanna; Husson, Francois; Aletti, Pierre

    2009-04-15

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center ({+-}4% of deviation between the calculated and measured doses) by calculating a control process capability (C{sub pc}) index. The C{sub pc} index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should

  12. A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC).

    PubMed

    Gérard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent; Kafrouni, Hanna; Husson, François; Aletti, Pierre

    2009-04-01

    The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short-term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center (+/- 4% of deviation between the calculated and measured doses) by calculating a control process capability (C(pc)) index. The C(pc) index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short-term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should improve the

  13. Software For Multivariate Bayesian Classification

    NASA Technical Reports Server (NTRS)

    Saul, Ronald; Laird, Philip; Shelton, Robert

    1996-01-01

    PHD general-purpose classifier computer program. Uses Bayesian methods to classify vectors of real numbers, based on combination of statistical techniques that include multivariate density estimation, Parzen density kernels, and EM (Expectation Maximization) algorithm. By means of simple graphical interface, user trains classifier to recognize two or more classes of data and then use it to identify new data. Written in ANSI C for Unix systems and optimized for online classification applications. Embedded in another program, or runs by itself using simple graphical-user-interface. Online help files makes program easy to use.

  14. Bayesian Probability Theory

    NASA Astrophysics Data System (ADS)

    von der Linden, Wolfgang; Dose, Volker; von Toussaint, Udo

    2014-06-01

    Preface; Part I. Introduction: 1. The meaning of probability; 2. Basic definitions; 3. Bayesian inference; 4. Combinatrics; 5. Random walks; 6. Limit theorems; 7. Continuous distributions; 8. The central limit theorem; 9. Poisson processes and waiting times; Part II. Assigning Probabilities: 10. Transformation invariance; 11. Maximum entropy; 12. Qualified maximum entropy; 13. Global smoothness; Part III. Parameter Estimation: 14. Bayesian parameter estimation; 15. Frequentist parameter estimation; 16. The Cramer-Rao inequality; Part IV. Testing Hypotheses: 17. The Bayesian way; 18. The frequentist way; 19. Sampling distributions; 20. Bayesian vs frequentist hypothesis tests; Part V. Real World Applications: 21. Regression; 22. Inconsistent data; 23. Unrecognized signal contributions; 24. Change point problems; 25. Function estimation; 26. Integral equations; 27. Model selection; 28. Bayesian experimental design; Part VI. Probabilistic Numerical Techniques: 29. Numerical integration; 30. Monte Carlo methods; 31. Nested sampling; Appendixes; References; Index.

  15. Theoretical evaluation of the detectability of random lesions in bayesian emission reconstruction

    SciTech Connect

    Qi, Jinyi

    2003-05-01

    Detecting cancerous lesion is an important task in positron emission tomography (PET). Bayesian methods based on the maximum a posteriori principle (also called penalized maximum likelihood methods) have been developed to deal with the low signal to noise ratio in the emission data. Similar to the filter cut-off frequency in the filtered backprojection method, the prior parameters in Bayesian reconstruction control the resolution and noise trade-off and hence affect detectability of lesions in reconstructed images. Bayesian reconstructions are difficult to analyze because the resolution and noise properties are nonlinear and object-dependent. Most research has been based on Monte Carlo simulations, which are very time consuming. Building on the recent progress on the theoretical analysis of image properties of statistical reconstructions and the development of numerical observers, here we develop a theoretical approach for fast computation of lesion detectability in Bayesian reconstruction. The results can be used to choose the optimum hyperparameter for the maximum lesion detectability. New in this work is the use of theoretical expressions that explicitly model the statistical variation of the lesion and background without assuming that the object variation is (locally) stationary. The theoretical results are validated using Monte Carlo simulations. The comparisons show good agreement between the theoretical predications and the Monte Carlo results.

  16. Testing Bayesian models of human coincidence timing.

    PubMed

    Miyazaki, Makoto; Nozaki, Daichi; Nakajima, Yasoichi

    2005-07-01

    A sensorimotor control task often requires an accurate estimation of the timing of the arrival of an external target (e.g., when hitting a pitched ball). Conventional studies of human timing processes have ignored the stochastic features of target timing: e.g., the speed of the pitched ball is not generally constant, but is variable. Interestingly, based on Bayesian theory, it has been recently shown that the human sensorimotor system achieves the optimal estimation by integrating sensory information with prior knowledge of the probabilistic structure of the target variation. In this study, we tested whether Bayesian integration is also implemented while performing a coincidence-timing type of sensorimotor task by manipulating the trial-by-trial variability (i.e., the prior distribution) of the target timing. As a result, within several hundred trials of learning, subjects were able to generate systematic timing behavior according to the width of the prior distribution, as predicted by the optimal Bayesian model. Considering the previous studies showing that the human sensorimotor system uses Bayesian integration in spacing and force-grading tasks, our result indicates that Bayesian integration is fundamental to all aspects of human sensorimotor control. Moreover, it was noteworthy that the subjects could adjust their behavior both when the prior distribution was switched from wide to narrow and vice versa, although the adjustment was slower in the former case. Based on a comparison with observations in a previous study, we discuss the flexibility and adaptability of Bayesian sensorimotor learning.

  17. The DWPF product composition control system at Savannah River: Statistical process control algorithm

    SciTech Connect

    Postles, R.L.; Brown, K.G.

    1991-01-01

    The DWPF Process batch-blends aqueous radwaste (PHA) with solid radwaste (Sludge) in a waste receipt vessel (the SRAT). The resulting SRAT-Batch is transferred to the next process vessel (the SME) and there blended with ground glass (Frit) to produce a batch of feed slurry. The SME-Batch is passed to a subsequent hold tank (the MFT) which feeds a Melter continuously. The Melter produces a molten glass wasteform which is poured into stainless steel canisters for cooling and, ultimately, shipment to and storage in a geologic Repository. The Repository will require that the glass wasteform be resistant to leaching by any underground water that might contact it. In addition, there are processing constraints on Viscosity and Liquidus Temperature of the melt. The Product Composition Control System (PCCS) is the system intended to ensure that the melt will be Processible and that the glass wasteform will be Acceptable. Within the PCCS, the SPC Algorithm is the device which guides control of the DWPF process. The SPC Algorithm is needed to control the multivariate DWPF process in the face of uncertainties (variances and covariances) which arise from this process and its supply, sampling, modeling, and measurement systems.

  18. The DWPF product composition control system at Savannah River: Statistical process control algorithm

    SciTech Connect

    Postles, R.L.; Brown, K.G.

    1991-12-31

    The DWPF Process batch-blends aqueous radwaste (PHA) with solid radwaste (Sludge) in a waste receipt vessel (the SRAT). The resulting SRAT-Batch is transferred to the next process vessel (the SME) and there blended with ground glass (Frit) to produce a batch of feed slurry. The SME-Batch is passed to a subsequent hold tank (the MFT) which feeds a Melter continuously. The Melter produces a molten glass wasteform which is poured into stainless steel canisters for cooling and, ultimately, shipment to and storage in a geologic Repository. The Repository will require that the glass wasteform be resistant to leaching by any underground water that might contact it. In addition, there are processing constraints on Viscosity and Liquidus Temperature of the melt. The Product Composition Control System (PCCS) is the system intended to ensure that the melt will be Processible and that the glass wasteform will be Acceptable. Within the PCCS, the SPC Algorithm is the device which guides control of the DWPF process. The SPC Algorithm is needed to control the multivariate DWPF process in the face of uncertainties (variances and covariances) which arise from this process and its supply, sampling, modeling, and measurement systems.

  19. Evaluating Statistical Process Control (SPC) techniques and computing the uncertainty of force calibrations

    NASA Technical Reports Server (NTRS)

    Navard, Sharon E.

    1989-01-01

    In recent years there has been a push within NASA to use statistical techniques to improve the quality of production. Two areas where statistics are used are in establishing product and process quality control of flight hardware and in evaluating the uncertainty of calibration of instruments. The Flight Systems Quality Engineering branch is responsible for developing and assuring the quality of all flight hardware; the statistical process control methods employed are reviewed and evaluated. The Measurement Standards and Calibration Laboratory performs the calibration of all instruments used on-site at JSC as well as those used by all off-site contractors. These calibrations must be performed in such a way as to be traceable to national standards maintained by the National Institute of Standards and Technology, and they must meet a four-to-one ratio of the instrument specifications to calibrating standard uncertainty. In some instances this ratio is not met, and in these cases it is desirable to compute the exact uncertainty of the calibration and determine ways of reducing it. A particular example where this problem is encountered is with a machine which does automatic calibrations of force. The process of force calibration using the United Force Machine is described in detail. The sources of error are identified and quantified when possible. Suggestions for improvement are made.

  20. "APEC Blue" association with emission control and meteorological conditions detected by multi-scale statistics

    NASA Astrophysics Data System (ADS)

    Wang, Ping; Dai, Xin-Gang

    2016-09-01

    The term "APEC Blue" has been created to describe the clear sky days since the Asia-Pacific Economic Cooperation (APEC) summit held in Beijing during November 5-11, 2014. The duration of the APEC Blue is detected from November 1 to November 14 (hereafter Blue Window) by moving t test in statistics. Observations show that APEC Blue corresponds to low air pollution with respect to PM2.5, PM10, SO2, and NO2 under strict emission-control measures (ECMs) implemented in Beijing and surrounding areas. Quantitative assessment shows that ECM is more effective on reducing aerosols than the chemical constituents. Statistical investigation has revealed that the window also resulted from intensified wind variability, as well as weakened static stability of atmosphere (SSA). The wind and ECMs played key roles in reducing air pollution during November 1-7 and 11-13, and strict ECMs and weak SSA become dominant during November 7-10 under weak wind environment. Moving correlation manifests that the emission reduction for aerosols can increase the apparent wind cleanup effect, leading to significant negative correlations of them, and the period-wise changes in emission rate can be well identified by multi-scale correlations basing on wavelet decomposition. In short, this case study manifests statistically how human interference modified air quality in the mega city through controlling local and surrounding emissions in association with meteorological condition.

  1. Pedestrian dynamics via Bayesian networks

    NASA Astrophysics Data System (ADS)

    Venkat, Ibrahim; Khader, Ahamad Tajudin; Subramanian, K. G.

    2014-06-01

    Studies on pedestrian dynamics have vital applications in crowd control management relevant to organizing safer large scale gatherings including pilgrimages. Reasoning pedestrian motion via computational intelligence techniques could be posed as a potential research problem within the realms of Artificial Intelligence. In this contribution, we propose a "Bayesian Network Model for Pedestrian Dynamics" (BNMPD) to reason the vast uncertainty imposed by pedestrian motion. With reference to key findings from literature which include simulation studies, we systematically identify: What are the various factors that could contribute to the prediction of crowd flow status? The proposed model unifies these factors in a cohesive manner using Bayesian Networks (BNs) and serves as a sophisticated probabilistic tool to simulate vital cause and effect relationships entailed in the pedestrian domain.

  2. Small sample properties of an adaptive filter with application to low volume statistical process control

    SciTech Connect

    Crowder, S.V.; Eshleman, L.

    1998-08-01

    In many manufacturing environments such as the nuclear weapons complex, emphasis has shifted from the regular production and delivery of large orders to infrequent small orders. However, the challenge to maintain the same high quality and reliability standards white building much smaller lot sizes remains. To meet this challenge, specific areas need more attention, including fast and on-target process start-up, low volume statistical process control, process characterization with small experiments, and estimating reliability given few actual performance tests of the product. In this paper the authors address the issue of low volume statistical process control. They investigate an adaptive filtering approach to process monitoring with a relatively short time series of autocorrelated data. The emphasis is on estimation and minimization of mean squared error rather than the traditional hypothesis testing and run length analyses associated with process control charting. The authors develop an adaptive filtering technique that assumes initial process parameters are unknown, and updates the parameters as more data become available. Using simulation techniques, they study the data requirements (the length of a time series of autocorrelated data) necessary to adequately estimate process parameters. They show that far fewer data values are needed than is typically recommended for process control applications. And they demonstrate the techniques with a case study from the nuclear weapons manufacturing complex.

  3. Small Sample Properties of an Adaptive Filter with Application to Low Volume Statistical Process Control

    SciTech Connect

    CROWDER, STEPHEN V.

    1999-09-01

    In many manufacturing environments such as the nuclear weapons complex, emphasis has shifted from the regular production and delivery of large orders to infrequent small orders. However, the challenge to maintain the same high quality and reliability standards while building much smaller lot sizes remains. To meet this challenge, specific areas need more attention, including fast and on-target process start-up, low volume statistical process control, process characterization with small experiments, and estimating reliability given few actual performance tests of the product. In this paper we address the issue of low volume statistical process control. We investigate an adaptive filtering approach to process monitoring with a relatively short time series of autocorrelated data. The emphasis is on estimation and minimization of mean squared error rather than the traditional hypothesis testing and run length analyses associated with process control charting. We develop an adaptive filtering technique that assumes initial process parameters are unknown, and updates the parameters as more data become available. Using simulation techniques, we study the data requirements (the length of a time series of autocorrelated data) necessary to adequately estimate process parameters. We show that far fewer data values are needed than is typically recommended for process control applications. We also demonstrate the techniques with a case study from the nuclear weapons manufacturing complex.

  4. Moving beyond qualitative evaluations of Bayesian models of cognition.

    PubMed

    Hemmer, Pernille; Tauber, Sean; Steyvers, Mark

    2015-06-01

    Bayesian models of cognition provide a powerful way to understand the behavior and goals of individuals from a computational point of view. Much of the focus in the Bayesian cognitive modeling approach has been on qualitative model evaluations, where predictions from the models are compared to data that is often averaged over individuals. In many cognitive tasks, however, there are pervasive individual differences. We introduce an approach to directly infer individual differences related to subjective mental representations within the framework of Bayesian models of cognition. In this approach, Bayesian data analysis methods are used to estimate cognitive parameters and motivate the inference process within a Bayesian cognitive model. We illustrate this integrative Bayesian approach on a model of memory. We apply the model to behavioral data from a memory experiment involving the recall of heights of people. A cross-validation analysis shows that the Bayesian memory model with inferred subjective priors predicts withheld data better than a Bayesian model where the priors are based on environmental statistics. In addition, the model with inferred priors at the individual subject level led to the best overall generalization performance, suggesting that individual differences are important to consider in Bayesian models of cognition.

  5. Use of statistical methods in industrial water pollution control regulations in the United States.

    PubMed

    Kahn, H D; Rubin, M B

    1989-11-01

    This paper describes the process for developing regulations limiting the discharge of pollutants from industrial sources into the waters of the United States. The process includies and surveys of the industry to define products, processes, wastewater sources and characteristics, appropriate subcategorization and control technologies in use. Limitations on the amounts of pollutants that may be discharged in treated wastewater are based on statistical analysis of physical and chemical analytical data characterizing the performance capability of technologies in use in the industry. A general discussion of the statistical approach employed is provided along with some examples based on work performed to support recently promulgated regulations. The determination of regulatory discharge limitations, based on estimates of percentiles of lognormal distributions of measured pollutant concentrations in treated wastewater, is presented. Modifications to account for different averaging periods and detection limit observations are discussed. PMID:24243169

  6. Using statistical process control for monitoring the prevalence of hospital-acquired pressure ulcers.

    PubMed

    Kottner, Jan; Halfens, Ruud

    2010-05-01

    Institutionally acquired pressure ulcers are used as outcome indicators to assess the quality of pressure ulcer prevention programs. Determining whether quality improvement projects that aim to decrease the proportions of institutionally acquired pressure ulcers lead to real changes in clinical practice depends on the measurement method and statistical analysis used. To examine whether nosocomial pressure ulcer prevalence rates in hospitals in the Netherlands changed, a secondary data analysis using different statistical approaches was conducted of annual (1998-2008) nationwide nursing-sensitive health problem prevalence studies in the Netherlands. Institutions that participated regularly in all survey years were identified. Risk-adjusted nosocomial pressure ulcers prevalence rates, grade 2 to 4 (European Pressure Ulcer Advisory Panel system) were calculated per year and hospital. Descriptive statistics, chi-square trend tests, and P charts based on statistical process control (SPC) were applied and compared. Six of the 905 healthcare institutions participated in every survey year and 11,444 patients in these six hospitals were identified as being at risk for pressure ulcers. Prevalence rates per year ranged from 0.05 to 0.22. Chi-square trend tests revealed statistically significant downward trends in four hospitals but based on SPC methods, prevalence rates of five hospitals varied by chance only. Results of chi-square trend tests and SPC methods were not comparable, making it impossible to decide which approach is more appropriate. P charts provide more valuable information than single P values and are more helpful for monitoring institutional performance. Empirical evidence about the decrease of nosocomial pressure ulcer prevalence rates in the Netherlands is contradictory and limited. PMID:20511685

  7. Bayesian Estimation of Thermonuclear Reaction Rates

    NASA Astrophysics Data System (ADS)

    Iliadis, C.; Anderson, K. S.; Coc, A.; Timmes, F. X.; Starrfield, S.

    2016-11-01

    The problem of estimating non-resonant astrophysical S-factors and thermonuclear reaction rates, based on measured nuclear cross sections, is of major interest for nuclear energy generation, neutrino physics, and element synthesis. Many different methods have been applied to this problem in the past, almost all of them based on traditional statistics. Bayesian methods, on the other hand, are now in widespread use in the physical sciences. In astronomy, for example, Bayesian statistics is applied to the observation of extrasolar planets, gravitational waves, and Type Ia supernovae. However, nuclear physics, in particular, has been slow to adopt Bayesian methods. We present astrophysical S-factors and reaction rates based on Bayesian statistics. We develop a framework that incorporates robust parameter estimation, systematic effects, and non-Gaussian uncertainties in a consistent manner. The method is applied to the reactions d(p,γ)3He, 3He(3He,2p)4He, and 3He(α,γ)7Be, important for deuterium burning, solar neutrinos, and Big Bang nucleosynthesis.

  8. A hybrid Bayesian hierarchical model combining cohort and case-control studies for meta-analysis of diagnostic tests: Accounting for partial verification bias.

    PubMed

    Ma, Xiaoye; Chen, Yong; Cole, Stephen R; Chu, Haitao

    2014-05-26

    To account for between-study heterogeneity in meta-analysis of diagnostic accuracy studies, bivariate random effects models have been recommended to jointly model the sensitivities and specificities. As study design and population vary, the definition of disease status or severity could differ across studies. Consequently, sensitivity and specificity may be correlated with disease prevalence. To account for this dependence, a trivariate random effects model had been proposed. However, the proposed approach can only include cohort studies with information estimating study-specific disease prevalence. In addition, some diagnostic accuracy studies only select a subset of samples to be verified by the reference test. It is known that ignoring unverified subjects may lead to partial verification bias in the estimation of prevalence, sensitivities, and specificities in a single study. However, the impact of this bias on a meta-analysis has not been investigated. In this paper, we propose a novel hybrid Bayesian hierarchical model combining cohort and case-control studies and correcting partial verification bias at the same time. We investigate the performance of the proposed methods through a set of simulation studies. Two case studies on assessing the diagnostic accuracy of gadolinium-enhanced magnetic resonance imaging in detecting lymph node metastases and of adrenal fluorine-18 fluorodeoxyglucose positron emission tomography in characterizing adrenal masses are presented.

  9. Variable selection in Bayesian generalized linear-mixed models: an illustration using candidate gene case-control association studies.

    PubMed

    Tsai, Miao-Yu

    2015-03-01

    The problem of variable selection in the generalized linear-mixed models (GLMMs) is pervasive in statistical practice. For the purpose of variable selection, many methodologies for determining the best subset of explanatory variables currently exist according to the model complexity and differences between applications. In this paper, we develop a "higher posterior probability model with bootstrap" (HPMB) approach to select explanatory variables without fitting all possible GLMMs involving a small or moderate number of explanatory variables. Furthermore, to save computational load, we propose an efficient approximation approach with Laplace's method and Taylor's expansion to approximate intractable integrals in GLMMs. Simulation studies and an application of HapMap data provide evidence that this selection approach is computationally feasible and reliable for exploring true candidate genes and gene-gene associations, after adjusting for complex structures among clusters.

  10. Monitoring Actuarial Present Values of Term Life Insurance By a Statistical Process Control Chart

    NASA Astrophysics Data System (ADS)

    Hafidz Omar, M.

    2015-06-01

    Tracking performance of life insurance or similar insurance policy using standard statistical process control chart is complex because of many factors. In this work, we present the difficulty in doing so. However, with some modifications of the SPC charting framework, the difficulty can be manageable to the actuaries. So, we propose monitoring a simpler but natural actuarial quantity that is typically found in recursion formulas of reserves, profit testing, as well as present values. We shared some simulation results for the monitoring process. Additionally, some advantages of doing so is discussed.

  11. Statistical process control for AR(1) or non-Gaussian processes using wavelets coefficients

    NASA Astrophysics Data System (ADS)

    Cohen, A.; Tiplica, T.; Kobi, A.

    2015-11-01

    Autocorrelation and non-normality of process characteristic variables are two main difficulties that industrial engineers must face when they should implement control charting techniques. This paper presents new issues regarding the probability distribution of wavelets coefficients. Firstly, we highlight that wavelets coefficients have capacities to strongly decrease autocorrelation degree of original data and are normally-like distributed, especially in the case of Haar wavelet. We used AR(1) model with positive autoregressive parameters to simulate autocorrelated data. Illustrative examples are presented to show wavelets coefficients properties. Secondly, the distributional parameters of wavelets coefficients are derived, it shows that wavelets coefficients reflect an interesting statistical properties for SPC purposes.

  12. Statistical process control program at a ceramics vendor facility. Final report

    SciTech Connect

    Enke, G.M.

    1992-12-01

    Development of a statistical process control (SPC) program at a ceramics vendor location was deemed necessary to improve product quality, reduce manufacturing flowtime, and reduce quality costs borne by AlliedSignal Inc., Kansas City Division (KCD), and the vendor. Because of the lack of available KCD manpower and the required time schedule for the project, it was necessary for the SPC program to be implemented by an external contractor. Approximately a year after the program had been installed, the original baseline was reviewed so that the success of the project could be determined.

  13. Preliminary Retrospective Analysis of Daily Tomotherapy Output Constancy Checks Using Statistical Process Control

    PubMed Central

    Menghi, Enrico; Marcocci, Francesco; Bianchini, David

    2016-01-01

    The purpose of this study was to retrospectively evaluate the results from a Helical TomoTherapy Hi-Art treatment system relating to quality controls based on daily static and dynamic output checks using statistical process control methods. Individual value X-charts, exponentially weighted moving average charts, and process capability and acceptability indices were used to monitor the treatment system performance. Daily output values measured from January 2014 to January 2015 were considered. The results obtained showed that, although the process was in control, there was an out-of-control situation in the principal maintenance intervention for the treatment system. In particular, process capability indices showed a decreasing percentage of points in control which was, however, acceptable according to AAPM TG148 guidelines. Our findings underline the importance of restricting the acceptable range of daily output checks and suggest a future line of investigation for a detailed process control of daily output checks for the Helical TomoTherapy Hi-Art treatment system. PMID:26848962

  14. Computationally efficient Bayesian inference for inverse problems.

    SciTech Connect

    Marzouk, Youssef M.; Najm, Habib N.; Rahn, Larry A.

    2007-10-01

    Bayesian statistics provides a foundation for inference from noisy and incomplete data, a natural mechanism for regularization in the form of prior information, and a quantitative assessment of uncertainty in the inferred results. Inverse problems - representing indirect estimation of model parameters, inputs, or structural components - can be fruitfully cast in this framework. Complex and computationally intensive forward models arising in physical applications, however, can render a Bayesian approach prohibitive. This difficulty is compounded by high-dimensional model spaces, as when the unknown is a spatiotemporal field. We present new algorithmic developments for Bayesian inference in this context, showing strong connections with the forward propagation of uncertainty. In particular, we introduce a stochastic spectral formulation that dramatically accelerates the Bayesian solution of inverse problems via rapid evaluation of a surrogate posterior. We also explore dimensionality reduction for the inference of spatiotemporal fields, using truncated spectral representations of Gaussian process priors. These new approaches are demonstrated on scalar transport problems arising in contaminant source inversion and in the inference of inhomogeneous material or transport properties. We also present a Bayesian framework for parameter estimation in stochastic models, where intrinsic stochasticity may be intermingled with observational noise. Evaluation of a likelihood function may not be analytically tractable in these cases, and thus several alternative Markov chain Monte Carlo (MCMC) schemes, operating on the product space of the observations and the parameters, are introduced.

  15. Space Shuttle RTOS Bayesian Network

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry; Beling, Peter A.

    2001-01-01

    With shrinking budgets and the requirements to increase reliability and operational life of the existing orbiter fleet, NASA has proposed various upgrades for the Space Shuttle that are consistent with national space policy. The cockpit avionics upgrade (CAU), a high priority item, has been selected as the next major upgrade. The primary functions of cockpit avionics include flight control, guidance and navigation, communication, and orbiter landing support. Secondary functions include the provision of operational services for non-avionics systems such as data handling for the payloads and caution and warning alerts to the crew. Recently, a process to selection the optimal commercial-off-the-shelf (COTS) real-time operating system (RTOS) for the CAU was conducted by United Space Alliance (USA) Corporation, which is a joint venture between Boeing and Lockheed Martin, the prime contractor for space shuttle operations. In order to independently assess the RTOS selection, NASA has used the Bayesian network-based scoring methodology described in this paper. Our two-stage methodology addresses the issue of RTOS acceptability by incorporating functional, performance and non-functional software measures related to reliability, interoperability, certifiability, efficiency, correctness, business, legal, product history, cost and life cycle. The first stage of the methodology involves obtaining scores for the various measures using a Bayesian network. The Bayesian network incorporates the causal relationships between the various and often competing measures of interest while also assisting the inherently complex decision analysis process with its ability to reason under uncertainty. The structure and selection of prior probabilities for the network is extracted from experts in the field of real-time operating systems. Scores for the various measures are computed using Bayesian probability. In the second stage, multi-criteria trade-off analyses are performed between the scores

  16. Disciplined decision making in an interdisciplinary environment: some implications for clinical applications of statistical process control.

    PubMed Central

    Hantula, D A

    1995-01-01

    This paper explores some of the implications the statistical process control (SPC) methodology described by Pfadt and Wheeler (1995) may have for analyzing more complex performances and contingencies in human services or health care environments at an organizational level. Service delivery usually occurs in an organizational system that is characterized by functional structures, high levels of professionalism, subunit optimization, and organizational suboptimization. By providing a standard set of criteria and decision rules, SPC may provide a common interface for data-based decision making, may bring decision making under the control of the contigencies that are established by these rules rather than the immediate contingencies of data fluctuation, and may attenuate escalation of failing treatments. SPC is culturally consistent with behavior analysis, sharing an emphasis on data-based decisions, measurement over time, and graphic analysis of data, as well as a systemic view of organizations. PMID:7592155

  17. Using a statistical process control chart during the quality assessment of cancer registry data.

    PubMed

    Myles, Zachary M; German, Robert R; Wilson, Reda J; Wu, Manxia

    2011-01-01

    Statistical process control (SPC) charts may be used to detect acute variations in the data while simultaneously evaluating unforeseen aberrations that may warrant further investigation by the data user. Using cancer stage data captured by the Summary Stage 2000 (SS2000) variable, we sought to present a brief report highlighting the utility of the SPC chart during the quality assessment of cancer registry data. Using a county-level caseload for the diagnosis period of 2001-2004 (n=25,648), we found the overall variation of the SS2000 variable to be in control during diagnosis years of 2001 and 2002, exceeded the lower control limit (LCL) in 2003, and exceeded the upper control limit (UCL) in 2004; in situ/localized stages were in control throughout the diagnosis period, regional stage exceeded UCL in 2004, and distant stage exceeded the LCL in 2001 and the UCL in 2004. Our application of the SPC chart with cancer registry data illustrates that the SPC chart may serve as a readily available and timely tool for identifying areas of concern during the data collection and quality assessment of central cancer registry data. PMID:22223059

  18. Implementation of Statistical Process Control: Evaluating the Mechanical Performance of a Candidate Silicone Elastomer Docking Seal

    NASA Technical Reports Server (NTRS)

    Oravec, Heather Ann; Daniels, Christopher C.

    2014-01-01

    The National Aeronautics and Space Administration has been developing a novel docking system to meet the requirements of future exploration missions to low-Earth orbit and beyond. A dynamic gas pressure seal is located at the main interface between the active and passive mating components of the new docking system. This seal is designed to operate in the harsh space environment, but is also to perform within strict loading requirements while maintaining an acceptable level of leak rate. In this study, a candidate silicone elastomer seal was designed, and multiple subscale test articles were manufactured for evaluation purposes. The force required to fully compress each test article at room temperature was quantified and found to be below the maximum allowable load for the docking system. However, a significant amount of scatter was observed in the test results. Due to the stochastic nature of the mechanical performance of this candidate docking seal, a statistical process control technique was implemented to isolate unusual compression behavior from typical mechanical performance. The results of this statistical analysis indicated a lack of process control, suggesting a variation in the manufacturing phase of the process. Further investigation revealed that changes in the manufacturing molding process had occurred which may have influenced the mechanical performance of the seal. This knowledge improves the chance of this and future space seals to satisfy or exceed design specifications.

  19. An evaluation of Bayesian techniques for controlling model complexity and selecting inputs in a neural network for short-term load forecasting.

    PubMed

    Hippert, Henrique S; Taylor, James W

    2010-04-01

    Artificial neural networks have frequently been proposed for electricity load forecasting because of their capabilities for the nonlinear modelling of large multivariate data sets. Modelling with neural networks is not an easy task though; two of the main challenges are defining the appropriate level of model complexity, and choosing the input variables. This paper evaluates techniques for automatic neural network modelling within a Bayesian framework, as applied to six samples containing daily load and weather data for four different countries. We analyse input selection as carried out by the Bayesian 'automatic relevance determination', and the usefulness of the Bayesian 'evidence' for the selection of the best structure (in terms of number of neurones), as compared to methods based on cross-validation.

  20. Characterization of a Saccharomyces cerevisiae fermentation process for production of a therapeutic recombinant protein using a multivariate Bayesian approach.

    PubMed

    Fu, Zhibiao; Baker, Daniel; Cheng, Aili; Leighton, Julie; Appelbaum, Edward; Aon, Juan

    2016-05-01

    The principle of quality by design (QbD) has been widely applied to biopharmaceutical manufacturing processes. Process characterization is an essential step to implement the QbD concept to establish the design space and to define the proven acceptable ranges (PAR) for critical process parameters (CPPs). In this study, we present characterization of a Saccharomyces cerevisiae fermentation process using risk assessment analysis, statistical design of experiments (DoE), and the multivariate Bayesian predictive approach. The critical quality attributes (CQAs) and CPPs were identified with a risk assessment. The statistical model for each attribute was established using the results from the DoE study with consideration given to interactions between CPPs. Both the conventional overlapping contour plot and the multivariate Bayesian predictive approaches were used to establish the region of process operating conditions where all attributes met their specifications simultaneously. The quantitative Bayesian predictive approach was chosen to define the PARs for the CPPs, which apply to the manufacturing control strategy. Experience from the 10,000 L manufacturing scale process validation, including 64 continued process verification batches, indicates that the CPPs remain under a state of control and within the established PARs. The end product quality attributes were within their drug substance specifications. The probability generated with the Bayesian approach was also used as a tool to assess CPP deviations. This approach can be extended to develop other production process characterization and quantify a reliable operating region. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:799-812, 2016.

  1. Bayesian Analysis on Meta-analysis of Case-control Studies Accounting for Within-study Correlation

    PubMed Central

    Chen, Yong; Chu, Haitao; Luo, Sheng; Nie, Lei; Chen, Sining

    2013-01-01

    In retrospective studies, odds ratio is often used as the measure of association. Under independent beta prior assumption, the exact posterior distribution of odds ratio given a single 2 × 2 table has been derived in the literature. However, independence between risks within the same study may be an oversimplified assumption because cases and controls in the same study are likely to share some common factors and thus to be correlated. Furthermore, in a meta-analysis of case-control studies, investigators usually have multiple 2×2 tables. In this paper, we first extend the published results on a single 2×2 table to allow within study prior correlation while retaining the advantage of closed form posterior formula, and then extend the results to multiple 2 × 2 tables and regression setting. The hyperparameters, including within study correlation, are estimated via an empirical Bayes approach. The overall odds ratio and the exact posterior distribution of the study-specific odds ratio are inferred based on the estimated hyperparameters. We conduct simulation studies to verify our exact posterior distribution formulas and investigate the finite sample properties of the inference for the overall odds ratio. The results are illustrated through a twin study for genetic heritability and a meta-analysis for the association between the N-acetyltransferase 2 (NAT2) acetylation status and colorectal cancer. PMID:22143403

  2. Bayesian regularization of neural networks.

    PubMed

    Burden, Frank; Winkler, Dave

    2008-01-01

    Bayesian regularized artificial neural networks (BRANNs) are more robust than standard back-propagation nets and can reduce or eliminate the need for lengthy cross-validation. Bayesian regularization is a mathematical process that converts a nonlinear regression into a "well-posed" statistical problem in the manner of a ridge regression. The advantage of BRANNs is that the models are robust and the validation process, which scales as O(N2) in normal regression methods, such as back propagation, is unnecessary. These networks provide solutions to a number of problems that arise in QSAR modeling, such as choice of model, robustness of model, choice of validation set, size of validation effort, and optimization of network architecture. They are difficult to overtrain, since evidence procedures provide an objective Bayesian criterion for stopping training. They are also difficult to overfit, because the BRANN calculates and trains on a number of effective network parameters or weights, effectively turning off those that are not relevant. This effective number is usually considerably smaller than the number of weights in a standard fully connected back-propagation neural net. Automatic relevance determination (ARD) of the input variables can be used with BRANNs, and this allows the network to "estimate" the importance of each input. The ARD method ensures that irrelevant or highly correlated indices used in the modeling are neglected as well as showing which are the most important variables for modeling the activity data. This chapter outlines the equations that define the BRANN method plus a flowchart for producing a BRANN-QSAR model. Some results of the use of BRANNs on a number of data sets are illustrated and compared with other linear and nonlinear models.

  3. A Case Study of Ion Implant In-Line Statistical Process Control

    NASA Astrophysics Data System (ADS)

    Zhao, Zhiyong; Ramczyk, Kenneth; Hall, Darcy; Wang, Linda

    2005-09-01

    Ion implantation is one of the most critical processes in the front-end-of-line for ULSI manufacturing. With more complexity in device layout, the fab cycle time can only be expected to be longer. To ensure yield and consistent device performance it is very beneficial to have a Statistical Process Control (SPC) practice that can detect tool issues to prevent excursions. Also, implanters may abort a process due to run-time issues. It requires human intervention to dispose of the lot. Since device wafers have a fixed flow plan and can only do anneal at certain points in the manufacturing process, it is not practical to use four-point probe to check such implants. Pattern recognition option on some of the metrology tools, such as ThermaWave (TWave), allows user to check an open area on device wafers for implant information. The two cited reasons prompted this work to look into the sensitivity of TWave with different implant processes and the possibility of setting up an SPC practice in a high-volume manufacturing fab. In this work, the authors compare the test wafer result with that of device wafers with variations in implant conditions such as dose, implant angle, energy, etc. The intention of this work is to correlate analytical measurement such as sheet resistance (Rs) and Secondary Ion Mass Spectrometry (SIMS) with device data such as electrical testing and sort yield. For a ± 1.5% TWave control limit with the tested implant processes in this work, this translates to about 0.5° in implant angle control or 2% to 8% dose change, respectively. It is understood that the dose sensitivity is not good since the tested processes are deep layer implants. Based on the statistics calculation, we assess the experimental error bar is within 1% of the measured values.

  4. Person Fit Based on Statistical Process Control in an Adaptive Testing Environment. Research Report 98-13.

    ERIC Educational Resources Information Center

    van Krimpen-Stoop, Edith M. L. A.; Meijer, Rob R.

    Person-fit research in the context of paper-and-pencil tests is reviewed, and some specific problems regarding person fit in the context of computerized adaptive testing (CAT) are discussed. Some new methods are proposed to investigate person fit in a CAT environment. These statistics are based on Statistical Process Control (SPC) theory. A…

  5. Comparative efficacy and tolerability of duloxetine, pregabalin, and milnacipran for the treatment of fibromyalgia: a Bayesian network meta-analysis of randomized controlled trials.

    PubMed

    Lee, Young Ho; Song, Gwan Gyu

    2016-05-01

    The aim of this study was to assess the relative efficacy and tolerability of duloxetine, pregabalin, and milnacipran at the recommended doses in patients with fibromyalgia. Randomized controlled trials (RCTs) examining the efficacy and safety of duloxetine 60 mg, pregabalin 300 mg, pregabalin 150 mg, milnacipran 200 mg, and milnacipran 100 mg compared to placebo in patients with fibromyalgia were included in this Bayesian network meta-analysis. Nine RCTs including 5140 patients met the inclusion criteria. The proportion of patients with >30 % improvement from baseline in pain was significantly higher in the duloxetine 60 mg, pregabalin 300 mg, milnacipran 100 mg, and milnacipran 200 mg groups than in the placebo group [pairwise odds ratio (OR) 2.33, 95 % credible interval (CrI) 1.50-3.67; OR 1.68, 95 % CrI 1.25-2.28; OR 1.62, 95 % CrI 1.16-2.25; and OR 1.61; 95 % CrI 1.15-2.24, respectively]. Ranking probability based on the surface under the cumulative ranking curve (SUCRA) indicated that duloxetine 60 mg had the highest probability of being the best treatment for achieving the response level (SUCRA = 0.9431), followed by pregabalin 300 mg (SUCRA = 0.6300), milnacipran 100 mg (SUCRA = 0.5680), milnacipran 200 mg (SUCRA = 0.5617), pregabalin 150 mg (SUCRA = 0.2392), and placebo (SUCRA = 0.0580). The risk of withdrawal due to adverse events was lower in the placebo group than in the pregabalin 300 mg, duloxetine 60 mg, milnacipran 100 mg, and milnacipran 200 mg groups. However, there was no significant difference in the efficacy and tolerability between the medications at the recommended doses. Duloxetine 60 mg, pregabalin 300 mg, milnacipran 100 mg, and milnacipran 200 mg were more efficacious than placebo. However, there was no significant difference in the efficacy and tolerability between the medications at the recommended doses. PMID:27000046

  6. Editorial: Bayesian benefits for child psychology and psychiatry researchers.

    PubMed

    Oldehinkel, Albertine J

    2016-09-01

    For many scientists, performing statistical tests has become an almost automated routine. However, p-values are frequently used and interpreted incorrectly; and even when used appropriately, p-values tend to provide answers that do not match researchers' questions and hypotheses well. Bayesian statistics present an elegant and often more suitable alternative. The Bayesian approach has rarely been applied in child psychology and psychiatry research so far, but the development of user-friendly software packages and tutorials has placed it well within reach now. Because Bayesian analyses require a more refined definition of hypothesized probabilities of possible outcomes than the classical approach, going Bayesian may offer the additional benefit of sparkling the development and refinement of theoretical models in our field.

  7. Editorial: Bayesian benefits for child psychology and psychiatry researchers.

    PubMed

    Oldehinkel, Albertine J

    2016-09-01

    For many scientists, performing statistical tests has become an almost automated routine. However, p-values are frequently used and interpreted incorrectly; and even when used appropriately, p-values tend to provide answers that do not match researchers' questions and hypotheses well. Bayesian statistics present an elegant and often more suitable alternative. The Bayesian approach has rarely been applied in child psychology and psychiatry research so far, but the development of user-friendly software packages and tutorials has placed it well within reach now. Because Bayesian analyses require a more refined definition of hypothesized probabilities of possible outcomes than the classical approach, going Bayesian may offer the additional benefit of sparkling the development and refinement of theoretical models in our field. PMID:27535649

  8. Statistical characterization of negative control data in the Ames Salmonella/microsome test.

    PubMed Central

    Hamada, C; Wada, T; Sakamoto, Y

    1994-01-01

    A statistical characterization of negative control data in the Ames Salmonella/microsome reverse mutation test was performed using data obtained at Takeda Analytical Research Laboratories during January 1989 to April 1990. The lot-to-lot variability of bacterial stock cultures and day-to-day variability of experiments were small for Salmonella typhimurium strains TA1535 and TA1537 and Escherichia coli WP2uvrA, but they were larger for S. typhimurium TA100. The number of revertant colonies for all test strains studied here followed Poisson distributions within the same day. The two-fold rule that is an empirical method to evaluate the Ames Salmonella/microsome test results has been widely used in Japan. This two-fold rule was evaluated statistically. The comparison-wise type I error rate was less than 0.05 for TA98, TA100, TA1535, TA1537, and WP2uvrA. Moreover, this rule is particularly conservative for TA100, for which the type I error rate was nearly 0. PMID:8187699

  9. IMPORTANCE OF MATERIAL BALANCES AND THEIR STATISTICAL EVALUATION IN RUSSIAN MATERIAL, PROTECTION, CONTROL AND ACCOUNTING

    SciTech Connect

    FISHBONE,L.G.

    1999-07-25

    While substantial work has been performed in the Russian MPC&A Program, much more needs to be done at Russian nuclear facilities to complete four necessary steps. These are (1) periodically measuring the physical inventory of nuclear material, (2) continuously measuring the flows of nuclear material, (3) using the results to close the material balance, particularly at bulk processing facilities, and (4) statistically evaluating any apparent loss of nuclear material. The periodic closing of material balances provides an objective test of the facility's system of nuclear material protection, control and accounting. The statistical evaluation using the uncertainties associated with individual measurement systems involved in the calculation of the material balance provides a fair standard for concluding whether the apparent loss of nuclear material means a diversion or whether the facility's accounting system needs improvement. In particular, if unattractive flow material at a facility is not measured well, the accounting system cannot readily detect the loss of attractive material if the latter substantially derives from the former.

  10. Statistical and graphical methods for quality control determination of high-throughput screening data.

    PubMed

    Gunter, Bert; Brideau, Christine; Pikounis, Bill; Liaw, Andy

    2003-12-01

    High-throughput screening (HTS) is used in modern drug discovery to screen hundreds of thousands to millions of compounds on selected protein targets. It is an industrial-scale process relying on sophisticated automation and state-of-the-art detection technologies. Quality control (QC) is an integral part of the process and is used to ensure good quality data and mini mize assay variability while maintaining assay sensitivity. The authors describe new QC methods and show numerous real examples from their biologist-friendly Stat Server HTS application, a custom-developed software tool built from the commercially available S-PLUS and Stat Server statistical analysis and server software. This system remotely processes HTS data using powerful and sophisticated statistical methodology but insulates users from the technical details by outputting results in a variety of readily interpretable graphs and tables. It allows users to visualize HTS data and examine assay performance during the HTS campaign to quickly react to or avoid quality problems.

  11. Statistics of plastic events in post-yield strain-controlled amorphous solids

    NASA Astrophysics Data System (ADS)

    Dubey, Awadhesh K.; Hentschel, H. George E.; Procaccia, Itamar; Singh, Murari

    2016-06-01

    Amorphous solids yield in strain-controlled protocols at a critical value of the strain. For larger strains the stress and energy display a generic complex serrated signal with elastic segments punctuated by sharp energy and stress plastic drops having a wide range of magnitudes. Here we provide a theory of the scaling properties of such serrated signals taking into account the system-size dependence. We show that the statistics are not homogeneous: they separate sharply to a regime of "small" and "large" drops, each endowed with its own scaling properties. A scaling theory is first derived solely by data analysis, showing a somewhat complex picture. But after considering the physical interpretation one discovers that the scaling behavior and the scaling exponents are in fact very simple and universal.

  12. Automatic detection of health changes using statistical process control techniques on measured transfer times of elderly.

    PubMed

    Baldewijns, Greet; Luca, Stijn; Nagels, William; Vanrumste, Bart; Croonenborghs, Tom

    2015-01-01

    It has been shown that gait speed and transfer times are good measures of functional ability in elderly. However, data currently acquired by systems that measure either gait speed or transfer times in the homes of elderly people require manual reviewing by healthcare workers. This reviewing process is time-consuming. To alleviate this burden, this paper proposes the use of statistical process control methods to automatically detect both positive and negative changes in transfer times. Three SPC techniques: tabular CUSUM, standardized CUSUM and EWMA, known for their ability to detect small shifts in the data, are evaluated on simulated transfer times. This analysis shows that EWMA is the best-suited method with a detection accuracy of 82% and an average detection time of 9.64 days. PMID:26737425

  13. Prewarning of the China National Aquatics Center using Johnson transformation based statistical process control

    NASA Astrophysics Data System (ADS)

    Zhang, Deyi; Bao, Yuequan; Li, Hui; Ou, Jinping

    2009-07-01

    Structural health monitoring (SHM) is regarded as an effective technique for structural damage diagnosis, safety and integrity assessment and service life evaluation. SHM techniques based on vibration modal parameters are ineffective for space structure health maintenance and the statistical process control (SPC) technique is a simple and effective tool to monitor the operational process of structures. Therefore, employing strain measurements from optical fiber Bragg grating (OFBG) sensors, the Johnson transformation based SPC is proposed to monitor structural health state and some unexpected excitements on line in this paper. The large and complicated space structure-the China National Aquatics Center is employed as an example to verify the proposed method in both numerical and experimental aspects. It is found that the Johnson transformation can effectively improve the quality of SPC for SHM process, and it can clearly and effectively monitor structural health state and detect the unexpected external load happened in structures.

  14. Evaluation of Dimensional Measurement Systems Applied to Statistical Control of the Manufacturing Process

    NASA Astrophysics Data System (ADS)

    Villeta, M.; Sanz-Lobera, A.; González, C.; Sebastián, M. A.

    2009-11-01

    The implantation of Statistical Process Control, SPC designated in short, requires the use of measurement systems. The inherent variability of these systems influences on the reliability of measurement results obtained, and as a consequence of it, influences on the SPC results. This paper investigates about the influence of the uncertainty of measurement on the analysis of process capability. It looks for reducing the effect of measurement uncertainty, to approach the capability that the productive process really has. In this work processes centered at a nominal value as well as off-center processes are raised, and a criterion is proposed that allows validate the adequacy of the dimensional measurement systems used in a SPC implantation.

  15. Simulation on a car interior aerodynamic noise control based on statistical energy analysis

    NASA Astrophysics Data System (ADS)

    Chen, Xin; Wang, Dengfeng; Ma, Zhengdong

    2012-09-01

    How to simulate interior aerodynamic noise accurately is an important question of a car interior noise reduction. The unsteady aerodynamic pressure on body surfaces is proved to be the key effect factor of car interior aerodynamic noise control in high frequency on high speed. In this paper, a detail statistical energy analysis (SEA) model is built. And the vibra-acoustic power inputs are loaded on the model for the valid result of car interior noise analysis. The model is the solid foundation for further optimization on car interior noise control. After the most sensitive subsystems for the power contribution to car interior noise are pointed by SEA comprehensive analysis, the sound pressure level of car interior aerodynamic noise can be reduced by improving their sound and damping characteristics. The further vehicle testing results show that it is available to improve the interior acoustic performance by using detailed SEA model, which comprised by more than 80 subsystems, with the unsteady aerodynamic pressure calculation on body surfaces and the materials improvement of sound/damping properties. It is able to acquire more than 2 dB reduction on the central frequency in the spectrum over 800 Hz. The proposed optimization method can be looked as a reference of car interior aerodynamic noise control by the detail SEA model integrated unsteady computational fluid dynamics (CFD) and sensitivity analysis of acoustic contribution.

  16. Statistical process control in the hybrid microelectronics manufacturing industry: A Navy view point

    NASA Astrophysics Data System (ADS)

    Azu, Charles C., Jr.

    1993-04-01

    The U.S. Navy is concerned with receiving high quality hybrid microelectronic circuits. The Navy recognizes that in order to obtain high quality circuits a manufacturer must have an effective statistical process control (SPC) program implemented. The implementation of effective SPC programs is an objective of the military hybrid microelectronics industry. Often the smaller sized manufacturers in the industry have little SPC implementation, while the larger manufacturers have practices originally developed for the control of other product lines outside the hybrid technology area. The industry recognizes that SPC programs will result in high quality hybrid microcircuits which the U.S. Navy requires. In the past, the goal of the industry had not been to put in effective process control methods, but to merely meet the government military standards on the quality of the hybrids they produce. This practice, at best, resulted in a 'hit or miss' situation when it comes to hybrid microcircuit assemblies meeting military standards. The U.S. Navy through its MicroCIM program has been challenging and working with the industry in the area of SPC practice methods. The major limitations so far have been a lack of available sensors for the real-time collection of effective SPC data on the factory floor. This paper will discuss the Navy's efforts in bringing about effective SPC programs in the military hybrid manufacturing industry.

  17. Particle identification in ALICE: a Bayesian approach

    NASA Astrophysics Data System (ADS)

    Adam, J.; Adamová, D.; Aggarwal, M. M.; Aglieri Rinella, G.; Agnello, M.; Agrawal, N.; Ahammed, Z.; Ahmad, S.; Ahn, S. U.; Aiola, S.; Akindinov, A.; Alam, S. N.; Albuquerque, D. S. D.; Aleksandrov, D.; Alessandro, B.; Alexandre, D.; Alfaro Molina, R.; Alici, A.; Alkin, A.; Almaraz, J. R. M.; Alme, J.; Alt, T.; Altinpinar, S.; Altsybeev, I.; Alves Garcia Prado, C.; Andrei, C.; Andronic, A.; Anguelov, V.; Antičić, T.; Antinori, F.; Antonioli, P.; Aphecetche, L.; Appelshäuser, H.; Arcelli, S.; Arnaldi, R.; Arnold, O. W.; Arsene, I. C.; Arslandok, M.; Audurier, B.; Augustinus, A.; Averbeck, R.; Azmi, M. D.; Badalà, A.; Baek, Y. W.; Bagnasco, S.; Bailhache, R.; Bala, R.; Balasubramanian, S.; Baldisseri, A.; Baral, R. C.; Barbano, A. M.; Barbera, R.; Barile, F.; Barnaföldi, G. G.; Barnby, L. S.; Barret, V.; Bartalini, P.; Barth, K.; Bartke, J.; Bartsch, E.; Basile, M.; Bastid, N.; Basu, S.; Bathen, B.; Batigne, G.; Batista Camejo, A.; Batyunya, B.; Batzing, P. C.; Bearden, I. G.; Beck, H.; Bedda, C.; Behera, N. K.; Belikov, I.; Bellini, F.; Bello Martinez, H.; Bellwied, R.; Belmont, R.; Belmont-Moreno, E.; Belyaev, V.; Benacek, P.; Bencedi, G.; Beole, S.; Berceanu, I.; Bercuci, A.; Berdnikov, Y.; Berenyi, D.; Bertens, R. A.; Berzano, D.; Betev, L.; Bhasin, A.; Bhat, I. R.; Bhati, A. K.; Bhattacharjee, B.; Bhom, J.; Bianchi, L.; Bianchi, N.; Bianchin, C.; Bielčík, J.; Bielčíková, J.; Bilandzic, A.; Biro, G.; Biswas, R.; Biswas, S.; Bjelogrlic, S.; Blair, J. T.; Blau, D.; Blume, C.; Bock, F.; Bogdanov, A.; Bøggild, H.; Boldizsár, L.; Bombara, M.; Book, J.; Borel, H.; Borissov, A.; Borri, M.; Bossú, F.; Botta, E.; Bourjau, C.; Braun-Munzinger, P.; Bregant, M.; Breitner, T.; Broker, T. A.; Browning, T. A.; Broz, M.; Brucken, E. J.; Bruna, E.; Bruno, G. E.; Budnikov, D.; Buesching, H.; Bufalino, S.; Buncic, P.; Busch, O.; Buthelezi, Z.; Butt, J. B.; Buxton, J. T.; Cabala, J.; Caffarri, D.; Cai, X.; Caines, H.; Calero Diaz, L.; Caliva, A.; Calvo Villar, E.; Camerini, P.; Carena, F.; Carena, W.; Carnesecchi, F.; Castillo Castellanos, J.; Castro, A. J.; Casula, E. A. R.; Ceballos Sanchez, C.; Cepila, J.; Cerello, P.; Cerkala, J.; Chang, B.; Chapeland, S.; Chartier, M.; Charvet, J. L.; Chattopadhyay, S.; Chattopadhyay, S.; Chauvin, A.; Chelnokov, V.; Cherney, M.; Cheshkov, C.; Cheynis, B.; Chibante Barroso, V.; Chinellato, D. D.; Cho, S.; Chochula, P.; Choi, K.; Chojnacki, M.; Choudhury, S.; Christakoglou, P.; Christensen, C. H.; Christiansen, P.; Chujo, T.; Chung, S. U.; Cicalo, C.; Cifarelli, L.; Cindolo, F.; Cleymans, J.; Colamaria, F.; Colella, D.; Collu, A.; Colocci, M.; Conesa Balbastre, G.; Conesa del Valle, Z.; Connors, M. E.; Contreras, J. G.; Cormier, T. M.; Corrales Morales, Y.; Cortés Maldonado, I.; Cortese, P.; Cosentino, M. R.; Costa, F.; Crochet, P.; Cruz Albino, R.; Cuautle, E.; Cunqueiro, L.; Dahms, T.; Dainese, A.; Danisch, M. C.; Danu, A.; Das, D.; Das, I.; Das, S.; Dash, A.; Dash, S.; De, S.; De Caro, A.; de Cataldo, G.; de Conti, C.; de Cuveland, J.; De Falco, A.; De Gruttola, D.; De Marco, N.; De Pasquale, S.; Deisting, A.; Deloff, A.; Dénes, E.; Deplano, C.; Dhankher, P.; Di Bari, D.; Di Mauro, A.; Di Nezza, P.; Diaz Corchero, M. A.; Dietel, T.; Dillenseger, P.; Divià, R.; Djuvsland, Ø.; Dobrin, A.; Domenicis Gimenez, D.; Dönigus, B.; Dordic, O.; Drozhzhova, T.; Dubey, A. K.; Dubla, A.; Ducroux, L.; Dupieux, P.; Ehlers, R. J.; Elia, D.; Endress, E.; Engel, H.; Epple, E.; Erazmus, B.; Erdemir, I.; Erhardt, F.; Espagnon, B.; Estienne, M.; Esumi, S.; Eum, J.; Evans, D.; Evdokimov, S.; Eyyubova, G.; Fabbietti, L.; Fabris, D.; Faivre, J.; Fantoni, A.; Fasel, M.; Feldkamp, L.; Feliciello, A.; Feofilov, G.; Ferencei, J.; Fernández Téllez, A.; Ferreiro, E. G.; Ferretti, A.; Festanti, A.; Feuillard, V. J. G.; Figiel, J.; Figueredo, M. A. S.; Filchagin, S.; Finogeev, D.; Fionda, F. M.; Fiore, E. M.; Fleck, M. G.; Floris, M.; Foertsch, S.; Foka, P.; Fokin, S.; Fragiacomo, E.; Francescon, A.; Frankenfeld, U.; Fronze, G. G.; Fuchs, U.; Furget, C.; Furs, A.; Fusco Girard, M.; Gaardhøje, J. J.; Gagliardi, M.; Gago, A. M.; Gallio, M.; Gangadharan, D. R.; Ganoti, P.; Gao, C.; Garabatos, C.; Garcia-Solis, E.; Gargiulo, C.; Gasik, P.; Gauger, E. F.; Germain, M.; Gheata, A.; Gheata, M.; Ghosh, P.; Ghosh, S. K.; Gianotti, P.; Giubellino, P.; Giubilato, P.; Gladysz-Dziadus, E.; Glässel, P.; Goméz Coral, D. M.; Gomez Ramirez, A.; Gonzalez, A. S.; Gonzalez, V.; González-Zamora, P.; Gorbunov, S.; Görlich, L.; Gotovac, S.; Grabski, V.; Grachov, O. A.; Graczykowski, L. K.; Graham, K. L.; Grelli, A.; Grigoras, A.; Grigoras, C.; Grigoriev, V.; Grigoryan, A.; Grigoryan, S.; Grinyov, B.; Grion, N.; Gronefeld, J. M.; Grosse-Oetringhaus, J. F.; Grosso, R.; Guber, F.; Guernane, R.; Guerzoni, B.; Gulbrandsen, K.; Gunji, T.; Gupta, A.; Gupta, R.; Haake, R.; Haaland, Ø.; Hadjidakis, C.; Haiduc, M.; Hamagaki, H.; Hamar, G.; Hamon, J. C.; Harris, J. W.; Harton, A.; Hatzifotiadou, D.; Hayashi, S.; Heckel, S. T.; Hellbär, E.; Helstrup, H.; Herghelegiu, A.; Herrera Corral, G.; Hess, B. A.; Hetland, K. F.; Hillemanns, H.; Hippolyte, B.; Horak, D.; Hosokawa, R.; Hristov, P.; Humanic, T. J.; Hussain, N.; Hussain, T.; Hutter, D.; Hwang, D. S.; Ilkaev, R.; Inaba, M.; Incani, E.; Ippolitov, M.; Irfan, M.; Ivanov, M.; Ivanov, V.; Izucheev, V.; Jacazio, N.; Jacobs, P. M.; Jadhav, M. B.; Jadlovska, S.; Jadlovsky, J.; Jahnke, C.; Jakubowska, M. J.; Jang, H. J.; Janik, M. A.; Jayarathna, P. H. S. Y.; Jena, C.; Jena, S.; Jimenez Bustamante, R. T.; Jones, P. G.; Jusko, A.; Kalinak, P.; Kalweit, A.; Kamin, J.; Kang, J. H.; Kaplin, V.; Kar, S.; Karasu Uysal, A.; Karavichev, O.; Karavicheva, T.; Karayan, L.; Karpechev, E.; Kebschull, U.; Keidel, R.; Keijdener, D. L. D.; Keil, M.; Mohisin Khan, M.; Khan, P.; Khan, S. A.; Khanzadeev, A.; Kharlov, Y.; Kileng, B.; Kim, D. W.; Kim, D. J.; Kim, D.; Kim, H.; Kim, J. S.; Kim, M.; Kim, S.; Kim, T.; Kirsch, S.; Kisel, I.; Kiselev, S.; Kisiel, A.; Kiss, G.; Klay, J. L.; Klein, C.; Klein, J.; Klein-Bösing, C.; Klewin, S.; Kluge, A.; Knichel, M. L.; Knospe, A. G.; Kobdaj, C.; Kofarago, M.; Kollegger, T.; Kolojvari, A.; Kondratiev, V.; Kondratyeva, N.; Kondratyuk, E.; Konevskikh, A.; Kopcik, M.; Kostarakis, P.; Kour, M.; Kouzinopoulos, C.; Kovalenko, O.; Kovalenko, V.; Kowalski, M.; Koyithatta Meethaleveedu, G.; Králik, I.; Kravčáková, A.; Krivda, M.; Krizek, F.; Kryshen, E.; Krzewicki, M.; Kubera, A. M.; Kučera, V.; Kuhn, C.; Kuijer, P. G.; Kumar, A.; Kumar, J.; Kumar, L.; Kumar, S.; Kurashvili, P.; Kurepin, A.; Kurepin, A. B.; Kuryakin, A.; Kweon, M. J.; Kwon, Y.; La Pointe, S. L.; La Rocca, P.; Ladron de Guevara, P.; Lagana Fernandes, C.; Lakomov, I.; Langoy, R.; Lara, C.; Lardeux, A.; Lattuca, A.; Laudi, E.; Lea, R.; Leardini, L.; Lee, G. R.; Lee, S.; Lehas, F.; Lemmon, R. C.; Lenti, V.; Leogrande, E.; León Monzón, I.; León Vargas, H.; Leoncino, M.; Lévai, P.; Li, S.; Li, X.; Lien, J.; Lietava, R.; Lindal, S.; Lindenstruth, V.; Lippmann, C.; Lisa, M. A.; Ljunggren, H. M.; Lodato, D. F.; Loenne, P. I.; Loginov, V.; Loizides, C.; Lopez, X.; López Torres, E.; Lowe, A.; Luettig, P.; Lunardon, M.; Luparello, G.; Lutz, T. H.; Maevskaya, A.; Mager, M.; Mahajan, S.; Mahmood, S. M.; Maire, A.; Majka, R. D.; Malaev, M.; Maldonado Cervantes, I.; Malinina, L.; Mal'Kevich, D.; Malzacher, P.; Mamonov, A.; Manko, V.; Manso, F.; Manzari, V.; Marchisone, M.; Mareš, J.; Margagliotti, G. V.; Margotti, A.; Margutti, J.; Marín, A.; Markert, C.; Marquard, M.; Martin, N. A.; Martin Blanco, J.; Martinengo, P.; Martínez, M. I.; Martínez García, G.; Martinez Pedreira, M.; Mas, A.; Masciocchi, S.; Masera, M.; Masoni, A.; Mastroserio, A.; Matyja, A.; Mayer, C.; Mazer, J.; Mazzoni, M. A.; Mcdonald, D.; Meddi, F.; Melikyan, Y.; Menchaca-Rocha, A.; Meninno, E.; Mercado Pérez, J.; Meres, M.; Miake, Y.; Mieskolainen, M. M.; Mikhaylov, K.; Milano, L.; Milosevic, J.; Mischke, A.; Mishra, A. N.; Miśkowiec, D.; Mitra, J.; Mitu, C. M.; Mohammadi, N.; Mohanty, B.; Molnar, L.; Montaño Zetina, L.; Montes, E.; Moreira De Godoy, D. A.; Moreno, L. A. P.; Moretto, S.; Morreale, A.; Morsch, A.; Muccifora, V.; Mudnic, E.; Mühlheim, D.; Muhuri, S.; Mukherjee, M.; Mulligan, J. D.; Munhoz, M. G.; Munzer, R. H.; Murakami, H.; Murray, S.; Musa, L.; Musinsky, J.; Naik, B.; Nair, R.; Nandi, B. K.; Nania, R.; Nappi, E.; Naru, M. U.; Natal da Luz, H.; Nattrass, C.; Navarro, S. R.; Nayak, K.; Nayak, R.; Nayak, T. K.; Nazarenko, S.; Nedosekin, A.; Nellen, L.; Ng, F.; Nicassio, M.; Niculescu, M.; Niedziela, J.; Nielsen, B. S.; Nikolaev, S.; Nikulin, S.; Nikulin, V.; Noferini, F.; Nomokonov, P.; Nooren, G.; Noris, J. C. C.; Norman, J.; Nyanin, A.; Nystrand, J.; Oeschler, H.; Oh, S.; Oh, S. K.; Ohlson, A.; Okatan, A.; Okubo, T.; Olah, L.; Oleniacz, J.; Oliveira Da Silva, A. C.; Oliver, M. H.; Onderwaater, J.; Oppedisano, C.; Orava, R.; Oravec, M.; Ortiz Velasquez, A.; Oskarsson, A.; Otwinowski, J.; Oyama, K.; Ozdemir, M.; Pachmayer, Y.; Pagano, D.; Pagano, P.; Paić, G.; Pal, S. K.; Pan, J.; Pandey, A. K.; Papikyan, V.; Pappalardo, G. S.; Pareek, P.; Park, W. J.; Parmar, S.; Passfeld, A.; Paticchio, V.; Patra, R. N.; Paul, B.; Pei, H.; Peitzmann, T.; Pereira Da Costa, H.; Peresunko, D.; Pérez Lara, C. E.; Perez Lezama, E.; Peskov, V.; Pestov, Y.; Petráček, V.; Petrov, V.; Petrovici, M.; Petta, C.; Piano, S.; Pikna, M.; Pillot, P.; Pimentel, L. O. D. L.; Pinazza, O.; Pinsky, L.; Piyarathna, D. B.; Płoskoń, M.; Planinic, M.; Pluta, J.; Pochybova, S.; Podesta-Lerma, P. L. M.; Poghosyan, M. G.; Polichtchouk, B.; Poljak, N.; Poonsawat, W.; Pop, A.; Porteboeuf-Houssais, S.; Porter, J.; Pospisil, J.; Prasad, S. K.; Preghenella, R.; Prino, F.; Pruneau, C. A.; Pshenichnov, I.; Puccio, M.; Puddu, G.; Pujahari, P.; Punin, V.; Putschke, J.; Qvigstad, H.; Rachevski, A.; Raha, S.; Rajput, S.; Rak, J.; Rakotozafindrabe, A.; Ramello, L.; Rami, F.; Raniwala, R.; Raniwala, S.; Räsänen, S. S.; Rascanu, B. T.; Rathee, D.; Read, K. F.; Redlich, K.; Reed, R. J.; Rehman, A.; Reichelt, P.; Reidt, F.; Ren, X.; Renfordt, R.; Reolon, A. R.; Reshetin, A.; Reygers, K.; Riabov, V.; Ricci, R. A.; Richert, T.; Richter, M.; Riedler, P.; Riegler, W.; Riggi, F.; Ristea, C.; Rocco, E.; Rodríguez Cahuantzi, M.; Rodriguez Manso, A.; Røed, K.; Rogochaya, E.; Rohr, D.; Röhrich, D.; Ronchetti, F.; Ronflette, L.; Rosnet, P.; Rossi, A.; Roukoutakis, F.; Roy, A.; Roy, C.; Roy, P.; Rubio Montero, A. J.; Rui, R.; Russo, R.; Ryabinkin, E.; Ryabov, Y.; Rybicki, A.; Saarinen, S.; Sadhu, S.; Sadovsky, S.; Šafařík, K.; Sahlmuller, B.; Sahoo, P.; Sahoo, R.; Sahoo, S.; Sahu, P. K.; Saini, J.; Sakai, S.; Saleh, M. A.; Salzwedel, J.; Sambyal, S.; Samsonov, V.; Šándor, L.; Sandoval, A.; Sano, M.; Sarkar, D.; Sarkar, N.; Sarma, P.; Scapparone, E.; Scarlassara, F.; Schiaua, C.; Schicker, R.; Schmidt, C.; Schmidt, H. R.; Schuchmann, S.; Schukraft, J.; Schulc, M.; Schutz, Y.; Schwarz, K.; Schweda, K.; Scioli, G.; Scomparin, E.; Scott, R.; Šefčík, M.; Seger, J. E.; Sekiguchi, Y.; Sekihata, D.; Selyuzhenkov, I.; Senosi, K.; Senyukov, S.; Serradilla, E.; Sevcenco, A.; Shabanov, A.; Shabetai, A.; Shadura, O.; Shahoyan, R.; Shahzad, M. I.; Shangaraev, A.; Sharma, A.; Sharma, M.; Sharma, M.; Sharma, N.; Sheikh, A. I.; Shigaki, K.; Shou, Q.; Shtejer, K.; Sibiriak, Y.; Siddhanta, S.; Sielewicz, K. M.; Siemiarczuk, T.; Silvermyr, D.; Silvestre, C.; Simatovic, G.; Simonetti, G.; Singaraju, R.; Singh, R.; Singha, S.; Singhal, V.; Sinha, B. C.; Sinha, T.; Sitar, B.; Sitta, M.; Skaali, T. B.; Slupecki, M.; Smirnov, N.; Snellings, R. J. M.; Snellman, T. W.; Song, J.; Song, M.; Song, Z.; Soramel, F.; Sorensen, S.; Souza, R. D. de; Sozzi, F.; Spacek, M.; Spiriti, E.; Sputowska, I.; Spyropoulou-Stassinaki, M.; Stachel, J.; Stan, I.; Stankus, P.; Stenlund, E.; Steyn, G.; Stiller, J. H.; Stocco, D.; Strmen, P.; Suaide, A. A. P.; Sugitate, T.; Suire, C.; Suleymanov, M.; Suljic, M.; Sultanov, R.; Šumbera, M.; Sumowidagdo, S.; Szabo, A.; Szanto de Toledo, A.; Szarka, I.; Szczepankiewicz, A.; Szymanski, M.; Tabassam, U.; Takahashi, J.; Tambave, G. J.; Tanaka, N.; Tarhini, M.; Tariq, M.; Tarzila, M. G.; Tauro, A.; Tejeda Muñoz, G.; Telesca, A.; Terasaki, K.; Terrevoli, C.; Teyssier, B.; Thäder, J.; Thakur, D.; Thomas, D.; Tieulent, R.; Timmins, A. R.; Toia, A.; Trogolo, S.; Trombetta, G.; Trubnikov, V.; Trzaska, W. H.; Tsuji, T.; Tumkin, A.; Turrisi, R.; Tveter, T. S.; Ullaland, K.; Uras, A.; Usai, G. L.; Utrobicic, A.; Vala, M.; Valencia Palomo, L.; Vallero, S.; Van Der Maarel, J.; Van Hoorne, J. W.; van Leeuwen, M.; Vanat, T.; Vande Vyvre, P.; Varga, D.; Vargas, A.; Vargyas, M.; Varma, R.; Vasileiou, M.; Vasiliev, A.; Vauthier, A.; Vechernin, V.; Veen, A. M.; Veldhoen, M.; Velure, A.; Vercellin, E.; Vergara Limón, S.; Vernet, R.; Verweij, M.; Vickovic, L.; Viesti, G.; Viinikainen, J.; Vilakazi, Z.; Villalobos Baillie, O.; Villatoro Tello, A.; Vinogradov, A.; Vinogradov, L.; Vinogradov, Y.; Virgili, T.; Vislavicius, V.; Viyogi, Y. P.; Vodopyanov, A.; Völkl, M. A.; Voloshin, K.; Voloshin, S. A.; Volpe, G.; von Haller, B.; Vorobyev, I.; Vranic, D.; Vrláková, J.; Vulpescu, B.; Wagner, B.; Wagner, J.; Wang, H.; Wang, M.; Watanabe, D.; Watanabe, Y.; Weber, M.; Weber, S. G.; Weiser, D. F.; Wessels, J. P.; Westerhoff, U.; Whitehead, A. M.; Wiechula, J.; Wikne, J.; Wilk, G.; Wilkinson, J.; Williams, M. C. S.; Windelband, B.; Winn, M.; Yang, H.; Yang, P.; Yano, S.; Yasin, Z.; Yin, Z.; Yokoyama, H.; Yoo, I.-K.; Yoon, J. H.; Yurchenko, V.; Yushmanov, I.; Zaborowska, A.; Zaccolo, V.; Zaman, A.; Zampolli, C.; Zanoli, H. J. C.; Zaporozhets, S.; Zardoshti, N.; Zarochentsev, A.; Závada, P.; Zaviyalov, N.; Zbroszczyk, H.; Zgura, I. S.; Zhalov, M.; Zhang, H.; Zhang, X.; Zhang, Y.; Zhang, C.; Zhang, Z.; Zhao, C.; Zhigareva, N.; Zhou, D.; Zhou, Y.; Zhou, Z.; Zhu, H.; Zhu, J.; Zichichi, A.; Zimmermann, A.; Zimmermann, M. B.; Zinovjev, G.; Zyzak, M.

    2016-05-01

    We present a Bayesian approach to particle identification (PID) within the ALICE experiment. The aim is to more effectively combine the particle identification capabilities of its various detectors. After a brief explanation of the adopted methodology and formalism, the performance of the Bayesian PID approach for charged pions, kaons and protons in the central barrel of ALICE is studied. PID is performed via measurements of specific energy loss ( d E/d x) and time of flight. PID efficiencies and misidentification probabilities are extracted and compared with Monte Carlo simulations using high-purity samples of identified particles in the decay channels K0S → π-π+, φ→ K-K+, and Λ→ p π- in p-Pb collisions at √{s_{NN}}=5.02 TeV. In order to thoroughly assess the validity of the Bayesian approach, this methodology was used to obtain corrected pT spectra of pions, kaons, protons, and D0 mesons in pp collisions at √{s}=7 TeV. In all cases, the results using Bayesian PID were found to be consistent with previous measurements performed by ALICE using a standard PID approach. For the measurement of D0 → K-π+, it was found that a Bayesian PID approach gave a higher signal-to-background ratio and a similar or larger statistical significance when compared with standard PID selections, despite a reduced identification efficiency. Finally, we present an exploratory study of the measurement of Λc+ → p K-π+ in pp collisions at √{s}=7 TeV, using the Bayesian approach for the identification of its decay products.

  18. Statistical process control analysis for patient-specific IMRT and VMAT QA.

    PubMed

    Sanghangthum, Taweap; Suriyapee, Sivalee; Srisatit, Somyot; Pawlicki, Todd

    2013-05-01

    This work applied statistical process control to establish the control limits of the % gamma pass of patient-specific intensity modulated radiotherapy (IMRT) and volumetric modulated arc therapy (VMAT) quality assurance (QA), and to evaluate the efficiency of the QA process by using the process capability index (Cpml). A total of 278 IMRT QA plans in nasopharyngeal carcinoma were measured with MapCHECK, while 159 VMAT QA plans were undertaken with ArcCHECK. Six megavolts with nine fields were used for the IMRT plan and 2.5 arcs were used to generate the VMAT plans. The gamma (3%/3 mm) criteria were used to evaluate the QA plans. The % gamma passes were plotted on a control chart. The first 50 data points were employed to calculate the control limits. The Cpml was calculated to evaluate the capability of the IMRT/VMAT QA process. The results showed higher systematic errors in IMRT QA than VMAT QA due to the more complicated setup used in IMRT QA. The variation of random errors was also larger in IMRT QA than VMAT QA because the VMAT plan has more continuity of dose distribution. The average % gamma pass was 93.7% ± 3.7% for IMRT and 96.7% ± 2.2% for VMAT. The Cpml value of IMRT QA was 1.60 and VMAT QA was 1.99, which implied that the VMAT QA process was more accurate than the IMRT QA process. Our lower control limit for % gamma pass of IMRT is 85.0%, while the limit for VMAT is 90%. Both the IMRT and VMAT QA processes are good quality because Cpml values are higher than 1.0.

  19. Using balance statistics to determine the optimal number of controls in matching studies.

    PubMed

    Linden, Ariel; Samuels, Steven J

    2013-10-01

    When a randomized controlled trial is not feasible, investigators typically turn to matching techniques as an alternative approach to evaluate the effectiveness of health care interventions. Matching studies are designed to minimize imbalances on measured pre-intervention characteristics, thereby reducing bias in estimates of treatment effects. Generally, a matching ratio up to 4:1 (control to treatment) elicits the lowest bias. However, when matching techniques are used in prospective studies, investigators try to maximize the number of controls matched to each treated individual to increase the likelihood that a sufficient sample size will remain after attrition. In this paper, we describe a systematic approach to managing the trade-off between minimizing bias and maximizing matched sample size. Our approach includes the following three steps: (1) run the desired matching algorithm, starting with 1:1 (one control to one treated individual) matching and iterating until the maximum desired number of potential controls per treated subject is reached; (2) for each iteration, test for covariate balance; and (3) generate numeric summaries and graphical plots of the balance statistics across all iterations in order to determine the optimal solution. We demonstrate the implementation of this approach with data from a medical home pilot programme and with a simulation study of populations of 100,000 in which 1000 individuals receive the intervention. We advocate undertaking this methodical approach in matching studies to ensure that the optimal matching solution is identified. Doing so will raise the overall quality of the literature and increase the likelihood of identifying effective interventions.

  20. Statistical quality control for volumetric modulated arc therapy (VMAT) delivery by using the machine's log data

    NASA Astrophysics Data System (ADS)

    Cheong, Kwang-Ho; Lee, Me-Yeon; Kang, Sei-Kwon; Yoon, Jai-Woong; Park, Soah; Hwang, Taejin; Kim, Haeyoung; Kim, Kyoung Ju; Han, Tae Jin; Bae, Hoonsik

    2015-07-01

    The aim of this study is to set up statistical quality control for monitoring the volumetric modulated arc therapy (VMAT) delivery error by using the machine's log data. Eclipse and a Clinac iX linac with the RapidArc system (Varian Medical Systems, Palo Alto, USA) are used for delivery of the VMAT plan. During the delivery of the RapidArc fields, the machine determines the delivered monitor units (MUs) and the gantry angle's position accuracy and the standard deviations of the MU ( σMU: dosimetric error) and the gantry angle ( σGA: geometric error) are displayed on the console monitor after completion of the RapidArc delivery. In the present study, first, the log data were analyzed to confirm its validity and usability; then, statistical process control (SPC) was applied to monitor the σMU and the σGA in a timely manner for all RapidArc fields: a total of 195 arc fields for 99 patients. The MU and the GA were determined twice for all fields, that is, first during the patient-specific plan QA and then again during the first treatment. The sMU and the σGA time series were quite stable irrespective of the treatment site; however, the sGA strongly depended on the gantry's rotation speed. The σGA of the RapidArc delivery for stereotactic body radiation therapy (SBRT) was smaller than that for the typical VMAT. Therefore, SPC was applied for SBRT cases and general cases respectively. Moreover, the accuracy of the potential meter of the gantry rotation is important because the σGA can change dramatically due to its condition. By applying SPC to the σMU and σGA, we could monitor the delivery error efficiently. However, the upper and the lower limits of SPC need to be determined carefully with full knowledge of the machine and log data.

  1. Quantification Of Margins And Uncertainties: A Bayesian Approach (full Paper)

    SciTech Connect

    Wallstrom, Timothy C

    2008-01-01

    Quantification of Margins and Uncertainties (QMU) is 'a formalism for dealing with the reliability of complex technical systems, and the confidence which can be placed in estimates of that reliability.' (Eardleyet al, 2005). In this paper, we show how QMU may be interpreted in the framework of Bayesian statistical inference, using a probabilistic network. The Bayesian approach clarifies the probabilistic underpinnings of the formalism, and shows how the formalism can be used for deciSion-making.

  2. A Bayesian Belief Network Approach to Explore Alternative Decisions for Sediment Control and water Storage Capacity at Lago Lucchetti, Puerto Rico

    EPA Science Inventory

    A Bayesian belief network (BBN) was developed to characterize the effects of sediment accumulation on the water storage capacity of Lago Lucchetti (located in southwest Puerto Rico) and to forecast the life expectancy (usefulness) of the reservoir under different management scena...

  3. Bayesian least squares deconvolution

    NASA Astrophysics Data System (ADS)

    Asensio Ramos, A.; Petit, P.

    2015-11-01

    Aims: We develop a fully Bayesian least squares deconvolution (LSD) that can be applied to the reliable detection of magnetic signals in noise-limited stellar spectropolarimetric observations using multiline techniques. Methods: We consider LSD under the Bayesian framework and we introduce a flexible Gaussian process (GP) prior for the LSD profile. This prior allows the result to automatically adapt to the presence of signal. We exploit several linear algebra identities to accelerate the calculations. The final algorithm can deal with thousands of spectral lines in a few seconds. Results: We demonstrate the reliability of the method with synthetic experiments and we apply it to real spectropolarimetric observations of magnetic stars. We are able to recover the magnetic signals using a small number of spectral lines, together with the uncertainty at each velocity bin. This allows the user to consider if the detected signal is reliable. The code to compute the Bayesian LSD profile is freely available.

  4. Childhood autism in India: A case-control study using tract-based spatial statistics analysis

    PubMed Central

    Assis, Zarina Abdul; Bagepally, Bhavani Shankara; Saini, Jitender; Srinath, Shoba; Bharath, Rose Dawn; Naidu, Purushotham R.; Gupta, Arun Kumar

    2015-01-01

    Context: Autism is a serious behavioral disorder among young children that now occurs at epidemic rates in developing countries like India. We have used tract-based spatial statistics (TBSS) of diffusion tensor imaging (DTI) measures to investigate the microstructure of primary neurocircuitry involved in autistic spectral disorders as compared to the typically developed children. Objective: To evaluate the various white matter tracts in Indian autistic children as compared to the controls using TBSS. Materials and Methods: Prospective, case-control, voxel-based, whole-brain DTI analysis using TBSS was performed. The study included 19 autistic children (mean age 8.7 years ± 3.84, 16 males and 3 females) and 34 controls (mean age 12.38 ± 3.76, all males). Fractional anisotropy (FA), mean diffusivity (MD), radial diffusivity (RD), and axial diffusivity (AD) values were used as outcome variables. Results: Compared to the control group, TBSS demonstrated multiple areas of markedly reduced FA involving multiple long white matter tracts, entire corpus callosum, bilateral posterior thalami, and bilateral optic tracts (OTs). Notably, there were no voxels where FA was significantly increased in the autism group. Increased RD was also noted in these regions, suggesting underlying myelination defect. The MD was elevated in many of the projections and association fibers and notably in the OTs. There were no significant changes in the AD in these regions, indicating no significant axonal injury. There was no significant correlation between the FA values and Childhood Autism Rating Scale. Conclusion: This is a first of a kind study evaluating DTI findings in autistic children in India. In our study, DTI has shown a significant fault with the underlying intricate brain wiring system in autism. OT abnormality is a novel finding and needs further research. PMID:26600581

  5. Immunonutrition Support for Patients Undergoing Surgery for Gastrointestinal Malignancy: Preoperative, Postoperative, or Perioperative? A Bayesian Network Meta-Analysis of Randomized Controlled Trials.

    PubMed

    Song, Guo-Min; Tian, Xu; Zhang, Lei; Ou, Yang-Xiang; Yi, Li-Juan; Shuai, Ting; Zhou, Jian-Guo; Zeng, Zi; Yang, Hong-Ling

    2015-07-01

    Enteral immunonutrition (EIN) has been established to be as a significantly important modality to prevent the postoperative infectious and noninfectious complications, enhance the immunity of host, and eventually improve the prognosis of gastrointestinal (GI) cancer patients undergoing surgery. However, different support routes, which are the optimum option, remain unclear. To evaluate the effects of different EIN support regimes for patients who underwent selective surgery for resectable GI malignancy, a Bayesian network meta-analysis (NMA) of randomized controlled trials (RCTs) was conducted. A search of PubMed, EMBASE, and the Cochrane Central Register of Controlled Trials (CENTRAL) was electronically searched until the end of December 2014. Moreover, we manually checked reference lists of eligible trials and review and retrieval unpublished literature. RCTs which investigated the comparative effects of EIN versus standard enteral nutrition (EN) or different EIN regimes were included if the clinical outcomes information can be extracted from it. A total of 27 RCTs were incorporated into this study. Pair-wise meta-analyses suggested that preoperative (relative risk [RR], 0.58; 95% confidence interval [CI], 0.43-0.78), postoperative (RR, 0.63; 95% CI, 0.52-0.76), and perioperative EIN methods (RR, 0.46; 95% CI, 0.34-0.62) reduced incidence of postoperative infectious complications compared with standard EN. Moreover, perioperative EIN (RR, 0.65; 95% CI, 0.44-0.95) reduced the incidence of postoperative noninfectious complications, and the postoperative (mean difference [MD], -2.38; 95% CI, -3.4 to -1.31) and perioperative EIN (MD, -2.64; 95% CI, -3.28 to -1.99) also shortened the length of postoperative hospitalization compared with standard EN. NMA found that EIN support effectively improved the clinical outcomes of patients who underwent selective surgery for GI cancer compared with standard EN. Our results suggest EIN support is promising alternative for

  6. Bayesian Exploratory Factor Analysis

    PubMed Central

    Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.; Piatek, Rémi

    2014-01-01

    This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates from a high dimensional set of psychological measurements. PMID:25431517

  7. Bayesian methods, maximum entropy, and quantum Monte Carlo

    SciTech Connect

    Gubernatis, J.E.; Silver, R.N. ); Jarrell, M. )

    1991-01-01

    We heuristically discuss the application of the method of maximum entropy to the extraction of dynamical information from imaginary-time, quantum Monte Carlo data. The discussion emphasizes the utility of a Bayesian approach to statistical inference and the importance of statistically well-characterized data. 14 refs.

  8. Bayesian Networks for Social Modeling

    SciTech Connect

    Whitney, Paul D.; White, Amanda M.; Walsh, Stephen J.; Dalton, Angela C.; Brothers, Alan J.

    2011-03-28

    This paper describes a body of work developed over the past five years. The work addresses the use of Bayesian network (BN) models for representing and predicting social/organizational behaviors. The topics covered include model construction, validation, and use. These topics show the bulk of the lifetime of such model, beginning with construction, moving to validation and other aspects of model ‘critiquing’, and finally demonstrating how the modeling approach might be used to inform policy analysis. To conclude, we discuss limitations of using BN for this activity and suggest remedies to address those limitations. The primary benefits of using a well-developed computational, mathematical, and statistical modeling structure, such as BN, are 1) there are significant computational, theoretical and capability bases on which to build 2) ability to empirically critique the model, and potentially evaluate competing models for a social/behavioral phenomena.

  9. Controlling Time-Dependent Confounding by Health Status and Frailty: Restriction Versus Statistical Adjustment

    PubMed Central

    McGrath, Leah J.; Ellis, Alan R.; Brookhart, M. Alan

    2015-01-01

    Nonexperimental studies of preventive interventions are often biased because of the healthy-user effect and, in frail populations, because of confounding by functional status. Bias is evident when estimating influenza vaccine effectiveness, even after adjustment for claims-based indicators of illness. We explored bias reduction methods while estimating vaccine effectiveness in a cohort of adult hemodialysis patients. Using the United States Renal Data System and linked data from a commercial dialysis provider, we estimated vaccine effectiveness using a Cox proportional hazards marginal structural model of all-cause mortality before and during 3 influenza seasons in 2005/2006 through 2007/2008. To improve confounding control, we added frailty indicators to the model, measured time-varying confounders at different time intervals, and restricted the sample in multiple ways. Crude and baseline-adjusted marginal structural models remained strongly biased. Restricting to a healthier population removed some unmeasured confounding; however, this reduced the sample size, resulting in wide confidence intervals. We estimated an influenza vaccine effectiveness of 9% (hazard ratio = 0.91, 95% confidence interval: 0.72, 1.15) when bias was minimized through cohort restriction. In this study, the healthy-user bias could not be controlled through statistical adjustment; however, sample restriction reduced much of the bias. PMID:25868551

  10. Statistical Methods for Quality Control of Steel Coils Manufacturing Process using Generalized Linear Models

    NASA Astrophysics Data System (ADS)

    García-Díaz, J. Carlos

    2009-11-01

    Fault detection and diagnosis is an important problem in process engineering. Process equipments are subject to malfunctions during operation. Galvanized steel is a value added product, furnishing effective performance by combining the corrosion resistance of zinc with the strength and formability of steel. Fault detection and diagnosis is an important problem in continuous hot dip galvanizing and the increasingly stringent quality requirements in automotive industry has also demanded ongoing efforts in process control to make the process more robust. When faults occur, they change the relationship among these observed variables. This work compares different statistical regression models proposed in the literature for estimating the quality of galvanized steel coils on the basis of short time histories. Data for 26 batches were available. Five variables were selected for monitoring the process: the steel strip velocity, four bath temperatures and bath level. The entire data consisting of 48 galvanized steel coils was divided into sets. The first training data set was 25 conforming coils and the second data set was 23 nonconforming coils. Logistic regression is a modeling tool in which the dependent variable is categorical. In most applications, the dependent variable is binary. The results show that the logistic generalized linear models do provide good estimates of quality coils and can be useful for quality control in manufacturing process.

  11. A new method to obtain uniform distribution of ground control points based on regional statistical information

    NASA Astrophysics Data System (ADS)

    Ma, Chao; An, Wei; Deng, Xinpu

    2015-10-01

    The Ground Control Points (GCPs) is an important source of fundamental data in geometric correction for remote sensing imagery. The quantity, accuracy and distribution of GCPs are three factors which may affect the accuracy of geometric correction. It is generally required that the distribution of GCP should be uniform, so they can fully control the accuracy of mapping regions. In this paper, we establish an objective standard of evaluating the uniformity of the GCPs' distribution based on regional statistical information (RSI), and get an optimal distribution of GCPs. This sampling method is called RSIS for short in this work. The Amounts of GCPs in different regions by equally partitioning the image in regions in different manners are counted which forms a vector called RSI vector in this work. The uniformity of GCPs' distribution can be evaluated by a mathematical quantity of the RSI vector. An optimal distribution of GCPs is obtained by searching the RSI vector with the minimum mathematical quantity. In this paper, the simulation annealing is employed to search the optimal distribution of GCPs that have the minimum mathematical quantity of the RSI vector. Experiments are carried out to test the method proposed in this paper, and sampling designs compared are simple random sampling and universal kriging model-based sampling. The experiments indicate that this method is highly recommended as new GCPs sampling design method for geometric correction of remotely sensed imagery.

  12. Exploring the use of statistical process control methods to assess course changes

    NASA Astrophysics Data System (ADS)

    Vollstedt, Ann-Marie

    This dissertation pertains to the field of Engineering Education. The Department of Mechanical Engineering at the University of Nevada, Reno (UNR) is hosting this dissertation under a special agreement. This study was motivated by the desire to find an improved, quantitative measure of student quality that is both convenient to use and easy to evaluate. While traditional statistical analysis tools such as ANOVA (analysis of variance) are useful, they are somewhat time consuming and are subject to error because they are based on grades, which are influenced by numerous variables, independent of student ability and effort (e.g. inflation and curving). Additionally, grades are currently the only measure of quality in most engineering courses even though most faculty agree that grades do not accurately reflect student quality. Based on a literature search, in this study, quality was defined as content knowledge, cognitive level, self efficacy, and critical thinking. Nineteen treatments were applied to a pair of freshmen classes in an effort in increase the qualities. The qualities were measured via quiz grades, essays, surveys, and online critical thinking tests. Results from the quality tests were adjusted and filtered prior to analysis. All test results were subjected to Chauvenet's criterion in order to detect and remove outlying data. In addition to removing outliers from data sets, it was felt that individual course grades needed adjustment to accommodate for the large portion of the grade that was defined by group work. A new method was developed to adjust grades within each group based on the residual of the individual grades within the group and the portion of the course grade defined by group work. It was found that the grade adjustment method agreed 78% of the time with the manual ii grade changes instructors made in 2009, and also increased the correlation between group grades and individual grades. Using these adjusted grades, Statistical Process Control

  13. Decision generation tools and Bayesian inference

    NASA Astrophysics Data System (ADS)

    Jannson, Tomasz; Wang, Wenjian; Forrester, Thomas; Kostrzewski, Andrew; Veeris, Christian; Nielsen, Thomas

    2014-05-01

    Digital Decision Generation (DDG) tools are important software sub-systems of Command and Control (C2) systems and technologies. In this paper, we present a special type of DDGs based on Bayesian Inference, related to adverse (hostile) networks, including such important applications as terrorism-related networks and organized crime ones.

  14. Varying prior information in Bayesian inversion

    NASA Astrophysics Data System (ADS)

    Walker, Matthew; Curtis, Andrew

    2014-06-01

    Bayes' rule is used to combine likelihood and prior probability distributions. The former represents knowledge derived from new data, the latter represents pre-existing knowledge; the Bayesian combination is the so-called posterior distribution, representing the resultant new state of knowledge. While varying the likelihood due to differing data observations is common, there are also situations where the prior distribution must be changed or replaced repeatedly. For example, in mixture density neural network (MDN) inversion, using current methods the neural network employed for inversion needs to be retrained every time prior information changes. We develop a method of prior replacement to vary the prior without re-training the network. Thus the efficiency of MDN inversions can be increased, typically by orders of magnitude when applied to geophysical problems. We demonstrate this for the inversion of seismic attributes in a synthetic subsurface geological reservoir model. We also present results which suggest that prior replacement can be used to control the statistical properties (such as variance) of the final estimate of the posterior in more general (e.g., Monte Carlo based) inverse problem solutions.

  15. A sensorimotor paradigm for Bayesian model selection.

    PubMed

    Genewein, Tim; Braun, Daniel A

    2012-01-01

    Sensorimotor control is thought to rely on predictive internal models in order to cope efficiently with uncertain environments. Recently, it has been shown that humans not only learn different internal models for different tasks, but that they also extract common structure between tasks. This raises the question of how the motor system selects between different structures or models, when each model can be associated with a range of different task-specific parameters. Here we design a sensorimotor task that requires subjects to compensate visuomotor shifts in a three-dimensional virtual reality setup, where one of the dimensions can be mapped to a model variable and the other dimension to the parameter variable. By introducing probe trials that are neutral in the parameter dimension, we can directly test for model selection. We found that model selection procedures based on Bayesian statistics provided a better explanation for subjects' choice behavior than simple non-probabilistic heuristics. Our experimental design lends itself to the general study of model selection in a sensorimotor context as it allows to separately query model and parameter variables from subjects. PMID:23125827

  16. A Bayesian approach to reliability and confidence

    NASA Technical Reports Server (NTRS)

    Barnes, Ron

    1989-01-01

    The historical evolution of NASA's interest in quantitative measures of reliability assessment is outlined. The introduction of some quantitative methodologies into the Vehicle Reliability Branch of the Safety, Reliability and Quality Assurance (SR and QA) Division at Johnson Space Center (JSC) was noted along with the development of the Extended Orbiter Duration--Weakest Link study which will utilize quantitative tools for a Bayesian statistical analysis. Extending the earlier work of NASA sponsor, Richard Heydorn, researchers were able to produce a consistent Bayesian estimate for the reliability of a component and hence by a simple extension for a system of components in some cases where the rate of failure is not constant but varies over time. Mechanical systems in general have this property since the reliability usually decreases markedly as the parts degrade over time. While they have been able to reduce the Bayesian estimator to a simple closed form for a large class of such systems, the form for the most general case needs to be attacked by the computer. Once a table is generated for this form, researchers will have a numerical form for the general solution. With this, the corresponding probability statements about the reliability of a system can be made in the most general setting. Note that the utilization of uniform Bayesian priors represents a worst case scenario in the sense that as researchers incorporate more expert opinion into the model, they will be able to improve the strength of the probability calculations.

  17. Bayesian networks in neuroscience: a survey.

    PubMed

    Bielza, Concha; Larrañaga, Pedro

    2014-01-01

    Bayesian networks are a type of probabilistic graphical models lie at the intersection between statistics and machine learning. They have been shown to be powerful tools to encode dependence relationships among the variables of a domain under uncertainty. Thanks to their generality, Bayesian networks can accommodate continuous and discrete variables, as well as temporal processes. In this paper we review Bayesian networks and how they can be learned automatically from data by means of structure learning algorithms. Also, we examine how a user can take advantage of these networks for reasoning by exact or approximate inference algorithms that propagate the given evidence through the graphical structure. Despite their applicability in many fields, they have been little used in neuroscience, where they have focused on specific problems, like functional connectivity analysis from neuroimaging data. Here we survey key research in neuroscience where Bayesian networks have been used with different aims: discover associations between variables, perform probabilistic reasoning over the model, and classify new observations with and without supervision. The networks are learned from data of any kind-morphological, electrophysiological, -omics and neuroimaging-, thereby broadening the scope-molecular, cellular, structural, functional, cognitive and medical- of the brain aspects to be studied.

  18. Bayesian networks in neuroscience: a survey.

    PubMed

    Bielza, Concha; Larrañaga, Pedro

    2014-01-01

    Bayesian networks are a type of probabilistic graphical models lie at the intersection between statistics and machine learning. They have been shown to be powerful tools to encode dependence relationships among the variables of a domain under uncertainty. Thanks to their generality, Bayesian networks can accommodate continuous and discrete variables, as well as temporal processes. In this paper we review Bayesian networks and how they can be learned automatically from data by means of structure learning algorithms. Also, we examine how a user can take advantage of these networks for reasoning by exact or approximate inference algorithms that propagate the given evidence through the graphical structure. Despite their applicability in many fields, they have been little used in neuroscience, where they have focused on specific problems, like functional connectivity analysis from neuroimaging data. Here we survey key research in neuroscience where Bayesian networks have been used with different aims: discover associations between variables, perform probabilistic reasoning over the model, and classify new observations with and without supervision. The networks are learned from data of any kind-morphological, electrophysiological, -omics and neuroimaging-, thereby broadening the scope-molecular, cellular, structural, functional, cognitive and medical- of the brain aspects to be studied. PMID:25360109

  19. Bayesian networks in neuroscience: a survey

    PubMed Central

    Bielza, Concha; Larrañaga, Pedro

    2014-01-01

    Bayesian networks are a type of probabilistic graphical models lie at the intersection between statistics and machine learning. They have been shown to be powerful tools to encode dependence relationships among the variables of a domain under uncertainty. Thanks to their generality, Bayesian networks can accommodate continuous and discrete variables, as well as temporal processes. In this paper we review Bayesian networks and how they can be learned automatically from data by means of structure learning algorithms. Also, we examine how a user can take advantage of these networks for reasoning by exact or approximate inference algorithms that propagate the given evidence through the graphical structure. Despite their applicability in many fields, they have been little used in neuroscience, where they have focused on specific problems, like functional connectivity analysis from neuroimaging data. Here we survey key research in neuroscience where Bayesian networks have been used with different aims: discover associations between variables, perform probabilistic reasoning over the model, and classify new observations with and without supervision. The networks are learned from data of any kind–morphological, electrophysiological, -omics and neuroimaging–, thereby broadening the scope–molecular, cellular, structural, functional, cognitive and medical– of the brain aspects to be studied. PMID:25360109

  20. A study of finite mixture model: Bayesian approach on financial time series data

    NASA Astrophysics Data System (ADS)

    Phoong, Seuk-Yen; Ismail, Mohd Tahir

    2014-07-01

    Recently, statistician have emphasized on the fitting finite mixture model by using Bayesian method. Finite mixture model is a mixture of distributions in modeling a statistical distribution meanwhile Bayesian method is a statistical method that use to fit the mixture model. Bayesian method is being used widely because it has asymptotic properties which provide remarkable result. In addition, Bayesian method also shows consistency characteristic which means the parameter estimates are close to the predictive distributions. In the present paper, the number of components for mixture model is studied by using Bayesian Information Criterion. Identify the number of component is important because it may lead to an invalid result. Later, the Bayesian method is utilized to fit the k-component mixture model in order to explore the relationship between rubber price and stock market price for Malaysia, Thailand, Philippines and Indonesia. Lastly, the results showed that there is a negative effect among rubber price and stock market price for all selected countries.

  1. Bayesian Threshold Estimation

    ERIC Educational Resources Information Center

    Gustafson, S. C.; Costello, C. S.; Like, E. C.; Pierce, S. J.; Shenoy, K. N.

    2009-01-01

    Bayesian estimation of a threshold time (hereafter simply threshold) for the receipt of impulse signals is accomplished given the following: 1) data, consisting of the number of impulses received in a time interval from zero to one and the time of the largest time impulse; 2) a model, consisting of a uniform probability density of impulse time…

  2. Spectral Bayesian Knowledge Tracing

    ERIC Educational Resources Information Center

    Falakmasir, Mohammad; Yudelson, Michael; Ritter, Steve; Koedinger, Ken

    2015-01-01

    Bayesian Knowledge Tracing (BKT) has been in wide use for modeling student skill acquisition in Intelligent Tutoring Systems (ITS). BKT tracks and updates student's latent mastery of a skill as a probability distribution of a binary variable. BKT does so by accounting for observed student successes in applying the skill correctly, where success is…

  3. Predicting coastal cliff erosion using a Bayesian probabilistic model

    USGS Publications Warehouse

    Hapke, C.; Plant, N.

    2010-01-01

    Regional coastal cliff retreat is difficult to model due to the episodic nature of failures and the along-shore variability of retreat events. There is a growing demand, however, for predictive models that can be used to forecast areas vulnerable to coastal erosion hazards. Increasingly, probabilistic models are being employed that require data sets of high temporal density to define the joint probability density function that relates forcing variables (e.g. wave conditions) and initial conditions (e.g. cliff geometry) to erosion events. In this study we use a multi-parameter Bayesian network to investigate correlations between key variables that control and influence variations in cliff retreat processes. The network uses Bayesian statistical methods to estimate event probabilities using existing observations. Within this framework, we forecast the spatial distribution of cliff retreat along two stretches of cliffed coast in Southern California. The input parameters are the height and slope of the cliff, a descriptor of material strength based on the dominant cliff-forming lithology, and the long-term cliff erosion rate that represents prior behavior. The model is forced using predicted wave impact hours. Results demonstrate that the Bayesian approach is well-suited to the forward modeling of coastal cliff retreat, with the correct outcomes forecast in 70-90% of the modeled transects. The model also performs well in identifying specific locations of high cliff erosion, thus providing a foundation for hazard mapping. This approach can be employed to predict cliff erosion at time-scales ranging from storm events to the impacts of sea-level rise at the century-scale. ?? 2010.

  4. Bayesian stratified sampling to assess corpus utility

    SciTech Connect

    Hochberg, J.; Scovel, C.; Thomas, T.; Hall, S.

    1998-12-01

    This paper describes a method for asking statistical questions about a large text corpus. The authors exemplify the method by addressing the question, ``What percentage of Federal Register documents are real documents, of possible interest to a text researcher or analyst?`` They estimate an answer to this question by evaluating 200 documents selected from a corpus of 45,820 Federal Register documents. Bayesian analysis and stratified sampling are used to reduce the sampling uncertainty of the estimate from over 3,100 documents to fewer than 1,000. A possible application of the method is to establish baseline statistics used to estimate recall rates for information retrieval systems.

  5. Statistical methods for launch vehicle guidance, navigation, and control (GN&C) system design and analysis

    NASA Astrophysics Data System (ADS)

    Rose, Michael Benjamin

    A novel trajectory and attitude control and navigation analysis tool for powered ascent is developed. The tool is capable of rapid trade-space analysis and is designed to ultimately reduce turnaround time for launch vehicle design, mission planning, and redesign work. It is streamlined to quickly determine trajectory and attitude control dispersions, propellant dispersions, orbit insertion dispersions, and navigation errors and their sensitivities to sensor errors, actuator execution uncertainties, and random disturbances. The tool is developed by applying both Monte Carlo and linear covariance analysis techniques to a closed-loop, launch vehicle guidance, navigation, and control (GN&C) system. The nonlinear dynamics and flight GN&C software models of a closed-loop, six-degree-of-freedom (6-DOF), Monte Carlo simulation are formulated and developed. The nominal reference trajectory (NRT) for the proposed lunar ascent trajectory is defined and generated. The Monte Carlo truth models and GN&C algorithms are linearized about the NRT, the linear covariance equations are formulated, and the linear covariance simulation is developed. The performance of the launch vehicle GN&C system is evaluated using both Monte Carlo and linear covariance techniques and their trajectory and attitude control dispersion, propellant dispersion, orbit insertion dispersion, and navigation error results are validated and compared. Statistical results from linear covariance analysis are generally within 10% of Monte Carlo results, and in most cases the differences are less than 5%. This is an excellent result given the many complex nonlinearities that are embedded in the ascent GN&C problem. Moreover, the real value of this tool lies in its speed, where the linear covariance simulation is 1036.62 times faster than the Monte Carlo simulation. Although the application and results presented are for a lunar, single-stage-to-orbit (SSTO), ascent vehicle, the tools, techniques, and mathematical

  6. Bayesian natural selection and the evolution of perceptual systems.

    PubMed Central

    Geisler, Wilson S; Diehl, Randy L

    2002-01-01

    In recent years, there has been much interest in characterizing statistical properties of natural stimuli in order to better understand the design of perceptual systems. A fruitful approach has been to compare the processing of natural stimuli in real perceptual systems with that of ideal observers derived within the framework of Bayesian statistical decision theory. While this form of optimization theory has provided a deeper understanding of the information contained in natural stimuli as well as of the computational principles employed in perceptual systems, it does not directly consider the process of natural selection, which is ultimately responsible for design. Here we propose a formal framework for analysing how the statistics of natural stimuli and the process of natural selection interact to determine the design of perceptual systems. The framework consists of two complementary components. The first is a maximum fitness ideal observer, a standard Bayesian ideal observer with a utility function appropriate for natural selection. The second component is a formal version of natural selection based upon Bayesian statistical decision theory. Maximum fitness ideal observers and Bayesian natural selection are demonstrated in several examples. We suggest that the Bayesian approach is appropriate not only for the study of perceptual systems but also for the study of many other systems in biology. PMID:12028784

  7. Feasibility study of using statistical process control to customized quality assurance in proton therapy

    SciTech Connect

    Rah, Jeong-Eun; Oh, Do Hoon; Shin, Dongho; Kim, Tae Hyun; Kim, Gwe-Ya

    2014-09-15

    Purpose: To evaluate and improve the reliability of proton quality assurance (QA) processes and, to provide an optimal customized tolerance level using the statistical process control (SPC) methodology. Methods: The authors investigated the consistency check of dose per monitor unit (D/MU) and range in proton beams to see whether it was within the tolerance level of the daily QA process. This study analyzed the difference between the measured and calculated ranges along the central axis to improve the patient-specific QA process in proton beams by using process capability indices. Results: The authors established a customized tolerance level of ±2% for D/MU and ±0.5 mm for beam range in the daily proton QA process. In the authors’ analysis of the process capability indices, the patient-specific range measurements were capable of a specification limit of ±2% in clinical plans. Conclusions: SPC methodology is a useful tool for customizing the optimal QA tolerance levels and improving the quality of proton machine maintenance, treatment delivery, and ultimately patient safety.

  8. Confidence limits for contribution plots in multivariate statistical process control using bootstrap estimates.

    PubMed

    Babamoradi, Hamid; van den Berg, Frans; Rinnan, Åsmund

    2016-02-18

    In Multivariate Statistical Process Control, when a fault is expected or detected in the process, contribution plots are essential for operators and optimization engineers in identifying those process variables that were affected by or might be the cause of the fault. The traditional way of interpreting a contribution plot is to examine the largest contributing process variables as the most probable faulty ones. This might result in false readings purely due to the differences in natural variation, measurement uncertainties, etc. It is more reasonable to compare variable contributions for new process runs with historical results achieved under Normal Operating Conditions, where confidence limits for contribution plots estimated from training data are used to judge new production runs. Asymptotic methods cannot provide confidence limits for contribution plots, leaving re-sampling methods as the only option. We suggest bootstrap re-sampling to build confidence limits for all contribution plots in online PCA-based MSPC. The new strategy to estimate CLs is compared to the previously reported CLs for contribution plots. An industrial batch process dataset was used to illustrate the concepts.

  9. Practical guidelines for applying statistical process control to blood component production.

    PubMed

    Beckman, N; Nightingale, M J; Pamphilon, D

    2009-12-01

    Legislation, guidelines and recommendations for blood components related to statistical process control (SPC) and the selection of a quality monitoring (QM) sampling regimen are subject to misinterpretation and lack practical guidance on implementation. The aim of this article is: to review and interpret applicable European legislation and guidelines and to develop an SPC strategy that meets these requirements; and to provide practical guidance on the selection and application of appropriate techniques and the interpretation of resultant blood component QM data. A methodology is presented which utilizes: an algorithm to select an appropriate quality-monitoring strategy for the blood component parameter under consideration; a range of straightforward, validated SPC techniques for variable data and an assessment of process capability (Cpk) and blood component parameter 'criticality' to determine the sampling regimen. The methodology was applied to routine National Health Service Blood and Transplant (NHSBT) blood component data for 2005-2006. Cpk values ranged from 0.22 to >3 and their predicted non-conformance rates were close to those observed (23 to <0.001%). Required sample size ranged from 0.01 to 10%. Chosen techniques identified significant deviation from 'as validated' performance within an appropriate time-scale. Thus the methodology was straightforward to apply and prompted the choice of a clinically and operationally appropriate sampling regimen and analysis for each blood component parameter. This evidence-based, targeted use of SPC for blood component monitoring provides an essential focus on processes with a low capability in achieving their specifications. PMID:19761545

  10. Statistical estimate of mercury removal efficiencies for air pollution control devices of municipal solid waste incinerators.

    PubMed

    Takahashi, Fumitake; Kida, Akiko; Shimaoka, Takayuki

    2010-10-15

    Although representative removal efficiencies of gaseous mercury for air pollution control devices (APCDs) are important to prepare more reliable atmospheric emission inventories of mercury, they have been still uncertain because they depend sensitively on many factors like the type of APCDs, gas temperature, and mercury speciation. In this study, representative removal efficiencies of gaseous mercury for several types of APCDs of municipal solid waste incineration (MSWI) were offered using a statistical method. 534 data of mercury removal efficiencies for APCDs used in MSWI were collected. APCDs were categorized as fixed-bed absorber (FA), wet scrubber (WS), electrostatic precipitator (ESP), and fabric filter (FF), and their hybrid systems. Data series of all APCD types had Gaussian log-normality. The average removal efficiency with a 95% confidence interval for each APCD was estimated. The FA, WS, and FF with carbon and/or dry sorbent injection systems had 75% to 82% average removal efficiencies. On the other hand, the ESP with/without dry sorbent injection had lower removal efficiencies of up to 22%. The type of dry sorbent injection in the FF system, dry or semi-dry, did not make more than 1% difference to the removal efficiency. The injection of activated carbon and carbon-containing fly ash in the FF system made less than 3% difference. Estimation errors of removal efficiency were especially high for the ESP. The national average of removal efficiency of APCDs in Japanese MSWI plants was estimated on the basis of incineration capacity. Owing to the replacement of old APCDs for dioxin control, the national average removal efficiency increased from 34.5% in 1991 to 92.5% in 2003. This resulted in an additional reduction of about 0.86Mg emission in 2003. Further study using the methodology in this study to other important emission sources like coal-fired power plants will contribute to better emission inventories.

  11. Structural damage detection using extended Kalman filter combined with statistical process control

    NASA Astrophysics Data System (ADS)

    Jin, Chenhao; Jang, Shinae; Sun, Xiaorong

    2015-04-01

    Traditional modal-based methods, which identify damage based upon changes in vibration characteristics of the structure on a global basis, have received considerable attention in the past decades. However, the effectiveness of the modalbased methods is dependent on the type of damage and the accuracy of the structural model, and these methods may also have difficulties when applied to complex structures. The extended Kalman filter (EKF) algorithm which has the capability to estimate parameters and catch abrupt changes, is currently used in continuous and automatic structural damage detection to overcome disadvantages of traditional methods. Structural parameters are typically slow-changing variables under effects of operational and environmental conditions, thus it would be difficult to observe the structural damage and quantify the damage in real-time with EKF only. In this paper, a Statistical Process Control (SPC) is combined with EFK method in order to overcome this difficulty. Based on historical measurements of damage-sensitive feathers involved in the state-space dynamic models, extended Kalman filter (EKF) algorithm is used to produce real-time estimations of these features as well as standard derivations, which can then be used to form control ranges for SPC to detect any abnormality of the selected features. Moreover, confidence levels of the detection can be adjusted by choosing different times of sigma and number of adjacent out-of-range points. The proposed method is tested using simulated data of a three floors linear building in different damage scenarios, and numerical results demonstrate high damage detection accuracy and light computation of this presented method.

  12. An improvement in IMRT QA results and beam matching in linacs using statistical process control.

    PubMed

    Gagneur, Justin D; Ezzell, Gary A

    2014-01-01

    The purpose of this study is to apply the principles of statistical process control (SPC) in the context of patient specific intensity-modulated radiation therapy (IMRT) QA to set clinic-specific action limits and evaluate the impact of changes to the multileaf collimator (MLC) calibrations on IMRT QA results. Ten months of IMRT QA data with 247 patient QAs collected on three beam-matched linacs were retrospectively analyzed with a focus on the gamma pass rate (GPR) and the average ratio between the measured and planned doses. Initial control charts and action limits were calculated. Based on this data, changes were made to the leaf gap parameter for the MLCs to improve the consistency between linacs. This leaf gap parameter is tested monthly using a MLC sweep test. A follow-up dataset with 424 unique QAs were used to evaluate the impact of the leaf gap parameter change. The initial data average GPR was 98.6% with an SPC action limit of 93.7%. The average ratio of doses was 1.003, with an upper action limit of 1.017 and a lower action limit of 0.989. The sweep test results for the linacs were -1.8%, 0%, and +1.2% from nominal. After the adjustment of the leaf gap parameter, all sweep test results were within 0.4% of nominal. Subsequently, the average GPR was 99.4% with an SPC action limit of 97.3%. The average ratio of doses was 0.997 with an upper action limit of 1.011 and a lower action limit of 0.981. Applying the principles of SPC to IMRT QA allowed small differences between closely matched linacs to be identified and reduced. Ongoing analysis will monitor the process and be used to refine the clinical action limits for IMRT QA. PMID:25207579

  13. Statistical Optics

    NASA Astrophysics Data System (ADS)

    Goodman, Joseph W.

    2000-07-01

    The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research

  14. A Bayesian Measurment Error Model for Misaligned Radiographic Data

    SciTech Connect

    Lennox, Kristin P.; Glascoe, Lee G.

    2013-09-06

    An understanding of the inherent variability in micro-computed tomography (micro-CT) data is essential to tasks such as statistical process control and the validation of radiographic simulation tools. The data present unique challenges to variability analysis due to the relatively low resolution of radiographs, and also due to minor variations from run to run which can result in misalignment or magnification changes between repeated measurements of a sample. Positioning changes artificially inflate the variability of the data in ways that mask true physical phenomena. We present a novel Bayesian nonparametric regression model that incorporates both additive and multiplicative measurement error in addition to heteroscedasticity to address this problem. We also use this model to assess the effects of sample thickness and sample position on measurement variability for an aluminum specimen. Supplementary materials for this article are available online.

  15. A Bayesian Measurment Error Model for Misaligned Radiographic Data

    DOE PAGES

    Lennox, Kristin P.; Glascoe, Lee G.

    2013-09-06

    An understanding of the inherent variability in micro-computed tomography (micro-CT) data is essential to tasks such as statistical process control and the validation of radiographic simulation tools. The data present unique challenges to variability analysis due to the relatively low resolution of radiographs, and also due to minor variations from run to run which can result in misalignment or magnification changes between repeated measurements of a sample. Positioning changes artificially inflate the variability of the data in ways that mask true physical phenomena. We present a novel Bayesian nonparametric regression model that incorporates both additive and multiplicative measurement error inmore » addition to heteroscedasticity to address this problem. We also use this model to assess the effects of sample thickness and sample position on measurement variability for an aluminum specimen. Supplementary materials for this article are available online.« less

  16. Bayesian multimodel inference for dose-response studies

    USGS Publications Warehouse

    Link, W.A.; Albers, P.H.

    2007-01-01

    Statistical inference in dose?response studies is model-based: The analyst posits a mathematical model of the relation between exposure and response, estimates parameters of the model, and reports conclusions conditional on the model. Such analyses rarely include any accounting for the uncertainties associated with model selection. The Bayesian inferential system provides a convenient framework for model selection and multimodel inference. In this paper we briefly describe the Bayesian paradigm and Bayesian multimodel inference. We then present a family of models for multinomial dose?response data and apply Bayesian multimodel inferential methods to the analysis of data on the reproductive success of American kestrels (Falco sparveriuss) exposed to various sublethal dietary concentrations of methylmercury.

  17. Quantum-Like Representation of Non-Bayesian Inference

    NASA Astrophysics Data System (ADS)

    Asano, M.; Basieva, I.; Khrennikov, A.; Ohya, M.; Tanaka, Y.

    2013-01-01

    This research is related to the problem of "irrational decision making or inference" that have been discussed in cognitive psychology. There are some experimental studies, and these statistical data cannot be described by classical probability theory. The process of decision making generating these data cannot be reduced to the classical Bayesian inference. For this problem, a number of quantum-like coginitive models of decision making was proposed. Our previous work represented in a natural way the classical Bayesian inference in the frame work of quantum mechanics. By using this representation, in this paper, we try to discuss the non-Bayesian (irrational) inference that is biased by effects like the quantum interference. Further, we describe "psychological factor" disturbing "rationality" as an "environment" correlating with the "main system" of usual Bayesian inference.

  18. Bayesian Case-deletion Model Complexity and Information Criterion

    PubMed Central

    Zhu, Hongtu; Ibrahim, Joseph G.; Chen, Qingxia

    2015-01-01

    We establish a connection between Bayesian case influence measures for assessing the influence of individual observations and Bayesian predictive methods for evaluating the predictive performance of a model and comparing different models fitted to the same dataset. Based on such a connection, we formally propose a new set of Bayesian case-deletion model complexity (BCMC) measures for quantifying the effective number of parameters in a given statistical model. Its properties in linear models are explored. Adding some functions of BCMC to a conditional deviance function leads to a Bayesian case-deletion information criterion (BCIC) for comparing models. We systematically investigate some properties of BCIC and its connection with other information criteria, such as the Deviance Information Criterion (DIC). We illustrate the proposed methodology on linear mixed models with simulations and a real data example. PMID:26180578

  19. Bayesian Correlation Analysis for Sequence Count Data

    PubMed Central

    Lau, Nelson; Perkins, Theodore J.

    2016-01-01

    Evaluating the similarity of different measured variables is a fundamental task of statistics, and a key part of many bioinformatics algorithms. Here we propose a Bayesian scheme for estimating the correlation between different entities’ measurements based on high-throughput sequencing data. These entities could be different genes or miRNAs whose expression is measured by RNA-seq, different transcription factors or histone marks whose expression is measured by ChIP-seq, or even combinations of different types of entities. Our Bayesian formulation accounts for both measured signal levels and uncertainty in those levels, due to varying sequencing depth in different experiments and to varying absolute levels of individual entities, both of which affect the precision of the measurements. In comparison with a traditional Pearson correlation analysis, we show that our Bayesian correlation analysis retains high correlations when measurement confidence is high, but suppresses correlations when measurement confidence is low—especially for entities with low signal levels. In addition, we consider the influence of priors on the Bayesian correlation estimate. Perhaps surprisingly, we show that naive, uniform priors on entities’ signal levels can lead to highly biased correlation estimates, particularly when different experiments have widely varying sequencing depths. However, we propose two alternative priors that provably mitigate this problem. We also prove that, like traditional Pearson correlation, our Bayesian correlation calculation constitutes a kernel in the machine learning sense, and thus can be used as a similarity measure in any kernel-based machine learning algorithm. We demonstrate our approach on two RNA-seq datasets and one miRNA-seq dataset. PMID:27701449

  20. Statistical examination of laser therapy effects in controlled double-blind clinical trial

    NASA Astrophysics Data System (ADS)

    Boerner, Ewa; Podbielska, Halina

    2001-10-01

    For the evaluation of the therapy effects the double-blind clinical trial followed by statistical analysis was performed. After statistical calculations it was stated that laser therapy with IR radiation has a significant influence on the decrease of the level of pain in the examined group of patients suffering from various locomotive diseases. The level of pain of patients undergoing laser therapy was statistically lower than the level of pain of patients undergoing placebo therapy. It means that laser therapy had statistically significant influence on the decrease of the level of pain. The same tests were performed for evaluation of movement range. Although placebo therapy contributes to the increase of the range of movement, the statistically significant influence was stated in case of the therapeutic group treated by laser.

  1. Validating an automated outcomes surveillance application using data from a terminated randomized, controlled trial (OPUS [TIMI-16]).

    PubMed

    Matheny, Michael E; Morrow, David A; Ohno-Machado, Lucila; Cannon, Christopher P; Resnic, Frederic S

    2007-01-01

    We sought to validate an automated outcomes surveillance system (DELTA) using OPUS (TIMI-16), a multi-center randomized, controlled trial that was stopped early due to elevated mortality in one of the two intervention arms. Methodologies that were incorporated into the application (Statistical Process Control [SPC] and Bayesian Updating Statistics [BUS]) were compared with standard Data Safety Monitoring Board (DSMB) protocols. PMID:18694141

  2. Bayesian truthing and experimental validation in homeland security and defense

    NASA Astrophysics Data System (ADS)

    Jannson, Tomasz; Forrester, Thomas; Wang, Wenjian; Kostrzewski, Andrew; Pradhan, Ranjit

    2014-05-01

    In this paper we discuss relations between Bayesian Truthing (experimental validation), Bayesian statistics, and Binary Sensing in the context of selected Homeland Security and Intelligence, Surveillance, Reconnaissance (ISR) optical and nonoptical application scenarios. The basic Figure of Merit (FoM) is Positive Predictive Value (PPV), as well as false positives and false negatives. By using these simple binary statistics, we can analyze, classify, and evaluate a broad variety of events including: ISR; natural disasters; QC; and terrorism-related, GIS-related, law enforcement-related, and other C3I events.

  3. On Bayesian analysis of on-off measurements

    NASA Astrophysics Data System (ADS)

    Nosek, Dalibor; Nosková, Jana

    2016-06-01

    We propose an analytical solution to the on-off problem within the framework of Bayesian statistics. Both the statistical significance for the discovery of new phenomena and credible intervals on model parameters are presented in a consistent way. We use a large enough family of prior distributions of relevant parameters. The proposed analysis is designed to provide Bayesian solutions that can be used for any number of observed on-off events, including zero. The procedure is checked using Monte Carlo simulations. The usefulness of the method is demonstrated on examples from γ-ray astronomy.

  4. Application of Snowfall and Wind Statistics to Snow Transport Modeling for Snowdrift Control in Minnesota.

    NASA Astrophysics Data System (ADS)

    Shulski, Martha D.; Seeley, Mark W.

    2004-11-01

    Models were utilized to determine the snow accumulation season (SAS) and to quantify windblown snow for the purpose of snowdrift control for locations in Minnesota. The models require mean monthly temperature, snowfall, density of snow, and wind frequency distribution statistics. Temperature and precipitation data were obtained from local cooperative observing sites, and wind data came from Automated Surface Observing System (ASOS)/Automated Weather Observing System (AWOS) sites in the region. The temperature-based algorithm used to define the SAS reveals a geographic variability in the starting and ending dates of the season, which is determined by latitude and elevation. Mean seasonal snowfall shows a geographic distribution that is affected by topography and proximity to Lake Superior. Mean snowfall density also exhibits variability, with lower-density snow events displaced to higher-latitude positions. Seasonal wind frequencies show a strong bimodal distribution with peaks from the northwest and southeast vector direction, with an exception for locations in close proximity to the Lake Superior shoreline. In addition, for western and south-central Minnesota there is a considerably higher frequency of wind speeds above the mean snow transport threshold of 7 m s-1. As such, this area is more conducive to higher potential snow transport totals. Snow relocation coefficients in this area are in the range of 0.4 0.9, and, according to the empirical models used in this analysis, this range implies that actual snow transport is 40% 90% of the total potential in south-central and western areas of the state.


  5. Independent assessment to continue improvement: Implementing statistical process control at the Hanford Site

    SciTech Connect

    Hu, T.A.; Lo, J.C.

    1994-11-01

    A Quality Assurance independent assessment has brought about continued improvement in the PUREX Plant surveillance program at the Department of Energy`s Hanford Site. After the independent assessment, Quality Assurance personnel were closely involved in improving the surveillance program, specifically regarding storage tank monitoring. The independent assessment activities included reviewing procedures, analyzing surveillance data, conducting personnel interviews, and communicating with management. Process improvement efforts included: (1) designing data collection methods; (2) gaining concurrence between engineering and management, (3) revising procedures; and (4) interfacing with shift surveillance crews. Through this process, Statistical Process Control (SPC) was successfully implemented and surveillance management was improved. The independent assessment identified several deficiencies within the surveillance system. These deficiencies can be grouped into two areas: (1) data recording and analysis and (2) handling off-normal conditions. By using several independent assessment techniques, Quality Assurance was able to point out program weakness to senior management and present suggestions for improvements. SPC charting, as implemented by Quality Assurance, is an excellent tool for diagnosing the process, improving communication between the team members, and providing a scientific database for management decisions. In addition, the surveillance procedure was substantially revised. The goals of this revision were to (1) strengthen the role of surveillance management, engineering and operators and (2) emphasize the importance of teamwork for each individual who performs a task. In this instance we believe that the value independent assessment adds to the system is the continuous improvement activities that follow the independent assessment. Excellence in teamwork between the independent assessment organization and the auditee is the key to continuing improvement.

  6. Advanced Bayesian Method for Planetary Surface Navigation

    NASA Technical Reports Server (NTRS)

    Center, Julian

    2015-01-01

    Autonomous Exploration, Inc., has developed an advanced Bayesian statistical inference method that leverages current computing technology to produce a highly accurate surface navigation system. The method combines dense stereo vision and high-speed optical flow to implement visual odometry (VO) to track faster rover movements. The Bayesian VO technique improves performance by using all image information rather than corner features only. The method determines what can be learned from each image pixel and weighs the information accordingly. This capability improves performance in shadowed areas that yield only low-contrast images. The error characteristics of the visual processing are complementary to those of a low-cost inertial measurement unit (IMU), so the combination of the two capabilities provides highly accurate navigation. The method increases NASA mission productivity by enabling faster rover speed and accuracy. On Earth, the technology will permit operation of robots and autonomous vehicles in areas where the Global Positioning System (GPS) is degraded or unavailable.

  7. Sum statistics for the joint detection of multiple disease loci in case-control association studies with SNP markers.

    PubMed

    Wille, Anja; Hoh, Josephine; Ott, Jurg

    2003-12-01

    In complex traits, multiple disease loci presumably interact to produce the disease. For this reason, even with high-resolution single nucleotide polymorphism (SNP) marker maps, it has been difficult to map susceptibility loci by conventional locus-by-locus methods. Fine mapping strategies are needed that allow for the simultaneous detection of interacting disease loci while handling large numbers of densely spaced markers. For this purpose, sum statistics were recently proposed as a first-stage analysis method for case-control association studies with SNPs. Via sums of single-marker statistics, information over multiple disease-associated markers is combined and, with a global significance value alpha, a small set of "interesting" markers is selected for further analysis. Here, the statistical properties of such approaches are examined by computer simulation. It is shown that sum statistics can often be successfully applied when marker-by-marker approaches fail to detect association. Compared with Bonferroni or False Discovery Rate (FDR) procedures, sum statistics have greater power, and more disease loci can be detected. However, in studies with tightly linked markers, simple sum statistics can be suboptimal, since the intermarker correlation is ignored. A method is presented that takes the correlation structure among marker loci into account when marker statistics are combined.

  8. Quantum Bayesian implementation

    NASA Astrophysics Data System (ADS)

    Wu, Haoyang

    2013-02-01

    Mechanism design is a reverse problem of game theory. Nash implementation and Bayesian implementation are two important parts of mechanism design theory. The former one corresponds to a setting with complete information, whereas the latter one corresponds to a setting with incomplete information. A recent work Wu (Int J Quantum Inf 9:615-623, 2011) shows that when an additional condition is satisfied, the traditional sufficient conditions for Nash implementation will fail in a quantum domain. Inspired by this work, in this paper we will propose that the traditional sufficient conditions for Bayesian implementation will also fail if agents use quantum strategies to send messages to the designer through channels (e.g., Internet, cable etc) and two additional conditions are satisfied.

  9. Hierarchical Approximate Bayesian Computation

    PubMed Central

    Turner, Brandon M.; Van Zandt, Trisha

    2013-01-01

    Approximate Bayesian computation (ABC) is a powerful technique for estimating the posterior distribution of a model’s parameters. It is especially important when the model to be fit has no explicit likelihood function, which happens for computational (or simulation-based) models such as those that are popular in cognitive neuroscience and other areas in psychology. However, ABC is usually applied only to models with few parameters. Extending ABC to hierarchical models has been difficult because high-dimensional hierarchical models add computational complexity that conventional ABC cannot accommodate. In this paper we summarize some current approaches for performing hierarchical ABC and introduce a new algorithm called Gibbs ABC. This new algorithm incorporates well-known Bayesian techniques to improve the accuracy and efficiency of the ABC approach for estimation of hierarchical models. We then use the Gibbs ABC algorithm to estimate the parameters of two models of signal detection, one with and one without a tractable likelihood function. PMID:24297436

  10. Using Statistical Process Control for detecting anomalies in multivariate spatiotemporal Earth Observations

    NASA Astrophysics Data System (ADS)

    Flach, Milan; Mahecha, Miguel; Gans, Fabian; Rodner, Erik; Bodesheim, Paul; Guanche-Garcia, Yanira; Brenning, Alexander; Denzler, Joachim; Reichstein, Markus

    2016-04-01

    The number of available Earth observations (EOs) is currently substantially increasing. Detecting anomalous patterns in these multivariate time series is an important step in identifying changes in the underlying dynamical system. Likewise, data quality issues might result in anomalous multivariate data constellations and have to be identified before corrupting subsequent analyses. In industrial application a common strategy is to monitor production chains with several sensors coupled to some statistical process control (SPC) algorithm. The basic idea is to raise an alarm when these sensor data depict some anomalous pattern according to the SPC, i.e. the production chain is considered 'out of control'. In fact, the industrial applications are conceptually similar to the on-line monitoring of EOs. However, algorithms used in the context of SPC or process monitoring are rarely considered for supervising multivariate spatio-temporal Earth observations. The objective of this study is to exploit the potential and transferability of SPC concepts to Earth system applications. We compare a range of different algorithms typically applied by SPC systems and evaluate their capability to detect e.g. known extreme events in land surface processes. Specifically two main issues are addressed: (1) identifying the most suitable combination of data pre-processing and detection algorithm for a specific type of event and (2) analyzing the limits of the individual approaches with respect to the magnitude, spatio-temporal size of the event as well as the data's signal to noise ratio. Extensive artificial data sets that represent the typical properties of Earth observations are used in this study. Our results show that the majority of the algorithms used can be considered for the detection of multivariate spatiotemporal events and directly transferred to real Earth observation data as currently assembled in different projects at the European scale, e.g. http://baci-h2020.eu

  11. Efficient Bayesian Phase Estimation.

    PubMed

    Wiebe, Nathan; Granade, Chris

    2016-07-01

    We introduce a new method called rejection filtering that we use to perform adaptive Bayesian phase estimation. Our approach has several advantages: it is classically efficient, easy to implement, achieves Heisenberg limited scaling, resists depolarizing noise, tracks time-dependent eigenstates, recovers from failures, and can be run on a field programmable gate array. It also outperforms existing iterative phase estimation algorithms such as Kitaev's method. PMID:27419551

  12. Efficient Bayesian Phase Estimation

    NASA Astrophysics Data System (ADS)

    Wiebe, Nathan; Granade, Chris

    2016-07-01

    We introduce a new method called rejection filtering that we use to perform adaptive Bayesian phase estimation. Our approach has several advantages: it is classically efficient, easy to implement, achieves Heisenberg limited scaling, resists depolarizing noise, tracks time-dependent eigenstates, recovers from failures, and can be run on a field programmable gate array. It also outperforms existing iterative phase estimation algorithms such as Kitaev's method.

  13. Probability, statistics, and computational science.

    PubMed

    Beerenwinkel, Niko; Siebourg, Juliane

    2012-01-01

    In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters.

  14. Bayesian Methods for Analyzing Structural Equation Models with Covariates, Interaction, and Quadratic Latent Variables

    ERIC Educational Resources Information Center

    Lee, Sik-Yum; Song, Xin-Yuan; Tang, Nian-Sheng

    2007-01-01

    The analysis of interaction among latent variables has received much attention. This article introduces a Bayesian approach to analyze a general structural equation model that accommodates the general nonlinear terms of latent variables and covariates. This approach produces a Bayesian estimate that has the same statistical optimal properties as a…

  15. Investigation of changes in characteristics of hydrological time series by Bayesian methods

    NASA Astrophysics Data System (ADS)

    Rao, A. Ramachandra; Tirtotjondro, Wahju

    1996-11-01

    A review of literature reveals the inadequacy of Intervention analysis and spectrum based methods to adequately quantify changes in hydrologic times series. A Bayesian method is used to investigate the statistical significance of observed changes in hydrologic times series and the results are reported herein. The Bayesian method is superior to the previous methods.

  16. A critique of statistical hypothesis testing in clinical research

    PubMed Central

    Raha, Somik

    2011-01-01

    Many have documented the difficulty of using the current paradigm of Randomized Controlled Trials (RCTs) to test and validate the effectiveness of alternative medical systems such as Ayurveda. This paper critiques the applicability of RCTs for all clinical knowledge-seeking endeavors, of which Ayurveda research is a part. This is done by examining statistical hypothesis testing, the underlying foundation of RCTs, from a practical and philosophical perspective. In the philosophical critique, the two main worldviews of probability are that of the Bayesian and the frequentist. The frequentist worldview is a special case of the Bayesian worldview requiring the unrealistic assumptions of knowing nothing about the universe and believing that all observations are unrelated to each other. Many have claimed that the first belief is necessary for science, and this claim is debunked by comparing variations in learning with different prior beliefs. Moving beyond the Bayesian and frequentist worldviews, the notion of hypothesis testing itself is challenged on the grounds that a hypothesis is an unclear distinction, and assigning a probability on an unclear distinction is an exercise that does not lead to clarity of action. This critique is of the theory itself and not any particular application of statistical hypothesis testing. A decision-making frame is proposed as a way of both addressing this critique and transcending ideological debates on probability. An example of a Bayesian decision-making approach is shown as an alternative to statistical hypothesis testing, utilizing data from a past clinical trial that studied the effect of Aspirin on heart attacks in a sample population of doctors. As a big reason for the prevalence of RCTs in academia is legislation requiring it, the ethics of legislating the use of statistical methods for clinical research is also examined. PMID:22022152

  17. A critique of statistical hypothesis testing in clinical research.

    PubMed

    Raha, Somik

    2011-07-01

    Many have documented the difficulty of using the current paradigm of Randomized Controlled Trials (RCTs) to test and validate the effectiveness of alternative medical systems such as Ayurveda. This paper critiques the applicability of RCTs for all clinical knowledge-seeking endeavors, of which Ayurveda research is a part. This is done by examining statistical hypothesis testing, the underlying foundation of RCTs, from a practical and philosophical perspective. In the philosophical critique, the two main worldviews of probability are that of the Bayesian and the frequentist. The frequentist worldview is a special case of the Bayesian worldview requiring the unrealistic assumptions of knowing nothing about the universe and believing that all observations are unrelated to each other. Many have claimed that the first belief is necessary for science, and this claim is debunked by comparing variations in learning with different prior beliefs. Moving beyond the Bayesian and frequentist worldviews, the notion of hypothesis testing itself is challenged on the grounds that a hypothesis is an unclear distinction, and assigning a probability on an unclear distinction is an exercise that does not lead to clarity of action. This critique is of the theory itself and not any particular application of statistical hypothesis testing. A decision-making frame is proposed as a way of both addressing this critique and transcending ideological debates on probability. An example of a Bayesian decision-making approach is shown as an alternative to statistical hypothesis testing, utilizing data from a past clinical trial that studied the effect of Aspirin on heart attacks in a sample population of doctors. As a big reason for the prevalence of RCTs in academia is legislation requiring it, the ethics of legislating the use of statistical methods for clinical research is also examined.

  18. Bayesian Attractor Learning

    NASA Astrophysics Data System (ADS)

    Wiegerinck, Wim; Schoenaker, Christiaan; Duane, Gregory

    2016-04-01

    Recently, methods for model fusion by dynamically combining model components in an interactive ensemble have been proposed. In these proposals, fusion parameters have to be learned from data. One can view these systems as parametrized dynamical systems. We address the question of learnability of dynamical systems with respect to both short term (vector field) and long term (attractor) behavior. In particular we are interested in learning in the imperfect model class setting, in which the ground truth has a higher complexity than the models, e.g. due to unresolved scales. We take a Bayesian point of view and we define a joint log-likelihood that consists of two terms, one is the vector field error and the other is the attractor error, for which we take the L1 distance between the stationary distributions of the model and the assumed ground truth. In the context of linear models (like so-called weighted supermodels), and assuming a Gaussian error model in the vector fields, vector field learning leads to a tractable Gaussian solution. This solution can then be used as a prior for the next step, Bayesian attractor learning, in which the attractor error is used as a log-likelihood term. Bayesian attractor learning is implemented by elliptical slice sampling, a sampling method for systems with a Gaussian prior and a non Gaussian likelihood. Simulations with a partially observed driven Lorenz 63 system illustrate the approach.

  19. Integrated Bayesian Experimental Design

    NASA Astrophysics Data System (ADS)

    Fischer, R.; Dreier, H.; Dinklage, A.; Kurzan, B.; Pasch, E.

    2005-11-01

    Any scientist planning experiments wants to optimize the design of a future experiment with respect to best performance within the scheduled experimental scenarios. Bayesian Experimental Design (BED) aims in finding optimal experimental settings based on an information theoretic utility function. Optimal design parameters are found by maximizing an expected utility function where the future data and the parameters of physical scenarios of interest are marginalized. The goal of the Integrated Bayesian Experimental Design (IBED) concept is to combine experiments as early as on the design phase to mutually exploit the benefits of the other experiments. The Bayesian Integrated Data Analysis (IDA) concept of linking interdependent measurements to provide a validated data base and to exploit synergetic effects will be used to design meta-diagnostics. An example is given by the Thomson scattering (TS) and the interferometry (IF) diagnostics individually, and a set of both. In finding the optimal experimental design for the meta-diagnostic, TS and IF, the strengths of both experiments can be combined to synergistically increase the reliability of results.

  20. 75 FR 6209 - Guidance for Industry and Food and Drug Administration; Guidance for the Use of Bayesian...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-02-08

    ... Use of Bayesian Statistics in Medical Device Clinical Trials.'' This guidance summarizes FDA's current... device clinical trials. DATES: Submit electronic or written comments on agency guidances at any time... Use of Bayesian Statistics in Medical Device Clinical Trials'' to the Division of Small...

  1. Model parameter updating using Bayesian networks

    SciTech Connect

    Treml, C. A.; Ross, Timothy J.

    2004-01-01

    This paper outlines a model parameter updating technique for a new method of model validation using a modified model reference adaptive control (MRAC) framework with Bayesian Networks (BNs). The model parameter updating within this method is generic in the sense that the model/simulation to be validated is treated as a black box. It must have updateable parameters to which its outputs are sensitive, and those outputs must have metrics that can be compared to that of the model reference, i.e., experimental data. Furthermore, no assumptions are made about the statistics of the model parameter uncertainty, only upper and lower bounds need to be specified. This method is designed for situations where a model is not intended to predict a complete point-by-point time domain description of the item/system behavior; rather, there are specific points, features, or events of interest that need to be predicted. These specific points are compared to the model reference derived from actual experimental data. The logic for updating the model parameters to match the model reference is formed via a BN. The nodes of this BN consist of updateable model input parameters and the specific output values or features of interest. Each time the model is executed, the input/output pairs are used to adapt the conditional probabilities of the BN. Each iteration further refines the inferred model parameters to produce the desired model output. After parameter updating is complete and model inputs are inferred, reliabilities for the model output are supplied. Finally, this method is applied to a simulation of a resonance control cooling system for a prototype coupled cavity linac. The results are compared to experimental data.

  2. Using Statistical Process Control Charts to Identify the Steroids Era in Major League Baseball: An Educational Exercise

    ERIC Educational Resources Information Center

    Hill, Stephen E.; Schvaneveldt, Shane J.

    2011-01-01

    This article presents an educational exercise in which statistical process control charts are constructed and used to identify the Steroids Era in American professional baseball. During this period (roughly 1993 until the present), numerous baseball players were alleged or proven to have used banned, performance-enhancing drugs. Also observed…

  3. Economic Statistical Design of integrated X-bar-S control chart with Preventive Maintenance and general failure distribution.

    PubMed

    Caballero Morales, Santiago Omar

    2013-01-01

    The application of Preventive Maintenance (PM) and Statistical Process Control (SPC) are important practices to achieve high product quality, small frequency of failures, and cost reduction in a production process. However there are some points that have not been explored in depth about its joint application. First, most SPC is performed with the X-bar control chart which does not fully consider the variability of the production process. Second, many studies of design of control charts consider just the economic aspect while statistical restrictions must be considered to achieve charts with low probabilities of false detection of failures. Third, the effect of PM on processes with different failure probability distributions has not been studied. Hence, this paper covers these points, presenting the Economic Statistical Design (ESD) of joint X-bar-S control charts with a cost model that integrates PM with general failure distribution. Experiments showed statistically significant reductions in costs when PM is performed on processes with high failure rates and reductions in the sampling frequency of units for testing under SPC.

  4. Economic Statistical Design of Integrated X-bar-S Control Chart with Preventive Maintenance and General Failure Distribution

    PubMed Central

    Caballero Morales, Santiago Omar

    2013-01-01

    The application of Preventive Maintenance (PM) and Statistical Process Control (SPC) are important practices to achieve high product quality, small frequency of failures, and cost reduction in a production process. However there are some points that have not been explored in depth about its joint application. First, most SPC is performed with the X-bar control chart which does not fully consider the variability of the production process. Second, many studies of design of control charts consider just the economic aspect while statistical restrictions must be considered to achieve charts with low probabilities of false detection of failures. Third, the effect of PM on processes with different failure probability distributions has not been studied. Hence, this paper covers these points, presenting the Economic Statistical Design (ESD) of joint X-bar-S control charts with a cost model that integrates PM with general failure distribution. Experiments showed statistically significant reductions in costs when PM is performed on processes with high failure rates and reductions in the sampling frequency of units for testing under SPC. PMID:23527082

  5. Fieldcrest Cannon, Inc. Advanced Technical Preparation. Statistical Process Control (SPC). PRE-SPC 11: SPC & Graphs. Instructor Book.

    ERIC Educational Resources Information Center

    Averitt, Sallie D.

    This instructor guide, which was developed for use in a manufacturing firm's advanced technical preparation program, contains the materials required to present a learning module that is designed to prepare trainees for the program's statistical process control module by improving their basic math skills in working with line graphs and teaching…

  6. Economic Statistical Design of integrated X-bar-S control chart with Preventive Maintenance and general failure distribution.

    PubMed

    Caballero Morales, Santiago Omar

    2013-01-01

    The application of Preventive Maintenance (PM) and Statistical Process Control (SPC) are important practices to achieve high product quality, small frequency of failures, and cost reduction in a production process. However there are some points that have not been explored in depth about its joint application. First, most SPC is performed with the X-bar control chart which does not fully consider the variability of the production process. Second, many studies of design of control charts consider just the economic aspect while statistical restrictions must be considered to achieve charts with low probabilities of false detection of failures. Third, the effect of PM on processes with different failure probability distributions has not been studied. Hence, this paper covers these points, presenting the Economic Statistical Design (ESD) of joint X-bar-S control charts with a cost model that integrates PM with general failure distribution. Experiments showed statistically significant reductions in costs when PM is performed on processes with high failure rates and reductions in the sampling frequency of units for testing under SPC. PMID:23527082

  7. Bayesian methods for the design and analysis of noninferiority trials.

    PubMed

    Gamalo-Siebers, Margaret; Gao, Aijun; Lakshminarayanan, Mani; Liu, Guanghan; Natanegara, Fanni; Railkar, Radha; Schmidli, Heinz; Song, Guochen

    2016-01-01

    The gold standard for evaluating treatment efficacy of a medical product is a placebo-controlled trial. However, when the use of placebo is considered to be unethical or impractical, a viable alternative for evaluating treatment efficacy is through a noninferiority (NI) study where a test treatment is compared to an active control treatment. The minimal objective of such a study is to determine whether the test treatment is superior to placebo. An assumption is made that if the active control treatment remains efficacious, as was observed when it was compared against placebo, then a test treatment that has comparable efficacy with the active control, within a certain range, must also be superior to placebo. Because of this assumption, the design, implementation, and analysis of NI trials present challenges for sponsors and regulators. In designing and analyzing NI trials, substantial historical data are often required on the active control treatment and placebo. Bayesian approaches provide a natural framework for synthesizing the historical data in the form of prior distributions that can effectively be used in design and analysis of a NI clinical trial. Despite a flurry of recent research activities in the area of Bayesian approaches in medical product development, there are still substantial gaps in recognition and acceptance of Bayesian approaches in NI trial design and analysis. The Bayesian Scientific Working Group of the Drug Information Association provides a coordinated effort to target the education and implementation issues on Bayesian approaches for NI trials. In this article, we provide a review of both frequentist and Bayesian approaches in NI trials, and elaborate on the implementation for two common Bayesian methods including hierarchical prior method and meta-analytic-predictive approach. Simulations are conducted to investigate the properties of the Bayesian methods, and some real clinical trial examples are presented for illustration.

  8. Evaluating the performances of statistical and neural network based control charts

    NASA Astrophysics Data System (ADS)

    Teoh, Kok Ban; Ong, Hong Choon

    2015-10-01

    Control chart is used widely in many fields and traditional control chart is no longer adequate in detecting a sudden change in a particular process. So, run rules which are built in into Shewhart X ¯ control chart while Exponential Weighted Moving Average control chart (EWMA), Cumulative Sum control chart (CUSUM) and neural network based control chart are introduced to overcome the limitation regarding to the sensitivity of traditional control chart. In this study, the average run length (ARL) and median run length (MRL) in the shifts in the process mean of control charts mentioned will be computed. We will show that interpretations based only on the ARL can be misleading. Thus, MRL is also used to evaluate the performances of the control charts. From this study, neural network based control chart is found to possess a better performance than run rules of Shewhart X ¯ control chart, EWMA and CUSUM control chart.

  9. Application of a statistical procedure for the control of yeast production

    SciTech Connect

    Michimasa, K.; Toshihiro, S.; Toshiomi, Y.; Hisaharu, T.

    1984-08-01

    In many cases of fermentation processes, it is difficult to measure all of the state variables needed to indicate the culture state and to predict state changes. Proposed is the use of time-series data of measurable state variables instead of unmeasurable variables to describe the fermentation process. A discrimination method of independent variables of regression analysis using statistical measures, PSS and Aic is employed. These statistical procedures were applied to fed batch culture for yeast production under aerobic conditions, and the effectiveness was ascertained experimentally. 12 references.

  10. Bayesian Quantitative Electrophysiology and Its Multiple Applications in Bioengineering

    PubMed Central

    Barr, Roger C.; Nolte, Loren W.; Pollard, Andrew E.

    2014-01-01

    Bayesian interpretation of observations began in the early 1700s, and scientific electrophysiology began in the late 1700s. For two centuries these two fields developed mostly separately. In part that was because quantitative Bayesian interpretation, in principle a powerful method of relating measurements to their underlying sources, often required too many steps to be feasible with hand calculation in real applications. As computer power became widespread in the later 1900s, Bayesian models and interpretation moved rapidly but unevenly from the domain of mathematical statistics into applications. Use of Bayesian models now is growing rapidly in electrophysiology. Bayesian models are well suited to the electrophysiological environment, allowing a direct and natural way to express what is known (and unknown) and to evaluate which one of many alternatives is most likely the source of the observations, and the closely related receiver operating characteristic (ROC) curve is a powerful tool in making decisions. Yet, in general, many people would ask what such models are for, in electrophysiology, and what particular advantages such models provide. So to examine this question in particular, this review identifies a number of electrophysiological papers in bio-engineering arising from questions in several organ systems to see where Bayesian electrophysiological models or ROC curves were important to the results that were achieved. PMID:22275206

  11. Treatment of control data in lunar phototriangulation. [application of statistical procedures and development of mathematical and computer techniques

    NASA Technical Reports Server (NTRS)

    Wong, K. W.

    1974-01-01

    In lunar phototriangulation, there is a complete lack of accurate ground control points. The accuracy analysis of the results of lunar phototriangulation must, therefore, be completely dependent on statistical procedure. It was the objective of this investigation to examine the validity of the commonly used statistical procedures, and to develop both mathematical techniques and computer softwares for evaluating (1) the accuracy of lunar phototriangulation; (2) the contribution of the different types of photo support data on the accuracy of lunar phototriangulation; (3) accuracy of absolute orientation as a function of the accuracy and distribution of both the ground and model points; and (4) the relative slope accuracy between any triangulated pass points.

  12. Eliminating the influence of serial correlation on statistical process control charts using trend free pre-whitening (TFPW) method

    NASA Astrophysics Data System (ADS)

    Desa, Nor Hasliza Mat; Jemain, Abdul Aziz

    2013-11-01

    A key assumption in traditional statistical process control (SPC) technique is based on the requirement that observations or time series data are normally and independently distributed. The presences of a serial autocorrelation results in a number of problems, including an increase in the type I error rate and thereby increase the expected number of false alarm in the process observation. However, the independency assumption is often violated in practice due to the influence of serial correlation in the observation. Therefore, the aim of this paper is to demonstrate with the hospital admission data, the influence of serial correlation on the statistical control charts. The trend free pre-whitening (TFPW) method has been used and applied as an alternative method to obtain residuals series which are statistically uncorrelated to each other. In this study, a data set of daily hospital admission for respiratory and cardiovascular diseases was used from the period of 1 January 2009 to 31 December 2009 (365 days). Result showed that TFPW method is an easy and useful method in removing the influence of serial correlation from the hospital admission data. It can be concluded that statistical control chart based on residual series perform better compared to original hospital admission series which influenced by the effects of serial correlation data.

  13. Computational Approaches for Developing Informative Prior Distributions for Bayesian Calibration of PBPK Models

    EPA Science Inventory

    Using Bayesian statistical methods to quantify uncertainty and variability in human PBPK model predictions for use in risk assessments requires prior distributions (priors), which characterize what is known or believed about parameters’ values before observing in vivo data. Expe...

  14. Computational Approaches for Developing Informative Prior Distributions for Bayesian Calibration of PBPK Models (Book Chapter)

    EPA Science Inventory

    Using Bayesian statistical methods to quantify uncertainty and variability in human physiologically-based pharmacokinetic (PBPK) model predictions for use in risk assessments requires prior distributions (priors), which characterize what is known or believed about parameters’ val...

  15. [Logistic regression against a divergent Bayesian network].

    PubMed

    Sánchez Trujillo, Noel Antonio

    2015-02-03

    This article is a discussion about two statistical tools used for prediction and causality assessment: logistic regression and Bayesian networks. Using data of a simulated example from a study assessing factors that might predict pulmonary emphysema (where fingertip pigmentation and smoking are considered); we posed the following questions. Is pigmentation a confounding, causal or predictive factor? Is there perhaps another factor, like smoking, that confounds? Is there a synergy between pigmentation and smoking? The results, in terms of prediction, are similar with the two techniques; regarding causation, differences arise. We conclude that, in decision-making, the sum of both: a statistical tool, used with common sense, and previous evidence, taking years or even centuries to develop; is better than the automatic and exclusive use of statistical resources.

  16. Elite Athletes Refine Their Internal Clocks: A Bayesian Analysis.

    PubMed

    Chen, Yin-Hua; Verdinelli, Isabella; Cesari, Paola

    2016-07-01

    This paper carries out a full Bayesian analysis for a data set examined in Chen & Cesari (2015). These data were collected for assessing people's ability in evaluating short intervals of time. Chen & Cesari (2015) showed evidence of the existence of two independent internal clocks for evaluating time intervals below and above the second. We reexamine here, the same question by performing a complete statistical Bayesian analysis of the data. The Bayesian approach can be used to analyze these data thanks to the specific trial design. Data were obtained from evaluation of time ranges from two groups of individuals. More specifically, information gathered from a nontrained group (considered as baseline) allowed us to build a prior distribution for the parameter(s) of interest, and data from the trained group determined the likelihood function. This paper's main goals are (i) showing how the Bayesian inferential method can be used in statistical analyses and (ii) showing that the Bayesian methodology gives additional support to the findings presented in Chen & Cesari (2015) regarding the existence of two internal clocks in assessing duration of time intervals.

  17. The radiology task: Bayesian theory and perception.

    PubMed

    Donovan, T; Manning, D J

    2007-06-01

    The use of a Bayesian framework to understand how radiologists search images for pathology is important as it formalizes, mathematically, how visual and cognitive processes control eye movements by modelling the ideal searcher against which human performance can be compared. It is important that the interpretation of medical images is understood so that new developments in the ways images are presented and the use of image processing software are matched to human abilities and limitations.

  18. Model Criticism of Bayesian Networks with Latent Variables.

    ERIC Educational Resources Information Center

    Williamson, David M.; Mislevy, Robert J.; Almond, Russell G.

    This study investigated statistical methods for identifying errors in Bayesian networks (BN) with latent variables, as found in intelligent cognitive assessments. BN, commonly used in artificial intelligence systems, are promising mechanisms for scoring constructed-response examinations. The success of an intelligent assessment or tutoring system…

  19. Stan: A Probabilistic Programming Language for Bayesian Inference and Optimization

    ERIC Educational Resources Information Center

    Gelman, Andrew; Lee, Daniel; Guo, Jiqiang

    2015-01-01

    Stan is a free and open-source C++ program that performs Bayesian inference or optimization for arbitrary user-specified models and can be called from the command line, R, Python, Matlab, or Julia and has great promise for fitting large and complex statistical models in many areas of application. We discuss Stan from users' and developers'…

  20. A Comparison of Imputation Methods for Bayesian Factor Analysis Models

    ERIC Educational Resources Information Center

    Merkle, Edgar C.

    2011-01-01

    Imputation methods are popular for the handling of missing data in psychology. The methods generally consist of predicting missing data based on observed data, yielding a complete data set that is amiable to standard statistical analyses. In the context of Bayesian factor analysis, this article compares imputation under an unrestricted…

  1. Augmenting Data with Published Results in Bayesian Linear Regression

    ERIC Educational Resources Information Center

    de Leeuw, Christiaan; Klugkist, Irene

    2012-01-01

    In most research, linear regression analyses are performed without taking into account published results (i.e., reported summary statistics) of similar previous studies. Although the prior density in Bayesian linear regression could accommodate such prior knowledge, formal models for doing so are absent from the literature. The goal of this…

  2. Model Comparison of Bayesian Semiparametric and Parametric Structural Equation Models

    ERIC Educational Resources Information Center

    Song, Xin-Yuan; Xia, Ye-Mao; Pan, Jun-Hao; Lee, Sik-Yum

    2011-01-01

    Structural equation models have wide applications. One of the most important issues in analyzing structural equation models is model comparison. This article proposes a Bayesian model comparison statistic, namely the "L[subscript nu]"-measure for both semiparametric and parametric structural equation models. For illustration purposes, we consider…

  3. Personalized Multi-Student Improvement Based on Bayesian Cybernetics

    ERIC Educational Resources Information Center

    Kaburlasos, Vassilis G.; Marinagi, Catherine C.; Tsoukalas, Vassilis Th.

    2008-01-01

    This work presents innovative cybernetics (feedback) techniques based on Bayesian statistics for drawing questions from an Item Bank towards personalized multi-student improvement. A novel software tool, namely "Module for Adaptive Assessment of Students" (or, "MAAS" for short), implements the proposed (feedback) techniques. In conclusion, a pilot…

  4. Searching Algorithm Using Bayesian Updates

    ERIC Educational Resources Information Center

    Caudle, Kyle

    2010-01-01

    In late October 1967, the USS Scorpion was lost at sea, somewhere between the Azores and Norfolk Virginia. Dr. Craven of the U.S. Navy's Special Projects Division is credited with using Bayesian Search Theory to locate the submarine. Bayesian Search Theory is a straightforward and interesting application of Bayes' theorem which involves searching…

  5. Meteorological Data Assimilation by Adaptive Bayesian Optimization.

    NASA Astrophysics Data System (ADS)

    Purser, Robert James

    1992-01-01

    The principal aim of this research is the elucidation of the Bayesian statistical principles that underlie the theory of objective meteorological analysis. In particular, emphasis is given to aspects of data assimilation that can benefit from an iterative numerical strategy. Two such aspects that are given special consideration are statistical validation of the covariance profiles and nonlinear initialization. A new economic algorithm is presented, based on the imposition of a sparse matrix structure for all covariances and precisions held during the computations. It is shown that very large datasets may be accommodated using this structure and a good linear approximation to the analysis equations established without the need to unnaturally fragment the problem. Since the integrity of the system of analysis equations is preserved, it is a relatively straight-forward matter to extend the basic analysis algorithm to one that incorporates a check on the plausibility of the statistical model assumed for background errors--the so-called "validation" problem. Two methods of validation are described within the sparse matrix framework: the first is essentially a direct extension of the Bayesian principles to embrace, not only the regular analysis variables, but also the parameters that determine the precise form of the covariance functions; the second technique is the non-Bayesian method of generalized cross validation adapted for use within the sparse matrix framework. The later part of this study is concerned with the establishment of a consistent dynamical balance within a forecast model--the initialization problem. The formal principles of the modern theory of initialization are reviewed and a critical examination is made of the concept of the "slow manifold". It is demonstrated, in accordance with more complete nonlinear models, that even within a simple three-mode linearized system, the notion that a universal slow manifold exists is untenable. It is therefore argued

  6. PACE – the first placebo controlled trial of paracetamol for acute low back pain: statistical analysis plan

    PubMed Central

    2013-01-01

    Background Paracetamol (acetaminophen) is recommended in most clinical practice guidelines as the first choice of treatment for low back pain, however there is limited evidence to support this recommendation. The PACE trial is the first placebo controlled trial of paracetamol for acute low back pain. This article describes the statistical analysis plan. Results PACE is a randomized double dummy placebo controlled trial that investigates and compares the effect of paracetamol taken in two regimens for the treatment of low back pain. The protocol has been published. The analysis plan was completed blind to study group and finalized prior to initiation of analyses. All data collected as part of the trial were reviewed, without stratification by group, and classified by baseline characteristics, process of care and trial outcomes. Trial outcomes were classified as primary and secondary outcomes. Appropriate descriptive statistics and statistical testing of between-group differences, where relevant, have been planned and described. Conclusions A standard analysis plan was developed for the results of the PACE study. This plan comprehensively describes the data captured and pre-determined statistical tests of relevant outcome measures. The plan demonstrates transparent and verifiable use of the data collected. This a priori plan will be followed to ensure rigorous standards of data analysis are strictly adhered to. Trial registration Australia and New Zealand Clinical Trials Registry ACTRN12609000966291 PMID:23937999

  7. Adaptive Dynamic Bayesian Networks

    SciTech Connect

    Ng, B M

    2007-10-26

    A discrete-time Markov process can be compactly modeled as a dynamic Bayesian network (DBN)--a graphical model with nodes representing random variables and directed edges indicating causality between variables. Each node has a probability distribution, conditional on the variables represented by the parent nodes. A DBN's graphical structure encodes fixed conditional dependencies between variables. But in real-world systems, conditional dependencies between variables may be unknown a priori or may vary over time. Model errors can result if the DBN fails to capture all possible interactions between variables. Thus, we explore the representational framework of adaptive DBNs, whose structure and parameters can change from one time step to the next: a distribution's parameters and its set of conditional variables are dynamic. This work builds on recent work in nonparametric Bayesian modeling, such as hierarchical Dirichlet processes, infinite-state hidden Markov networks and structured priors for Bayes net learning. In this paper, we will explain the motivation for our interest in adaptive DBNs, show how popular nonparametric methods are combined to formulate the foundations for adaptive DBNs, and present preliminary results.

  8. Neural representation of swallowing is retained with age. A functional neuroimaging study validated by classical and Bayesian inference.

    PubMed

    Windel, Anne-Sophie; Mihai, Paul Glad; Lotze, Martin

    2015-06-01

    We investigated the neural representation of swallowing in two age groups for a total of 51 healthy participants (seniors: average age 64 years; young adults: average age 24 years) using high spatial resolution functional magnetic resonance imaging (fMRI). Two statistical comparisons (classical and Bayesian inference) revealed no significant differences between subject groups, apart from higher cortical activation for the seniors in the frontal pole 1 of Brodmann's Area 10 using Bayesian inference. Seniors vs. young participants showed longer reaction times and higher skin conductance response (SCR) during swallowing. We found a positive association of SCR and fMRI-activation only among seniors in areas processing sensorimotor performance, arousal and emotional perception. The results indicate that the highly automated swallowing network retains its functionality with age. However, seniors with higher SCR during swallowing appear to also engage areas involved in attention control and emotional regulation, possibly suggesting increased attention and emotional demands during task performance.

  9. A Bayesian Approach to Identifying New Risk Factors for Dementia

    PubMed Central

    Wen, Yen-Hsia; Wu, Shihn-Sheng; Lin, Chun-Hung Richard; Tsai, Jui-Hsiu; Yang, Pinchen; Chang, Yang-Pei; Tseng, Kuan-Hua

    2016-01-01

    Abstract Dementia is one of the most disabling and burdensome health conditions worldwide. In this study, we identified new potential risk factors for dementia from nationwide longitudinal population-based data by using Bayesian statistics. We first tested the consistency of the results obtained using Bayesian statistics with those obtained using classical frequentist probability for 4 recognized risk factors for dementia, namely severe head injury, depression, diabetes mellitus, and vascular diseases. Then, we used Bayesian statistics to verify 2 new potential risk factors for dementia, namely hearing loss and senile cataract, determined from the Taiwan's National Health Insurance Research Database. We included a total of 6546 (6.0%) patients diagnosed with dementia. We observed older age, female sex, and lower income as independent risk factors for dementia. Moreover, we verified the 4 recognized risk factors for dementia in the older Taiwanese population; their odds ratios (ORs) ranged from 3.469 to 1.207. Furthermore, we observed that hearing loss (OR = 1.577) and senile cataract (OR = 1.549) were associated with an increased risk of dementia. We found that the results obtained using Bayesian statistics for assessing risk factors for dementia, such as head injury, depression, DM, and vascular diseases, were consistent with those obtained using classical frequentist probability. Moreover, hearing loss and senile cataract were found to be potential risk factors for dementia in the older Taiwanese population. Bayesian statistics could help clinicians explore other potential risk factors for dementia and for developing appropriate treatment strategies for these patients. PMID:27227925

  10. A statistical learning strategy for closed-loop control of fluid flows

    NASA Astrophysics Data System (ADS)

    Guéniat, Florimond; Mathelin, Lionel; Hussaini, M. Yousuff

    2016-04-01

    This work discusses a closed-loop control strategy for complex systems utilizing scarce and streaming data. A discrete embedding space is first built using hash functions applied to the sensor measurements from which a Markov process model is derived, approximating the complex system's dynamics. A control strategy is then learned using reinforcement learning once rewards relevant with respect to the control objective are identified. This method is designed for experimental configurations, requiring no computations nor prior knowledge of the system, and enjoys intrinsic robustness. It is illustrated on two systems: the control of the transitions of a Lorenz'63 dynamical system, and the control of the drag of a cylinder flow. The method is shown to perform well.

  11. Statistical process control applied to the liquid-fed ceramic melter process

    SciTech Connect

    Pulsipher, B.A.; Kuhn, W.L.

    1987-09-01

    In this report, an application of control charts to the apparent feed composition of a Liquid-Fed Ceramic Melter (LFCM) is demonstrated by using results from a simulation of the LFCM system. Usual applications of control charts require the assumption of uncorrelated observations over time. This assumption is violated in the LFCM system because of the heels left in tanks from previous batches. Methods for dealing with this problem have been developed to create control charts for individual batches sent to the feed preparation tank (FPT). These control charts are capable of detecting changes in the process average as well as changes in the process variation. All numbers reported in this document were derived from a simulated demonstration of a plausible LFCM system. In practice, site-specific data must be used as input to a simulation tailored to that site. These data directly affect all variance estimates used to develop control charts. 64 refs., 3 figs., 2 tabs.

  12. Bayesian Exchangeability, Benefit Transfer, and Research Efficiency

    NASA Astrophysics Data System (ADS)

    Atkinson, Scott E.; Crocker, Thomas D.; Shogren, Jason F.

    1992-03-01

    We offer an economic model of the policymaker's site- or time-specific benefit estimate extrapolation problem when she must weigh the potential gains from an increase in the accuracy and the precision of her agents' estimates against the costs of conducting their assessments. If Bayesian exchangeability is treated as a maintained hypothesis, we suggest that empirical Bayes estimators offer a powerful way to increase the economic efficiency of extrapolation. Finally, we employ a hedonic study of pollution control benefits to illustrate a Bayesian diagnostic that allows the hypothesis of exchangeability to be tested rather than taken as maintained. The power of the diagnostic arises from its ability to identify those sources of parameter variability most likely to discourage extrapolations.

  13. Elements of Statistics

    NASA Astrophysics Data System (ADS)

    Grégoire, G.

    2016-05-01

    This chapter is devoted to two objectives. The first one is to answer the request expressed by attendees of the first Astrostatistics School (Annecy, October 2013) to be provided with an elementary vademecum of statistics that would facilitate understanding of the given courses. In this spirit we recall very basic notions, that is definitions and properties that we think sufficient to benefit from courses given in the Astrostatistical School. Thus we give briefly definitions and elementary properties on random variables and vectors, distributions, estimation and tests, maximum likelihood methodology. We intend to present basic ideas in a hopefully comprehensible way. We do not try to give a rigorous presentation, and due to the place devoted to this chapter, can cover only a rather limited field of statistics. The second aim is to focus on some statistical tools that are useful in classification: basic introduction to Bayesian statistics, maximum likelihood methodology, Gaussian vectors and Gaussian mixture models.

  14. Active control on high-order coherence and statistic characterization on random phase fluctuation of two classical point sources.

    PubMed

    Hong, Peilong; Li, Liming; Liu, Jianji; Zhang, Guoquan

    2016-01-01

    Young's double-slit or two-beam interference is of fundamental importance to understand various interference effects, in which the stationary phase difference between two beams plays the key role in the first-order coherence. Different from the case of first-order coherence, in the high-order optical coherence the statistic behavior of the optical phase will play the key role. In this article, by employing a fundamental interfering configuration with two classical point sources, we showed that the high- order optical coherence between two classical point sources can be actively designed by controlling the statistic behavior of the relative phase difference between two point sources. Synchronous position Nth-order subwavelength interference with an effective wavelength of λ/M was demonstrated, in which λ is the wavelength of point sources and M is an integer not larger than N. Interestingly, we found that the synchronous position Nth-order interference fringe fingerprints the statistic trace of random phase fluctuation of two classical point sources, therefore, it provides an effective way to characterize the statistic properties of phase fluctuation for incoherent light sources.

  15. Active control on high-order coherence and statistic characterization on random phase fluctuation of two classical point sources

    PubMed Central

    Hong, Peilong; Li, Liming; Liu, Jianji; Zhang, Guoquan

    2016-01-01

    Young’s double-slit or two-beam interference is of fundamental importance to understand various interference effects, in which the stationary phase difference between two beams plays the key role in the first-order coherence. Different from the case of first-order coherence, in the high-order optical coherence the statistic behavior of the optical phase will play the key role. In this article, by employing a fundamental interfering configuration with two classical point sources, we showed that the high- order optical coherence between two classical point sources can be actively designed by controlling the statistic behavior of the relative phase difference between two point sources. Synchronous position Nth-order subwavelength interference with an effective wavelength of λ/M was demonstrated, in which λ is the wavelength of point sources and M is an integer not larger than N. Interestingly, we found that the synchronous position Nth-order interference fringe fingerprints the statistic trace of random phase fluctuation of two classical point sources, therefore, it provides an effective way to characterize the statistic properties of phase fluctuation for incoherent light sources. PMID:27021589

  16. Human Balance out of Equilibrium: Nonequilibrium Statistical Mechanics in Posture Control

    NASA Astrophysics Data System (ADS)

    Lauk, Michael; Chow, Carson C.; Pavlik, Ann E.; Collins, James J.

    1998-01-01

    During quiet standing, the human body sways in a stochastic manner. Here we show that the fluctuation-dissipation theorem can be applied to the human postural control system. That is, the dynamic response of the postural system to a weak mechanical perturbation can be predicted from the fluctuations exhibited by the system under quasistatic conditions. We also show that the estimated correlation and response functions can be described by a simple stochastic model consisting of a pinned polymer. These findings suggest that the postural control system utilizes the same control mechanisms under quiet-standing and dynamic conditions.

  17. Advanced statistical process control of a chemical vapor tungsten deposition process on an Applied Materials Centura reactor

    NASA Astrophysics Data System (ADS)

    Stefani, Jerry A.; Poarch, Scott; Saxena, Sharad; Mozumder, P. K.

    1994-09-01

    An advanced multivariable off-line process control system, which combines traditional Statistical Process Control (SPC) with feedback control, has been applied to the CVD tungsten process on an Applied Materials Centura reactor. The goal of the model-based controller is to compensate for shifts in the process and maintain the wafer state responses on target. In the present application the controller employs measurements made on test wafers by off-line metrology tools to track the process behavior. This is accomplished by using model- bases SPC, which compares the measurements with predictions obtained from empirically-derived process models. For CVD tungsten, a physically-based modeling approach was employed based on the kinetically-limited H2 reduction of WF6. On detecting a statistically significant shift in the process, the controller calculates adjustments to the settings to bring the process responses back on target. To achieve this a few additional test wafers are processed at slightly different settings than the nominal. This local experiment allows the models to be updated to reflect the current process performance. The model updates are expressed as multiplicative or additive changes in the process inputs and a change in the model constant. This approach for model updating not only tracks the present process/equipment state, but it also provides some diagnostic capability regarding the cause of the process shift. The updated models are used by an optimizer to compute new settings to bring the responses back to target. The optimizer is capable of incrementally entering controllables into the strategy, reflecting the degree to which the engineer desires to manipulates each setting. The capability of the controller to compensate for shifts in the CVD tungsten process has been demonstrated. Targets for film bulk resistivity and deposition rate were maintained while satisfying constraints on film stress and WF6 conversion efficiency.

  18. Bayesian inference in geomagnetism

    NASA Technical Reports Server (NTRS)

    Backus, George E.

    1988-01-01

    The inverse problem in empirical geomagnetic modeling is investigated, with critical examination of recently published studies. Particular attention is given to the use of Bayesian inference (BI) to select the damping parameter lambda in the uniqueness portion of the inverse problem. The mathematical bases of BI and stochastic inversion are explored, with consideration of bound-softening problems and resolution in linear Gaussian BI. The problem of estimating the radial magnetic field B(r) at the earth core-mantle boundary from surface and satellite measurements is then analyzed in detail, with specific attention to the selection of lambda in the studies of Gubbins (1983) and Gubbins and Bloxham (1985). It is argued that the selection method is inappropriate and leads to lambda values much larger than those that would result if a reasonable bound on the heat flow at the CMB were assumed.

  19. Statistical model and control of a ring-and-roller type grinding mill

    SciTech Connect

    Cho, S.

    1992-01-01

    This project explores the use of a ring-and-roller grinding mill for fine grinding or pulverizing. Since the efficiency of this system is low, small improvements in product throughput or energy consumption can result in substantial economic improvements. To better understand this system, experiments were conducted for limestone and cement clinker. From these experiments models were formulated for product throughput, energy consumption and product size distribution. A control loop was implemented to reduce the effect of non-uniform feeding. Steady-state interactions were analyzed using the Relative Gain Array (RGA) method. Next, a multi-input and multi-output (MIMO) control system was designed using the Inverse Nyquist Array (INA) method to remove variable interactions. The robustness of this controller was explored. Finally, a self-tuning PI controller was designed using the pole placement method to improve system performance. The results of this study provide models for implementation on ring-and-roller grinding mills.

  20. The utilization of six sigma and statistical process control techniques in surgical quality improvement.

    PubMed

    Sedlack, Jeffrey D

    2010-01-01

    Surgeons have been slow to incorporate industrial reliability techniques. Process control methods were applied to surgeon waiting time between cases, and to length of stay (LOS) after colon surgery. Waiting times between surgeries were evaluated by auditing the operating room records of a single hospital over a 1-month period. The medical records of 628 patients undergoing colon surgery over a 5-year period were reviewed. The average surgeon wait time between cases was 53 min, and the busiest surgeon spent 291/2 hr in 1 month waiting between surgeries. Process control charting demonstrated poor overall control of the room turnover process. Average LOS after colon resection also demonstrated very poor control. Mean LOS was 10 days. Weibull's conditional analysis revealed a conditional LOS of 9.83 days. Serious process management problems were identified in both analyses. These process issues are both expensive and adversely affect the quality of service offered by the institution. Process control mechanisms were suggested or implemented to improve these surgical processes. Industrial reliability and quality management tools can easily and effectively identify process control problems that occur on surgical services. PMID:20946422

  1. [Prudent use price controls in Chinese medicines market: based on statistical data analysis].

    PubMed

    Yang, Guang; Wang, Nuo; Huang, Lu-Qi; Qiu, Hong-Yan; Guo, Lan-Ping

    2014-01-01

    A dispute about the decreasing-price problem of traditional Chinese medicine (TCM) has recently arisen. This article analyzes the statistical data of 1995-2011 in China, the results showed that the main responsibility of expensive health care has no direct relationship with the drug price. The price index of TCM rose significantly slower than the medicine prices, the production margins of TCM affected by the material prices has been diminishing since 1995, continuous price reduction will further depress profits of the TCM industry. Considering the pros and cons of raw materials vary greatly in price, decreasing medicine price behavior will force enterprises to use inferior materials in order to maintain corporate profits. The results have the guiding meaning to medicine price management. PMID:24754184

  2. Statistical Comparison of Far-Field Noise Events in a Controlled Flow Ma = 0.6 Jet

    NASA Astrophysics Data System (ADS)

    Freedland, Graham; Lewalle, Jacques

    2013-11-01

    We compare distributions of acoustic events in controlled and uncontrolled high speed jets. By examining far-field acoustic signals from three microphones and using continuous wavelets, sources of noise can be identified through cross-correlation of the different far-field signals. From the events found, four properties (wavelet magnitude, Strouhal number and lags between two pairs of microphones) were tabulated. Each test gives over 10,000 events, which were sorted into histograms that approximate the statistical distributions of properties. This is used to determine what influence the addition of synthetic jet actuators has on the properties of the flow of the jet. A qualitative analysis of the distributions using quantile-quantile plots helps in the understanding of the distributions of sources. A quantitative analysis using the Anderson-Darling and Kolmogorov-Smirnov tests establishes statistically significant differences between the baseline and control cases. The authors thank Dr. Mark Glauser, Dr. Kerwin Low and the Syracuse Jet Group for the use of their data, Professor Dongliang Wang of Upstate Medical University for his suggestion of statistical methods, and Spectral Energies LLC (through an SBIR grant from AFRL) for their support.

  3. Estimating size and scope economies in the Portuguese water sector using the Bayesian stochastic frontier analysis.

    PubMed

    Carvalho, Pedro; Marques, Rui Cunha

    2016-02-15

    This study aims to search for economies of size and scope in the Portuguese water sector applying Bayesian and classical statistics to make inference in stochastic frontier analysis (SFA). This study proves the usefulness and advantages of the application of Bayesian statistics for making inference in SFA over traditional SFA which just uses classical statistics. The resulting Bayesian methods allow overcoming some problems that arise in the application of the traditional SFA, such as the bias in small samples and skewness of residuals. In the present case study of the water sector in Portugal, these Bayesian methods provide more plausible and acceptable results. Based on the results obtained we found that there are important economies of output density, economies of size, economies of vertical integration and economies of scope in the Portuguese water sector, pointing out to the huge advantages in undertaking mergers by joining the retail and wholesale components and by joining the drinking water and wastewater services.

  4. Estimating size and scope economies in the Portuguese water sector using the Bayesian stochastic frontier analysis.

    PubMed

    Carvalho, Pedro; Marques, Rui Cunha

    2016-02-15

    This study aims to search for economies of size and scope in the Portuguese water sector applying Bayesian and classical statistics to make inference in stochastic frontier analysis (SFA). This study proves the usefulness and advantages of the application of Bayesian statistics for making inference in SFA over traditional SFA which just uses classical statistics. The resulting Bayesian methods allow overcoming some problems that arise in the application of the traditional SFA, such as the bias in small samples and skewness of residuals. In the present case study of the water sector in Portugal, these Bayesian methods provide more plausible and acceptable results. Based on the results obtained we found that there are important economies of output density, economies of size, economies of vertical integration and economies of scope in the Portuguese water sector, pointing out to the huge advantages in undertaking mergers by joining the retail and wholesale components and by joining the drinking water and wastewater services. PMID:26674686

  5. Bayesian estimation of isotopic age differences

    SciTech Connect

    Curl, R.L.

    1988-08-01

    Isotopic dating is subject to uncertainties arising from counting statistics and experimental errors. These uncertainties are additive when an isotopic age difference is calculated. If large, they can lead to no significant age difference by classical statistics. In many cases, relative ages are known because of stratigraphic order or other clues. Such information can be used to establish a Bayes estimate of age difference which will include prior knowledge of age order. Age measurement errors are assumed to be log-normal and a noninformative but constrained bivariate prior for two true ages in known order is adopted. True-age ratio is distributed as a truncated log-normal variate. Its expected value gives an age-ratio estimate, and its variance provides credible intervals. Bayesian estimates of ages are different and in correct order even if measured ages are identical or reversed in order. For example, age measurements on two samples might both yield 100 ka with coefficients of variation of 0.2. Bayesian estimates are 22.7 ka for age difference with a 75% credible interval of (4.4, 43.7) ka.

  6. Bayesian inference and Markov chain Monte Carlo in imaging

    NASA Astrophysics Data System (ADS)

    Higdon, David M.; Bowsher, James E.

    1999-05-01

    Over the past 20 years, many problems in Bayesian inference that were previously intractable can now be fairly routinely dealt with using a computationally intensive technique for exploring the posterior distribution called Markov chain Monte Carlo (MCMC). Primarily because of insufficient computing capabilities, most MCMC applications have been limited to rather standard statistical models. However, with the computing power of modern workstations, a fully Bayesian approach with MCMC, is now possible for many imaging applications. Such an approach can be quite useful because it leads not only to `point' estimates of an underlying image or emission source, but it also gives a means for quantifying uncertainties regarding the image. This paper gives an overview of Bayesian image analysis and focuses on applications relevant to medical imaging. Particular focus is on prior image models and outlining MCMC methods for these models.

  7. Bayesian failure probability model sensitivity study. Final report

    SciTech Connect

    Not Available

    1986-05-30

    The Office of the Manager, National Communications System (OMNCS) has developed a system-level approach for estimating the effects of High-Altitude Electromagnetic Pulse (HEMP) on the connectivity of telecommunications networks. This approach incorporates a Bayesian statistical model which estimates the HEMP-induced failure probabilities of telecommunications switches and transmission facilities. The purpose of this analysis is to address the sensitivity of the Bayesian model. This is done by systematically varying two model input parameters--the number of observations, and the equipment failure rates. Throughout the study, a non-informative prior distribution is used. The sensitivity of the Bayesian model to the noninformative prior distribution is investigated from a theoretical mathematical perspective.

  8. Development of Statistical Process Control Methodology for an Environmentally Compliant Surface Cleaning Process in a Bonding Laboratory

    NASA Technical Reports Server (NTRS)

    Hutchens, Dale E.; Doan, Patrick A.; Boothe, Richard E.

    1997-01-01

    Bonding labs at both MSFC and the northern Utah production plant prepare bond test specimens which simulate or witness the production of NASA's Reusable Solid Rocket Motor (RSRM). The current process for preparing the bonding surfaces employs 1,1,1-trichloroethane vapor degreasing, which simulates the current RSRM process. Government regulations (e.g., the 1990 Amendments to the Clean Air Act) have mandated a production phase-out of a number of ozone depleting compounds (ODC) including 1,1,1-trichloroethane. In order to comply with these regulations, the RSRM Program is qualifying a spray-in-air (SIA) precision cleaning process using Brulin 1990, an aqueous blend of surfactants. Accordingly, surface preparation prior to bonding process simulation test specimens must reflect the new production cleaning process. The Bonding Lab Statistical Process Control (SPC) program monitors the progress of the lab and its capabilities, as well as certifies the bonding technicians, by periodically preparing D6AC steel tensile adhesion panels with EA-91 3NA epoxy adhesive using a standardized process. SPC methods are then used to ensure the process is statistically in control, thus producing reliable data for bonding studies, and identify any problems which might develop. Since the specimen cleaning process is being changed, new SPC limits must be established. This report summarizes side-by-side testing of D6AC steel tensile adhesion witness panels and tapered double cantilevered beams (TDCBs) using both the current baseline vapor degreasing process and a lab-scale spray-in-air process. A Proceco 26 inches Typhoon dishwasher cleaned both tensile adhesion witness panels and TDCBs in a process which simulates the new production process. The tests were performed six times during 1995, subsequent statistical analysis of the data established new upper control limits (UCL) and lower control limits (LCL). The data also demonstrated that the new process was equivalent to the vapor

  9. A Novel Hybrid Statistical Particle Swarm Optimization for Multimodal Functions and Frequency Control of Hybrid Wind-Solar System

    NASA Astrophysics Data System (ADS)

    Verma, Harish Kumar; Jain, Cheshta

    2016-09-01

    In this article, a hybrid algorithm of particle swarm optimization (PSO) with statistical parameter (HSPSO) is proposed. Basic PSO for shifted multimodal problems have low searching precision due to falling into a number of local minima. The proposed approach uses statistical characteristics to update the velocity of the particle to avoid local minima and help particles to search global optimum with improved convergence. The performance of the newly developed algorithm is verified using various standard multimodal, multivariable, shifted hybrid composition benchmark problems. Further, the comparative analysis of HSPSO with variants of PSO is tested to control frequency of hybrid renewable energy system which comprises solar system, wind system, diesel generator, aqua electrolyzer and ultra capacitor. A significant improvement in convergence characteristic of HSPSO algorithm over other variants of PSO is observed in solving benchmark optimization and renewable hybrid system problems.

  10. Identifying the controls of wildfire activity in Namibia using multivariate statistics

    NASA Astrophysics Data System (ADS)

    Mayr, Manuel; Le Roux, Johan; Samimi, Cyrus

    2015-04-01

    data mining techniques to select a conceivable set of variables by their explanatory value and to remove redundancy. We will then apply two multivariate statistical methods suitable to a large variety of data types and frequently used for (non-linear) causative factor identification: Non-metric Multidimensional Scaling (NMDS) and Regression Trees. The assumed value of these analyses is i) to determine the most important predictor variables of fire activity in Namibia, ii) to decipher their complex interactions in driving fire variability in Namibia, and iii) to compare the performance of two state-of-the-art statistical methods. References: Le Roux, J. (2011): The effect of land use practices on the spatial and temporal characteristics of savanna fires in Namibia. Doctoral thesis at the University of Erlangen-Nuremberg/Germany - 155 pages.

  11. Bayesian networks as a tool for epidemiological systems analysis

    NASA Astrophysics Data System (ADS)

    Lewis, F. I.

    2012-11-01

    Bayesian network analysis is a form of probabilistic modeling which derives from empirical data a directed acyclic graph (DAG) describing the dependency structure between random variables. Bayesian networks are increasingly finding application in areas such as computational and systems biology, and more recently in epidemiological analyses. The key distinction between standard empirical modeling approaches, such as generalised linear modeling, and Bayesian network analyses is that the latter attempts not only to identify statistically associated variables, but to additionally, and empirically, separate these into those directly and indirectly dependent with one or more outcome variables. Such discrimination is vastly more ambitious but has the potential to reveal far more about key features of complex disease systems. Applying Bayesian network modeling to biological and medical data has considerable computational demands, combined with the need to ensure robust model selection given the vast model space of possible DAGs. These challenges require the use of approximation techniques, such as the Laplace approximation, Markov chain Monte Carlo simulation and parametric bootstrapping, along with computational parallelization. A case study in structure discovery - identification of an optimal DAG for given data - is presented which uses additive Bayesian networks to explore veterinary disease data of industrial and medical relevance.

  12. Cortical hierarchies perform Bayesian causal inference in multisensory perception.

    PubMed

    Rohe, Tim; Noppeney, Uta

    2015-02-01

    To form a veridical percept of the environment, the brain needs to integrate sensory signals from a common source but segregate those from independent sources. Thus, perception inherently relies on solving the "causal inference problem." Behaviorally, humans solve this problem optimally as predicted by Bayesian Causal Inference; yet, the underlying neural mechanisms are unexplored. Combining psychophysics, Bayesian modeling, functional magnetic resonance imaging (fMRI), and multivariate decoding in an audiovisual spatial localization task, we demonstrate that Bayesian Causal Inference is performed by a hierarchy of multisensory processes in the human brain. At the bottom of the hierarchy, in auditory and visual areas, location is represented on the basis that the two signals are generated by independent sources (= segregation). At the next stage, in posterior intraparietal sulcus, location is estimated under the assumption that the two signals are from a common source (= forced fusion). Only at the top of the hierarchy, in anterior intraparietal sulcus, the uncertainty about the causal structure of the world is taken into account and sensory signals are combined as predicted by Bayesian Causal Inference. Characterizing the computational operations of signal interactions reveals the hierarchical nature of multisensory perception in human neocortex. It unravels how the brain accomplishes Bayesian Causal Inference, a statistical computation fundamental for perception and cognition. Our results demonstrate how the brain combines information in the face of uncertainty about the underlying causal structure of the world.

  13. Searching chemical space with the Bayesian Idea Generator.

    PubMed

    van Hoorn, Willem P; Bell, Andrew S

    2009-10-01

    The Pfizer Global Virtual Library (PGVL) is defined as a set compounds that could be synthesized using validated protocols and monomers. However, it is too large (10(12) compounds) to search by brute-force methods for close analogues of a given input structure. In this paper the Bayesian Idea Generator is described which is based on a novel application of Bayesian statistics to narrow down the search space to a prioritized set of existing library arrays (the default is 16). For each of these libraries the 6 closest neighbors are retrieved from the existing compound file, resulting in a screenable hypothesis of 96 compounds. Using the Bayesian models for library space, the Pfizer file of singleton compounds has been mapped to library space and is optionally searched as well. The method is >99% accurate in retrieving known library provenance from an independent test set. The compounds retrieved strike a balance between similarity and diversity resulting in frequent scaffold hops. Four examples of how the Bayesian Idea Generator has been successfully used in drug discovery are provided. The methodology of the Bayesian Idea Generator can be used for any collection of compounds containing distinct clusters, and an example using compound vendor catalogues has been included.

  14. Evaluation of Bayesian network to classify clustered microcalcifications

    NASA Astrophysics Data System (ADS)

    Patrocinio, Ana C.; Schiabel, Homero; Romero, Roseli A. F.

    2004-05-01

    The purpose of this work is the evaluation and analysis of Bayesian network models in order to classify clusters of microcalcifications to supply a second opinion to the specialists in the detection of breast diseases by mammography. Bayesian networks are statistics techniques, which provide explanation about the inferences and influences among features and classes of a determinated problem. Therefore, the technique investigation will aid in obtaining more detailed information to the diagnosis in a CAD scheme. From regions of interest (ROI), containing clusters of microcalcifications, detailed image analysis, pixel to pixel; in this step shape using geometric descriptors (Hu Invariant Moments, second and third order moments and radius gyration); irregularity measure; compactness; area and perimeter extracted descriptors. By using software of Bayesian network models construction, different Bayesian network classifier models could be generated, using the extracted features mentioned above in order to verify their behavior and probabilistic influences and used as the input to Bayesian network, some tests were performed in order to build the classifier. The results of generated nets models validation correspond to an average of 10 tests made with 6 different database sub-groups. The first results of validation have shown 83.17% of correct results.

  15. On the Adaptive Control of the False Discovery Rate in Multiple Testing with Independent Statistics.

    ERIC Educational Resources Information Center

    Benjamini, Yoav; Hochberg, Yosef

    2000-01-01

    Presents an adaptive approach to multiple significance testing based on the procedure of Y. Benjamini and Y. Hochberg (1995) that first estimates the number of true null hypotheses and then uses that estimate in the Benjamini and Hochberg procedure. Uses the new procedure in examples from educational and behavioral studies and shows its control of…

  16. A New Method for the Statistical Control of Rating Error in Performance Ratings.

    ERIC Educational Resources Information Center

    Bannister, Brendan D.; And Others

    1987-01-01

    To control for response bias in student ratings of college teachers, an index of rater error was used that was theoretically independent of actual performance. Partialing out the effects of this extraneous response bias enhanced validity, but partialing out overall effectiveness resulted in reduced convergent and discriminant validities.…

  17. Analyzing Data from a Pretest-Posttest Control Group Design: The Importance of Statistical Assumptions

    ERIC Educational Resources Information Center

    Zientek, Linda; Nimon, Kim; Hammack-Brown, Bryn

    2016-01-01

    Purpose: Among the gold standards in human resource development (HRD) research are studies that test theoretically developed hypotheses and use experimental designs. A somewhat typical experimental design would involve collecting pretest and posttest data on individuals assigned to a control or experimental group. Data from such a design that…

  18. Sparsity and the Bayesian perspective

    NASA Astrophysics Data System (ADS)

    Starck, J.-L.; Donoho, D. L.; Fadili, M. J.; Rassat, A.

    2013-04-01

    Sparsity has recently been introduced in cosmology for weak-lensing and cosmic microwave background (CMB) data analysis for different applications such as denoising, component separation, or inpainting (i.e., filling the missing data or the mask). Although it gives very nice numerical results, CMB sparse inpainting has been severely criticized by top researchers in cosmology using arguments derived from a Bayesian perspective. In an attempt to understand their point of view, we realize that interpreting a regularization penalty term as a prior in a Bayesian framework can lead to erroneous conclusions. This paper is by no means against the Bayesian approach, which has proven to be very useful for many applications, but warns against a Bayesian-only interpretation in data analysis, which can be misleading in some cases.

  19. Statistical Modelling for Controlled Drug Delivery Systems and its Applications in HPMC based Hydrogels

    NASA Astrophysics Data System (ADS)

    Ghosal, Kajal; Chandra, Aniruddha

    2010-10-01

    Different concentrations of hydrophobically modified hydroxypropyl methylcellulose (HPMC, 60 M Grade) and conventional hydrophilic hydroxypropyl methylcellulose (50 cPs) were used to prepare four topical hydrogel formulations using a model non steroidal anti-inflammatory drug (NSAID) diclofenac potassium (DP). For all the formulations, suitability of different common empirical (zero-order, first-order, and Higuchi), semi-empirical (Ritger-Peppas and Peppas-Sahlin), and some new statistical (logistic, log-logistic, Weibull, Gumbel, and generalized extreme value distribution) models to describe the drug release profile were tested through non-linear least-square curve fitting. A general purpose mathematical analysis tool MATLAB is used for the purpose. Further, instead of the widely used transformed linear fit method, direct fitting was used in the paper to avoid any sort of truncation and transformation errors. The results revealed that the log-logistic distribution, among all the models that were investigated, was the best fit for hydrophobic formulations. For hydrophilic cases, the semi-empirical models and Weibull distribution worked best, although log-logistic also showed a close fit.

  20. Statistical decision theory and trade-offs in the control of motor response.

    PubMed

    Trommershäuser, Julia; Maloney, Laurence T; Landy, Michael S

    2003-01-01

    We present a novel approach to the modeling of motor responses based on statistical decision theory. We begin with the hypothesis that subjects are ideal motion planners who choose movement trajectories to minimize expected loss. We derive predictions of the hypothesis for movement in environments where contact with specified regions carries rewards or penalties. The model predicts shifts in a subject's aiming point in response to changes in the reward and penalty structure of the environment and with changes in the subject's uncertainty in carrying out planned movements. We tested some of these predictions in an experiment where subjects were rewarded if they succeeded in touching a target region on a computer screen within a specified time limit. Near the target was a penalty region which, if touched, resulted in a penalty. We varied distance between the penalty region and the target and the cost of hitting the penalty region. Subjects shift their mean points of contact with the computer screen in response to changes in penalties and location of the penalty region relative to the target region in qualitative agreement with the predictions of the hypothesis. Thus, movement planning takes into account extrinsic costs and the subject's own motor uncertainty.

  1. Controlling False Discovery Rate in Signal Space for Transformation-Invariant Thresholding of Statistical Maps.

    PubMed

    Li, Junning; Shi, Yonggang; Toga, Arthur W

    2015-01-01

    Thresholding statistical maps with appropriate correction of multiple testing remains a critical and challenging problem in brain mapping. Since the false discovery rate (FDR) criterion was introduced to the neuroimaging community a decade ago, various improvements have been proposed. However, a highly desirable feature, transformation invariance, has not been adequately addressed, especially for voxel-based FDR. Thresholding applied after spatial transformation is not necessarily equivalent to transformation applied after thresholding in the original space. We find this problem closely related to another important issue: spatial correlation of signals. A Gaussian random vector-valued image after normalization is a random map from a Euclidean space to a high-dimension unit-sphere. Instead of defining the FDR measure in the image's Euclidean space, we define it in the signals' hyper-spherical space whose measure not only reflects the intrinsic "volume" of signals' randomness but also keeps invariant under images' spatial transformation. Experiments with synthetic and real images demonstrate that our method achieves transformation invariance and significantly minimizes the bias introduced by the choice of template images.

  2. Statistical tools and control of internal lubricant content of inhalation grade HPMC capsules during manufacture.

    PubMed

    Ayala, Guillermo; Díez, Fernando; Gassó, María T; Jones, Brian E; Martín-Portugués, Rafael; Ramiro-Aparicio, Juan

    2016-04-30

    The internal lubricant content (ILC) of inhalation grade HPMC capsules is a key factor to ensure good powder release when the patient inhales a medicine from a dry powder inhaler (DPI). Powder release from capsules has been shown to be influenced by the ILC. The characteristics used to measure this are the emitted dose, fine particle fraction and mass median aerodynamic diameter. In addition the ILC level is critical for capsule shell manufacture because it is an essential part of the process that cannot work without it. An experiment has been applied to the manufacture of inhalation capsules with the required ILC. A full factorial model was used to identify the controlling factors and from this a linear model has been proposed to improve control of the process. PMID:26899981

  3. Social Self-Control Is a Statistically Nonredundant Correlate of Adolescent Substance Use.

    PubMed

    Sussman, Steve; Chou, Chih-Ping; Pang, Raina D; Kirkpatrick, Matthew; Guillot, Casey R; Stone, Matthew; Khoddam, Rubin; Riggs, Nathaniel R; Unger, Jennifer B; Leventhal, Adam M

    2016-05-11

    The social self-control scale (SSCS), which taps provocative behavior in social situations, was compared with five potentially overlapping measures (i.e., temperament-related impulsivity, psychomotor agitation-related self-control, perceived social competence, and rash action in response to negative and positive affectively charged states) as correlates of tobacco use and other drug use among a sample of 3,356 ninth-grade youth in Southern California high schools. While there was a lot of shared variance among the measures, the SSCS was incrementally associated with both categories of drug use over and above alternate constructs previously implicated in adolescent drug use. Hence, SSC may relate to adolescent drug use through an etiological pathway unique from other risk constructs. Given that youth who tend to alienate others through provocative social behavior are at risk for multiple drug use, prevention programming to modify low SSC may be warranted.

  4. Diffusion-controlled electron transfer processes and power-law statistics of fluorescence intermittency of nanoparticles.

    PubMed

    Tang, Jau; Marcus, R A

    2005-09-01

    A mechanism involving diffusion-controlled electron transfer processes in Debye and non-Debye dielectric media is proposed to elucidate the power-law distribution for the lifetime of a blinking quantum dot. This model leads to two complementary regimes of power law with a sum of the exponents equal to 2, and to a specific value for the exponent in terms of a distribution of the diffusion correlation times. It also links the exponential bending tail with energetic and kinetic parameters.

  5. BATSE gamma-ray burst line search. 2: Bayesian consistency methodology

    NASA Technical Reports Server (NTRS)

    Band, D. L.; Ford, L. A.; Matteson, J. L.; Briggs, M.; Paciesas, W.; Pendleton, G.; Preece, R.; Palmer, D.; Teegarden, B.; Schaefer, B.

    1994-01-01

    We describe a Bayesian methodology to evaluate the consistency between the reported Ginga and Burst and Transient Source Experiment (BATSE) detections of absorption features in gamma-ray burst spectra. Currently no features have been detected by BATSE, but this methodology will still be applicable if and when such features are discovered. The Bayesian methodology permits the comparison of hypotheses regarding the two detectors' observations and makes explicit the subjective aspects of our analysis (e.g., the quantification of our confidence in detector performance). We also present non-Bayesian consistency statistics. Based on preliminary calculations of line detectability, we find that both the Bayesian and non-Bayesian techniques show that the BATSE and Ginga observations are consistent given our understanding of these detectors.

  6. Controlled trials of vitamin D, causality and type 2 statistical error.

    PubMed

    Gillie, Oliver

    2016-02-01

    Two recent studies published in The Lancet (Autier et al. (2013) Lancet Diabetes Endocrinol 2, 76-89 and Bolland et al. (2014) Lancet Diabetes Endocrinol 2, 307-320) have concluded that low levels of vitamin D are not a cause but a consequence of ill health brought about by reduced exposure to the sun, an association known as 'reverse causality'. The scientific evidence and reasoning for these conclusions are examined here and found to be faulty. A null result in a clinical trial of vitamin D in adults need not lead to a conclusion of reverse causation when low vitamin D is found in observational studies of the same disease earlier in life. To assume an explanation of reverse causality has close similarities with type 2 statistical error. For example, a null result in providing vitamin D for treatment of adult bones that are deformed in the pattern of the rachitic rosary would not alter the observation that lack of vitamin D can cause rickets in childhood and may have lasting consequences if not cured with vitamin D. Other examples of diseases considered on a lifetime basis from conception to adulthood are used to further illustrate the issue, which is evidently not obvious and is far from trivial. It is concluded that deficiency of vitamin D in cohort studies, especially at critical times such as pregnancy and early life, can be the cause of a number of important diseases. Denial of the possible benefits of vitamin D, as suggested by insistent interpretation of studies with reverse causation, may lead to serious harms, some of which are listed.

  7. Single-agent maintenance therapy for advanced non-small cell lung cancer (NSCLC): a systematic review and Bayesian network meta-analysis of 26 randomized controlled trials

    PubMed Central

    Zeng, Xiaoning; Ma, Yuan

    2016-01-01

    Background The benefit of maintenance therapy has been confirmed in patients with non-progressing non-small cell lung cancer (NSCLC) after first-line therapy by many trials and meta-analyses. However, since few head-to-head trials between different regimens have been reported, clinicians still have little guidance on how to select the most efficacious single-agent regimen. Hence, we present a network meta-analysis to assess the comparative treatment efficacy of several single-agent maintenance therapy regimens for stage III/IV NSCLC. Methods A comprehensive literature search of public databases and conference proceedings was performed. Randomized clinical trials (RCTs) meeting the eligible criteria were integrated into a Bayesian network meta-analysis. The primary outcome was overall survival (OS) and the secondary outcome was progression free survival (PFS). Results A total of 26 trials covering 7,839 patients were identified, of which 24 trials were included in the OS analysis, while 23 trials were included in the PFS analysis. Switch-racotumomab-alum vaccine and switch-pemetrexed were identified as the most efficacious regimens based on OS (HR, 0.64; 95% CrI, 0.45–0.92) and PFS (HR, 0.54; 95% CrI, 0.26–1.04) separately. According to the rank order based on OS, switch-racotumomab-alum vaccine had the highest probability as the most effective regimen (52%), while switch-pemetrexed ranked first (34%) based on PFS. Conclusions Several single-agent maintenance therapy regimens can prolong OS and PFS for stage III/IV NSCLC. Switch-racotumomab-alum vaccine maintenance therapy may be the most optimal regimen, but should be confirmed by additional evidence. PMID:27781159

  8. Applications of Bayesian spectrum representation in acoustics

    NASA Astrophysics Data System (ADS)

    Botts, Jonathan M.

    framework. The application to reflection data is useful for representing frequency-dependent impedance boundaries in finite difference acoustic simulations. Furthermore, since the filter transfer function is a parametric model, it can be modified to incorporate arbitrary frequency weighting and account for the band-limited nature of measured reflection spectra. Finally, the model is modified to compensate for dispersive error in the finite difference simulation, from the filter design process. Stemming from the filter boundary problem, the implementation of pressure sources in finite difference simulation is addressed in order to assure that schemes properly converge. A class of parameterized source functions is proposed and shown to offer straightforward control of residual error in the simulation. Guided by the notion that the solution to be approximated affects the approximation error, sources are designed which reduce residual dispersive error to the size of round-off errors. The early part of a room impulse response can be characterized by a series of isolated plane waves. Measured with an array of microphones, plane waves map to a directional response of the array or spatial intensity map. Probabilistic inversion of this response results in estimates of the number and directions of image source arrivals. The model-based inversion is shown to avoid ambiguities associated with peak-finding or inspection of the spatial intensity map. For this problem, determining the number of arrivals in a given frame is critical for properly inferring the state of the sound field. This analysis is effectively compression of the spatial room response, which is useful for analysis or encoding of the spatial sound field. Parametric, model-based formulations of these problems enhance the solution in all cases, and a Bayesian interpretation provides a principled approach to model comparison and parameter estimation. v

  9. Evaluation of Various Radar Data Quality Control Algorithms Based on Accumulated Radar Rainfall Statistics

    NASA Technical Reports Server (NTRS)

    Robinson, Michael; Steiner, Matthias; Wolff, David B.; Ferrier, Brad S.; Kessinger, Cathy; Einaudi, Franco (Technical Monitor)

    2000-01-01

    The primary function of the TRMM Ground Validation (GV) Program is to create GV rainfall products that provide basic validation of satellite-derived precipitation measurements for select primary sites. A fundamental and extremely important step in creating high-quality GV products is radar data quality control. Quality control (QC) processing of TRMM GV radar data is based on some automated procedures, but the current QC algorithm is not fully operational and requires significant human interaction to assure satisfactory results. Moreover, the TRMM GV QC algorithm, even with continuous manual tuning, still can not completely remove all types of spurious echoes. In an attempt to improve the current operational radar data QC procedures of the TRMM GV effort, an intercomparison of several QC algorithms has been conducted. This presentation will demonstrate how various radar data QC algorithms affect accumulated radar rainfall products. In all, six different QC algorithms will be applied to two months of WSR-88D radar data from Melbourne, Florida. Daily, five-day, and monthly accumulated radar rainfall maps will be produced for each quality-controlled data set. The QC algorithms will be evaluated and compared based on their ability to remove spurious echoes without removing significant precipitation. Strengths and weaknesses of each algorithm will be assessed based on, their abilit to mitigate both erroneous additions and reductions in rainfall accumulation from spurious echo contamination and true precipitation removal, respectively. Contamination from individual spurious echo categories will be quantified to further diagnose the abilities of each radar QC algorithm. Finally, a cost-benefit analysis will be conducted to determine if a more automated QC algorithm is a viable alternative to the current, labor-intensive QC algorithm employed by TRMM GV.

  10. Statistical optimization of controlled release microspheres containing cetirizine hydrochloride as a model for water soluble drugs.

    PubMed

    El-Say, Khalid M; El-Helw, Abdel-Rahim M; Ahmed, Osama A A; Hosny, Khaled M; Ahmed, Tarek A; Kharshoum, Rasha M; Fahmy, Usama A; Alsawahli, Majed

    2015-01-01

    The purpose was to improve the encapsulation efficiency of cetirizine hydrochloride (CTZ) microspheres as a model for water soluble drugs and control its release by applying response surface methodology. A 3(3) Box-Behnken design was used to determine the effect of drug/polymer ratio (X1), surfactant concentration (X2) and stirring speed (X3), on the mean particle size (Y1), percentage encapsulation efficiency (Y2) and cumulative percent drug released for 12 h (Y3). Emulsion solvent evaporation (ESE) technique was applied utilizing Eudragit RS100 as coating polymer and span 80 as surfactant. All formulations were evaluated for micromeritic properties and morphologically characterized by scanning electron microscopy (SEM). The relative bioavailability of the optimized microspheres was compared with CTZ marketed product after oral administration on healthy human volunteers using a double blind, randomized, cross-over design. The results revealed that the mean particle sizes of the microspheres ranged from 62 to 348 µm and the efficiency of entrapment ranged from 36.3% to 70.1%. The optimized CTZ microspheres exhibited a slow and controlled release over 12 h. The pharmacokinetic data of optimized CTZ microspheres showed prolonged tmax, decreased Cmax and AUC0-∞ value of 3309 ± 211 ng h/ml indicating improved relative bioavailability by 169.4% compared with marketed tablets.

  11. Financial Management and Control for Decision Making in Urban Local Bodies in India Using Statistical Techniques

    NASA Astrophysics Data System (ADS)

    Bhattacharyya, Sidhakam; Bandyopadhyay, Gautam

    2010-10-01

    The council of most of the Urban Local Bodies (ULBs) has a limited scope for decision making in the absence of appropriate financial control mechanism. The information about expected amount of own fund during a particular period is of great importance for decision making. Therefore, in this paper, efforts are being made to present set of findings and to establish a model of estimating receipts of own sources and payments thereof using multiple regression analysis. Data for sixty months from a reputed ULB in West Bengal have been considered for ascertaining the regression models. This can be used as a part of financial management and control procedure by the council to estimate the effect on own fund. In our study we have considered two models using multiple regression analysis. "Model I" comprises of total adjusted receipt as the dependent variable and selected individual receipts as the independent variables. Similarly "Model II" consists of total adjusted payments as the dependent variable and selected individual payments as independent variables. The resultant of Model I and Model II is the surplus or deficit effecting own fund. This may be applied for decision making purpose by the council.

  12. Approximate Bayesian multibody tracking.

    PubMed

    Lanz, Oswald

    2006-09-01

    Visual tracking of multiple targets is a challenging problem, especially when efficiency is an issue. Occlusions, if not properly handled, are a major source of failure. Solutions supporting principled occlusion reasoning have been proposed but are yet unpractical for online applications. This paper presents a new solution which effectively manages the trade-off between reliable modeling and computational efficiency. The Hybrid Joint-Separable (HJS) filter is derived from a joint Bayesian formulation of the problem, and shown to be efficient while optimal in terms of compact belief representation. Computational efficiency is achieved by employing a Markov random field approximation to joint dynamics and an incremental algorithm for posterior update with an appearance likelihood that implements a physically-based model of the occlusion process. A particle filter implementation is proposed which achieves accurate tracking during partial occlusions, while in cases of complete occlusion, tracking hypotheses are bound to estimated occlusion volumes. Experiments show that the proposed algorithm is efficient, robust, and able to resolve long-term occlusions between targets with identical appearance. PMID:16929730

  13. Rediscovery of Good-Turing estimators via Bayesian nonparametrics.

    PubMed

    Favaro, Stefano; Nipoti, Bernardo; Teh, Yee Whye

    2016-03-01

    The problem of estimating discovery probabilities originated in the context of statistical ecology, and in recent years it has become popular due to its frequent appearance in challenging applications arising in genetics, bioinformatics, linguistics, designs of experiments, machine learning, etc. A full range of statistical approaches, parametric and nonparametric as well as frequentist and Bayesian, has been proposed for estimating discovery probabilities. In this article, we investigate the relationships between the celebrated Good-Turing approach, which is a frequentist nonparametric approach developed in the 1940s, and a Bayesian nonparametric approach recently introduced in the literature. Specifically, under the assumption of a two parameter Poisson-Dirichlet prior, we show that Bayesian nonparametric estimators of discovery probabilities are asymptotically equivalent, for a large sample size, to suitably smoothed Good-Turing estimators. As a by-product of this result, we introduce and investigate a methodology for deriving exact and asymptotic credible intervals to be associated with the Bayesian nonparametric estimators of discovery probabilities. The proposed methodology is illustrated through a comprehensive simulation study and the analysis of Expressed Sequence Tags data generated by sequencing a benchmark complementary DNA library.

  14. Rediscovery of Good-Turing estimators via Bayesian nonparametrics.

    PubMed

    Favaro, Stefano; Nipoti, Bernardo; Teh, Yee Whye

    2016-03-01

    The problem of estimating discovery probabilities originated in the context of statistical ecology, and in recent years it has become popular due to its frequent appearance in challenging applications arising in genetics, bioinformatics, linguistics, designs of experiments, machine learning, etc. A full range of statistical approaches, parametric and nonparametric as well as frequentist and Bayesian, has been proposed for estimating discovery probabilities. In this article, we investigate the relationships between the celebrated Good-Turing approach, which is a frequentist nonparametric approach developed in the 1940s, and a Bayesian nonparametric approach recently introduced in the literature. Specifically, under the assumption of a two parameter Poisson-Dirichlet prior, we show that Bayesian nonparametric estimators of discovery probabilities are asymptotically equivalent, for a large sample size, to suitably smoothed Good-Turing estimators. As a by-product of this result, we introduce and investigate a methodology for deriving exact and asymptotic credible intervals to be associated with the Bayesian nonparametric estimators of discovery probabilities. The proposed methodology is illustrated through a comprehensive simulation study and the analysis of Expressed Sequence Tags data generated by sequencing a benchmark complementary DNA library. PMID:26224325

  15. Statistical Colocalization of Genetic Risk Variants for Related Autoimmune Diseases in the Context of Common Controls

    PubMed Central

    Fortune, Mary D.; Guo, Hui; Burren, Oliver; Schofield, Ellen; Walker, Neil M.; Ban, Maria; Sawcer, Stephen J.; Bowes, John; Worthington, Jane; Barton, Ann; Eyre, Steve; Todd, John A.; Wallace, Chris

    2015-01-01

    Identifying whether potential causal variants for related diseases are shared can identify overlapping etiologies of multifactorial disorders. Colocalization methods disentangle shared and distinct causal variants. However, existing approaches require independent datasets. Here we extend two colocalization methods to allow for the shared control design commonly used in comparison of genome-wide association study results across diseases. Our analysis of four autoimmune diseases, type 1 diabetes (T1D), rheumatoid arthritis, celiac disease and multiple sclerosis, revealed 90 regions that were associated with at least one disease, 33 (37%) of which with two or more disorders. Nevertheless, for 14 of these 33 shared regions there was evidence that causal variants differed. We identified novel disease associations in 11 regions previously associated with one or more of the other three disorders. Four of eight T1D-specific regions contained known type 2 diabetes candidate genes: COBL, GLIS3, RNLS and BCAR1, suggesting a shared cellular etiology. PMID:26053495

  16. A parameter-tuned genetic algorithm for statistically constrained economic design of multivariate CUSUM control charts: a Taguchi loss approach

    NASA Astrophysics Data System (ADS)

    Niaki, Seyed Taghi Akhavan; Javad Ershadi, Mohammad

    2012-12-01

    In this research, the main parameters of the multivariate cumulative sum (CUSUM) control chart (the reference value k, the control limit H, the sample size n and the sampling interval h) are determined by minimising the Lorenzen-Vance cost function [Lorenzen, T.J., and Vance, L.C. (1986), 'The Economic Design of Control Charts: A Unified Approach', Technometrics, 28, 3-10], in which the external costs of employing the chart are added. In addition, the model is statistically constrained to achieve desired in-control and out-of-control average run lengths. The Taguchi loss approach is used to model the problem and a genetic algorithm, for which its main parameters are tuned using the response surface methodology (RSM), is proposed to solve it. At the end, sensitivity analyses on the main parameters of the cost function are presented and their practical conclusions are drawn. The results show that RSM significantly improves the performance of the proposed algorithm and the external costs of applying the chart, which are due to real-world constraints, do not increase the average total loss very much.

  17. Fiducial registration error as a statistical process control metric in image-guidance radiotherapy with fiducial markers.

    PubMed

    Ung, N M; Wee, L

    2011-12-01

    Portal imaging of implanted fiducial markers has been in use for image-guided radiotherapy (IGRT) of prostate cancer, with ample attention to localization accuracy and organ motion. The geometric uncertainties in point-based rigid-body matching algorithms during localization of prostate fiducial markers can be quantified in terms of a fiducial registration error (FRE). In this study, the aim is to demonstrate how statistical process control (SPC) can be used to intercept potential problems with rigid-body matching algorithms in a retrospective study of FRE for a pilot cohort of 34 patients with fiducial markers. A procedure for estimating control parameters of a SPC control chart (x-chart) from a small number of initial observations (N) of FRE was implemented. The sensitivity analysis of N on the number of 'in-control' and 'out-of-control' x-charts was also performed. Uncorrected rotational offsets of an individual patient were examined to elucidate possible correlations with the behaviours of an x-chart. Four specific types of qualitative x-chart behaviour have been observed. The number of out-of-control processes was insensitive to the choice of N, provided N ≥ 5. Residual errors of rigid-body registration were contributed from uncorrected rotational offsets in 5 out of 15 'out-of-control' x-charts. Out-of-control x-charts were also shown to be correlated with potential changes in the IGRT processes, which may compromise the quality of the radiation treatment delivery. The SPC methodology, implemented in the form of individually customized x-charts, has been shown to be a useful tool for monitoring process reliability during fiducial-based IGRT for prostate cancer. PMID:22080792

  18. Fiducial registration error as a statistical process control metric in image-guidance radiotherapy with fiducial markers

    NASA Astrophysics Data System (ADS)

    Ung, N. M.; Wee, L.

    2011-12-01

    Portal imaging of implanted fiducial markers has been in use for image-guided radiotherapy (IGRT) of prostate cancer, with ample attention to localization accuracy and organ motion. The geometric uncertainties in point-based rigid-body matching algorithms during localization of prostate fiducial markers can be quantified in terms of a fiducial registration error (FRE). In this study, the aim is to demonstrate how statistical process control (SPC) can be used to intercept potential problems with rigid-body matching algorithms in a retrospective study of FRE for a pilot cohort of 34 patients with fiducial markers. A procedure for estimating control parameters of a SPC control chart (x-chart) from a small number of initial observations (N) of FRE was implemented. The sensitivity analysis of N on the number of 'in-control' and 'out-of-control' x-charts was also performed. Uncorrected rotational offsets of an individual patient were examined to elucidate possible correlations with the behaviours of an x-chart. Four specific types of qualitative x-chart behaviour have been observed. The number of out-of-control processes was insensitive to the choice of N, provided N >= 5. Residual errors of rigid-body registration were contributed from uncorrected rotational offsets in 5 out of 15 'out-of-control' x-charts. Out-of-control x-charts were also shown to be correlated with potential changes in the IGRT processes, which may compromise the quality of the radiation treatment delivery. The SPC methodology, implemented in the form of individually customized x-charts, has been shown to be a useful tool for monitoring process reliability during fiducial-based IGRT for prostate cancer.

  19. Modeling controlled nutrient release from a population of polymer coated fertilizers: statistically based model for diffusion release.

    PubMed

    Shaviv, Avi; Raban, Smadar; Zaidel, Elina

    2003-05-15

    A statistically based model for describing the release from a population of polymer coated controlled release fertilizer (CRF) granules by the diffusion mechanism was constructed. The model is based on a mathematical-mechanistic description of the release from a single granule of a coated CRF accounting for its complex and nonlinear nature. The large variation within populations of coated CRFs poses the need for a statistically based approach to integrate over the release from the individual granules within a given population for which the distribution and range of granule radii and coating thickness are known. The model was constructed and verified using experimentally determined parameters and release curves of polymer-coated CRFs. A sensitivity analysis indicated the importance of water permeability in controlling the lag period and that of solute permeability in governing the rate of linear release and the total duration of the release. Increasing the mean values of normally distributed granule radii or coating thickness, increases the lag period and the period of linear release. The variation of radii and coating thickness, within realistic ranges, affects the release only when the standard deviation is very large or when water permeability is reduced without affecting solute permeability. The model provides an effective tool for designing and improving agronomic and environmental effectiveness of polymer-coated CRFs. PMID:12785533

  20. Modeling controlled nutrient release from a population of polymer coated fertilizers: statistically based model for diffusion release.

    PubMed

    Shaviv, Avi; Raban, Smadar; Zaidel, Elina

    2003-05-15

    A statistically based model for describing the release from a population of polymer coated controlled release fertilizer (CRF) granules by the diffusion mechanism was constructed. The model is based on a mathematical-mechanistic description of the release from a single granule of a coated CRF accounting for its complex and nonlinear nature. The large variation within populations of coated CRFs poses the need for a statistically based approach to integrate over the release from the individual granules within a given population for which the distribution and range of granule radii and coating thickness are known. The model was constructed and verified using experimentally determined parameters and release curves of polymer-coated CRFs. A sensitivity analysis indicated the importance of water permeability in controlling the lag period and that of solute permeability in governing the rate of linear release and the total duration of the release. Increasing the mean values of normally distributed granule radii or coating thickness, increases the lag period and the period of linear release. The variation of radii and coating thickness, within realistic ranges, affects the release only when the standard deviation is very large or when water permeability is reduced without affecting solute permeability. The model provides an effective tool for designing and improving agronomic and environmental effectiveness of polymer-coated CRFs.

  1. Bayesian Inference for Functional Dynamics Exploring in fMRI Data.

    PubMed

    Guo, Xuan; Liu, Bing; Chen, Le; Chen, Guantao; Pan, Yi; Zhang, Jing

    2016-01-01

    This paper aims to review state-of-the-art Bayesian-inference-based methods applied to functional magnetic resonance imaging (fMRI) data. Particularly, we focus on one specific long-standing challenge in the computational modeling of fMRI datasets: how to effectively explore typical functional interactions from fMRI time series and the corresponding boundaries of temporal segments. Bayesian inference is a method of statistical inference which has been shown to be a powerful tool to encode dependence relationships among the variables with uncertainty. Here we provide an introduction to a group of Bayesian-inference-based methods for fMRI data analysis, which were designed to detect magnitude or functional connectivity change points and to infer their functional interaction patterns based on corresponding temporal boundaries. We also provide a comparison of three popular Bayesian models, that is, Bayesian Magnitude Change Point Model (BMCPM), Bayesian Connectivity Change Point Model (BCCPM), and Dynamic Bayesian Variable Partition Model (DBVPM), and give a summary of their applications. We envision that more delicate Bayesian inference models will be emerging and play increasingly important roles in modeling brain functions in the years to come.

  2. Bayesian Inference for Functional Dynamics Exploring in fMRI Data.

    PubMed

    Guo, Xuan; Liu, Bing; Chen, Le; Chen, Guantao; Pan, Yi; Zhang, Jing

    2016-01-01

    This paper aims to review state-of-the-art Bayesian-inference-based methods applied to functional magnetic resonance imaging (fMRI) data. Particularly, we focus on one specific long-standing challenge in the computational modeling of fMRI datasets: how to effectively explore typical functional interactions from fMRI time series and the corresponding boundaries of temporal segments. Bayesian inference is a method of statistical inference which has been shown to be a powerful tool to encode dependence relationships among the variables with uncertainty. Here we provide an introduction to a group of Bayesian-inference-based methods for fMRI data analysis, which were designed to detect magnitude or functional connectivity change points and to infer their functional interaction patterns based on corresponding temporal boundaries. We also provide a comparison of three popular Bayesian models, that is, Bayesian Magnitude Change Point Model (BMCPM), Bayesian Connectivity Change Point Model (BCCPM), and Dynamic Bayesian Variable Partition Model (DBVPM), and give a summary of their applications. We envision that more delicate Bayesian inference models will be emerging and play increasingly important roles in modeling brain functions in the years to come. PMID:27034708

  3. Bayesian Inference for Functional Dynamics Exploring in fMRI Data

    PubMed Central

    Guo, Xuan; Liu, Bing; Chen, Le; Chen, Guantao

    2016-01-01

    This paper aims to review state-of-the-art Bayesian-inference-based methods applied to functional magnetic resonance imaging (fMRI) data. Particularly, we focus on one specific long-standing challenge in the computational modeling of fMRI datasets: how to effectively explore typical functional interactions from fMRI time series and the corresponding boundaries of temporal segments. Bayesian inference is a method of statistical inference which has been shown to be a powerful tool to encode dependence relationships among the variables with uncertainty. Here we provide an introduction to a group of Bayesian-inference-based methods for fMRI data analysis, which were designed to detect magnitude or functional connectivity change points and to infer their functional interaction patterns based on corresponding temporal boundaries. We also provide a comparison of three popular Bayesian models, that is, Bayesian Magnitude Change Point Model (BMCPM), Bayesian Connectivity Change Point Model (BCCPM), and Dynamic Bayesian Variable Partition Model (DBVPM), and give a summary of their applications. We envision that more delicate Bayesian inference models will be emerging and play increasingly important roles in modeling brain functions in the years to come. PMID:27034708

  4. Statistical colocalization of genetic risk variants for related autoimmune diseases in the context of common controls.

    PubMed

    Fortune, Mary D; Guo, Hui; Burren, Oliver; Schofield, Ellen; Walker, Neil M; Ban, Maria; Sawcer, Stephen J; Bowes, John; Worthington, Jane; Barton, Anne; Eyre, Steve; Todd, John A; Wallace, Chris

    2015-07-01

    Determining whether potential causal variants for related diseases are shared can identify overlapping etiologies of multifactorial disorders. Colocalization methods disentangle shared and distinct causal variants. However, existing approaches require independent data sets. Here we extend two colocalization methods to allow for the shared-control design commonly used in comparison of genome-wide association study results across diseases. Our analysis of four autoimmune diseases--type 1 diabetes (T1D), rheumatoid arthritis, celiac disease and multiple sclerosis--identified 90 regions that were associated with at least one disease, 33 (37%) of which were associated with 2 or more disorders. Nevertheless, for 14 of these 33 shared regions, there was evidence that the causal variants differed. We identified new disease associations in 11 regions previously associated with one or more of the other 3 disorders. Four of eight T1D-specific regions contained known type 2 diabetes (T2D) candidate genes (COBL, GLIS3, RNLS and BCAR1), suggesting a shared cellular etiology. PMID:26053495

  5. A Bayesian framework for comparative quantitative genetics

    PubMed Central

    Ovaskainen, Otso; Cano, José Manuel; Merilä, Juha

    2008-01-01

    Bayesian approaches have been extensively used in animal breeding sciences, but similar approaches in the context of evolutionary quantitative genetics have been rare. We compared the performance of Bayesian and frequentist approaches in estimation of quantitative genetic parameters (viz. matrices of additive and dominance variances) in datasets typical of evolutionary studies and traits differing in their genetic architecture. Our results illustrate that it is difficult to disentangle the relative roles of different genetic components from small datasets, and that ignoring, e.g. dominance is likely to lead to biased estimates of additive variance. We suggest that a natural summary statistic for G-matrix comparisons can be obtained by examining how different the underlying multinormal probability distributions are, and illustrate our approach with data on the common frog (Rana temporaria). Furthermore, we derive a simple Monte Carlo method for computation of fraternity coefficients needed for the estimation of dominance variance, and use the pedigree of a natural Siberian jay (Perisoreus infaustus) population to illustrate that the commonly used approximate values can be substantially biased. PMID:18211881

  6. Bayesian Cosmic Web Reconstruction: BARCODE for Clusters

    NASA Astrophysics Data System (ADS)

    Patrick Bos, E. G.; van de Weygaert, Rien; Kitaura, Francisco; Cautun, Marius

    2016-10-01

    We describe the Bayesian \\barcode\\ formalism that has been designed towards the reconstruction of the Cosmic Web in a given volume on the basis of the sampled galaxy cluster distribution. Based on the realization that the massive compact clusters are responsible for the major share of the large scale tidal force field shaping the anisotropic and in particular filamentary features in the Cosmic Web. Given the nonlinearity of the constraints imposed by the cluster configurations, we resort to a state-of-the-art constrained reconstruction technique to find a proper statistically sampled realization of the original initial density and velocity field in the same cosmic region. Ultimately, the subsequent gravitational evolution of these initial conditions towards the implied Cosmic Web configuration can be followed on the basis of a proper analytical model or an N-body computer simulation. The BARCODE formalism includes an implicit treatment for redshift space distortions. This enables a direct reconstruction on the basis of observational data, without the need for a correction of redshift space artifacts. In this contribution we provide a general overview of the the Cosmic Web connection with clusters and a description of the Bayesian BARCODE formalism. We conclude with a presentation of its successful workings with respect to test runs based on a simulated large scale matter distribution, in physical space as well as in redshift space.

  7. Bayesian Analysis of High Dimensional Classification

    NASA Astrophysics Data System (ADS)

    Mukhopadhyay, Subhadeep; Liang, Faming

    2009-12-01

    Modern data mining and bioinformatics have presented an important playground for statistical learning techniques, where the number of input variables is possibly much larger than the sample size of the training data. In supervised learning, logistic regression or probit regression can be used to model a binary output and form perceptron classification rules based on Bayesian inference. In these cases , there is a lot of interest in searching for sparse model in High Dimensional regression(/classification) setup. we first discuss two common challenges for analyzing high dimensional data. The first one is the curse of dimensionality. The complexity of many existing algorithms scale exponentially with the dimensionality of the space and by virtue of that algorithms soon become computationally intractable and therefore inapplicable in many real applications. secondly, multicollinearities among the predictors which severely slowdown the algorithm. In order to make Bayesian analysis operational in high dimension we propose a novel 'Hierarchical stochastic approximation monte carlo algorithm' (HSAMC), which overcomes the curse of dimensionality, multicollinearity of predictors in high dimension and also it possesses the self-adjusting mechanism to avoid the local minima separated by high energy barriers. Models and methods are illustrated by simulation inspired from from the feild of genomics. Numerical results indicate that HSAMC can work as a general model selection sampler in high dimensional complex model space.

  8. Bayesian design strategies for synthetic biology

    PubMed Central

    Barnes, Chris P.; Silk, Daniel; Stumpf, Michael P. H.

    2011-01-01

    We discuss how statistical inference techniques can be applied in the context of designing novel biological systems. Bayesian techniques have found widespread application and acceptance in the systems biology community, where they are used for both parameter estimation and model selection. Here we show that the same approaches can also be used in order to engineer synthetic biological systems by inferring the structure and parameters that are most likely to give rise to the dynamics that we require a system to exhibit. Problems that are shared between applications in systems and synthetic biology include the vast potential spaces that need to be searched for suitable models and model parameters; the complex forms of likelihood functions; and the interplay between noise at the molecular level and nonlinearity in the dynamics owing to often complex feedback structures. In order to meet these challenges, we have to develop suitable inferential tools and here, in particular, we illustrate the use of approximate Bayesian computation and unscented Kalman filtering-based approaches. These partly complementary methods allow us to tackle a number of recurring problems in the design of biological systems. After a brief exposition of these two methodologies, we focus on their application to oscillatory systems. PMID:23226588

  9. Bayesian response adaptive randomization using longitudinal outcomes.

    PubMed

    Hatayama, Tomoyoshi; Morita, Satoshi; Sakamaki, Kentaro

    2015-01-01

    The response adaptive randomization (RAR) method is used to increase the number of patients assigned to more efficacious treatment arms in clinical trials. In many trials evaluating longitudinal patient outcomes, RAR methods based only on the final measurement may not benefit significantly from RAR because of its delayed initiation. We propose a Bayesian RAR method to improve RAR performance by accounting for longitudinal patient outcomes (longitudinal RAR). We use a Bayesian linear mixed effects model to analyze longitudinal continuous patient outcomes for calculating a patient allocation probability. In addition, we aim to mitigate the loss of statistical power because of large patient allocation imbalances by embedding adjusters into the patient allocation probability calculation. Using extensive simulation we compared the operating characteristics of our proposed longitudinal RAR method with those of the RAR method based only on the final measurement and with an equal randomization method. Simulation results showed that our proposed longitudinal RAR method assigned more patients to the presumably superior treatment arm compared with the other two methods. In addition, the embedded adjuster effectively worked to prevent extreme patient allocation imbalances. However, our proposed method may not function adequately when the treatment effect difference is moderate or less, and still needs to be modified to deal with unexpectedly large departures from the presumed longitudinal data model.

  10. Organism-level models: When mechanisms and statistics fail us

    NASA Astrophysics Data System (ADS)

    Phillips, M. H.; Meyer, J.; Smith, W. P.; Rockhill, J. K.

    2014-03-01

    Purpose: To describe the unique characteristics of models that represent the entire course of radiation therapy at the organism level and to highlight the uses to which such models can be put. Methods: At the level of an organism, traditional model-building runs into severe difficulties. We do not have sufficient knowledge to devise a complete biochemistry-based model. Statistical model-building fails due to the vast number of variables and the inability to control many of them in any meaningful way. Finally, building surrogate models, such as animal-based models, can result in excluding some of the most critical variables. Bayesian probabilistic models (Bayesian networks) provide a useful alternative that have the advantages of being mathematically rigorous, incorporating the knowledge that we do have, and being practical. Results: Bayesian networks representing radiation therapy pathways for prostate cancer and head & neck cancer were used to highlight the important aspects of such models and some techniques of model-building. A more specific model representing the treatment of occult lymph nodes in head & neck cancer were provided as an example of how such a model can inform clinical decisions. A model of the possible role of PET imaging in brain cancer was used to illustrate the means by which clinical trials can be modelled in order to come up with a trial design that will have meaningful outcomes. Conclusions: Probabilistic models are currently the most useful approach to representing the entire therapy outcome process.

  11. Quality control and statistical modeling for environmental epigenetics: A study on in utero lead exposure and DNA methylation at birth

    PubMed Central

    Goodrich, Jaclyn M; Sánchez, Brisa N; Dolinoy, Dana C; Zhang, Zhenzhen; Hernández-Ávila, Mauricio; Hu, Howard; Peterson, Karen E; Téllez-Rojo, Martha M

    2015-01-01

    DNA methylation data assayed using pyrosequencing techniques are increasingly being used in human cohort studies to investigate associations between epigenetic modifications at candidate genes and exposures to environmental toxicants and to examine environmentally-induced epigenetic alterations as a mechanism underlying observed toxicant-health outcome associations. For instance, in utero lead (Pb) exposure is a neurodevelopmental toxicant of global concern that has also been linked to altered growth in human epidemiological cohorts; a potential mechanism of this association is through alteration of DNA methylation (e.g., at growth-related genes). However, because the associations between toxicants and DNA methylation might be weak, using appropriate quality control and statistical methods is important to increase reliability and power of such studies. Using a simulation study, we compared potential approaches to estimate toxicant-DNA methylation associations that varied by how methylation data were analyzed (repeated measures vs. averaging all CpG sites) and by method to adjust for batch effects (batch controls vs. random effects). We demonstrate that correcting for batch effects using plate controls yields unbiased associations, and that explicitly modeling the CpG site-specific variances and correlations among CpG sites increases statistical power. Using the recommended approaches, we examined the association between DNA methylation (in LINE-1 and growth related genes IGF2, H19 and HSD11B2) and 3 biomarkers of Pb exposure (Pb concentrations in umbilical cord blood, maternal tibia, and maternal patella), among mother-infant pairs of the Early Life Exposures in Mexico to Environmental Toxicants (ELEMENT) cohort (n = 247). Those with 10 μg/g higher patella Pb had, on average, 0.61% higher IGF2 methylation (P = 0.05). Sex-specific trends between Pb and DNA methylation (P < 0.1) were observed among girls including a 0.23% increase in HSD11B2 methylation with 10

  12. Quality control and statistical modeling for environmental epigenetics: a study on in utero lead exposure and DNA methylation at birth.

    PubMed

    Goodrich, Jaclyn M; Sánchez, Brisa N; Dolinoy, Dana C; Zhang, Zhenzhen; Hernández-Ávila, Mauricio; Hu, Howard; Peterson, Karen E; Téllez-Rojo, Martha M

    2015-01-01

    DNA methylation data assayed using pyrosequencing techniques are increasingly being used in human cohort studies to investigate associations between epigenetic modifications at candidate genes and exposures to environmental toxicants and to examine environmentally-induced epigenetic alterations as a mechanism underlying observed toxicant-health outcome associations. For instance, in utero lead (Pb) exposure is a neurodevelopmental toxicant of global concern that has also been linked to altered growth in human epidemiological cohorts; a potential mechanism of this association is through alteration of DNA methylation (e.g., at growth-related genes). However, because the associations between toxicants and DNA methylation might be weak, using appropriate quality control and statistical methods is important to increase reliability and power of such studies. Using a simulation study, we compared potential approaches to estimate toxicant-DNA methylation associations that varied by how methylation data were analyzed (repeated measures vs. averaging all CpG sites) and by method to adjust for batch effects (batch controls vs. random effects). We demonstrate that correcting for batch effects using plate controls yields unbiased associations, and that explicitly modeling the CpG site-specific variances and correlations among CpG sites increases statistical power. Using the recommended approaches, we examined the association between DNA methylation (in LINE-1 and growth related genes IGF2, H19 and HSD11B2) and 3 biomarkers of Pb exposure (Pb concentrations in umbilical cord blood, maternal tibia, and maternal patella), among mother-infant pairs of the Early Life Exposures in Mexico to Environmental Toxicants (ELEMENT) cohort (n = 247). Those with 10 μg/g higher patella Pb had, on average, 0.61% higher IGF2 methylation (P = 0.05). Sex-specific trends between Pb and DNA methylation (P < 0.1) were observed among girls including a 0.23% increase in HSD11B2 methylation with 10

  13. Quality control and statistical modeling for environmental epigenetics: a study on in utero lead exposure and DNA methylation at birth.

    PubMed

    Goodrich, Jaclyn M; Sánchez, Brisa N; Dolinoy, Dana C; Zhang, Zhenzhen; Hernández-Ávila, Mauricio; Hu, Howard; Peterson, Karen E; Téllez-Rojo, Martha M

    2015-01-01

    DNA methylation data assayed using pyrosequencing techniques are increasingly being used in human cohort studies to investigate associations between epigenetic modifications at candidate genes and exposures to environmental toxicants and to examine environmentally-induced epigenetic alterations as a mechanism underlying observed toxicant-health outcome associations. For instance, in utero lead (Pb) exposure is a neurodevelopmental toxicant of global concern that has also been linked to altered growth in human epidemiological cohorts; a potential mechanism of this association is through alteration of DNA methylation (e.g., at growth-related genes). However, because the associations between toxicants and DNA methylation might be weak, using appropriate quality control and statistical methods is important to increase reliability and power of such studies. Using a simulation study, we compared potential approaches to estimate toxicant-DNA methylation associations that varied by how methylation data were analyzed (repeated measures vs. averaging all CpG sites) and by method to adjust for batch effects (batch controls vs. random effects). We demonstrate that correcting for batch effects using plate controls yields unbiased associations, and that explicitly modeling the CpG site-specific variances and correlations among CpG sites increases statistical power. Using the recommended approaches, we examined the association between DNA methylation (in LINE-1 and growth related genes IGF2, H19 and HSD11B2) and 3 biomarkers of Pb exposure (Pb concentrations in umbilical cord blood, maternal tibia, and maternal patella), among mother-infant pairs of the Early Life Exposures in Mexico to Environmental Toxicants (ELEMENT) cohort (n = 247). Those with 10 μg/g higher patella Pb had, on average, 0.61% higher IGF2 methylation (P = 0.05). Sex-specific trends between Pb and DNA methylation (P < 0.1) were observed among girls including a 0.23% increase in HSD11B2 methylation with 10

  14. The Use of Statistical Process Control-Charts for Person-Fit Analysis on Computerized Adaptive Testing. LSAC Research Report Series.

    ERIC Educational Resources Information Center

    Meijer, Rob R.; van Krimpen-Stoop, Edith M. L. A.

    In this study a cumulative-sum (CUSUM) procedure from the theory of Statistical Process Control was modified and applied in the context of person-fit analysis in a computerized adaptive testing (CAT) environment. Six person-fit statistics were proposed using the CUSUM procedure, and three of them could be used to investigate the CAT in online test…

  15. A program for the Bayesian Neural Network in the ROOT framework

    NASA Astrophysics Data System (ADS)

    Zhong, Jiahang; Huang, Run-Sheng; Lee, Shih-Chang

    2011-12-01

    We present a Bayesian Neural Network algorithm implemented in the TMVA package (Hoecker et al., 2007 [1]), within the ROOT framework (Brun and Rademakers, 1997 [2]). Comparing to the conventional utilization of Neural Network as discriminator, this new implementation has more advantages as a non-parametric regression tool, particularly for fitting probabilities. It provides functionalities including cost function selection, complexity control and uncertainty estimation. An example of such application in High Energy Physics is shown. The algorithm is available with ROOT release later than 5.29. Program summaryProgram title: TMVA-BNN Catalogue identifier: AEJX_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEJX_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: BSD license No. of lines in distributed program, including test data, etc.: 5094 No. of bytes in distributed program, including test data, etc.: 1,320,987 Distribution format: tar.gz Programming language: C++ Computer: Any computer system or cluster with C++ compiler and UNIX-like operating system Operating system: Most UNIX/Linux systems. The application programs were thoroughly tested under Fedora and Scientific Linux CERN. Classification: 11.9 External routines: ROOT package version 5.29 or higher ( http://root.cern.ch) Nature of problem: Non-parametric fitting of multivariate distributions Solution method: An implementation of Neural Network following the Bayesian statistical interpretation. Uses Laplace approximation for the Bayesian marginalizations. Provides the functionalities of automatic complexity control and uncertainty estimation. Running time: Time consumption for the training depends substantially on the size of input sample, the NN topology, the number of training iterations, etc. For the example in this manuscript, about 7 min was used on a PC/Linux with 2.0 GHz processors.

  16. Bayesian shared frailty models for regional inference about wildlife survival

    USGS Publications Warehouse

    Heisey, D.M.

    2012-01-01

    One can joke that 'exciting statistics' is an oxymoron, but it is neither a joke nor an exaggeration to say that these are exciting times to be involved in statistical ecology. As Halstead et al.'s (2012) paper nicely exemplifies, recently developed Bayesian analyses can now be used to extract insights from data using techniques that would have been unavailable to the ecological researcher just a decade ago. Some object to this, implying that the subjective priors of the Bayesian approach is the pathway to perdition (e.g. Lele & Dennis, 2009). It is reasonable to ask whether these new approaches are really giving us anything that we could not obtain with traditional tried-and-true frequentist approaches. I believe the answer is a clear yes.

  17. Determination of the EEDF using a Bayesian analysis framework

    NASA Astrophysics Data System (ADS)

    Poznic, Dominic; Samarian, Alex; James, Brian

    2013-10-01

    A statistical analysis framework is presented that determines the electron energy distribution function (EEDF) of an argon discharge plasma from optical emission spectra and Langmuir probe data. The analysis framework is based on Bayesian inference, in which data are treated in a rigorously statistical manner, that naturally includes all sources of uncertainty. The framework is designed to allow models describing different data sets from the same system to be combined in a straightforward manner. Spectral line intensities are described using a collisional-radiative model, while Langmuir probe data are described with a simple 1D Langmuir probe model. The models are inverted and combined using Bayesian probability theory in a joint analysis of both data sets. This framework was tested using data simulated by the two models from a known set of plasma conditions. The testing confirmed the ability of the framework to determine non-Maxwellian EEDFs and use multiple data sets to increase the accuracy of results.

  18. Statistical Methods for Establishing Quality Control Ranges for Antibacterial Agents in Clinical and Laboratory Standards Institute Susceptibility Testing▿

    PubMed Central

    Turnidge, John; Bordash, Gerry

    2007-01-01

    Quality control (QC) ranges for antimicrobial agents against QC strains for both dilution and disk diffusion testing are currently set by the Clinical and Laboratory Standards Institute (CLSI), using data gathered in predefined structured multilaboratory studies, so-called tier 2 studies. The ranges are finally selected by the relevant CLSI subcommittee, based largely on visual inspection and a few simple rules. We have developed statistical methods for analyzing the data from tier 2 studies and applied them to QC strain-antimicrobial agent combinations from 178 dilution testing data sets and 48 disk diffusion data sets, including a method for identifying possible outlier data from individual laboratories. The methods are based on the fact that dilution testing MIC data were log normally distributed and disk diffusion zone diameter data were normally distributed. For dilution testing, compared to QC ranges actually set by CLSI, calculated ranges were identical in 68% of cases, narrower in 7% of cases, and wider in 14% of cases. For disk diffusion testing, calculated ranges were identical to CLSI ranges in 33% of cases, narrower in 8% of cases, and 1 to 2 mm wider in 58% of cases. Possible outliers were detected in 8% of diffusion test data but none of the disk diffusion data. Application of statistical techniques to the analysis of QC tier 2 data and the setting of QC ranges is relatively simple to perform on spreadsheets, and the output enhances the current CLSI methods for setting of QC ranges. PMID:17438045

  19. Application of statistical process control and process capability analysis procedures in orbiter processing activities at the Kennedy Space Center

    NASA Technical Reports Server (NTRS)

    Safford, Robert R.; Jackson, Andrew E.; Swart, William W.; Barth, Timothy S.

    1994-01-01

    Successful ground processing at KSC requires that flight hardware and ground support equipment conform to specifications at tens of thousands of checkpoints. Knowledge of conformance is an essential requirement for launch. That knowledge of conformance at every requisite point does not, however, enable identification of past problems with equipment, or potential problem areas. This paper describes how the introduction of Statistical Process Control and Process Capability Analysis identification procedures into existing shuttle processing procedures can enable identification of potential problem areas and candidates for improvements to increase processing performance measures. Results of a case study describing application of the analysis procedures to Thermal Protection System processing are used to illustrate the benefits of the approaches described in the paper.

  20. Statistical process control charts for attribute data involving very large sample sizes: a review of problems and solutions.

    PubMed

    Mohammed, Mohammed A; Panesar, Jagdeep S; Laney, David B; Wilson, Richard

    2013-04-01

    The use of statistical process control (SPC) charts in healthcare is increasing. The primary purpose of SPC is to distinguish between common-cause variation which is attributable to the underlying process, and special-cause variation which is extrinsic to the underlying process. This is important because improvement under common-cause variation requires action on the process, whereas special-cause variation merits an investigation to first find the cause. Nonetheless, when dealing with attribute or count data (eg, number of emergency admissions) involving very large sample sizes, traditional SPC charts often produce tight control limits with most of the data points appearing outside the control limits. This can give a false impression of common and special-cause variation, and potentially misguide the user into taking the wrong actions. Given the growing availability of large datasets from routinely collected databases in healthcare, there is a need to present a review of this problem (which arises because traditional attribute charts only consider within-subgroup variation) and its solutions (which consider within and between-subgroup variation), which involve the use of the well-established measurements chart and the more recently developed attribute charts based on Laney's innovative approach. We close by making some suggestions for practice. PMID:23365140

  1. Modeling of manufacturing sensitivity and of statistically based process control requirements for a 0.18 μm NMOS device

    NASA Astrophysics Data System (ADS)

    Zeitzoff, P. M.; Tasch, A. F.; Moore, W. E.; Khan, S. A.; Angelo, D.

    1998-11-01

    Random statistical variations during the IC manufacturing process have an important influence on yield and performance, particularly as technology is scaled into the deep submicron regime. A simulation-based approach to modeling the impact of these variations on a 0.18μm NMOSFET is presented. The result of this modeling is a special Monte Carlo simulation code that can be used to predict the statistical variation of key device electrical characteristics and to determine the reduction in these variations resulting from improved process control. In addition, the level of process control needed to satisfy specified statistical targets for the NMOSFET electrical performance was analyzed. Meeting these targets requires tight control of five key parameters: the gate length (optimal statistical variation is 9% or less), the gate oxide thickness (optimal statistical variation is 5% or less), the shallow source/drain extension junction depth (optimal statistical variation is 5% or less), the channel dose (optimal statistical variation is 7.5% or less), and the spacer width (optimal statistical variation is 8% or less).

  2. Bayesian planet searches in radial velocity data

    NASA Astrophysics Data System (ADS)

    Gregory, Phil

    2015-08-01

    Intrinsic stellar variability caused by magnetic activity and convection has become the main limiting factor for planet searches in both transit and radial velocity (RV) data. New spectrographs are under development like ESPRESSO and EXPRES that aim to improve RV precision by a factor of approximately 100 over the current best spectrographs, HARPS and HARPS-N. This will greatly exacerbate the challenge of distinguishing planetary signals from stellar activity induced RV signals. At the same time good progress has been made in simulating stellar activity signals. At the Porto 2014 meeting, “Towards Other Earths II,” Xavier Dumusque challenged the community to a large scale blind test using the simulated RV data to understand the limitations of present solutions to deal with stellar signals and to select the best approach. My talk will focus on some of the statistical lesson learned from this challenge with an emphasis on Bayesian methodology.

  3. Bayesian Lasso for Semiparametric Structural Equation Models

    PubMed Central

    Guo, Ruixin; Zhu, Hongtu; Chow, Sy-Miin; Ibrahim, Joseph G.

    2011-01-01

    Summary There has been great interest in developing nonlinear structural equation models and associated statistical inference procedures, including estimation and model selection methods. In this paper a general semiparametric structural equation model (SSEM) is developed in which the structural equation is composed of nonparametric functions of exogenous latent variables and fixed covariates on a set of latent endogenous variables. A basis representation is used to approximate these nonparametric functions in the structural equation and the Bayesian Lasso method coupled with a Markov Chain Monte Carlo (MCMC) algorithm is used for simultaneous estimation and model selection. The proposed method is illustrated using a simulation study and data from the Affective Dynamics and Individual Differences (ADID) study. Results demonstrate that our method can accurately estimate the unknown parameters and correctly identify the true underlying model. PMID:22376150

  4. SU-C-BRD-01: A Statistical Modeling Method for Quality Control of Intensity- Modulated Radiation Therapy Planning

    SciTech Connect

    Gao, S; Meyer, R; Shi, L; D'Souza, W; Zhang, H

    2014-06-15

    Purpose: To apply a statistical modeling approach, threshold modeling (TM), for quality control of intensity-modulated radiation therapy (IMRT) treatment plans. Methods: A quantitative measure, which was the weighted sum of violations of dose/dose-volume constraints, was first developed to represent the quality of each IMRT plan. Threshold modeling approach, which is is an extension of extreme value theory in statistics and is an effect way to model extreme values, was then applied to analyze the quality of the plans summarized by our quantitative measures. Our approach modeled the plans generated by planners as a series of independent and identically distributed random variables and described the behaviors of them if the plan quality was controlled below certain threshold. We tested our approach with five locally advanced head and neck cancer patients retrospectively. Two statistics were incorporated for numerical analysis: probability of quality improvement (PQI) of the plans and expected amount of improvement on the quantitative measure (EQI). Results: After clinical planners generated 15 plans for each patient, we applied our approach to obtain the PQI and EQI as if planners would generate additional 15 plans. For two of the patients, the PQI was significantly higher than the other three (0.17 and 0.18 comparing to 0.08, 0.01 and 0.01). The actual percentage of the additional 15 plans that outperformed the best of initial 15 plans was 20% and 27% comparing to 11%, 0% and 0%. EQI for the two potential patients were 34.5 and 32.9 and the rest of three patients were 9.9, 1.4 and 6.6. The actual improvements obtained were 28.3 and 20.5 comparing to 6.2, 0 and 0. Conclusion: TM is capable of reliably identifying the potential quality improvement of IMRT plans. It provides clinicians an effective tool to assess the trade-off between extra planning effort and achievable plan quality. This work was supported in part by NIH/NCI grant CA130814.

  5. Bayesian methods for assessing system reliability: models and computation.

    SciTech Connect

    Graves, T. L.; Hamada, Michael,

    2004-01-01

    There are many challenges with assessing the reliability of a system today. These challenges arise because a system may be aging and full system tests may be too expensive or can no longer be performed. Without full system testing, one must integrate (1) all science and engineering knowledge, models and simulations, (2) information and data at various levels of the system, e.g., subsystems and components and (3) information and data from similar systems, subsystems and components. The analyst must work with various data types and how the data are collected, account for measurement bias and uncertainty, deal with model and simulation uncertainty and incorporate expert knowledge. Bayesian hierarchical modeling provides a rigorous way to combine information from multiple sources and different types of information. However, an obstacle to applying Bayesian methods is the need to develop new software to analyze novel statistical models. We discuss a new statistical modeling environment, YADAS, that facilitates the development of Bayesian statistical analyses. It includes classes that help analysts specify new models, as well as classes that support the creation of new analysis algorithms. We illustrate these concepts using several examples.

  6. A Bayesian Nonparametric Approach to Test Equating

    ERIC Educational Resources Information Center

    Karabatsos, George; Walker, Stephen G.

    2009-01-01

    A Bayesian nonparametric model is introduced for score equating. It is applicable to all major equating designs, and has advantages over previous equating models. Unlike the previous models, the Bayesian model accounts for positive dependence between distributions of scores from two tests. The Bayesian model and the previous equating models are…

  7. Bayesian Model Averaging for Propensity Score Analysis

    ERIC Educational Resources Information Center

    Kaplan, David; Chen, Jianshen

    2013-01-01

    The purpose of this study is to explore Bayesian model averaging in the propensity score context. Previous research on Bayesian propensity score analysis does not take into account model uncertainty. In this regard, an internally consistent Bayesian framework for model building and estimation must also account for model uncertainty. The…

  8. Candidate gene association study in pediatric acute lymphoblastic leukemia evaluated by Bayesian network based Bayesian multilevel analysis of relevance

    PubMed Central

    2012-01-01

    Background We carried out a candidate gene association study in pediatric acute lymphoblastic leukemia (ALL) to identify possible genetic risk factors in a Hungarian population. Methods The results were evaluated with traditional statistical methods and with our newly developed Bayesian network based Bayesian multilevel analysis of relevance (BN-BMLA) method. We collected genomic DNA and clinical data from 543 children, who underwent chemotherapy due to ALL, and 529 healthy controls. Altogether 66 single nucleotide polymorphisms (SNPs) in 19 candidate genes were genotyped. Results With logistic regression, we identified 6 SNPs in the ARID5B and IKZF1 genes associated with increased risk to B-cell ALL, and two SNPs in the STAT3 gene, which decreased the risk to hyperdiploid ALL. Because the associated SNPs were in linkage in each gene, these associations corresponded to one signal per gene. The odds ratio (OR) associated with the tag SNPs were: OR = 1.69, P = 2.22x10-7 for rs4132601 (IKZF1), OR = 1.53, P = 1.95x10-5 for rs10821936 (ARID5B) and OR = 0.64, P = 2.32x10-4 for rs12949918 (STAT3). With the BN-BMLA we confirmed the findings of the frequentist-based method and received additional information about the nature of the relations between the SNPs and the disease. E.g. the rs10821936 in ARID5B and rs17405722 in STAT3 showed a weak interaction, and in case of T-cell lineage sample group, the gender showed a weak interaction with three SNPs in three genes. In the hyperdiploid patient group the BN-BMLA detected a strong interaction among SNPs in the NOTCH1, STAT1, STAT3 and BCL2 genes. Evaluating the survival rate of the patients with ALL, the BN-BMLA showed that besides risk groups and subtypes, genetic variations in the BAX and CEBPA genes might also influence the probability of survival of the patients. Conclusions In the present study we confirmed the roles of genetic variations in ARID5B and IKZF1 in the susceptibility to B-cell ALL

  9. Bayesian phylogenetic estimation of fossil ages

    PubMed Central

    Drummond, Alexei J.; Stadler, Tanja

    2016-01-01

    Recent advances have allowed for both morphological fossil evidence and molecular sequences to be integrated into a single combined inference of divergence dates under the rule of Bayesian probability. In particular, the fossilized birth–death tree prior and the Lewis-Mk model of discrete morphological evolution allow for the estimation of both divergence times and phylogenetic relationships between fossil and extant taxa. We exploit this statistical framework to investigate the internal consistency of these models by producing phylogenetic estimates of the age of each fossil in turn, within two rich and well-characterized datasets of fossil and extant species (penguins and canids). We find that the estimation accuracy of fossil ages is generally high with credible intervals seldom excluding the true age and median relative error in the two datasets of 5.7% and 13.2%, respectively. The median relative standard error (RSD) was 9.2% and 7.2%, respectively, suggesting good precision, although with some outliers. In fact, in the two datasets we analyse, the phylogenetic estimate of fossil age is on average less than 2 Myr from the mid-point age of the geological strata from which it was excavated. The high level of internal consistency found in our analyses suggests that the Bayesian statistical model employed is an adequate fit for both the geological and morphological data, and provides evidence from real data that the framework used can accurately model the evolution of discrete morphological traits coded from fossil and extant taxa. We anticipate that this approach will have diverse applications beyond divergence time dating, including dating fossils that are temporally unconstrained, testing of the ‘morphological clock', and for uncovering potential model misspecification and/or data errors when controversial phylogenetic hypotheses are obtained based on combined divergence dating analyses. This article is part of the themed issue ‘Dating species divergences

  10. Bayesian phylogenetic estimation of fossil ages.

    PubMed

    Drummond, Alexei J; Stadler, Tanja

    2016-07-19

    Recent advances have allowed for both morphological fossil evidence and molecular sequences to be integrated into a single combined inference of divergence dates under the rule of Bayesian probability. In particular, the fossilized birth-death tree prior and the Lewis-Mk model of discrete morphological evolution allow for the estimation of both divergence times and phylogenetic relationships between fossil and extant taxa. We exploit this statistical framework to investigate the internal consistency of these models by producing phylogenetic estimates of the age of each fossil in turn, within two rich and well-characterized datasets of fossil and extant species (penguins and canids). We find that the estimation accuracy of fossil ages is generally high with credible intervals seldom excluding the true age and median relative error in the two datasets of 5.7% and 13.2%, respectively. The median relative standard error (RSD) was 9.2% and 7.2%, respectively, suggesting good precision, although with some outliers. In fact, in the two datasets we analyse, the phylogenetic estimate of fossil age is on average less than 2 Myr from the mid-point age of the geological strata from which it was excavated. The high level of internal consistency found in our analyses suggests that the Bayesian statistical model employed is an adequate fit for both the geological and morphological data, and provides evidence from real data that the framework used can accurately model the evolution of discrete morphological traits coded from fossil and extant taxa. We anticipate that this approach will have diverse applications beyond divergence time dating, including dating fossils that are temporally unconstrained, testing of the 'morphological clock', and for uncovering potential model misspecification and/or data errors when controversial phylogenetic hypotheses are obtained based on combined divergence dating analyses.This article is part of the themed issue 'Dating species divergences using

  11. Bayesian phylogenetic estimation of fossil ages.

    PubMed

    Drummond, Alexei J; Stadler, Tanja

    2016-07-19

    Recent advances have allowed for both morphological fossil evidence and molecular sequences to be integrated into a single combined inference of divergence dates under the rule of Bayesian probability. In particular, the fossilized birth-death tree prior and the Lewis-Mk model of discrete morphological evolution allow for the estimation of both divergence times and phylogenetic relationships between fossil and extant taxa. We exploit this statistical framework to investigate the internal consistency of these models by producing phylogenetic estimates of the age of each fossil in turn, within two rich and well-characterized datasets of fossil and extant species (penguins and canids). We find that the estimation accuracy of fossil ages is generally high with credible intervals seldom excluding the true age and median relative error in the two datasets of 5.7% and 13.2%, respectively. The median relative standard error (RSD) was 9.2% and 7.2%, respectively, suggesting good precision, although with some outliers. In fact, in the two datasets we analyse, the phylogenetic estimate of fossil age is on average less than 2 Myr from the mid-point age of the geological strata from which it was excavated. The high level of internal consistency found in our analyses suggests that the Bayesian statistical model employed is an adequate fit for both the geological and morphological data, and provides evidence from real data that the framework used can accurately model the evolution of discrete morphological traits coded from fossil and extant taxa. We anticipate that this approach will have diverse applications beyond divergence time dating, including dating fossils that are temporally unconstrained, testing of the 'morphological clock', and for uncovering potential model misspecification and/or data errors when controversial phylogenetic hypotheses are obtained based on combined divergence dating analyses.This article is part of the themed issue 'Dating species divergences using

  12. Why traditional statistical process control charts for attribute data should be viewed alongside an xmr-chart.

    PubMed

    Mohammed, Mohammed A; Worthington, Peter

    2013-03-01

    The use of statistical process control (SPC) charts in healthcare is increasing. The general advice when plotting SPC charts is to begin by selecting the right chart. This advice, in the case of attribute data, may be limiting our insights into the underlying process and consequently be potentially misleading. Given the general lack of awareness that additional insights may be obtained by using more than one SPC chart, there is a need to review this issue and make some recommendations. Under purely common cause variation the control limits on the xmr-chart and traditional attribute charts (eg, p-chart, c-chart, u-chart) will be in close agreement, indicating that the observed variation (xmr-chart) is consistent with the underlying Binomial model (p-chart) or Poisson model (c-chart, u-chart). However, when there is a material difference between the limits from the xmr-chart and the attribute chart then this also constitutes a signal of an underlying systematic special cause of variation. We use one simulation and two case studies to demonstrate these ideas and show the utility of plotting the SPC chart for attribute data alongside an xmr-chart. We conclude that the combined use of attribute charts and xmr-charts, which requires little additional effort, is a useful strategy because it is less likely to mislead us and more likely to give us the insight to do the right thing. PMID:23104897

  13. Bayesian Integration of Spatial Information

    ERIC Educational Resources Information Center

    Cheng, Ken; Shettleworth, Sara J.; Huttenlocher, Janellen; Rieser, John J.

    2007-01-01

    Spatial judgments and actions are often based on multiple cues. The authors review a multitude of phenomena on the integration of spatial cues in diverse species to consider how nearly optimally animals combine the cues. Under the banner of Bayesian perception, cues are sometimes combined and weighted in a near optimal fashion. In other instances…

  14. Evidence cross-validation and Bayesian inference of MAST plasma equilibria

    NASA Astrophysics Data System (ADS)

    von Nessi, G. T.; Hole, M. J.; Svensson, J.; Appel, L.

    2012-01-01

    In this paper, current profiles for plasma discharges on the mega-ampere spherical tokamak are directly calculated from pickup coil, flux loop, and motional-Stark effect observations via methods based in the statistical theory of Bayesian analysis. By representing toroidal plasma current as a series of axisymmetric current beams with rectangular cross-section and inferring the current for each one of these beams, flux-surface geometry and q-profiles are subsequently calculated by elementary application of Biot-Savart's law. The use of this plasma model in the context of Bayesian analysis was pioneered by Svensson and Werner on the joint-European tokamak [Svensson and Werner,Plasma Phys. Controlled Fusion 50(8), 085002 (2008)]. In this framework, linear forward models are used to generate diagnostic predictions, and the probability distribution for the currents in the collection of plasma beams was subsequently calculated directly via application of Bayes' formula. In this work, we introduce a new diagnostic technique to identify and remove outlier observations associated with diagnostics falling out of calibration or suffering from an unidentified malfunction. These modifications enable a good agreement between Bayesian inference of the last-closed flux-surface with other corroborating data, such as that from force balance considerations using EFIT++ [Appel et al., "A unified approach to equilibrium reconstruction" Proceedings of the 33rd EPS Conference on Plasma Physics (Rome, Italy, 2006)]. In addition, this analysis also yields errors on the plasma current profile and flux-surface geometry as well as directly predicting the Shafranov shift of the plasma core.

  15. Bayesian Alternation during Tactile Augmentation

    PubMed Central

    Goeke, Caspar M.; Planera, Serena; Finger, Holger; König, Peter

    2016-01-01

    A large number of studies suggest that the integration of multisensory signals by humans is well-described by Bayesian principles. However, there are very few reports about cue combination between a native and an augmented sense. In particular, we asked the question whether adult participants are able to integrate an augmented sensory cue with existing native sensory information. Hence for the purpose of this study, we build a tactile augmentation device. Consequently, we compared different hypotheses of how untrained adult participants combine information from a native and an augmented sense. In a two-interval forced choice (2 IFC) task, while subjects were blindfolded and seated on a rotating platform, our sensory augmentation device translated information on whole body yaw rotation to tactile stimulation. Three conditions were realized: tactile stimulation only (augmented condition), rotation only (native condition), and both augmented and native information (bimodal condition). Participants had to choose one out of two consecutive rotations with higher angular rotation. For the analysis, we fitted the participants' responses with a probit model and calculated the just notable difference (JND). Then, we compared several models for predicting bimodal from unimodal responses. An objective Bayesian alternation model yielded a better prediction (χred2 = 1.67) than the Bayesian integration model (χred2 = 4.34). Slightly higher accuracy showed a non-Bayesian winner takes all (WTA) model (χred2 = 1.64), which either used only native or only augmented values per subject for prediction. However, the performance of the Bayesian alternation model could be substantially improved (χred2 = 1.09) utilizing subjective weights obtained by a questionnaire. As a result, the subjective Bayesian alternation model predicted bimodal performance most accurately among all tested models. These results suggest that information from augmented and existing sensory modalities in

  16. Using Bayesian analysis in repeated preclinical in vivo studies for a more effective use of animals.

    PubMed

    Walley, Rosalind; Sherington, John; Rastrick, Joe; Detrait, Eric; Hanon, Etienne; Watt, Gillian

    2016-05-01

    Whilst innovative Bayesian approaches are increasingly used in clinical studies, in the preclinical area Bayesian methods appear to be rarely used in the reporting of pharmacology data. This is particularly surprising in the context of regularly repeated in vivo studies where there is a considerable amount of data from historical control groups, which has potential value. This paper describes our experience with introducing Bayesian analysis for such studies using a Bayesian meta-analytic predictive approach. This leads naturally either to an informative prior for a control group as part of a full Bayesian analysis of the next study or using a predictive distribution to replace a control group entirely. We use quality control charts to illustrate study-to-study variation to the scientists and describe informative priors in terms of their approximate effective numbers of animals. We describe two case studies of animal models: the lipopolysaccharide-induced cytokine release model used in inflammation and the novel object recognition model used to screen cognitive enhancers, both of which show the advantage of a Bayesian approach over the standard frequentist analysis. We conclude that using Bayesian methods in stable repeated in vivo studies can result in a more effective use of animals, either by reducing the total number of animals used or by increasing the precision of key treatment differences. This will lead to clearer results and supports the "3Rs initiative" to Refine, Reduce and Replace animals in research. Copyright © 2016 John Wiley & Sons, Ltd.

  17. Bayesian analysis of rare events

    NASA Astrophysics Data System (ADS)

    Straub, Daniel; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.

  18. Evolving a Bayesian Classifier for ECG-based Age Classification in Medical Applications.

    PubMed

    Wiggins, M; Saad, A; Litt, B; Vachtsevanos, G

    2008-01-01

    OBJECTIVE: To classify patients by age based upon information extracted from their electro-cardiograms (ECGs). To develop and compare the performance of Bayesian classifiers. METHODS AND MATERIAL: We present a methodology for classifying patients according to statistical features extracted from their ECG signals using a genetically evolved Bayesian network classifier. Continuous signal feature variables are converted to a discrete symbolic form by thresholding, to lower the dimensionality of the signal. This simplifies calculation of conditional probability tables for the classifier, and makes the tables smaller. Two methods of network discovery from data were developed and compared: the first using a greedy hill-climb search and the second employed evolutionary computing using a genetic algorithm (GA). RESULTS AND CONCLUSIONS: The evolved Bayesian network performed better (86.25% AUC) than both the one developed using the greedy algorithm (65% AUC) and the naïve Bayesian classifier (84.75% AUC). The methodology for evolving the Bayesian classifier can be used to evolve Bayesian networks in general thereby identifying the dependencies among the variables of interest. Those dependencies are assumed to be non-existent by naïve Bayesian classifiers. Such a classifier can then be used for medical applications for diagnosis and prediction purposes.

  19. Evolving a Bayesian Classifier for ECG-based Age Classification in Medical Applications

    PubMed Central

    Wiggins, M.; Saad, A.; Litt, B.; Vachtsevanos, G.

    2010-01-01

    Objective To classify patients by age based upon information extracted from their electro-cardiograms (ECGs). To develop and compare the performance of Bayesian classifiers. Methods and Material We present a methodology for classifying patients according to statistical features extracted from their ECG signals using a genetically evolved Bayesian network classifier. Continuous signal feature variables are converted to a discrete symbolic form by thresholding, to lower the dimensionality of the signal. This simplifies calculation of conditional probability tables for the classifier, and makes the tables smaller. Two methods of network discovery from data were developed and compared: the first using a greedy hill-climb search and the second employed evolutionary computing using a genetic algorithm (GA). Results and Conclusions The evolved Bayesian network performed better (86.25% AUC) than both the one developed using the greedy algorithm (65% AUC) and the naïve Bayesian classifier (84.75% AUC). The methodology for evolving the Bayesian classifier can be used to evolve Bayesian networks in general thereby identifying the dependencies among the variables of interest. Those dependencies are assumed to be non-existent by naïve Bayesian classifiers. Such a classifier can then be used for medical applications for diagnosis and prediction purposes. PMID:22010038

  20. SU-E-CAMPUS-T-04: Statistical Process Control for Patient-Specific QA in Proton Beams

    SciTech Connect

    LAH, J; SHIN, D; Kim, G

    2014-06-15

    Purpose: To evaluate and improve the reliability of proton QA process, to provide an optimal customized level using the statistical process control (SPC) methodology. The aim is then to suggest the suitable guidelines for patient-specific QA process. Methods: We investigated the constancy of the dose output and range to see whether it was within the tolerance level of daily QA process. This study analyzed the difference between the measured and calculated ranges along the central axis to suggest the suitable guidelines for patient-specific QA in proton beam by using process capability indices. In this study, patient QA plans were classified into 6 treatment sites: head and neck (41 cases), spinal cord (29 cases), lung (28 cases), liver (30 cases), pancreas (26 cases), and prostate (24 cases). Results: The deviations for the dose output and range of daily QA process were ±0.84% and ±019%, respectively. Our results show that the patient-specific range measurements are capable at a specification limit of ±2% in all treatment sites except spinal cord cases. In spinal cord cases, comparison of process capability indices (Cp, Cpm, Cpk ≥1, but Cpmk ≤1) indicated that the process is capable, but not centered, the process mean deviates from its target value. The UCL (upper control limit), CL (center line) and LCL (lower control limit) for spinal cord cases were 1.37%, −0.27% and −1.89%, respectively. On the other hands, the range differences in prostate cases were good agreement between calculated and measured values. The UCL, CL and LCL for prostate cases were 0.57%, −0.11% and −0.78%, respectively. Conclusion: SPC methodology has potential as a useful tool to customize an optimal tolerance levels and to suggest the suitable guidelines for patient-specific QA in clinical proton beam.

  1. Computer program for prediction of fuel consumption statistical data for an upper stage three-axes stabilized on-off control system

    NASA Technical Reports Server (NTRS)

    1982-01-01

    A FORTRAN coded computer program and method to predict the reaction control fuel consumption statistics for a three axis stabilized rocket vehicle upper stage is described. A Monte Carlo approach is used which is more efficient by using closed form estimates of impulses. The effects of rocket motor thrust misalignment, static unbalance, aerodynamic disturbances, and deviations in trajectory, mass properties and control system characteristics are included. This routine can be applied to many types of on-off reaction controlled vehicles. The pseudorandom number generation and statistical analyses subroutines including the output histograms can be used for other Monte Carlo analyses problems.

  2. A Bayesian approach to probabilistic sensitivity analysis in structured benefit-risk assessment.

    PubMed

    Waddingham, Ed; Mt-Isa, Shahrul; Nixon, Richard; Ashby, Deborah

    2016-01-01

    Quantitative decision models such as multiple criteria decision analysis (MCDA) can be used in benefit-risk assessment to formalize trade-offs between benefits and risks, providing transparency to the assessment process. There is however no well-established method for propagating uncertainty of treatment effects data through such models to provide a sense of the variability of the benefit-risk balance. Here, we present a Bayesian statistical method that directly models the outcomes observed in randomized placebo-controlled trials and uses this to infer indirect comparisons between competing active treatments. The resulting treatment effects estimates are suitable for use within the MCDA setting, and it is possible to derive the distribution of the overall benefit-risk balance through Markov Chain Monte Carlo simulation. The method is illustrated using a case study of natalizumab for relapsing-remitting multiple sclerosis.

  3. Assessment of adherence to the statistical components of consolidated standards of reporting trials statement for quality of reports on randomized controlled trials from five pharmacology journals

    PubMed Central

    Satpute, Sachin; Mehta, Manthan; Bhete, Sandeep; Kurle, Dnyneshwar

    2016-01-01

    Background: The Consolidated Standards of Reporting Trials (CONSORT) statement is a device to standardize reporting and improve the quality of controlled trials. However, little attention is paid to the statistical components in the CONSORT checklist. The present study evaluates the randomized controlled trials [RCTs] published in five high impact pharmacology journals with respect to its statistical methods. Methods: Randomized Controlled Trials [RCTs] published in the years 2013 & 2014 in five pharmacology journals with high impact factor, The Journal of Clinical Pharmacology (JCP), British Journal of Clinical Pharmacology (BJCP), European Journal of Clinical Pharmacology (EJCP), Journal of Pharmacology & Pharmacotherapeutics (JPP) and Indian Journal of Pharmacology (IJP) were assessed for adherence to the statistical components of CONSORT statement. Results: Of the 174 RCTs analysed, 103 described the method of sample size calculation. Of the five journals, maximum reports in JCP (34/50) and minimum in IJP (13/31) adhered to the CONSORT checklist [item 7a-sample size calculation]. Most reports mentioned the statistical methods used for analysis of data. (171/174) as per the checklist [item 12=statistical methods used]. Analysis of variance (ANOVA) was the most commonly used test (88/174). The software used for statistical analysis was mentioned in 111 RCTs and SPSS was used more frequently (58/111). The exact p value was stated in 108 reports. Certain errors in statistical analysis were also noted (40/174). Conclusion: These findings show inconsistencies and non- adherence to the statistical components of the CONSORT statement especially with respect to sample size calculation. Special attention must be paid to the statistical accuracy of the reports. PMID:27453829

  4. Optical imaging in a variational Bayesian framework

    NASA Astrophysics Data System (ADS)

    Arhab, S.; Ayasso, H.; Duchêne, B.; Mohammad-Djafari, A.

    2014-10-01

    We are interested in optical imaging of nano-structured man-made objects. Optical imaging is taken as a nonlinear inverse scattering problem where the goal is to retrieve the dielectric parameters of an unknown object. In addition to be nonlinear, such problems are also known to be ill-posed, which means that a regularization is required prior to their resolution. This is done by introducing a priori information which consists in the fact that object is known to be composed of compact homogeneous regions made of a finite number of different materials. This a priori knowledge is appropriately translated in a Bayesian framework by a Gauss-Markov- Potts prior. Hence, a Gauss-Markov random field is used to model the contrast distribution, whereas a hidden Potts-Markov field accounts for the compactness of the regions. The problem is then solved by means of a variational Bayesian approximation, which consists in approximating the joint posterior law of all unknown parameters in the Kullback-Leibler sense with a separable free form distribution. This leads to an implicit parametric optimization scheme which is solved iteratively. This inversion algorithm is applied to laboratory controlled experimental data.

  5. Mediator effect of statistical process control between Total Quality Management (TQM) and business performance in Malaysian Automotive Industry

    NASA Astrophysics Data System (ADS)

    Ahmad, M. F.; Rasi, R. Z.; Zakuan, N.; Hisyamudin, M. N. N.

    2015-12-01

    In today's highly competitive market, Total Quality Management (TQM) is vital management tool in ensuring a company can success in their business. In order to survive in the global market with intense competition amongst regions and enterprises, the adoption of tools and techniques are essential in improving business performance. There are consistent results between TQM and business performance. However, only few previous studies have examined the mediator effect namely statistical process control (SPC) between TQM and business performance. A mediator is a third variable that changes the association between an independent variable and an outcome variable. This study present research proposed a TQM performance model with mediator effect of SPC with structural equation modelling, which is a more comprehensive model for developing countries, specifically for Malaysia. A questionnaire was prepared and sent to 1500 companies from automotive industry and the related vendors in Malaysia, giving a 21.8 per cent rate. Attempts were made at findings significant impact of mediator between TQM practices and business performance showed that SPC is important tools and techniques in TQM implementation. The result concludes that SPC is partial correlation between and TQM and BP with indirect effect (IE) is 0.25 which can be categorised as high moderator effect.

  6. Application of statistical methods (SPC) for an optimized control of the irradiation process of high-power semiconductors

    NASA Astrophysics Data System (ADS)

    Mittendorfer, J.; Zwanziger, P.

    2000-03-01

    High-power bipolar semiconductor devices (thyristors and diodes) in a disc-type shape are key components (semiconductor switches) for high-power electronic systems. These systems are important for the economic design of energy transmission systems, i.e. high-power drive systems, static compensation and high-voltage DC transmission lines. In their factory located in Pretzfeld, Germany, the company, eupec GmbH+Co.KG (eupec), is producing disc-type devices with ceramic encapsulation in the high-end range for the world market. These elements have to fulfil special customer requirements and therefore deliver tailor-made trade-offs between their on-state voltage and dynamic switching behaviour. This task can be achieved by applying a dedicated electron irradiation on the semiconductor pellets, which tunes this trade-off. In this paper, the requirements to the irradiation company Mediscan GmbH, from the point of view of the semiconductor manufacturer, are described. The actual strategy for controlling the irradiation results to fulfil these requirements are presented, together with the choice of relevant parameters from the viewpoint of the irradiation company. The set of process parameters monitored, using statistical process control (SPC) techniques, includes beam current and energy, conveyor speed and irradiation geometry. The results are highlighted and show the successful co-operation in this business. Watching this process vice versa, an idea is presented and discussed to develop the possibilities of a highly sensitive dose detection device by using modified diodes, which could function as accurate yet cheap and easy-to-use detectors as routine dosimeters for irradiation institutes.

  7. Latent features in similarity judgments: a nonparametric bayesian approach.

    PubMed

    Navarro, Daniel J; Griffiths, Thomas L

    2008-11-01

    One of the central problems in cognitive science is determining the mental representations that underlie human inferences. Solutions to this problem often rely on the analysis of subjective similarity judgments, on the assumption that recognizing likenesses between people, objects, and events is crucial to everyday inference. One such solution is provided by the additive clustering model, which is widely used to infer the features of a set of stimuli from their similarities, on the assumption that similarity is a weighted linear function of common features. Existing approaches for implementing additive clustering often lack a complete framework for statistical inference, particularly with respect to choosing the number of features. To address these problems, this article develops a fully Bayesian formulation of the additive clustering model, using methods from nonparametric Bayesian statistics to allow the number of features to vary. We use this to explore several approaches to parameter estimation, showing that the nonparametric Bayesian approach provides a straightforward way to obtain estimates of both the number of features and their importance. PMID:18533818

  8. Bayesian networks for evaluation of evidence from forensic entomology.

    PubMed

    Andersson, M Gunnar; Sundström, Anders; Lindström, Anders

    2013-09-01

    In the aftermath of a CBRN incident, there is an urgent need to reconstruct events in order to bring the perpetrators to court and to take preventive actions for the future. The challenge is to discriminate, based on available information, between alternative scenarios. Forensic interpretation is used to evaluate to what extent results from the forensic investigation favor the prosecutors' or the defendants' arguments, using the framework of Bayesian hypothesis testing. Recently, several new scientific disciplines have been used in a forensic context. In the AniBioThreat project, the framework was applied to veterinary forensic pathology, tracing of pathogenic microorganisms, and forensic entomology. Forensic entomology is an important tool for estimating the postmortem interval in, for example, homicide investigations as a complement to more traditional methods. In this article we demonstrate the applicability of the Bayesian framework for evaluating entomological evidence in a forensic investigation through the analysis of a hypothetical scenario involving suspect movement of carcasses from a clandestine laboratory. Probabilities of different findings under the alternative hypotheses were estimated using a combination of statistical analysis of data, expert knowledge, and simulation, and entomological findings are used to update the beliefs about the prosecutors' and defendants' hypotheses and to calculate the value of evidence. The Bayesian framework proved useful for evaluating complex hypotheses using findings from several insect species, accounting for uncertainty about development rate, temperature, and precolonization. The applicability of the forensic statistic approach to evaluating forensic results from a CBRN incident is discussed.

  9. Bayesian analysis of the flutter margin method in aeroelasticity

    DOE PAGES

    Khalil, Mohammad; Poirel, Dominique; Sarkar, Abhijit

    2016-08-27

    A Bayesian statistical framework is presented for Zimmerman and Weissenburger flutter margin method which considers the uncertainties in aeroelastic modal parameters. The proposed methodology overcomes the limitations of the previously developed least-square based estimation technique which relies on the Gaussian approximation of the flutter margin probability density function (pdf). Using the measured free-decay responses at subcritical (preflutter) airspeeds, the joint non-Gaussain posterior pdf of the modal parameters is sampled using the Metropolis–Hastings (MH) Markov chain Monte Carlo (MCMC) algorithm. The posterior MCMC samples of the modal parameters are then used to obtain the flutter margin pdfs and finally the fluttermore » speed pdf. The usefulness of the Bayesian flutter margin method is demonstrated using synthetic data generated from a two-degree-of-freedom pitch-plunge aeroelastic model. The robustness of the statistical framework is demonstrated using different sets of measurement data. In conclusion, it will be shown that the probabilistic (Bayesian) approach reduces the number of test points required in providing a flutter speed estimate for a given accuracy and precision.« less

  10. Cosmic statistics of statistics

    NASA Astrophysics Data System (ADS)

    Szapudi, István; Colombi, Stéphane; Bernardeau, Francis

    1999-12-01

    The errors on statistics measured in finite galaxy catalogues are exhaustively investigated. The theory of errors on factorial moments by Szapudi & Colombi is applied to cumulants via a series expansion method. All results are subsequently extended to the weakly non-linear regime. Together with previous investigations this yields an analytic theory of the errors for moments and connected moments of counts in cells from highly non-linear to weakly non-linear scales. For non-linear functions of unbiased estimators, such as the cumulants, the phenomenon of cosmic bias is identified and computed. Since it is subdued by the cosmic errors in the range of applicability of the theory, correction for it is inconsequential. In addition, the method of Colombi, Szapudi & Szalay concerning sampling effects is generalized, adapting the theory for inhomogeneous galaxy catalogues. While previous work focused on the variance only, the present article calculates the cross-correlations between moments and connected moments as well for a statistically complete description. The final analytic formulae representing the full theory are explicit but somewhat complicated. Therefore we have made available a fortran program capable of calculating the described quantities numerically (for further details e-mail SC at colombi@iap.fr). An important special case is the evaluation of the errors on the two-point correlation function, for which this should be more accurate than any method put forward previously. This tool will be immensely useful in the future for assessing the precision of measurements from existing catalogues, as well as aiding the design of new galaxy surveys. To illustrate the applicability of the results and to explore the numerical aspects of the theory qualitatively and quantitatively, the errors and cross-correlations are predicted under a wide range of assumptions for the future Sloan Digital Sky Survey. The principal results concerning the cumulants ξ, Q3 and Q4 is that

  11. Bayesian parameter inference and model selection by population annealing in systems biology.

    PubMed

    Murakami, Yohei

    2014-01-01

    Parameter inference and model selection are very important for mathematical modeling in systems biology. Bayesian statistics can be used to conduct both parameter inference and model selection. Especially, the framework named approximate Bayesian computation is often used for parameter inference and model selection in systems biology. However, Monte Carlo methods needs to be used to compute Bayesian posterior distributions. In addition, the posterior distributions of parameters are sometimes almost uniform or very similar to their prior distributions. In such cases, it is difficult to choose one specific value of parameter with high credibility as the representative value of the distribution. To overcome the problems, we introduced one of the population Monte Carlo algorithms, population annealing. Although population annealing is usually used in statistical mechanics, we showed that population annealing can be used to compute Bayesian posterior distributions in the approximate Bayesian computation framework. To deal with un-identifiability of the representative values of parameters, we proposed to run the simulations with the parameter ensemble sampled from the posterior distribution, named "posterior parameter ensemble". We showed that population annealing is an efficient and convenient algorithm to generate posterior parameter ensemble. We also showed that the simulations with the posterior parameter ensemble can, not only reproduce the data used for parameter inference, but also capture and predict the data which was not used for parameter inference. Lastly, we introduced the marginal likelihood in the approximate Bayesian computation framework for Bayesian model selection. We showed that population annealing enables us to compute the marginal likelihood in the approximate Bayesian computation framework and conduct model selection depending on the Bayes factor.

  12. Bayesian Modeling of Time Trends in Component Reliability Data via Markov Chain Monte Carlo Simulation

    SciTech Connect

    D. L. Kelly

    2007-06-01

    Markov chain Monte Carlo (MCMC) techniques represent an extremely flexible and powerful approach to Bayesian modeling. This work illustrates the application of such techniques to time-dependent reliability of components with repair. The WinBUGS package is used to illustrate, via examples, how Bayesian techniques can be used for parametric statistical modeling of time-dependent component reliability. Additionally, the crucial, but often overlooked subject of model validation is discussed, and summary statistics for judging the model’s ability to replicate the observed data are developed, based on the posterior predictive distribution for the parameters of interest.

  13. On becoming a Bayesian: early correspondences between J. Cornfield and L. J. Savage.

    PubMed

    Greenhouse, Joel B

    2012-10-30

    Jerome Cornfield was arguably the leading proponent for the use of Bayesian methods in biostatistics during the 1960s. Prior to 1963, however, Cornfield had no publications in the area of Bayesian statistics. At a time when frequentist methods were the dominant influence on statistical practice, Cornfield went against the mainstream and embraced Bayes. The goals of this paper are as follows: (i) to explore how and why this transformation came about and (ii) to provide some sense as to who Cornfield was and the context in which he worked.

  14. Merging Digital Surface Models Implementing Bayesian Approaches

    NASA Astrophysics Data System (ADS)

    Sadeq, H.; Drummond, J.; Li, Z.

    2016-06-01

    In this research different DSMs from different sources have been merged. The merging is based on a probabilistic model using a Bayesian Approach. The implemented data have been sourced from very high resolution satellite imagery sensors (e.g. WorldView-1 and Pleiades). It is deemed preferable to use a Bayesian Approach when the data obtained from the sensors are limited and it is difficult to obtain many measurements or it would be very costly, thus the problem of the lack of data can be solved by introducing a priori estimations of data. To infer the prior data, it is assumed that the roofs of the buildings are specified as smooth, and for that purpose local entropy has been implemented. In addition to the a priori estimations, GNSS RTK measurements have been collected in the field which are used as check points to assess the quality of the DSMs and to validate the merging result. The model has been applied in the West-End of Glasgow containing different kinds of buildings, such as flat roofed and hipped roofed buildings. Both quantitative and qualitative methods have been employed to validate the merged DSM. The validation results have shown that the model was successfully able to improve the quality of the DSMs and improving some characteristics such as the roof surfaces, which consequently led to better representations. In addition to that, the developed model has been compared with the well established Maximum Likelihood model and showed similar quantitative statistical results and better qualitative results. Although the proposed model has been applied on DSMs that were derived from satellite imagery, it can be applied to any other sourced DSMs.

  15. Bayesian Blocks: A New Method to Analyze Photon Counting Data

    NASA Technical Reports Server (NTRS)

    Scargle, Jeffrey D.; Bloom, Elliott D.; Young, Richard E. (Technical Monitor)

    1997-01-01

    A Bayesian analysis of photon-counting data leads to a new time-domain algorithm for detecting localized structures (bursts), revealing pulse shapes, and generally characterizing intensity variations. The raw counting data -- time-tag events (TTE), time-to-spill (TTS) data, or binned counts -- is converted to a maximum likelihood segmentation of the observation into time intervals during which the photon arrival rate is perceptibly constant -- i.e. has a fixed intensity without statistically significant variations. The resulting structures, Bayesian Blocks, can be thought of as bins with arbitrary spacing determined by the data. The method itself sets no lower limit to the time scale on which variability can be detected. We have applied the method to RXTE data on Cyg X-1, yielding information on this source's short-time-scale variability.

  16. A Bayesian sequential processor approach to spectroscopic portal system decisions

    SciTech Connect

    Sale, K; Candy, J; Breitfeller, E; Guidry, B; Manatt, D; Gosnell, T; Chambers, D

    2007-07-31

    The development of faster more reliable techniques to detect radioactive contraband in a portal type scenario is an extremely important problem especially in this era of constant terrorist threats. Towards this goal the development of a model-based, Bayesian sequential data processor for the detection problem is discussed. In the sequential processor each datum (detector energy deposit and pulse arrival time) is used to update the posterior probability distribution over the space of model parameters. The nature of the sequential processor approach is that a detection is produced as soon as it is statistically justified by the data rather than waiting for a fixed counting interval before any analysis is performed. In this paper the Bayesian model-based approach, physics and signal processing models and decision functions are discussed along with the first results of our research.

  17. An Overview of Bayesian Methods for Neural Spike Train Analysis

    PubMed Central

    2013-01-01

    Neural spike train analysis is an important task in computational neuroscience which aims to understand neural mechanisms and gain insights into neural circuits. With the advancement of multielectrode recording and imaging technologies, it has become increasingly demanding to develop statistical tools for analyzing large neuronal ensemble spike activity. Here we present a tutorial overview of Bayesian methods and their representative applications in neural spike train analysis, at both single neuron and population levels. On the theoretical side, we focus on various approximate Bayesian inference techniques as applied to latent state and parameter estimation. On the application side, the topics include spike sorting, tuning curve estimation, neural encoding and decoding, deconvolution of spike trains from calcium imaging signals, and inference of neuronal functional connectivity and synchrony. Some research challenges and opportunities for neural spike train analysis are discussed. PMID:24348527

  18. Bayesian seismology of the Sun

    NASA Astrophysics Data System (ADS)

    Gruberbauer, M.; Guenther, D. B.

    2013-06-01

    We perform a Bayesian grid-based analysis of the solar l = 0, 1, 2 and 3 p modes obtained via BiSON in order to deliver the first Bayesian asteroseismic analysis of the solar composition problem. We do not find decisive evidence to prefer either of the contending chemical compositions, although the revised solar abundances (AGSS09) are more probable in general. We do find indications for systematic problems in standard stellar evolution models, unrelated to the consequences of inadequate modelling of the outer layers on the higher order modes. The seismic observables are best fitted by solar models that are several hundred million years older than the meteoritic age of the Sun. Similarly, meteoritic age calibrated models do not adequately reproduce the observed seismic observables. Our results suggest that these problems will affect any asteroseismic inference that relies on a calibration to the Sun.

  19. Bayesian segmentation of hyperspectral images

    NASA Astrophysics Data System (ADS)

    Mohammadpour, Adel; Féron, Olivier; Mohammad-Djafari, Ali

    2004-11-01

    In this paper we consider the problem of joint segmentation of hyperspectral images in the Bayesian framework. The proposed approach is based on a Hidden Markov Modeling (HMM) of the images with common segmentation, or equivalently with common hidden classification label variables which is modeled by a Potts Markov Random Field. We introduce an appropriate Markov Chain Monte Carlo (MCMC) algorithm to implement the method and show some simulation results.

  20. Elements of Bayesian experimental design

    SciTech Connect

    Sivia, D.S.

    1997-09-01

    We consider some elements of the Bayesian approach that are important for optimal experimental design. While the underlying principles used are very general, and are explained in detail in a recent tutorial text, they are applied here to the specific case of characterising the inferential value of different resolution peakshapes. This particular issue was considered earlier by Silver, Sivia and Pynn (1989, 1990a, 1990b), and the following presentation confirms and extends the conclusions of their analysis.