ERIC Educational Resources Information Center
Meyer, Donald L.
Bayesian statistical methodology and its possible uses in the behavioral sciences are discussed in relation to the solution of problems in both the use and teaching of fundamental statistical methods, including confidence intervals, significance tests, and sampling. The Bayesian model explains these statistical methods and offers a consistent…
Information geometry of Bayesian statistics
NASA Astrophysics Data System (ADS)
Matsuzoe, Hiroshi
2015-01-01
A survey of geometry of Bayesian statistics is given. From the viewpoint of differential geometry, a prior distribution in Bayesian statistics is regarded as a volume element on a statistical model. In this paper, properties of Bayesian estimators are studied by applying equiaffine structures of statistical manifolds. In addition, geometry of anomalous statistics is also studied. Deformed expectations and deformed independeces are important in anomalous statistics. After summarizing geometry of such deformed structues, a generalization of maximum likelihood method is given. A suitable weight on a parameter space is important in Bayesian statistics, whereas a suitable weight on a sample space is important in anomalous statistics.
Bayesian Statistics for Biological Data: Pedigree Analysis
ERIC Educational Resources Information Center
Stanfield, William D.; Carlton, Matthew A.
2004-01-01
The use of Bayes' formula is applied to the biological problem of pedigree analysis to show that the Bayes' formula and non-Bayesian or "classical" methods of probability calculation give different answers. First year college students of biology can be introduced to the Bayesian statistics.
Philosophy and the practice of Bayesian statistics
Gelman, Andrew; Shalizi, Cosma Rohilla
2015-01-01
A substantial school in the philosophy of science identifies Bayesian inference with inductive inference and even rationality as such, and seems to be strengthened by the rise and practical success of Bayesian statistics. We argue that the most successful forms of Bayesian statistics do not actually support that particular philosophy but rather accord much better with sophisticated forms of hypothetico-deductivism. We examine the actual role played by prior distributions in Bayesian models, and the crucial aspects of model checking and model revision, which fall outside the scope of Bayesian confirmation theory. We draw on the literature on the consistency of Bayesian updating and also on our experience of applied work in social science. Clarity about these matters should benefit not just philosophy of science, but also statistical practice. At best, the inductivist view has encouraged researchers to fit and compare models without checking them; at worst, theorists have actively discouraged practitioners from performing model checking because it does not fit into their framework. PMID:22364575
Bayesian versus 'plain-vanilla Bayesian' multitarget statistics
NASA Astrophysics Data System (ADS)
Mahler, Ronald P. S.
2004-08-01
Finite-set statistics (FISST) is a direct generalization of single-sensor, single-target Bayes statistics to the multisensor-multitarget realm, based on random set theory. Various aspects of FISST are being investigated by several research teams around the world. In recent years, however, a few partisans have claimed that a "plain-vanilla Bayesian approach" suffices as down-to-earth, "straightforward," and general "first principles" for multitarget problems. Therefore, FISST is mere mathematical "obfuscation." In this and a companion paper I demonstrate the speciousness of these claims. In this paper I summarize general Bayes statistics, what is required to use it in multisensor-multitarget problems, and why FISST is necessary to make it practical. Then I demonstrate that the "plain-vanilla Bayesian approach" is so heedlessly formulated that it is erroneous, not even Bayesian denigrates FISST concepts while unwittingly assuming them, and has resulted in a succession of algorithms afflicted by inherent -- but less than candidly acknowledged -- computational "logjams."
Teaching Bayesian Statistics to Undergraduate Students through Debates
ERIC Educational Resources Information Center
Stewart, Sepideh; Stewart, Wayne
2014-01-01
This paper describes a lecturer's approach to teaching Bayesian statistics to students who were only exposed to the classical paradigm. The study shows how the lecturer extended himself by making use of ventriloquist dolls to grab hold of students' attention and embed important ideas in revealing the differences between the Bayesian and…
Liley, James; Wallace, Chris
2015-01-01
Genome-wide association studies (GWAS) have been successful in identifying single nucleotide polymorphisms (SNPs) associated with many traits and diseases. However, at existing sample sizes, these variants explain only part of the estimated heritability. Leverage of GWAS results from related phenotypes may improve detection without the need for larger datasets. The Bayesian conditional false discovery rate (cFDR) constitutes an upper bound on the expected false discovery rate (FDR) across a set of SNPs whose p values for two diseases are both less than two disease-specific thresholds. Calculation of the cFDR requires only summary statistics and have several advantages over traditional GWAS analysis. However, existing methods require distinct control samples between studies. Here, we extend the technique to allow for some or all controls to be shared, increasing applicability. Several different SNP sets can be defined with the same cFDR value, and we show that the expected FDR across the union of these sets may exceed expected FDR in any single set. We describe a procedure to establish an upper bound for the expected FDR among the union of such sets of SNPs. We apply our technique to pairwise analysis of p values from ten autoimmune diseases with variable sharing of controls, enabling discovery of 59 SNP-disease associations which do not reach GWAS significance after genomic control in individual datasets. Most of the SNPs we highlight have previously been confirmed using replication studies or larger GWAS, a useful validation of our technique; we report eight SNP-disease associations across five diseases not previously declared. Our technique extends and strengthens the previous algorithm, and establishes robust limits on the expected FDR. This approach can improve SNP detection in GWAS, and give insight into shared aetiology between phenotypically related conditions. PMID:25658688
Bayesian statistics in environmental engineering planning
Englehardt, J.D.; Simon, T.W.
1999-07-01
Today's engineer must be able to quantify both uncertainty due to information limitations, and the variability of natural processes, in order to determine risk. Nowhere is this emphasis on risk assessment more evident than in environmental engineering. The use of Bayesian inference for the rigorous assessment of risk based on available information is reviewed in this paper. Several example environmental engineering planning applications are presented: (1) assessment of losses involving the evaluation of proposed revisions to the South Florida Building Code after Hurricane Andrew; (2) development of a model to predict oil spill consequences due to proposed changes in the oil transportation network in the Gulf of Mexico; (3) studies of ambient concentrations of perchloroethylene surrounding dry cleaners and of tire particulates in residential areas near roadways in Miami, FL; (4) risk assessment from contaminated soils at a cleanup of an old transformer dump site.
A BAYESIAN STATISTICAL APPROACH FOR THE EVALUATION OF CMAQ
Bayesian statistical methods are used to evaluate Community Multiscale Air Quality (CMAQ) model simulations of sulfate aerosol over a section of the eastern US for 4-week periods in summer and winter 2001. The observed data come from two U.S. Environmental Protection Agency data ...
Reconstruction in emission tomography via a Bayesian multiscale statistical framework
NASA Astrophysics Data System (ADS)
Kolaczyk, Eric D.; Nowak, Robert D.
2000-12-01
Recently the authors introduced a general Bayesian statistical method for modeling and analysis in linear inverse problems involving certain types of count data. Emission-based tomography is medical imaging is a particularly important and common examples of this type of proem. In this paper we provide an overview of the methodology and illustrate its application to problems in emission tomography through a series of simulated and real- data examples. The framework rests on the special manner in which a multiscale representation of recursive dyadic partitions interacts with the statistical likelihood of data with Poisson noise characteristics. In particular, the likelihood function permits a factorization, with respect to location-scale indexing, analogous to the manner in which, say, an arbitrary signal allows a wavelet transform. Recovery of an object from tomographic data is the posed as a problem involving the statistical estimation of a multiscale parameter vector. A type of statistical shrinkage estimation is used, induced by careful choice of a Bayesian prior probability structure for the parameters. Finally, the ill-posedness of the tomographic imaging problem is accounted for by embedding the above-described framework within a larger, but simpler statistical algorithm problem, via the so-called Expectation-Maximization approach. The resulting image reconstruction algorithm is iterative in nature, entailing the calculation of two closed-form algebraic expression at each iteration. Convergence of the algorithm to a unique solution, under appropriate choice of Bayesian prior, can be assured.
Spectral Analysis of B Stars: An Application of Bayesian Statistics
NASA Astrophysics Data System (ADS)
Mugnes, J.-M.; Robert, C.
2012-12-01
To better understand the processes involved in stellar physics, it is necessary to obtain accurate stellar parameters (effective temperature, surface gravity, abundances…). Spectral analysis is a powerful tool for investigating stars, but it is also vital to reduce uncertainties at a decent computational cost. Here we present a spectral analysis method based on a combination of Bayesian statistics and grids of synthetic spectra obtained with TLUSTY. This method simultaneously constrains the stellar parameters by using all the lines accessible in observed spectra and thus greatly reduces uncertainties and improves the overall spectrum fitting. Preliminary results are shown using spectra from the Observatoire du Mont-Mégantic.
Bayesian statistics and information fusion for GPS-denied navigation
NASA Astrophysics Data System (ADS)
Copp, Brian Lee
It is well known that satellite navigation systems are vulnerable to disruption due to jamming, spoofing, or obstruction of the signal. The desire for robust navigation of aircraft in GPS-denied environments has motivated the development of feature-aided navigation systems, in which measurements of environmental features are used to complement the dead reckoning solution produced by an inertial navigation system. Examples of environmental features which can be exploited for navigation include star positions, terrain elevation, terrestrial wireless signals, and features extracted from photographic data. Feature-aided navigation represents a particularly challenging estimation problem because the measurements are often strongly nonlinear, and the quality of the navigation solution is limited by the knowledge of nuisance parameters which may be difficult to model accurately. As a result, integration approaches based on the Kalman filter and its variants may fail to give adequate performance. This project develops a framework for the integration of feature-aided navigation techniques using Bayesian statistics. In this approach, the probability density function for aircraft horizontal position (latitude and longitude) is approximated by a two-dimensional point mass function defined on a rectangular grid. Nuisance parameters are estimated using a hypothesis based approach (Multiple Model Adaptive Estimation) which continuously maintains an accurate probability density even in the presence of strong nonlinearities. The effectiveness of the proposed approach is illustrated by the simulated use of terrain referenced navigation and wireless time-of-arrival positioning to estimate a reference aircraft trajectory. Monte Carlo simulations have shown that accurate position estimates can be obtained in terrain referenced navigation even with a strongly nonlinear altitude bias. The integration of terrain referenced and wireless time-of-arrival measurements is described along with
Bayesian Tracking of Emerging Epidemics Using Ensemble Optimal Statistical Interpolation
Cobb, Loren; Krishnamurthy, Ashok; Mandel, Jan; Beezley, Jonathan D.
2014-01-01
We present a preliminary test of the Ensemble Optimal Statistical Interpolation (EnOSI) method for the statistical tracking of an emerging epidemic, with a comparison to its popular relative for Bayesian data assimilation, the Ensemble Kalman Filter (EnKF). The spatial data for this test was generated by a spatial susceptible-infectious-removed (S-I-R) epidemic model of an airborne infectious disease. Both tracking methods in this test employed Poisson rather than Gaussian noise, so as to handle epidemic data more accurately. The EnOSI and EnKF tracking methods worked well on the main body of the simulated spatial epidemic, but the EnOSI was able to detect and track a distant secondary focus of infection that the EnKF missed entirely. PMID:25113590
Bayesian tracking of emerging epidemics using ensemble optimal statistical interpolation.
Cobb, Loren; Krishnamurthy, Ashok; Mandel, Jan; Beezley, Jonathan D
2014-07-01
We present a preliminary test of the Ensemble Optimal Statistical Interpolation (EnOSI) method for the statistical tracking of an emerging epidemic, with a comparison to its popular relative for Bayesian data assimilation, the Ensemble Kalman Filter (EnKF). The spatial data for this test was generated by a spatial susceptible-infectious-removed (S-I-R) epidemic model of an airborne infectious disease. Both tracking methods in this test employed Poisson rather than Gaussian noise, so as to handle epidemic data more accurately. The EnOSI and EnKF tracking methods worked well on the main body of the simulated spatial epidemic, but the EnOSI was able to detect and track a distant secondary focus of infection that the EnKF missed entirely. PMID:25113590
Defining statistical perceptions with an empirical Bayesian approach
NASA Astrophysics Data System (ADS)
Tajima, Satohiro
2013-04-01
Extracting statistical structures (including textures or contrasts) from a natural stimulus is a central challenge in both biological and engineering contexts. This study interprets the process of statistical recognition in terms of hyperparameter estimations and free-energy minimization procedures with an empirical Bayesian approach. This mathematical interpretation resulted in a framework for relating physiological insights in animal sensory systems to the functional properties of recognizing stimulus statistics. We applied the present theoretical framework to two typical models of natural images that are encoded by a population of simulated retinal neurons, and demonstrated that the resulting cognitive performances could be quantified with the Fisher information measure. The current enterprise yielded predictions about the properties of human texture perception, suggesting that the perceptual resolution of image statistics depends on visual field angles, internal noise, and neuronal information processing pathways, such as the magnocellular, parvocellular, and koniocellular systems. Furthermore, the two conceptually similar natural-image models were found to yield qualitatively different predictions, striking a note of warning against confusing the two models when describing a natural image.
Bayesian statistical approach to binary asteroid orbit determination
NASA Astrophysics Data System (ADS)
Kovalenko, Irina D.; Stoica, Radu S.; Emelyanov, N. V.; Doressoundiram, A.; Hestroffer, D.
2016-01-01
The problem of binary asteroids orbit determination is of particular interest, given knowledge of the orbit is the best way to derive the mass of the system. Orbit determination from observed points is a classic problem of celestial mechanics. However, in the case of binary asteroids, particularly with a small number of observations, the solution is not evident to derive. In the case of resolved binaries the problem consists in the determination of the relative orbit from observed relative positions of a secondary asteroid with respect to the primary. In this work, the problem is investigated as a statistical inverse problem. Within this context, we propose a method based on Bayesian modelling together with a global optimisation procedure that is based on the simulated annealing algorithm.
Oakland, J.S.
1986-01-01
Addressing the increasing importance for firms to have a thorough knowledge of statistically based quality control procedures, this book presents the fundamentals of statistical process control (SPC) in a non-mathematical, practical way. It provides real-life examples and data drawn from a wide variety of industries. The foundations of good quality management and process control, and control of conformance and consistency during production are given. Offers clear guidance to those who wish to understand and implement modern SPC techniques.
Exploring aftershock properties with depth using Bayesian statistics
NASA Astrophysics Data System (ADS)
Narteau, Clement; Shebalin, Peter; Holschneider, Matthias
2013-04-01
Stress magnitudes and frictional faulting properties vary with depth and may strongly affect earthquake statistics. Nevertheless, if the Anderson faulting theory may be used to define the relative stress magnitudes, it remains extremely difficult to observe significant variations of earthquake properties from the top to the bottom of the seismogenic layer. Here, we concentrate on aftershock sequences in normal, strike-slip and reverse faulting regimes to isolate specific temporal properties of this major relaxation process with respect to depth. More exactly, we use Bayesian statistics of the Modified Omori Law to characterize the exponent p of the power-law aftershock decay rate and the duration c of the early stage of aftershock activity that does not fit with this power-law regime. Preliminary results show that the c-value decreases with depth without any significant variation of the p-value. Then, we infer the duration of a non power-law aftershock decay rate over short times can be related to the level of stress in the seismogenic crust.
NASA Technical Reports Server (NTRS)
He, Yuning
2015-01-01
The behavior of complex aerospace systems is governed by numerous parameters. For safety analysis it is important to understand how the system behaves with respect to these parameter values. In particular, understanding the boundaries between safe and unsafe regions is of major importance. In this paper, we describe a hierarchical Bayesian statistical modeling approach for the online detection and characterization of such boundaries. Our method for classification with active learning uses a particle filter-based model and a boundary-aware metric for best performance. From a library of candidate shapes incorporated with domain expert knowledge, the location and parameters of the boundaries are estimated using advanced Bayesian modeling techniques. The results of our boundary analysis are then provided in a form understandable by the domain expert. We illustrate our approach using a simulation model of a NASA neuro-adaptive flight control system, as well as a system for the detection of separation violations in the terminal airspace.
Bayesian Statistical Approach To Binary Asteroid Orbit Determination
NASA Astrophysics Data System (ADS)
Dmitrievna Kovalenko, Irina; Stoica, Radu S.
2015-08-01
Orbit determination from observations is one of the classical problems in celestial mechanics. Deriving the trajectory of binary asteroid with high precision is much more complicate than the trajectory of simple asteroid. Here we present a method of orbit determination based on the algorithm of Monte Carlo Markov Chain (MCMC). This method can be used for the preliminary orbit determination with relatively small number of observations, or for adjustment of orbit previously determined.The problem consists on determination of a conditional a posteriori probability density with given observations. Applying the Bayesian statistics, the a posteriori probability density of the binary asteroid orbital parameters is proportional to the a priori and likelihood probability densities. The likelihood function is related to the noise probability density and can be calculated from O-C deviations (Observed minus Calculated positions). The optionally used a priori probability density takes into account information about the population of discovered asteroids. The a priori probability density is used to constrain the phase space of possible orbits.As a MCMC method the Metropolis-Hastings algorithm has been applied, adding a globally convergent coefficient. The sequence of possible orbits derives through the sampling of each orbital parameter and acceptance criteria.The method allows to determine the phase space of every possible orbit considering each parameter. It also can be used to derive one orbit with the biggest probability density of orbital elements.
Bayesian statistical ionospheric tomography improved by incorporating ionosonde measurements
NASA Astrophysics Data System (ADS)
Norberg, Johannes; Virtanen, Ilkka I.; Roininen, Lassi; Vierinen, Juha; Orispää, Mikko; Kauristie, Kirsti; Lehtinen, Markku S.
2016-04-01
We validate two-dimensional ionospheric tomography reconstructions against EISCAT incoherent scatter radar measurements. Our tomography method is based on Bayesian statistical inversion with prior distribution given by its mean and covariance. We employ ionosonde measurements for the choice of the prior mean and covariance parameters and use the Gaussian Markov random fields as a sparse matrix approximation for the numerical computations. This results in a computationally efficient tomographic inversion algorithm with clear probabilistic interpretation. We demonstrate how this method works with simultaneous beacon satellite and ionosonde measurements obtained in northern Scandinavia. The performance is compared with results obtained with a zero-mean prior and with the prior mean taken from the International Reference Ionosphere 2007 model. In validating the results, we use EISCAT ultra-high-frequency incoherent scatter radar measurements as the ground truth for the ionization profile shape. We find that in comparison to the alternative prior information sources, ionosonde measurements improve the reconstruction by adding accurate information about the absolute value and the altitude distribution of electron density. With an ionosonde at continuous disposal, the presented method enhances stand-alone near-real-time ionospheric tomography for the given conditions significantly.
Statistical relationship discovery in SNP data using Bayesian networks
NASA Astrophysics Data System (ADS)
Szlendak, Pawel; Nowak, Robert M.
2009-06-01
The aim of this article is to present an application of Bayesian networks for discovery of affinity relationships based on genetic data. The presented solution uses a search and score algorithm to discover the Bayesian network structure which best fits the data i.e. the alleles of single nucleotide polymorphisms detected by DNA microarrays. The algorithm perceives structure learning as a combinatorial optimization problem. It is a randomized local search algorithm, which uses a Bayesian-Dirichlet scoring function. The algorithm's testing procedure encompasses tests on synthetic data, generated from given Bayesian networks by a forward sampling procedure as well as tests on real-world genetic data. The comparison of Bayesian networks generated by the application and the genetic evidence data confirms the usability of the presented methods.
Ockham's razor and Bayesian analysis. [statistical theory for systems evaluation
NASA Technical Reports Server (NTRS)
Jefferys, William H.; Berger, James O.
1992-01-01
'Ockham's razor', the ad hoc principle enjoining the greatest possible simplicity in theoretical explanations, is presently shown to be justifiable as a consequence of Bayesian inference; Bayesian analysis can, moreover, clarify the nature of the 'simplest' hypothesis consistent with the given data. By choosing the prior probabilities of hypotheses, it becomes possible to quantify the scientific judgment that simpler hypotheses are more likely to be correct. Bayesian analysis also shows that a hypothesis with fewer adjustable parameters intrinsically possesses an enhanced posterior probability, due to the clarity of its predictions.
Impaired Bayesian Learning for Cognitive Control in Cocaine Dependence
Ide, Jaime S.; Hu, Sien; Zhang, Sheng; Yu, Angela J.; Li, Chiang-shan R.
2015-01-01
Background Cocaine dependence is associated with cognitive control deficits. Here, we apply a Bayesian model of stop-signal task (SST) performance to further characterize these deficits in a theory-driven framework. Methods A “sequential effect” is commonly observed in SST: encounters with a stop trial tend to prolong reaction time (RT) on subsequent go trials. The Bayesian model accounts for this by assuming that each stop/go trial increases/decreases the subject’s belief about the likelihood of encountering a subsequent stop trial, P(stop), and that P(stop) strategically modulates RT accordingly. Parameters of the model were individually fit, and compared between cocaine-dependent (CD, n=51) and healthy control (HC, n=57) groups, matched in age and gender and both demonstrating a significant sequential effect (p<0.05). Model-free measures of sequential effect, post-error slowing (PES) and post-stop slowing (PSS), were also compared across groups. Results By comparing individually fit Bayesian model parameters, CD were found to utilize a smaller time window of past experiences to anticipate P(stop) (p<0.003), as well as showing less behavioral adjustment in response to P(stop) (p<0.015). PES (p=0.19) and PSS (p=0.14) did not show group differences and were less correlated with the Bayesian account of sequential effect in CD than in HC. Conclusions Cocaine dependence is associated with the utilization of less contextual information to anticipate future events and decreased behavioral adaptation in response to changes in such anticipation. These findings constitute a novel contribution by providing a computationally more refined and statistically more sensitive account of altered cognitive control in cocaine addiction. PMID:25869543
Bayesian reclassification statistics for assessing improvements in diagnostic accuracy.
Huang, Zhipeng; Li, Jialiang; Cheng, Ching-Yu; Cheung, Carol; Wong, Tien-Yin
2016-07-10
We propose a Bayesian approach to the estimation of the net reclassification improvement (NRI) and three versions of the integrated discrimination improvement (IDI) under the logistic regression model. Both NRI and IDI were proposed as numerical characterizations of accuracy improvement for diagnostic tests and were shown to retain certain practical advantage over analysis based on ROC curves and offer complementary information to the changes in area under the curve. Our development is a new contribution towards Bayesian solution for the estimation of NRI and IDI, which eases computational burden and increases flexibility. Our simulation results indicate that Bayesian estimation enjoys satisfactory performance comparable with frequentist estimation and achieves point estimation and credible interval construction simultaneously. We adopt the methodology to analyze a real data from the Singapore Malay Eye Study. Copyright © 2016 John Wiley & Sons, Ltd. PMID:26875442
Ice Shelf Modeling: A Cross-Polar Bayesian Statistical Approach
NASA Astrophysics Data System (ADS)
Kirchner, N.; Furrer, R.; Jakobsson, M.; Zwally, H. J.
2010-12-01
Ice streams interlink glacial terrestrial and marine environments: embedded in a grounded inland ice such as the Antarctic Ice Sheet or the paleo ice sheets covering extensive parts of the Eurasian and Amerasian Arctic respectively, ice streams are major drainage agents facilitating the discharge of substantial portions of continental ice into the ocean. At their seaward side, ice streams can either extend onto the ocean as floating ice tongues (such as the Drygalsky Ice Tongue/East Antarctica), or feed large ice shelves (as is the case for e.g. the Siple Coast and the Ross Ice Shelf/West Antarctica). The flow behavior of ice streams has been recognized to be intimately linked with configurational changes in their attached ice shelves; in particular, ice shelf disintegration is associated with rapid ice stream retreat and increased mass discharge from the continental ice mass, contributing eventually to sea level rise. Investigations of ice stream retreat mechanism are however incomplete if based on terrestrial records only: rather, the dynamics of ice shelves (and, eventually, the impact of the ocean on the latter) must be accounted for. However, since floating ice shelves leave hardly any traces behind when melting, uncertainty regarding the spatio-temporal distribution and evolution of ice shelves in times prior to instrumented and recorded observation is high, calling thus for a statistical modeling approach. Complementing ongoing large-scale numerical modeling efforts (Pollard & DeConto, 2009), we model the configuration of ice shelves by using a Bayesian Hiearchial Modeling (BHM) approach. We adopt a cross-polar perspective accounting for the fact that currently, ice shelves exist mainly along the coastline of Antarctica (and are virtually non-existing in the Arctic), while Arctic Ocean ice shelves repeatedly impacted the Arctic ocean basin during former glacial periods. Modeled Arctic ocean ice shelf configurations are compared with geological spatial
Postscript: Bayesian Statistical Inference in Psychology: Comment on Trafimow (2003)
ERIC Educational Resources Information Center
Lee, Michael D.; Wagenmakers, Eric-Jan
2005-01-01
This paper comments on the response offered by Trafimow on Lee and Wagenmakers comments on Trafimow's original article. It seems our comment should have made it clear that the objective Bayesian approach we advocate views probabilities neither as relative frequencies nor as belief states, but as degrees of plausibility assigned to propositions in…
Yu, Jihnhee; Hutson, Alan D; Siddiqui, Adnan H; Kedron, Mary A
2016-02-01
In some small clinical trials, toxicity is not a primary endpoint; however, it often has dire effects on patients' quality of life and is even life-threatening. For such clinical trials, rigorous control of the overall incidence of adverse events is desirable, while simultaneously collecting safety information. In this article, we propose group sequential toxicity monitoring strategies to control overall toxicity incidents below a certain level as opposed to performing hypothesis testing, which can be incorporated into an existing study design based on the primary endpoint. We consider two sequential methods: a non-Bayesian approach in which stopping rules are obtained based on the 'future' probability of an excessive toxicity rate; and a Bayesian adaptation modifying the proposed non-Bayesian approach, which can use the information obtained at interim analyses. Through an extensive Monte Carlo study, we show that the Bayesian approach often provides better control of the overall toxicity rate than the non-Bayesian approach. We also investigate adequate toxicity estimation after the studies. We demonstrate the applicability of our proposed methods in controlling the symptomatic intracranial hemorrhage rate for treating acute ischemic stroke patients. PMID:22407172
TOWARDS A BAYESIAN PERSPECTIVE ON STATISTICAL DISCLOSURE LIMITATION
National statistical offices and other organizations collect data on individual subjects (person, businesses, organizations), typically while assuring the subject that data pertaining to them will be held confidential. These data provide the raw material for statistical data pro...
Bayesian statistics for the calibration of the LISA Pathfinder experiment
NASA Astrophysics Data System (ADS)
Armano, M.; Audley, H.; Auger, G.; Binetruy, P.; Born, M.; Bortoluzzi, D.; Brandt, N.; Bursi, A.; Caleno, M.; Cavalleri, A.; Cesarini, A.; Cruise, M.; Danzmann, K.; Diepholz, I.; Dolesi, R.; Dunbar, N.; Ferraioli, L.; Ferroni, V.; Fitzsimons, E.; Freschi, M.; García Marirrodriga, C.; Gerndt, R.; Gesa, L.; Gibert, F.; Giardini, D.; Giusteri, R.; Grimani, C.; Harrison, I.; Heinzel, G.; Hewitson, M.; Hollington, D.; Hueller, M.; Huesler, J.; Inchauspé, H.; Jennrich, O.; Jetzer, P.; Johlander, B.; Karnesis, N.; Kaune, B.; Korsakova, N.; Killow, C.; Lloro, I.; Maarschalkerweerd, R.; Madden, S.; Mance, D.; Martin, V.; Martin-Porqueras, F.; Mateos, I.; McNamara, P.; Mendes, J.; Mitchell, E.; Moroni, A.; Nofrarias, M.; Paczkowski, S.; Perreur-Lloyd, M.; Pivato, P.; Plagnol, E.; Prat, P.; Ragnit, U.; Ramos-Castro, J.; Reiche, J.; Romera Perez, J. A.; Robertson, D.; Rozemeijer, H.; Russano, G.; Sarra, P.; Schleicher, A.; Slutsky, J.; Sopuerta, C. F.; Sumner, T.; Texier, D.; Thorpe, J.; Trenkel, C.; Tu, H. B.; Vitale, S.; Wanner, G.; Ward, H.; Waschke, S.; Wass, P.; Wealthy, D.; Wen, S.; Weber, W.; Wittchen, A.; Zanoni, C.; Ziegler, T.; Zweifel, P.
2015-05-01
The main goal of LISA Pathfinder (LPF) mission is to estimate the acceleration noise models of the overall LISA Technology Package (LTP) experiment on-board. This will be of crucial importance for the future space-based Gravitational-Wave (GW) detectors, like eLISA. Here, we present the Bayesian analysis framework to process the planned system identification experiments designed for that purpose. In particular, we focus on the analysis strategies to predict the accuracy of the parameters that describe the system in all degrees of freedom. The data sets were generated during the latest operational simulations organised by the data analysis team and this work is part of the LTPDA Matlab toolbox.
Preferential sampling and Bayesian geostatistics: Statistical modeling and examples.
Cecconi, Lorenzo; Grisotto, Laura; Catelan, Dolores; Lagazio, Corrado; Berrocal, Veronica; Biggeri, Annibale
2016-08-01
Preferential sampling refers to any situation in which the spatial process and the sampling locations are not stochastically independent. In this paper, we present two examples of geostatistical analysis in which the usual assumption of stochastic independence between the point process and the measurement process is violated. To account for preferential sampling, we specify a flexible and general Bayesian geostatistical model that includes a shared spatial random component. We apply the proposed model to two different case studies that allow us to highlight three different modeling and inferential aspects of geostatistical modeling under preferential sampling: (1) continuous or finite spatial sampling frame; (2) underlying causal model and relevant covariates; and (3) inferential goals related to mean prediction surface or prediction uncertainty. PMID:27566774
Statistical Detection of EEG Synchrony Using Empirical Bayesian Inference
Singh, Archana K.; Asoh, Hideki; Takeda, Yuji; Phillips, Steven
2015-01-01
There is growing interest in understanding how the brain utilizes synchronized oscillatory activity to integrate information across functionally connected regions. Computing phase-locking values (PLV) between EEG signals is a popular method for quantifying such synchronizations and elucidating their role in cognitive tasks. However, high-dimensionality in PLV data incurs a serious multiple testing problem. Standard multiple testing methods in neuroimaging research (e.g., false discovery rate, FDR) suffer severe loss of power, because they fail to exploit complex dependence structure between hypotheses that vary in spectral, temporal and spatial dimension. Previously, we showed that a hierarchical FDR and optimal discovery procedures could be effectively applied for PLV analysis to provide better power than FDR. In this article, we revisit the multiple comparison problem from a new Empirical Bayes perspective and propose the application of the local FDR method (locFDR; Efron, 2001) for PLV synchrony analysis to compute FDR as a posterior probability that an observed statistic belongs to a null hypothesis. We demonstrate the application of Efron's Empirical Bayes approach for PLV synchrony analysis for the first time. We use simulations to validate the specificity and sensitivity of locFDR and a real EEG dataset from a visual search study for experimental validation. We also compare locFDR with hierarchical FDR and optimal discovery procedures in both simulation and experimental analyses. Our simulation results showed that the locFDR can effectively control false positives without compromising on the power of PLV synchrony inference. Our results from the application locFDR on experiment data detected more significant discoveries than our previously proposed methods whereas the standard FDR method failed to detect any significant discoveries. PMID:25822617
Statistical detection of EEG synchrony using empirical bayesian inference.
Singh, Archana K; Asoh, Hideki; Takeda, Yuji; Phillips, Steven
2015-01-01
There is growing interest in understanding how the brain utilizes synchronized oscillatory activity to integrate information across functionally connected regions. Computing phase-locking values (PLV) between EEG signals is a popular method for quantifying such synchronizations and elucidating their role in cognitive tasks. However, high-dimensionality in PLV data incurs a serious multiple testing problem. Standard multiple testing methods in neuroimaging research (e.g., false discovery rate, FDR) suffer severe loss of power, because they fail to exploit complex dependence structure between hypotheses that vary in spectral, temporal and spatial dimension. Previously, we showed that a hierarchical FDR and optimal discovery procedures could be effectively applied for PLV analysis to provide better power than FDR. In this article, we revisit the multiple comparison problem from a new Empirical Bayes perspective and propose the application of the local FDR method (locFDR; Efron, 2001) for PLV synchrony analysis to compute FDR as a posterior probability that an observed statistic belongs to a null hypothesis. We demonstrate the application of Efron's Empirical Bayes approach for PLV synchrony analysis for the first time. We use simulations to validate the specificity and sensitivity of locFDR and a real EEG dataset from a visual search study for experimental validation. We also compare locFDR with hierarchical FDR and optimal discovery procedures in both simulation and experimental analyses. Our simulation results showed that the locFDR can effectively control false positives without compromising on the power of PLV synchrony inference. Our results from the application locFDR on experiment data detected more significant discoveries than our previously proposed methods whereas the standard FDR method failed to detect any significant discoveries. PMID:25822617
Cost-sensitive Bayesian control policy in human active sensing
Ahmad, Sheeraz; Huang, He; Yu, Angela J.
2014-01-01
An important but poorly understood aspect of sensory processing is the role of active sensing, the use of self-motion such as eye or head movements to focus sensing resources on the most rewarding or informative aspects of the sensory environment. Here, we present behavioral data from a visual search experiment, as well as a Bayesian model of within-trial dynamics of sensory processing and eye movements. Within this Bayes-optimal inference and control framework, which we call C-DAC (Context-Dependent Active Controller), various types of behavioral costs, such as temporal delay, response error, and sensor repositioning cost, are explicitly minimized. This contrasts with previously proposed algorithms that optimize abstract statistical objectives such as anticipated information gain (Infomax) (Butko and Movellan, 2010) and expected posterior maximum (greedy MAP) (Najemnik and Geisler, 2005). We find that C-DAC captures human visual search dynamics better than previous models, in particular a certain form of “confirmation bias” apparent in the way human subjects utilize prior knowledge about the spatial distribution of the search target to improve search speed and accuracy. We also examine several computationally efficient approximations to C-DAC that may present biologically more plausible accounts of the neural computations underlying active sensing, as well as practical tools for solving active sensing problems in engineering applications. To summarize, this paper makes the following key contributions: human visual search behavioral data, a context-sensitive Bayesian active sensing model, a comparative study between different models of human active sensing, and a family of efficient approximations to the optimal model. PMID:25520640
Cost-sensitive Bayesian control policy in human active sensing.
Ahmad, Sheeraz; Huang, He; Yu, Angela J
2014-01-01
An important but poorly understood aspect of sensory processing is the role of active sensing, the use of self-motion such as eye or head movements to focus sensing resources on the most rewarding or informative aspects of the sensory environment. Here, we present behavioral data from a visual search experiment, as well as a Bayesian model of within-trial dynamics of sensory processing and eye movements. Within this Bayes-optimal inference and control framework, which we call C-DAC (Context-Dependent Active Controller), various types of behavioral costs, such as temporal delay, response error, and sensor repositioning cost, are explicitly minimized. This contrasts with previously proposed algorithms that optimize abstract statistical objectives such as anticipated information gain (Infomax) (Butko and Movellan, 2010) and expected posterior maximum (greedy MAP) (Najemnik and Geisler, 2005). We find that C-DAC captures human visual search dynamics better than previous models, in particular a certain form of "confirmation bias" apparent in the way human subjects utilize prior knowledge about the spatial distribution of the search target to improve search speed and accuracy. We also examine several computationally efficient approximations to C-DAC that may present biologically more plausible accounts of the neural computations underlying active sensing, as well as practical tools for solving active sensing problems in engineering applications. To summarize, this paper makes the following key contributions: human visual search behavioral data, a context-sensitive Bayesian active sensing model, a comparative study between different models of human active sensing, and a family of efficient approximations to the optimal model. PMID:25520640
A BAYESIAN STATISTICAL APPROACHES FOR THE EVALUATION OF CMAQ
This research focuses on the application of spatial statistical techniques for the evaluation of the Community Multiscale Air Quality (CMAQ) model. The upcoming release version of the CMAQ model was run for the calendar year 2001 and is in the process of being evaluated by EPA an...
Applications of Bayesian Statistics to Problems in Gamma-Ray Bursts
NASA Technical Reports Server (NTRS)
Meegan, Charles A.
1997-01-01
This presentation will describe two applications of Bayesian statistics to Gamma Ray Bursts (GRBS). The first attempts to quantify the evidence for a cosmological versus galactic origin of GRBs using only the observations of the dipole and quadrupole moments of the angular distribution of bursts. The cosmological hypothesis predicts isotropy, while the galactic hypothesis is assumed to produce a uniform probability distribution over positive values for these moments. The observed isotropic distribution indicates that the Bayes factor for the cosmological hypothesis over the galactic hypothesis is about 300. Another application of Bayesian statistics is in the estimation of chance associations of optical counterparts with galaxies. The Bayesian approach is preferred to frequentist techniques here because the Bayesian approach easily accounts for galaxy mass distributions and because one can incorporate three disjoint hypotheses: (1) bursts come from galactic centers, (2) bursts come from galaxies in proportion to luminosity, and (3) bursts do not come from external galaxies. This technique was used in the analysis of the optical counterpart to GRB970228.
Bayesian approach for counting experiment statistics applied to a neutrino point source analysis
NASA Astrophysics Data System (ADS)
Bose, D.; Brayeur, L.; Casier, M.; de Vries, K. D.; Golup, G.; van Eijndhoven, N.
2013-12-01
In this paper we present a model independent analysis method following Bayesian statistics to analyse data from a generic counting experiment and apply it to the search for neutrinos from point sources. We discuss a test statistic defined following a Bayesian framework that will be used in the search for a signal. In case no signal is found, we derive an upper limit without the introduction of approximations. The Bayesian approach allows us to obtain the full probability density function for both the background and the signal rate. As such, we have direct access to any signal upper limit. The upper limit derivation directly compares with a frequentist approach and is robust in the case of low-counting observations. Furthermore, it allows also to account for previous upper limits obtained by other analyses via the concept of prior information without the need of the ad hoc application of trial factors. To investigate the validity of the presented Bayesian approach, we have applied this method to the public IceCube 40-string configuration data for 10 nearby blazars and we have obtained a flux upper limit, which is in agreement with the upper limits determined via a frequentist approach. Furthermore, the upper limit obtained compares well with the previously published result of IceCube, using the same data set.
Fine Mapping Causal Variants with an Approximate Bayesian Method Using Marginal Test Statistics.
Chen, Wenan; Larrabee, Beth R; Ovsyannikova, Inna G; Kennedy, Richard B; Haralambieva, Iana H; Poland, Gregory A; Schaid, Daniel J
2015-07-01
Two recently developed fine-mapping methods, CAVIAR and PAINTOR, demonstrate better performance over other fine-mapping methods. They also have the advantage of using only the marginal test statistics and the correlation among SNPs. Both methods leverage the fact that the marginal test statistics asymptotically follow a multivariate normal distribution and are likelihood based. However, their relationship with Bayesian fine mapping, such as BIMBAM, is not clear. In this study, we first show that CAVIAR and BIMBAM are actually approximately equivalent to each other. This leads to a fine-mapping method using marginal test statistics in the Bayesian framework, which we call CAVIAR Bayes factor (CAVIARBF). Another advantage of the Bayesian framework is that it can answer both association and fine-mapping questions. We also used simulations to compare CAVIARBF with other methods under different numbers of causal variants. The results showed that both CAVIARBF and BIMBAM have better performance than PAINTOR and other methods. Compared to BIMBAM, CAVIARBF has the advantage of using only marginal test statistics and takes about one-quarter to one-fifth of the running time. We applied different methods on two independent cohorts of the same phenotype. Results showed that CAVIARBF, BIMBAM, and PAINTOR selected the same top 3 SNPs; however, CAVIARBF and BIMBAM had better consistency in selecting the top 10 ranked SNPs between the two cohorts. Software is available at https://bitbucket.org/Wenan/caviarbf. PMID:25948564
Bayesian Statistical Analysis Applied to NAA Data for Neutron Flux Spectrum Determination
NASA Astrophysics Data System (ADS)
Chiesa, D.; Previtali, E.; Sisti, M.
2014-04-01
In this paper, we present a statistical method, based on Bayesian statistics, to evaluate the neutron flux spectrum from the activation data of different isotopes. The experimental data were acquired during a neutron activation analysis (NAA) experiment [A. Borio di Tigliole et al., Absolute flux measurement by NAA at the Pavia University TRIGA Mark II reactor facilities, ENC 2012 - Transactions Research Reactors, ISBN 978-92-95064-14-0, 22 (2012)] performed at the TRIGA Mark II reactor of Pavia University (Italy). In order to evaluate the neutron flux spectrum, subdivided in energy groups, we must solve a system of linear equations containing the grouped cross sections and the activation rate data. We solve this problem with Bayesian statistical analysis, including the uncertainties of the coefficients and the a priori information about the neutron flux. A program for the analysis of Bayesian hierarchical models, based on Markov Chain Monte Carlo (MCMC) simulations, is used to define the problem statistical model and solve it. The energy group fluxes and their uncertainties are then determined with great accuracy and the correlations between the groups are analyzed. Finally, the dependence of the results on the prior distribution choice and on the group cross section data is investigated to confirm the reliability of the analysis.
Bayesian Bigot? Statistical Discrimination, Stereotypes, and Employer Decision Making
Pager, Devah; Karafin, Diana
2010-01-01
Much of the debate over the underlying causes of discrimination centers on the rationality of employer decision making. Economic models of statistical discrimination emphasize the cognitive utility of group estimates as a means of dealing with the problems of uncertainty. Sociological and social-psychological models, by contrast, question the accuracy of group-level attributions. Although mean differences may exist between groups on productivity-related characteristics, these differences are often inflated in their application, leading to much larger differences in individual evaluations than would be warranted by actual group-level trait distributions. In this study, the authors examine the nature of employer attitudes about black and white workers and the extent to which these views are calibrated against their direct experiences with workers from each group. They use data from fifty-five in-depth interviews with hiring managers to explore employers’ group-level attributions and their direct observations to develop a model of attitude formation and employer learning. PMID:20686633
Bayesian adjustment for exposure misclassification in case-control studies.
Chu, Rong; Gustafson, Paul; Le, Nhu
2010-04-30
Poor measurement of explanatory variables occurs frequently in observational studies. Error-prone observations may lead to biased estimation and loss of power in detecting the impact of explanatory variables on the response. We consider misclassified binary exposure in the context of case-control studies, assuming the availability of validation data to inform the magnitude of the misclassification. A Bayesian adjustment to correct the misclassification is investigated. Simulation studies show that the Bayesian method can have advantages over non-Bayesian counterparts, particularly in the face of a rare exposure, small validation sample sizes, and uncertainty about whether exposure misclassification is differential or non-differential. The method is illustrated via application to several real studies. PMID:20087839
Control Theory and Statistical Generalizations.
ERIC Educational Resources Information Center
Powers, William T.
1990-01-01
Contrasts modeling methods in control theory to the methods of statistical generalizations in empirical studies of human or animal behavior. Presents a computer simulation that predicts behavior based on variables (effort and rewards) determined by the invariable (desired reward). Argues that control theory methods better reflect relationships to…
[Statistical process control in healthcare].
Anhøj, Jacob; Bjørn, Brian
2009-05-18
Statistical process control (SPC) is a branch of statistical science which comprises methods for the study of process variation. Common cause variation is inherent in any process and predictable within limits. Special cause variation is unpredictable and indicates change in the process. The run chart is a simple tool for analysis of process variation. Run chart analysis may reveal anomalies that suggest shifts or unusual patterns that are attributable to special cause variation. PMID:19454196
NASA Astrophysics Data System (ADS)
Albert, Carlo; Ulzega, Simone; Stoop, Ruedi
2016-04-01
Measured time-series of both precipitation and runoff are known to exhibit highly non-trivial statistical properties. For making reliable probabilistic predictions in hydrology, it is therefore desirable to have stochastic models with output distributions that share these properties. When parameters of such models have to be inferred from data, we also need to quantify the associated parametric uncertainty. For non-trivial stochastic models, however, this latter step is typically very demanding, both conceptually and numerically, and always never done in hydrology. Here, we demonstrate that methods developed in statistical physics make a large class of stochastic differential equation (SDE) models amenable to a full-fledged Bayesian parameter inference. For concreteness we demonstrate these methods by means of a simple yet non-trivial toy SDE model. We consider a natural catchment that can be described by a linear reservoir, at the scale of observation. All the neglected processes are assumed to happen at much shorter time-scales and are therefore modeled with a Gaussian white noise term, the standard deviation of which is assumed to scale linearly with the system state (water volume in the catchment). Even for constant input, the outputs of this simple non-linear SDE model show a wealth of desirable statistical properties, such as fat-tailed distributions and long-range correlations. Standard algorithms for Bayesian inference fail, for models of this kind, because their likelihood functions are extremely high-dimensional intractable integrals over all possible model realizations. The use of Kalman filters is illegitimate due to the non-linearity of the model. Particle filters could be used but become increasingly inefficient with growing number of data points. Hamiltonian Monte Carlo algorithms allow us to translate this inference problem to the problem of simulating the dynamics of a statistical mechanics system and give us access to most sophisticated methods
NASA Astrophysics Data System (ADS)
Rubin, D.; Aldering, G.; Barbary, K.; Boone, K.; Chappell, G.; Currie, M.; Deustua, S.; Fagrelius, P.; Fruchter, A.; Hayden, B.; Lidman, C.; Nordin, J.; Perlmutter, S.; Saunders, C.; Sofiatti, C.; Supernova Cosmology Project, The
2015-11-01
While recent supernova (SN) cosmology research has benefited from improved measurements, current analysis approaches are not statistically optimal and will prove insufficient for future surveys. This paper discusses the limitations of current SN cosmological analyses in treating outliers, selection effects, shape- and color-standardization relations, unexplained dispersion, and heterogeneous observations. We present a new Bayesian framework, called UNITY (Unified Nonlinear Inference for Type-Ia cosmologY), that incorporates significant improvements in our ability to confront these effects. We apply the framework to real SN observations and demonstrate smaller statistical and systematic uncertainties. We verify earlier results that SNe Ia require nonlinear shape and color standardizations, but we now include these nonlinear relations in a statistically well-justified way. This analysis was primarily performed blinded, in that the basic framework was first validated on simulated data before transitioning to real data. We also discuss possible extensions of the method.
A Bayesian Formulation of Behavioral Control
ERIC Educational Resources Information Center
Huys, Quentin J. M.; Dayan, Peter
2009-01-01
Helplessness, a belief that the world is not subject to behavioral control, has long been central to our understanding of depression, and has influenced cognitive theories, animal models and behavioral treatments. However, despite its importance, there is no fully accepted definition of helplessness or behavioral control in psychology or…
Predictive data-derived Bayesian statistic-transport model and simulator of sunken oil mass
NASA Astrophysics Data System (ADS)
Echavarria Gregory, Maria Angelica
Sunken oil is difficult to locate because remote sensing techniques cannot as yet provide views of sunken oil over large areas. Moreover, the oil may re-suspend and sink with changes in salinity, sediment load, and temperature, making deterministic fate models difficult to deploy and calibrate when even the presence of sunken oil is difficult to assess. For these reasons, together with the expense of field data collection, there is a need for a statistical technique integrating limited data collection with stochastic transport modeling. Predictive Bayesian modeling techniques have been developed and demonstrated for exploiting limited information for decision support in many other applications. These techniques brought to a multi-modal Lagrangian modeling framework, representing a near-real time approach to locating and tracking sunken oil driven by intrinsic physical properties of field data collected following a spill after oil has begun collecting on a relatively flat bay bottom. Methods include (1) development of the conceptual predictive Bayesian model and multi-modal Gaussian computational approach based on theory and literature review; (2) development of an object-oriented programming and combinatorial structure capable of managing data, integration and computation over an uncertain and highly dimensional parameter space; (3) creating a new bi-dimensional approach of the method of images to account for curved shoreline boundaries; (4) confirmation of model capability for locating sunken oil patches using available (partial) real field data and capability for temporal projections near curved boundaries using simulated field data; and (5) development of a stand-alone open-source computer application with graphical user interface capable of calibrating instantaneous oil spill scenarios, obtaining sets maps of relative probability profiles at different prediction times and user-selected geographic areas and resolution, and capable of performing post
Shen, Jian; Zhao, Yuan
2010-01-01
Nonpoint source load estimation is an essential part of the development of the bacterial total maximum daily load (TMDL) mandated by the Clean Water Act. However, the currently widely used watershed-receiving water modeling approach is usually associated with a high level of uncertainty and requires long-term observational data and intensive training effort. The load duration curve (LDC) method recommended by the EPA provides a simpler way to estimate bacteria loading. This method, however, does not take into consideration the specific fate and transport mechanisms of the pollutant and cannot address the uncertainty. In this study, a Bayesian statistical approach is applied to the Escherichia coli TMDL development of a stream on the Eastern Shore of Virginia to inversely estimate watershed bacteria loads from the in-stream monitoring data. The mechanism of bacteria transport is incorporated. The effects of temperature, bottom slope, and flow on allowable and existing load calculations are discussed. The uncertainties associated with load estimation are also fully described. Our method combines the merits of LDC, mechanistic modeling, and Bayesian statistics, while overcoming some of the shortcomings associated with these methods. It is a cost-effective tool for bacteria TMDL development and can be modified and applied to multi-segment streams as well. PMID:19781737
Li, Ke; Zhang, Qiuju; Wang, Kun; Chen, Peng; Wang, Huaqing
2016-01-01
A new fault diagnosis method for rotating machinery based on adaptive statistic test filter (ASTF) and Diagnostic Bayesian Network (DBN) is presented in this paper. ASTF is proposed to obtain weak fault features under background noise, ASTF is based on statistic hypothesis testing in the frequency domain to evaluate similarity between reference signal (noise signal) and original signal, and remove the component of high similarity. The optimal level of significance α is obtained using particle swarm optimization (PSO). To evaluate the performance of the ASTF, evaluation factor Ipq is also defined. In addition, a simulation experiment is designed to verify the effectiveness and robustness of ASTF. A sensitive evaluation method using principal component analysis (PCA) is proposed to evaluate the sensitiveness of symptom parameters (SPs) for condition diagnosis. By this way, the good SPs that have high sensitiveness for condition diagnosis can be selected. A three-layer DBN is developed to identify condition of rotation machinery based on the Bayesian Belief Network (BBN) theory. Condition diagnosis experiment for rolling element bearings demonstrates the effectiveness of the proposed method. PMID:26761006
Li, Ke; Zhang, Qiuju; Wang, Kun; Chen, Peng; Wang, Huaqing
2016-01-01
A new fault diagnosis method for rotating machinery based on adaptive statistic test filter (ASTF) and Diagnostic Bayesian Network (DBN) is presented in this paper. ASTF is proposed to obtain weak fault features under background noise, ASTF is based on statistic hypothesis testing in the frequency domain to evaluate similarity between reference signal (noise signal) and original signal, and remove the component of high similarity. The optimal level of significance α is obtained using particle swarm optimization (PSO). To evaluate the performance of the ASTF, evaluation factor Ipq is also defined. In addition, a simulation experiment is designed to verify the effectiveness and robustness of ASTF. A sensitive evaluation method using principal component analysis (PCA) is proposed to evaluate the sensitiveness of symptom parameters (SPs) for condition diagnosis. By this way, the good SPs that have high sensitiveness for condition diagnosis can be selected. A three-layer DBN is developed to identify condition of rotation machinery based on the Bayesian Belief Network (BBN) theory. Condition diagnosis experiment for rolling element bearings demonstrates the effectiveness of the proposed method. PMID:26761006
Statistical process control for lathes
Barkman, W.E.; Babelay, E.F.; Woodard, L.M.
1986-12-18
The Oak Ridge Y-12 Plant produces large numbers of hemishell workpieces using precision computer-controlled lathes. In order to improve the quality/productivity of these machines, a pilot project is under way to demonstrate the utility of automatic, on-machine measurement of key workpiece features. This system utilizes tough-trigger probes and on automatic tool changer to generate data for a host data base that monitors and adjusts the machine's operations for variable machining conditions. This paper discusses the individual components, control software and data communications that are used to achieve an automated machining system which incorporates statistical process control.
Ni, Weiping; Yan, Weidong; Bian, Hui; Wu, Junzheng
2014-01-01
A novel fast SAR image change detection method is presented in this paper. Based on a Bayesian approach, the prior information that speckles follow the Nakagami distribution is incorporated into the difference image (DI) generation process. The new DI performs much better than the familiar log ratio (LR) DI as well as the cumulant based Kullback-Leibler divergence (CKLD) DI. The statistical region merging (SRM) approach is first introduced to change detection context. A new clustering procedure with the region variance as the statistical inference variable is exhibited to tailor SAR image change detection purposes, with only two classes in the final map, the unchanged and changed classes. The most prominent advantages of the proposed modified SRM (MSRM) method are the ability to cope with noise corruption and the quick implementation. Experimental results show that the proposed method is superior in both the change detection accuracy and the operation efficiency. PMID:25258740
Application of Bayesian statistical techniques in the analysis of spacecraft pointing errors
NASA Astrophysics Data System (ADS)
Dungate, D. G.
1993-09-01
A key problem in the statistical analysis of spacecraft pointing performance is the justifiable identification of a Probability Density Function (PDF) for each contributing error source. The drawbacks of Gaussian distributions are well known, and more flexible families of distributions have been identified, but often only limited data is available to support PDF assignment. Two methods based on Bayesian statistical principles, each working from alternative viewpoints, are applied to the problem here, and appear to offer significant advantages in the analysis of many error types. In particular, errors such as time-varying thermal distortions, where data is only available via a small number of Finite Element Analyses, appear to be satisfactorily dealt with via one of these methods, which also explicitly allows for the inclusion of estimated errors in quantities formed from the data available for a particular error source.
Statistical Inference at Work: Statistical Process Control as an Example
ERIC Educational Resources Information Center
Bakker, Arthur; Kent, Phillip; Derry, Jan; Noss, Richard; Hoyles, Celia
2008-01-01
To characterise statistical inference in the workplace this paper compares a prototypical type of statistical inference at work, statistical process control (SPC), with a type of statistical inference that is better known in educational settings, hypothesis testing. Although there are some similarities between the reasoning structure involved in…
Exploring the Connection Between Sampling Problems in Bayesian Inference and Statistical Mechanics
NASA Technical Reports Server (NTRS)
Pohorille, Andrew
2006-01-01
The Bayesian and statistical mechanical communities often share the same objective in their work - estimating and integrating probability distribution functions (pdfs) describing stochastic systems, models or processes. Frequently, these pdfs are complex functions of random variables exhibiting multiple, well separated local minima. Conventional strategies for sampling such pdfs are inefficient, sometimes leading to an apparent non-ergodic behavior. Several recently developed techniques for handling this problem have been successfully applied in statistical mechanics. In the multicanonical and Wang-Landau Monte Carlo (MC) methods, the correct pdfs are recovered from uniform sampling of the parameter space by iteratively establishing proper weighting factors connecting these distributions. Trivial generalizations allow for sampling from any chosen pdf. The closely related transition matrix method relies on estimating transition probabilities between different states. All these methods proved to generate estimates of pdfs with high statistical accuracy. In another MC technique, parallel tempering, several random walks, each corresponding to a different value of a parameter (e.g. "temperature"), are generated and occasionally exchanged using the Metropolis criterion. This method can be considered as a statistically correct version of simulated annealing. An alternative approach is to represent the set of independent variables as a Hamiltonian system. Considerab!e progress has been made in understanding how to ensure that the system obeys the equipartition theorem or, equivalently, that coupling between the variables is correctly described. Then a host of techniques developed for dynamical systems can be used. Among them, probably the most powerful is the Adaptive Biasing Force method, in which thermodynamic integration and biased sampling are combined to yield very efficient estimates of pdfs. The third class of methods deals with transitions between states described
Evaluation of Oceanic Transport Statistics By Use of Transient Tracers and Bayesian Methods
NASA Astrophysics Data System (ADS)
Trossman, D. S.; Thompson, L.; Mecking, S.; Bryan, F.; Peacock, S.
2013-12-01
Key variables that quantify the time scales over which atmospheric signals penetrate into the oceanic interior and their uncertainties are computed using Bayesian methods and transient tracers from both models and observations. First, the mean residence times, subduction rates, and formation rates of Subtropical Mode Water (STMW) and Subpolar Mode Water (SPMW) in the North Atlantic and Subantarctic Mode Water (SAMW) in the Southern Ocean are estimated by combining a model and observations of chlorofluorocarbon-11 (CFC-11) via Bayesian Model Averaging (BMA), statistical technique that weights model estimates according to how close they agree with observations. Second, a Bayesian method is presented to find two oceanic transport parameters associated with the age distribution of ocean waters, the transit-time distribution (TTD), by combining an eddying global ocean model's estimate of the TTD with hydrographic observations of CFC-11, temperature, and salinity. Uncertainties associated with objectively mapping irregularly spaced bottle data are quantified by making use of a thin-plate spline and then propagated via the two Bayesian techniques. It is found that the subduction of STMW, SPMW, and SAMW is mostly an advective process, but up to about one-third of STMW subduction likely owes to non-advective processes. Also, while the formation of STMW is mostly due to subduction, the formation of SPMW is mostly due to other processes. About half of the formation of SAMW is due to subduction and half is due to other processes. A combination of air-sea flux, acting on relatively short time scales, and turbulent mixing, acting on a wide range of time scales, is likely the dominant SPMW erosion mechanism. Air-sea flux is likely responsible for most STMW erosion, and turbulent mixing is likely responsible for most SAMW erosion. Two oceanic transport parameters, the mean age of a water parcel and the half-variance associated with the TTD, estimated using the model's tracers as
NASA Astrophysics Data System (ADS)
Zhang, Xianliang; Yan, Xiaodong
2015-11-01
A new statistical downscaling method was developed and applied to downscale monthly total precipitation from 583 stations in China. Generally, there are two steps involved in statistical downscaling: first, the predictors are selected (large-scale variables) and transformed; and second, a model between the predictors and the predictand (in this case, precipitation) is established. In the first step, a selection process of the predictor domain, called the optimum correlation method (OCM), was developed to transform the predictors. The transformed series obtained by the OCM showed much better correlation with the predictand than those obtained by the traditional transform method for the same predictor. Moreover, the method combining OCM and linear regression obtained better downscaling results than the traditional linear regression method, suggesting that the OCM could be used to improve the results of statistical downscaling. In the second step, Bayesian model averaging (BMA) was adopted as an alternative to linear regression. The method combining the OCM and BMA showed much better performance than the method combining the OCM and linear regression. Thus, BMA could be used as an alternative to linear regression in the second step of statistical downscaling. In conclusion, the downscaling method combining OCM and BMA produces more accurate results than the multiple linear regression method when used to statistically downscale large-scale variables.
Bayesian Software Health Management for Aircraft Guidance, Navigation, and Control
NASA Technical Reports Server (NTRS)
Schumann, Johann; Mbaya, Timmy; Menghoel, Ole
2011-01-01
Modern aircraft, both piloted fly-by-wire commercial aircraft as well as UAVs, more and more depend on highly complex safety critical software systems with many sensors and computer-controlled actuators. Despite careful design and V&V of the software, severe incidents have happened due to malfunctioning software. In this paper, we discuss the use of Bayesian networks (BNs) to monitor the health of the on-board software and sensor system, and to perform advanced on-board diagnostic reasoning. We will focus on the approach to develop reliable and robust health models for the combined software and sensor systems.
Shafieloo, Arman
2012-05-01
By introducing Crossing functions and hyper-parameters I show that the Bayesian interpretation of the Crossing Statistics [1] can be used trivially for the purpose of model selection among cosmological models. In this approach to falsify a cosmological model there is no need to compare it with other models or assume any particular form of parametrization for the cosmological quantities like luminosity distance, Hubble parameter or equation of state of dark energy. Instead, hyper-parameters of Crossing functions perform as discriminators between correct and wrong models. Using this approach one can falsify any assumed cosmological model without putting priors on the underlying actual model of the universe and its parameters, hence the issue of dark energy parametrization is resolved. It will be also shown that the sensitivity of the method to the intrinsic dispersion of the data is small that is another important characteristic of the method in testing cosmological models dealing with data with high uncertainties.
A Bayesian statistical model for hybrid metrology to improve measurement accuracy
NASA Astrophysics Data System (ADS)
Silver, R. M.; Zhang, N. F.; Barnes, B. M.; Qin, J.; Zhou, H.; Dixson, R.
2011-05-01
We present a method to combine measurements from different techniques that reduces uncertainties and can improve measurement throughput. The approach directly integrates the measurement analysis of multiple techniques that can include different configurations or platforms. This approach has immediate application when performing model-based optical critical dimension (OCD) measurements. When modeling optical measurements, a library of curves is assembled through the simulation of a multi-dimensional parameter space. Parametric correlation and measurement noise lead to measurement uncertainty in the fitting process with fundamental limitations resulting from the parametric correlations. A strategy to decouple parametric correlation and reduce measurement uncertainties is described. We develop the rigorous underlying Bayesian statistical model and apply this methodology to OCD metrology. We then introduce an approach to damp the regression process to achieve more stable and rapid regression fitting. These methods that use a priori information are shown to reduce measurement uncertainty and improve throughput while also providing an improved foundation for comprehensive reference metrology.
Recovery of gastrointestinal tract motility detection using Naive Bayesian and minimum statistics.
Ulusar, Umit D
2014-08-01
Loss of gastrointestinal motility is a significant medical setback for patients who experience abdominal surgery and contributes to the most common reason for prolonged hospital stays. Recent clinical studies suggest that initiating feeding early after abdominal surgery is beneficial. Early feeding is possible when the patients demonstrate bowel motility in the form of bowel sounds (BS). This work provides a data collection, processing and analysis methodology for detection of recovery of gastrointestinal track motility by observing BSs in auscultation recordings. The approach is suitable for real-time long-term continuous monitoring in clinical environments. The system was developed using a Naive Bayesian algorithm for pattern classification, and Minimum Statistics and spectral subtraction for noise attenuation. The solution was tested on 59h of recordings and 94.15% recognition accuracy was observed. PMID:24971526
How to construct the optimal Bayesian measurement in quantum statistical decision theory
NASA Astrophysics Data System (ADS)
Tanaka, Fuyuhiko
Recently, much more attention has been paid to the study aiming at the application of fundamental properties in quantum theory to information processing and technology. In particular, modern statistical methods have been recognized in quantum state tomography (QST), where we have to estimate a density matrix (positive semidefinite matrix of trace one) representing a quantum system from finite data collected in a certain experiment. When the dimension of the density matrix gets large (from a few hundred to millions), it gets a nontrivial problem. While a specific measurement is often given and fixed in QST, we are also able to choose a measurement itself according to the purpose of QST by using qunatum statistical decision theory. Here we propose a practical method to find the best projective measurement in the Bayesian sense. We assume that a prior distribution (e.g., the uniform distribution) and a convex loss function (e.g., the squared error) are given. In many quantum experiments, these assumptions are not so restrictive. We show that the best projective measurement and the best statistical inference based on the measurement outcome exist and that they are obtained explicitly by using the Monte Carlo optimization. The Grant-in-Aid for Scientific Research (B) (No. 26280005).
NASA Astrophysics Data System (ADS)
Moradkhani, Hamid
2015-04-01
Drought forecasting is vital for resource management and planning. Both societal and agricultural requirements for water weigh heavily on the natural environment, which may become scarce in the event of drought. Although drought forecasts are an important tool for managing water in hydrologic systems, these forecasts are plagued by uncertainties, owing to the complexities of water dynamics and the spatial heterogeneities of pertinent variables. Due to these uncertainties, it is necessary to frame forecasts in a probabilistic manner. Here we present a statistical-dynamical probabilistic drought forecast framework within Bayesian networks. The statistical forecast model applies a family of multivariate distribution functions to forecast future drought conditions given the drought status in the past. The advantage of the statistical forecast model is that it develops conditional probabilities of a given forecast variable, and returns the highest probable forecast along with an assessment of the uncertainty around that value. The dynamical model relies on data assimilation to characterize the initial land surface condition uncertainty which correspondingly reflect on drought forecast. In addition, the recovery of drought will be examined. From these forecasts, it is found that drought recovery is a longer process than suggested in recent literature. Drought in land surface variables (snow, soil moisture) is shown to be persistent up to a year in certain locations, depending on the intensity of the drought. Location within the basin appears to be a driving factor in the ability of the land surface to recover from drought, allowing for differentiation between drought prone and drought resistant regions.
Automated parameter estimation for biological models using Bayesian statistical model checking
2015-01-01
Background Probabilistic models have gained widespread acceptance in the systems biology community as a useful way to represent complex biological systems. Such models are developed using existing knowledge of the structure and dynamics of the system, experimental observations, and inferences drawn from statistical analysis of empirical data. A key bottleneck in building such models is that some system variables cannot be measured experimentally. These variables are incorporated into the model as numerical parameters. Determining values of these parameters that justify existing experiments and provide reliable predictions when model simulations are performed is a key research problem. Domain experts usually estimate the values of these parameters by fitting the model to experimental data. Model fitting is usually expressed as an optimization problem that requires minimizing a cost-function which measures some notion of distance between the model and the data. This optimization problem is often solved by combining local and global search methods that tend to perform well for the specific application domain. When some prior information about parameters is available, methods such as Bayesian inference are commonly used for parameter learning. Choosing the appropriate parameter search technique requires detailed domain knowledge and insight into the underlying system. Results Using an agent-based model of the dynamics of acute inflammation, we demonstrate a novel parameter estimation algorithm by discovering the amount and schedule of doses of bacterial lipopolysaccharide that guarantee a set of observed clinical outcomes with high probability. We synthesized values of twenty-eight unknown parameters such that the parameterized model instantiated with these parameter values satisfies four specifications describing the dynamic behavior of the model. Conclusions We have developed a new algorithmic technique for discovering parameters in complex stochastic models of
NASA Astrophysics Data System (ADS)
Norberg, J.; Virtanen, I. I.; Roininen, L.; Vierinen, J.; Orispää, M.; Kauristie, K.; Lehtinen, M. S.
2015-09-01
We validate two-dimensional ionospheric tomography reconstructions against EISCAT incoherent scatter radar measurements. Our tomography method is based on Bayesian statistical inversion with prior distribution given by its mean and covariance. We employ ionosonde measurements for the choice of the prior mean and covariance parameters, and use the Gaussian Markov random fields as a sparse matrix approximation for the numerical computations. This results in a computationally efficient and statistically clear inversion algorithm for tomography. We demonstrate how this method works with simultaneous beacon satellite and ionosonde measurements obtained in northern Scandinavia. The performance is compared with results obtained with a zero mean prior and with the prior mean taken from the International Reference Ionosphere 2007 model. In validating the results, we use EISCAT UHF incoherent scatter radar measurements as the ground truth for the ionization profile shape. We find that ionosonde measurements improve the reconstruction by adding accurate information about the absolute value and the height distribution of electron density, and outperforms the alternative prior information sources. With an ionosonde at continuous disposal, the presented method enhances stand-alone near real-time ionospheric tomography for the given conditions significantly.
Hewett, Paul; Bullock, William H
2014-01-01
For more than 20 years CSX Transportation (CSXT) has collected exposure measurements from locomotive engineers and conductors who are potentially exposed to diesel emissions. The database included measurements for elemental and total carbon, polycyclic aromatic hydrocarbons, aromatics, aldehydes, carbon monoxide, and nitrogen dioxide. This database was statistically analyzed and summarized, and the resulting statistics and exposure profiles were compared to relevant occupational exposure limits (OELs) using both parametric and non-parametric descriptive and compliance statistics. Exposure ratings, using the American Industrial Health Association (AIHA) exposure categorization scheme, were determined using both the compliance statistics and Bayesian Decision Analysis (BDA). The statistical analysis of the elemental carbon data (a marker for diesel particulate) strongly suggests that the majority of levels in the cabs of the lead locomotives (n = 156) were less than the California guideline of 0.020 mg/m(3). The sample 95th percentile was roughly half the guideline; resulting in an AIHA exposure rating of category 2/3 (determined using BDA). The elemental carbon (EC) levels in the trailing locomotives tended to be greater than those in the lead locomotive; however, locomotive crews rarely ride in the trailing locomotive. Lead locomotive EC levels were similar to those reported by other investigators studying locomotive crew exposures and to levels measured in urban areas. Lastly, both the EC sample mean and 95%UCL were less than the Environmental Protection Agency (EPA) reference concentration of 0.005 mg/m(3). With the exception of nitrogen dioxide, the overwhelming majority of the measurements for total carbon, polycyclic aromatic hydrocarbons, aromatics, aldehydes, and combustion gases in the cabs of CSXT locomotives were either non-detects or considerably less than the working OELs for the years represented in the database. When compared to the previous American
NASA Astrophysics Data System (ADS)
Herschtal, A.; Foroudi, F.; Greer, P. B.; Eade, T. N.; Hindson, B. R.; Kron, T.
2012-05-01
Early approaches to characterizing errors in target displacement during a fractionated course of radiotherapy assumed that the underlying fraction-to-fraction variability in target displacement, known as the ‘treatment error’ or ‘random error’, could be regarded as constant across patients. More recent approaches have modelled target displacement allowing for differences in random error between patients. However, until recently it has not been feasible to compare the goodness of fit of alternate models of random error rigorously. This is because the large volumes of real patient data necessary to distinguish between alternative models have only very recently become available. This work uses real-world displacement data collected from 365 patients undergoing radical radiotherapy for prostate cancer to compare five candidate models for target displacement. The simplest model assumes constant random errors across patients, while other models allow for random errors that vary according to one of several candidate distributions. Bayesian statistics and Markov Chain Monte Carlo simulation of the model parameters are used to compare model goodness of fit. We conclude that modelling the random error as inverse gamma distributed provides a clearly superior fit over all alternatives considered. This finding can facilitate more accurate margin recipes and correction strategies.
Francis, Royce A; Vanbriesen, Jeanne M; Small, Mitchell J
2010-02-15
Statistical models are developed for bromine incorporation in the trihalomethane (THM), trihaloacetic acids (THAA), dihaloacetic acid (DHAA), and dihaloacetonitrile (DHAN) subclasses of disinfection byproducts (DBPs) using distribution system samples from plants applying only free chlorine as a primary or residual disinfectant in the Information Collection Rule (ICR) database. The objective of this study is to characterize the effect of water quality conditions before, during, and post-treatment on distribution system bromine incorporation into DBP mixtures. Bayesian Markov Chain Monte Carlo (MCMC) methods are used to model individual DBP concentrations and estimate the coefficients of the linear models used to predict the bromine incorporation fraction for distribution system DBP mixtures in each of the four priority DBP classes. The bromine incorporation models achieve good agreement with the data. The most important predictors of bromine incorporation fraction across DBP classes are alkalinity, specific UV absorption (SUVA), and the bromide to total organic carbon ratio (Br:TOC) at the first point of chlorine addition. Free chlorine residual in the distribution system, distribution system residence time, distribution system pH, turbidity, and temperature only slightly influence bromine incorporation. The bromide to applied chlorine (Br:Cl) ratio is not a significant predictor of the bromine incorporation fraction (BIF) in any of the four classes studied. These results indicate that removal of natural organic matter and the location of chlorine addition are important treatment decisions that have substantial implications for bromine incorporation into disinfection byproduct in drinking waters. PMID:20095529
NASA Astrophysics Data System (ADS)
Stenning, D. C.; Wagner-Kaiser, R.; Robinson, E.; van Dyk, D. A.; von Hippel, T.; Sarajedini, A.; Stein, N.
2016-07-01
We develop a Bayesian model for globular clusters composed of multiple stellar populations, extending earlier statistical models for open clusters composed of simple (single) stellar populations. Specifically, we model globular clusters with two populations that differ in helium abundance. Our model assumes a hierarchical structuring of the parameters in which physical properties—age, metallicity, helium abundance, distance, absorption, and initial mass—are common to (i) the cluster as a whole or to (ii) individual populations within a cluster, or are unique to (iii) individual stars. An adaptive Markov chain Monte Carlo (MCMC) algorithm is devised for model fitting that greatly improves convergence relative to its precursor non-adaptive MCMC algorithm. Our model and computational tools are incorporated into an open-source software suite known as BASE-9. We use numerical studies to demonstrate that our method can recover parameters of two-population clusters, and also show how model misspecification can potentially be identified. As a proof of concept, we analyze the two stellar populations of globular cluster NGC 5272 using our model and methods. (BASE-9 is available from GitHub: https://github.com/argiopetech/base/releases).
NASA Astrophysics Data System (ADS)
Cunningham, A. C.; Wallinga, J.; Hobo, N.; Versendaal, A. J.; Makaske, B.; Middelkoop, H.
2015-01-01
The optically stimulated luminescence (OSL) signal from fluvial sediment often contains a remnant from the previous deposition cycle, leading to a partially bleached equivalent-dose distribution. Although identification of the burial dose is of primary concern, the degree of bleaching could potentially provide insights into sediment transport processes. However, comparison of bleaching between samples is complicated by sample-to-sample variation in aliquot size and luminescence sensitivity. Here we begin development of an age model to account for these effects. With measurement data from multi-grain aliquots, we use Bayesian computational statistics to estimate the burial dose and bleaching parameters of the single-grain dose distribution. We apply the model to 46 samples taken from fluvial sediment of Rhine branches in the Netherlands, and compare the results with environmental predictor variables (depositional environment, texture, sample depth, depth relative to mean water level, dose rate). Although obvious correlations with predictor variables are absent, there is some suggestion that the best-bleached samples are found close to the modern mean water level, and that the extent of bleaching has changed over the recent past. We hypothesise that sediment deposited near the transition of channel to overbank deposits receives the most sunlight exposure, due to local reworking after deposition. However, nearly all samples are inferred to have at least some well-bleached grains, suggesting that bleaching also occurs during fluvial transport.
Gaggiotti, Oscar E
2010-11-01
Ever since the introduction of allozymes in the 1960s, evolutionary biologists and ecologists have continued to search for more powerful molecular markers to estimate important parameters such as effective population size and migration rates and to make inferences about the demographic history of populations, the relationships between individuals and the genetic architecture of phenotypic variation (Bensch & Akesson 2005; Bonin et al. 2007). Choosing a marker requires a thorough consideration of the trade-offs associated with the different techniques and the type of data obtained from them. Some markers can be very informative but require substantial amounts of start-up time (e.g. microsatellites), while others require very little time but are much less polymorphic. Amplified fragment length polymorphism (AFLP) is a firmly established molecular marker technique that falls in this latter category. AFLPs are widely distributed throughout the genome and can be used on organisms for which there is no a priori sequence information (Meudt & Clarke 2007). These properties together with their moderate cost and short start-up time have made them the method of choice for many molecular ecology studies of wild species (Bensch & Akesson 2005). However, they have a major disadvantage, they are dominant. This represents a very important limitation because many statistical genetics methods appropriate for molecular ecology studies require the use of codominant markers. In this issue, Foll et al. (2010) present an innovative hierarchical Bayesian method that overcomes this limitation. The proposed approach represents a comprehensive statistical treatment of the fluorescence of AFLP bands and leads to accurate inferences about the genetic structure of natural populations. Besides allowing a quasi-codominant treatment of AFLPs, this new method also solves the difficult problems posed by subjectivity in the scoring of AFLP bands. PMID:20958811
NASA Astrophysics Data System (ADS)
Cubillos, Patricio; Harrington, Joseph; Blecic, Jasmina; Stemm, Madison M.; Lust, Nate B.; Foster, Andrew S.; Rojo, Patricio M.; Loredo, Thomas J.
2014-11-01
Multi-wavelength secondary-eclipse and transit depths probe the thermo-chemical properties of exoplanets. In recent years, several research groups have developed retrieval codes to analyze the existing data and study the prospects of future facilities. However, the scientific community has limited access to these packages. Here we premiere the open-source Bayesian Atmospheric Radiative Transfer (BART) code. We discuss the key aspects of the radiative-transfer algorithm and the statistical package. The radiation code includes line databases for all HITRAN molecules, high-temperature H2O, TiO, and VO, and includes a preprocessor for adding additional line databases without recompiling the radiation code. Collision-induced absorption lines are available for H2-H2 and H2-He. The parameterized thermal and molecular abundance profiles can be modified arbitrarily without recompilation. The generated spectra are integrated over arbitrary bandpasses for comparison to data. BART's statistical package, Multi-core Markov-chain Monte Carlo (MC3), is a general-purpose MCMC module. MC3 implements the Differental-evolution Markov-chain Monte Carlo algorithm (ter Braak 2006, 2009). MC3 converges 20-400 times faster than the usual Metropolis-Hastings MCMC algorithm, and in addition uses the Message Passing Interface (MPI) to parallelize the MCMC chains. We apply the BART retrieval code to the HD 209458b data set to estimate the planet's temperature profile and molecular abundances. This work was supported by NASA Planetary Atmospheres grant NNX12AI69G and NASA Astrophysics Data Analysis Program grant NNX13AF38G. JB holds a NASA Earth and Space Science Fellowship.
Harrison, Jay M; Breeze, Matthew L; Berman, Kristina H; Harrigan, George G
2013-03-01
Bayesian approaches to evaluation of crop composition data allow simpler interpretations than traditional statistical significance tests. An important advantage of Bayesian approaches is that they allow formal incorporation of previously generated data through prior distributions in the analysis steps. This manuscript describes key steps to ensure meaningful and transparent selection and application of informative prior distributions. These include (i) review of previous data in the scientific literature to form the prior distributions, (ii) proper statistical model specification and documentation, (iii) graphical analyses to evaluate the fit of the statistical model to new study data, and (iv) sensitivity analyses to evaluate the robustness of results to the choice of prior distribution. The validity of the prior distribution for any crop component is critical to acceptance of Bayesian approaches to compositional analyses and would be essential for studies conducted in a regulatory setting. Selection and validation of prior distributions for three soybean isoflavones (daidzein, genistein, and glycitein) and two oligosaccharides (raffinose and stachyose) are illustrated in a comparative assessment of data obtained on GM and non-GM soybean seed harvested from replicated field sites at multiple locations in the US during the 2009 growing season. PMID:23261475
Bayesian Statistical Analysis of Historical and Late Holocene Rates of Sea-Level Change
NASA Astrophysics Data System (ADS)
Cahill, Niamh; Parnell, Andrew; Kemp, Andrew; Horton, Benjamin
2014-05-01
A fundamental concern associated with climate change is the rate at which sea levels are rising. Studies of past sea level (particularly beyond the instrumental data range) allow modern sea-level rise to be placed in a more complete context. Considering this, we perform a Bayesian statistical analysis on historical and late Holocene rates of sea-level change. The data that form the input to the statistical model are tide-gauge measurements and proxy reconstructions from cores of coastal sediment. The aims are to estimate rates of sea-level rise, to determine when modern rates of sea-level rise began and to observe how these rates have been changing over time. Many of the current methods for doing this use simple linear regression to estimate rates. This is often inappropriate as it is too rigid and it can ignore uncertainties that arise as part of the data collection exercise. This can lead to over confidence in the sea-level trends being characterized. The proposed Bayesian model places a Gaussian process prior on the rate process (i.e. the process that determines how rates of sea-level are changing over time). The likelihood of the observed data is the integral of this process. When dealing with proxy reconstructions, this is set in an errors-in-variables framework so as to take account of age uncertainty. It is also necessary, in this case, for the model to account for glacio-isostatic adjustment, which introduces a covariance between individual age and sea-level observations. This method provides a flexible fit and it allows for the direct estimation of the rate process with full consideration of all sources of uncertainty. Analysis of tide-gauge datasets and proxy reconstructions in this way means that changing rates of sea level can be estimated more comprehensively and accurately than previously possible. The model captures the continuous and dynamic evolution of sea-level change and results show that not only are modern sea levels rising but that the rates
A Bayesian statistical assessment of representative samples for asteroidal or meteoritical material
NASA Astrophysics Data System (ADS)
Carter, Jonathan N.; Sephton, Mark A.
2013-06-01
Primitive substances in asteroid and meteorite materials represent a record of early solar system evolution. To allow the study of these materials, they must be collected and transferred to the laboratory. Collection during sample return missions requires an assessment of the size of samples needed. Meteorite falls or finds must be subdivided into appropriate subsamples for analysis by successive generations of scientists. It is essential, therefore, to determine a representative mass or volume at which the collected or allocated sample is representative of the whole. For the first time, we have used a Bayesian statistical approach and a selected meteorite sample, Murchison, to identify a recommended smallest sample mass that can be used without interferences from sampling bias. Enhancing background knowledge to inform sample selection and analysis is an effective means of increasing the probability of obtaining a positive scientific outcome. The influence of the subdivision mechanism when preparing samples for distribution has also been examined. Assuming a similar size distribution of fragments to that of the Murchison meteorite, cubes can be similarly representative as fragments, but at orders of magnitude smaller sizes. We find that: (1) at all defined probabilities (90%, 95%, and 99%), nanometer-sized particles (where the axes of a three-dimensional sample are less that a nanometer in length) are never representative of the whole; (2) at the intermediate and highest defined probabilities (95% and 99%), micrometer-sized particles are never representative of the whole; and (3) for micrometer-sized samples, the only sample that is representative of the whole is a cube and then only at a 90% probability. The difference between cubes and fragments becomes less important as sample size increases and any >0.5 mm-sized sample will be representative of the whole with a probability of 99.9%. The results provide guidance for sample return mission planners and curators or
An overview of component qualification using Bayesian statistics and energy methods.
Dohner, Jeffrey Lynn
2011-09-01
The below overview is designed to give the reader a limited understanding of Bayesian and Maximum Likelihood (MLE) estimation; a basic understanding of some of the mathematical tools to evaluate the quality of an estimation; an introduction to energy methods and a limited discussion of damage potential. This discussion then goes on to presented a limited presentation as to how energy methods and Bayesian estimation are used together to qualify components. Example problems with solutions have been supplied as a learning aid. Bold letters are used to represent random variables. Un-bolded letter represent deterministic values. A concluding section presents a discussion of attributes and concerns.
Statistical Physics for Adaptive Distributed Control
NASA Technical Reports Server (NTRS)
Wolpert, David H.
2005-01-01
A viewgraph presentation on statistical physics for distributed adaptive control is shown. The topics include: 1) The Golden Rule; 2) Advantages; 3) Roadmap; 4) What is Distributed Control? 5) Review of Information Theory; 6) Iterative Distributed Control; 7) Minimizing L(q) Via Gradient Descent; and 8) Adaptive Distributed Control.
NASA Astrophysics Data System (ADS)
Schöniger, Anneli; Wöhling, Thomas; Nowak, Wolfgang
2015-09-01
Bayesian model averaging (BMA) ranks the plausibility of alternative conceptual models according to Bayes' theorem. A prior belief about each model's adequacy is updated to a posterior model probability based on the skill to reproduce observed data and on the principle of parsimony. The posterior model probabilities are then used as model weights for model ranking, selection, or averaging. Despite the statistically rigorous BMA procedure, model weights can become uncertain quantities due to measurement noise in the calibration data set or due to uncertainty in model input. Uncertain weights may in turn compromise the reliability of BMA results. We present a new statistical concept to investigate this weighting uncertainty, and thus, to assess the significance of model weights and the confidence in model ranking. Our concept is to resample the uncertain input or output data and then to analyze the induced variability in model weights. In the special case of weighting uncertainty due to measurement noise in the calibration data set, we interpret statistics of Bayesian model evidence to assess the distance of a model's performance from the theoretical upper limit. To illustrate our suggested approach, we investigate the reliability of soil-plant model selection following up on a study by Wöhling et al. (2015). Results show that the BMA routine should be equipped with our suggested upgrade to (1) reveal the significant but otherwise undetected impact of measurement noise on model ranking results and (2) to decide whether the considered set of models should be extended with better performing alternatives.
Improving Instruction Using Statistical Process Control.
ERIC Educational Resources Information Center
Higgins, Ronald C.; Messer, George H.
1990-01-01
Two applications of statistical process control to the process of education are described. Discussed are the use of prompt feedback to teachers and prompt feedback to students. A sample feedback form is provided. (CW)
Bayesian Statistical Inference in Ion-Channel Models with Exact Missed Event Correction.
Epstein, Michael; Calderhead, Ben; Girolami, Mark A; Sivilotti, Lucia G
2016-07-26
The stochastic behavior of single ion channels is most often described as an aggregated continuous-time Markov process with discrete states. For ligand-gated channels each state can represent a different conformation of the channel protein or a different number of bound ligands. Single-channel recordings show only whether the channel is open or shut: states of equal conductance are aggregated, so transitions between them have to be inferred indirectly. The requirement to filter noise from the raw signal further complicates the modeling process, as it limits the time resolution of the data. The consequence of the reduced bandwidth is that openings or shuttings that are shorter than the resolution cannot be observed; these are known as missed events. Postulated models fitted using filtered data must therefore explicitly account for missed events to avoid bias in the estimation of rate parameters and therefore assess parameter identifiability accurately. In this article, we present the first, to our knowledge, Bayesian modeling of ion-channels with exact missed events correction. Bayesian analysis represents uncertain knowledge of the true value of model parameters by considering these parameters as random variables. This allows us to gain a full appreciation of parameter identifiability and uncertainty when estimating values for model parameters. However, Bayesian inference is particularly challenging in this context as the correction for missed events increases the computational complexity of the model likelihood. Nonetheless, we successfully implemented a two-step Markov chain Monte Carlo method that we called "BICME", which performs Bayesian inference in models of realistic complexity. The method is demonstrated on synthetic and real single-channel data from muscle nicotinic acetylcholine channels. We show that parameter uncertainty can be characterized more accurately than with maximum-likelihood methods. Our code for performing inference in these ion channel
Wafer, Lucas; Kloczewiak, Marek; Luo, Yin
2016-07-01
Analytical ultracentrifugation-sedimentation velocity (AUC-SV) is often used to quantify high molar mass species (HMMS) present in biopharmaceuticals. Although these species are often present in trace quantities, they have received significant attention due to their potential immunogenicity. Commonly, AUC-SV data is analyzed as a diffusion-corrected, sedimentation coefficient distribution, or c(s), using SEDFIT to numerically solve Lamm-type equations. SEDFIT also utilizes maximum entropy or Tikhonov-Phillips regularization to further allow the user to determine relevant sample information, including the number of species present, their sedimentation coefficients, and their relative abundance. However, this methodology has several, often unstated, limitations, which may impact the final analysis of protein therapeutics. These include regularization-specific effects, artificial "ripple peaks," and spurious shifts in the sedimentation coefficients. In this investigation, we experimentally verified that an explicit Bayesian approach, as implemented in SEDFIT, can largely correct for these effects. Clear guidelines on how to implement this technique and interpret the resulting data, especially for samples containing micro-heterogeneity (e.g., differential glycosylation), are also provided. In addition, we demonstrated how the Bayesian approach can be combined with F statistics to draw more accurate conclusions and rigorously exclude artifactual peaks. Numerous examples with an antibody and an antibody-drug conjugate were used to illustrate the strengths and drawbacks of each technique. PMID:27184576
NASA Astrophysics Data System (ADS)
Joshi, Deepti; St-Hilaire, André; Daigle, Anik; Ouarda, Taha B. M. J.
2013-04-01
SummaryThis study attempts to compare the performance of two statistical downscaling frameworks in downscaling hydrological indices (descriptive statistics) characterizing the low flow regimes of three rivers in Eastern Canada - Moisie, Romaine and Ouelle. The statistical models selected are Relevance Vector Machine (RVM), an implementation of Sparse Bayesian Learning, and the Automated Statistical Downscaling tool (ASD), an implementation of Multiple Linear Regression. Inputs to both frameworks involve climate variables significantly (α = 0.05) correlated with the indices. These variables were processed using Canonical Correlation Analysis and the resulting canonical variates scores were used as input to RVM to estimate the selected low flow indices. In ASD, the significantly correlated climate variables were subjected to backward stepwise predictor selection and the selected predictors were subsequently used to estimate the selected low flow indices using Multiple Linear Regression. With respect to the correlation between climate variables and the selected low flow indices, it was observed that all indices are influenced, primarily, by wind components (Vertical, Zonal and Meridonal) and humidity variables (Specific and Relative Humidity). The downscaling performance of the framework involving RVM was found to be better than ASD in terms of Relative Root Mean Square Error, Relative Mean Absolute Bias and Coefficient of Determination. In all cases, the former resulted in less variability of the performance indices between calibration and validation sets, implying better generalization ability than for the latter.
An absolute chronology for early Egypt using radiocarbon dating and Bayesian statistical modelling
Dee, Michael; Wengrow, David; Shortland, Andrew; Stevenson, Alice; Brock, Fiona; Girdland Flink, Linus; Bronk Ramsey, Christopher
2013-01-01
The Egyptian state was formed prior to the existence of verifiable historical records. Conventional dates for its formation are based on the relative ordering of artefacts. This approach is no longer considered sufficient for cogent historical analysis. Here, we produce an absolute chronology for Early Egypt by combining radiocarbon and archaeological evidence within a Bayesian paradigm. Our data cover the full trajectory of Egyptian state formation and indicate that the process occurred more rapidly than previously thought. We provide a timeline for the First Dynasty of Egypt of generational-scale resolution that concurs with prevailing archaeological analysis and produce a chronometric date for the foundation of Egypt that distinguishes between historical estimates. PMID:24204188
Statistical process control in nursing research.
Polit, Denise F; Chaboyer, Wendy
2012-02-01
In intervention studies in which randomization to groups is not possible, researchers typically use quasi-experimental designs. Time series designs are strong quasi-experimental designs but are seldom used, perhaps because of technical and analytic hurdles. Statistical process control (SPC) is an alternative analytic approach to testing hypotheses about intervention effects using data collected over time. SPC, like traditional statistical methods, is a tool for understanding variation and involves the construction of control charts that distinguish between normal, random fluctuations (common cause variation), and statistically significant special cause variation that can result from an innovation. The purpose of this article is to provide an overview of SPC and to illustrate its use in a study of a nursing practice improvement intervention. PMID:22095634
Bayesian Statistics and Uncertainty Quantification for Safety Boundary Analysis in Complex Systems
NASA Technical Reports Server (NTRS)
He, Yuning; Davies, Misty Dawn
2014-01-01
The analysis of a safety-critical system often requires detailed knowledge of safe regions and their highdimensional non-linear boundaries. We present a statistical approach to iteratively detect and characterize the boundaries, which are provided as parameterized shape candidates. Using methods from uncertainty quantification and active learning, we incrementally construct a statistical model from only few simulation runs and obtain statistically sound estimates of the shape parameters for safety boundaries.
Applied Behavior Analysis and Statistical Process Control?
ERIC Educational Resources Information Center
Hopkins, B. L.
1995-01-01
Incorporating statistical process control (SPC) methods into applied behavior analysis is discussed. It is claimed that SPC methods would likely reduce applied behavior analysts' intimate contacts with problems and would likely yield poor treatment and research decisions. Cases and data presented by Pfadt and Wheeler (1995) are cited as examples.…
Statistical process control for total quality
NASA Astrophysics Data System (ADS)
Ali, Syed W.
1992-06-01
The paper explains the techniques and applications of statistical process control (SPC). Examples of control charts used in the Poseidon program of the NASA ocean topography experiment (TOPEX) and a brief discussion of Taguchi methods are presented. It is noted that SPC involves everyone in process improvement by providing objective, workable data. It permits continuous improvement instead of merely aiming for all parts to be within a tolerance band.
A Dynamic Bayesian Network Model for the Production and Inventory Control
NASA Astrophysics Data System (ADS)
Shin, Ji-Sun; Takazaki, Noriyuki; Lee, Tae-Hong; Kim, Jin-Il; Lee, Hee-Hyol
In general, the production quantities and delivered goods are changed randomly and then the total stock is also changed randomly. This paper deals with the production and inventory control using the Dynamic Bayesian Network. Bayesian Network is a probabilistic model which represents the qualitative dependence between two or more random variables by the graph structure, and indicates the quantitative relations between individual variables by the conditional probability. The probabilistic distribution of the total stock is calculated through the propagation of the probability on the network. Moreover, an adjusting rule of the production quantities to maintain the probability of a lower limit and a ceiling of the total stock to certain values is shown.
New Insights into the Genetic Control of Gene Expression using a Bayesian Multi-tissue Approach
Langley, Sarah R.; Heinig, Matthias; McDermott-Roe, Chris; Sarwar, Rizwan; Pravenec, Michal; Hübner, Norbert; Aitman, Timothy J.; Cook, Stuart A.; Richardson, Sylvia
2010-01-01
The majority of expression quantitative trait locus (eQTL) studies have been carried out in single tissues or cell types, using methods that ignore information shared across tissues. Although global analysis of RNA expression in multiple tissues is now feasible, few integrated statistical frameworks for joint analysis of gene expression across tissues combined with simultaneous analysis of multiple genetic variants have been developed to date. Here, we propose Sparse Bayesian Regression models for mapping eQTLs within individual tissues and simultaneously across tissues. Testing these on a set of 2,000 genes in four tissues, we demonstrate that our methods are more powerful than traditional approaches in revealing the true complexity of the eQTL landscape at the systems-level. Highlighting the power of our method, we identified a two-eQTL model (cis/trans) for the Hopx gene that was experimentally validated and was not detected by conventional approaches. We showed common genetic regulation of gene expression across four tissues for ∼27% of transcripts, providing >5 fold increase in eQTLs detection when compared with single tissue analyses at 5% FDR level. These findings provide a new opportunity to uncover complex genetic regulatory mechanisms controlling global gene expression while the generality of our modelling approach makes it adaptable to other model systems and humans, with broad application to analysis of multiple intermediate and whole-body phenotypes. PMID:20386736
Bayesian statistics applied to the location of the source of explosions at Stromboli Volcano, Italy
Saccorotti, G.; Chouet, B.; Martini, M.; Scarpa, R.
1998-01-01
We present a method for determining the location and spatial extent of the source of explosions at Stromboli Volcano, Italy, based on a Bayesian inversion of the slowness vector derived from frequency-slowness analyses of array data. The method searches for source locations that minimize the error between the expected and observed slowness vectors. For a given set of model parameters, the conditional probability density function of slowness vectors is approximated by a Gaussian distribution of expected errors. The method is tested with synthetics using a five-layer velocity model derived for the north flank of Stromboli and a smoothed velocity model derived from a power-law approximation of the layered structure. Application to data from Stromboli allows for a detailed examination of uncertainties in source location due to experimental errors and incomplete knowledge of the Earth model. Although the solutions are not constrained in the radial direction, excellent resolution is achieved in both transverse and depth directions. Under the assumption that the horizontal extent of the source does not exceed the crater dimension, the 90% confidence region in the estimate of the explosive source location corresponds to a small volume extending from a depth of about 100 m to a maximum depth of about 300 m beneath the active vents, with a maximum likelihood source region located in the 120- to 180-m-depth interval.
Multiple LacI-mediated loops revealed by Bayesian statistics and tethered particle motion
Johnson, Stephanie; van de Meent, Jan-Willem; Phillips, Rob; Wiggins, Chris H.; Lindén, Martin
2014-01-01
The bacterial transcription factor LacI loops DNA by binding to two separate locations on the DNA simultaneously. Despite being one of the best-studied model systems for transcriptional regulation, the number and conformations of loop structures accessible to LacI remain unclear, though the importance of multiple coexisting loops has been implicated in interactions between LacI and other cellular regulators of gene expression. To probe this issue, we have developed a new analysis method for tethered particle motion, a versatile and commonly used in vitro single-molecule technique. Our method, vbTPM, performs variational Bayesian inference in hidden Markov models. It learns the number of distinct states (i.e. DNA–protein conformations) directly from tethered particle motion data with better resolution than existing methods, while easily correcting for common experimental artifacts. Studying short (roughly 100 bp) LacI-mediated loops, we provide evidence for three distinct loop structures, more than previously reported in single-molecule studies. Moreover, our results confirm that changes in LacI conformation and DNA-binding topology both contribute to the repertoire of LacI-mediated loops formed in vitro, and provide qualitatively new input for models of looping and transcriptional regulation. We expect vbTPM to be broadly useful for probing complex protein–nucleic acid interactions. PMID:25120267
Applying statistical process control to the adaptive rate control problem
NASA Astrophysics Data System (ADS)
Manohar, Nelson R.; Willebeek-LeMair, Marc H.; Prakash, Atul
1997-12-01
Due to the heterogeneity and shared resource nature of today's computer network environments, the end-to-end delivery of multimedia requires adaptive mechanisms to be effective. We present a framework for the adaptive streaming of heterogeneous media. We introduce the application of online statistical process control (SPC) to the problem of dynamic rate control. In SPC, the goal is to establish (and preserve) a state of statistical quality control (i.e., controlled variability around a target mean) over a process. We consider the end-to-end streaming of multimedia content over the internet as the process to be controlled. First, at each client, we measure process performance and apply statistical quality control (SQC) with respect to application-level requirements. Then, we guide an adaptive rate control (ARC) problem at the server based on the statistical significance of trends and departures on these measurements. We show this scheme facilitates handling of heterogeneous media. Last, because SPC is designed to monitor long-term process performance, we show that our online SPC scheme could be used to adapt to various degrees of long-term (network) variability (i.e., statistically significant process shifts as opposed to short-term random fluctuations). We develop several examples and analyze its statistical behavior and guarantees.
Statistical Process Control In Photolithography Applications
NASA Astrophysics Data System (ADS)
Pritchard, Lois B.
1987-04-01
Recently there have been numerous papers, articles and books on the benefits and rewards of Statistical Process Control for manufacturing processes. Models are used that quite adequately describe methods appropriate for the factory situation where many discrete and identical items are turned out and where a limited number of parameters are inspected along the line. Photolithographic applications often require different statistical models from the usual factory methods. The difficulties encountered in getting started with SPC lie in determining: 1. what parameters should be tracked 2. what statistical model is appropriate for each of those parameters 3. how to use the models chosen. This paper describes three statistical models that, among them, account for most operations within a photolithographic manufacturing application. The process of determining which model is appropriate is described, along with the basic rules that may be used in making the determination. In addition, the application of each method is shown, and action instructions are covered. Initially the "x-bar, R" model is described. This model is the one most often found in off-the-shelf software packages, and enjoys wide applications in equipment tracking, besides general use process control. Secondly the "x, moving-R" model is described. This is appropriate where a series of measurements of the same parameter is taken on a single item, perhaps at different locations, such as in dimensional uniformity control for wafers or photomasks. In this case, each "x" is a single observation, or a number of measurements of a single observation, as opposed to a mean value taken in a sampling scheme. Thirdly a model for a Poisson distribution is described, which tends to fit defect density data, particulate counts, where count data is accumulated per unit or per unit time. The purpose of the paper is to briefly describe the included models, for those with little or no background in statistics, to enable them to
NASA Astrophysics Data System (ADS)
Schöniger, Anneli; Wöhling, Thomas; Nowak, Wolfgang
2014-05-01
Bayesian model averaging ranks the predictive capabilities of alternative conceptual models based on Bayes' theorem. The individual models are weighted with their posterior probability to be the best one in the considered set of models. Finally, their predictions are combined into a robust weighted average and the predictive uncertainty can be quantified. This rigorous procedure does, however, not yet account for possible instabilities due to measurement noise in the calibration data set. This is a major drawback, since posterior model weights may suffer a lack of robustness related to the uncertainty in noisy data, which may compromise the reliability of model ranking. We present a new statistical concept to account for measurement noise as source of uncertainty for the weights in Bayesian model averaging. Our suggested upgrade reflects the limited information content of data for the purpose of model selection. It allows us to assess the significance of the determined posterior model weights, the confidence in model selection, and the accuracy of the quantified predictive uncertainty. Our approach rests on a brute-force Monte Carlo framework. We determine the robustness of model weights against measurement noise by repeatedly perturbing the observed data with random realizations of measurement error. Then, we analyze the induced variability in posterior model weights and introduce this "weighting variance" as an additional term into the overall prediction uncertainty analysis scheme. We further determine the theoretical upper limit in performance of the model set which is imposed by measurement noise. As an extension to the merely relative model ranking, this analysis provides a measure of absolute model performance. To finally decide, whether better data or longer time series are needed to ensure a robust basis for model selection, we resample the measurement time series and assess the convergence of model weights for increasing time series length. We illustrate
Dolejsi, Erich; Bodenstorfer, Bernhard; Frommlet, Florian
2014-01-01
The prevailing method of analyzing GWAS data is still to test each marker individually, although from a statistical point of view it is quite obvious that in case of complex traits such single marker tests are not ideal. Recently several model selection approaches for GWAS have been suggested, most of them based on LASSO-type procedures. Here we will discuss an alternative model selection approach which is based on a modification of the Bayesian Information Criterion (mBIC2) which was previously shown to have certain asymptotic optimality properties in terms of minimizing the misclassification error. Heuristic search strategies are introduced which attempt to find the model which minimizes mBIC2, and which are efficient enough to allow the analysis of GWAS data. Our approach is implemented in a software package called MOSGWA. Its performance in case control GWAS is compared with the two algorithms HLASSO and d-GWASelect, as well as with single marker tests, where we performed a simulation study based on real SNP data from the POPRES sample. Our results show that MOSGWA performs slightly better than HLASSO, where specifically for more complex models MOSGWA is more powerful with only a slight increase in Type I error. On the other hand according to our simulations GWASelect does not at all control the type I error when used to automatically determine the number of important SNPs. We also reanalyze the GWAS data from the Wellcome Trust Case-Control Consortium and compare the findings of the different procedures, where MOSGWA detects for complex diseases a number of interesting SNPs which are not found by other methods. PMID:25061809
Dolejsi, Erich; Bodenstorfer, Bernhard; Frommlet, Florian
2014-01-01
The prevailing method of analyzing GWAS data is still to test each marker individually, although from a statistical point of view it is quite obvious that in case of complex traits such single marker tests are not ideal. Recently several model selection approaches for GWAS have been suggested, most of them based on LASSO-type procedures. Here we will discuss an alternative model selection approach which is based on a modification of the Bayesian Information Criterion (mBIC2) which was previously shown to have certain asymptotic optimality properties in terms of minimizing the misclassification error. Heuristic search strategies are introduced which attempt to find the model which minimizes mBIC2, and which are efficient enough to allow the analysis of GWAS data. Our approach is implemented in a software package called MOSGWA. Its performance in case control GWAS is compared with the two algorithms HLASSO and d-GWASelect, as well as with single marker tests, where we performed a simulation study based on real SNP data from the POPRES sample. Our results show that MOSGWA performs slightly better than HLASSO, where specifically for more complex models MOSGWA is more powerful with only a slight increase in Type I error. On the other hand according to our simulations GWASelect does not at all control the type I error when used to automatically determine the number of important SNPs. We also reanalyze the GWAS data from the Wellcome Trust Case-Control Consortium and compare the findings of the different procedures, where MOSGWA detects for complex diseases a number of interesting SNPs which are not found by other methods. PMID:25061809
Neural network uncertainty assessment using Bayesian statistics: a remote sensing application
NASA Technical Reports Server (NTRS)
Aires, F.; Prigent, C.; Rossow, W. B.
2004-01-01
Neural network (NN) techniques have proved successful for many regression problems, in particular for remote sensing; however, uncertainty estimates are rarely provided. In this article, a Bayesian technique to evaluate uncertainties of the NN parameters (i.e., synaptic weights) is first presented. In contrast to more traditional approaches based on point estimation of the NN weights, we assess uncertainties on such estimates to monitor the robustness of the NN model. These theoretical developments are illustrated by applying them to the problem of retrieving surface skin temperature, microwave surface emissivities, and integrated water vapor content from a combined analysis of satellite microwave and infrared observations over land. The weight uncertainty estimates are then used to compute analytically the uncertainties in the network outputs (i.e., error bars and correlation structure of these errors). Such quantities are very important for evaluating any application of an NN model. The uncertainties on the NN Jacobians are then considered in the third part of this article. Used for regression fitting, NN models can be used effectively to represent highly nonlinear, multivariate functions. In this situation, most emphasis is put on estimating the output errors, but almost no attention has been given to errors associated with the internal structure of the regression model. The complex structure of dependency inside the NN is the essence of the model, and assessing its quality, coherency, and physical character makes all the difference between a blackbox model with small output errors and a reliable, robust, and physically coherent model. Such dependency structures are described to the first order by the NN Jacobians: they indicate the sensitivity of one output with respect to the inputs of the model for given input data. We use a Monte Carlo integration procedure to estimate the robustness of the NN Jacobians. A regularization strategy based on principal component
Optimal speech motor control and token-to-token variability: a Bayesian modeling approach.
Patri, Jean-François; Diard, Julien; Perrier, Pascal
2015-12-01
The remarkable capacity of the speech motor system to adapt to various speech conditions is due to an excess of degrees of freedom, which enables producing similar acoustical properties with different sets of control strategies. To explain how the central nervous system selects one of the possible strategies, a common approach, in line with optimal motor control theories, is to model speech motor planning as the solution of an optimality problem based on cost functions. Despite the success of this approach, one of its drawbacks is the intrinsic contradiction between the concept of optimality and the observed experimental intra-speaker token-to-token variability. The present paper proposes an alternative approach by formulating feedforward optimal control in a probabilistic Bayesian modeling framework. This is illustrated by controlling a biomechanical model of the vocal tract for speech production and by comparing it with an existing optimal control model (GEPPETO). The essential elements of this optimal control model are presented first. From them the Bayesian model is constructed in a progressive way. Performance of the Bayesian model is evaluated based on computer simulations and compared to the optimal control model. This approach is shown to be appropriate for solving the speech planning problem while accounting for variability in a principled way. PMID:26497359
Applied behavior analysis and statistical process control?
Hopkins, B L
1995-01-01
This paper examines Pfadt and Wheeler's (1995) suggestions that the methods of statistical process control (SPC) be incorporated into applied behavior analysis. The research strategies of SPC are examined and compared to those of applied behavior analysis. I argue that the statistical methods that are a part of SPC would likely reduce applied behavior analysts' intimate contacts with the problems with which they deal and would, therefore, likely yield poor treatment and research decisions. Examples of these kinds of results and decisions are drawn from the cases and data Pfadt and Wheeler present. This paper also describes and clarifies many common misconceptions about SPC, including W. Edwards Deming's involvement in its development, its relationship to total quality management, and its confusion with various other methods designed to detect sources of unwanted variability. PMID:7592156
Beginning a statistical process control program
Davis, H.D.; Burnett, M. )
1989-01-01
Statistical Process Control (SPC) has in recent years become a hot'' topic in the manufacturing world. It has been touted as the means by which Japanese manufacturers have moved to the forefront of world-class quality, and subsequent financial power. Is SPC a business-saving strategy What is SPC What is the cost of quality and can we afford it Is SPC applicable to the petroleum refining and petrochemical manufacturing industry, or are these manufacturing operations so deterministic by nature that the statistics only show the accuracy and precision of the laboratory work If SPC is worthwhile how do we get started, and what problems can we expect to encounter If we begin an SPC Program, how will it benefit us These questions are addressed by the author. The view presented here is a management perspective with emphasis on rationale and implementation methods.
Giambartolomei, Claudia; Vukcevic, Damjan; Schadt, Eric E; Franke, Lude; Hingorani, Aroon D; Wallace, Chris; Plagnol, Vincent
2014-05-01
Genetic association studies, in particular the genome-wide association study (GWAS) design, have provided a wealth of novel insights into the aetiology of a wide range of human diseases and traits, in particular cardiovascular diseases and lipid biomarkers. The next challenge consists of understanding the molecular basis of these associations. The integration of multiple association datasets, including gene expression datasets, can contribute to this goal. We have developed a novel statistical methodology to assess whether two association signals are consistent with a shared causal variant. An application is the integration of disease scans with expression quantitative trait locus (eQTL) studies, but any pair of GWAS datasets can be integrated in this framework. We demonstrate the value of the approach by re-analysing a gene expression dataset in 966 liver samples with a published meta-analysis of lipid traits including >100,000 individuals of European ancestry. Combining all lipid biomarkers, our re-analysis supported 26 out of 38 reported colocalisation results with eQTLs and identified 14 new colocalisation results, hence highlighting the value of a formal statistical test. In three cases of reported eQTL-lipid pairs (SYPL2, IFT172, TBKBP1) for which our analysis suggests that the eQTL pattern is not consistent with the lipid association, we identify alternative colocalisation results with SORT1, GCKR, and KPNB1, indicating that these genes are more likely to be causal in these genomic intervals. A key feature of the method is the ability to derive the output statistics from single SNP summary statistics, hence making it possible to perform systematic meta-analysis type comparisons across multiple GWAS datasets (implemented online at http://coloc.cs.ucl.ac.uk/coloc/). Our methodology provides information about candidate causal genes in associated intervals and has direct implications for the understanding of complex diseases as well as the design of drugs to
Controlling statistical moments of stochastic dynamical networks
NASA Astrophysics Data System (ADS)
Bielievtsov, Dmytro; Ladenbauer, Josef; Obermayer, Klaus
2016-07-01
We consider a general class of stochastic networks and ask which network nodes need to be controlled, and how, to stabilize and switch between desired metastable (target) states in terms of the first and second statistical moments of the system. We first show that it is sufficient to directly interfere with a subset of nodes which can be identified using information about the graph of the network only. Then we develop a suitable method for feedback control which acts on that subset of nodes and preserves the covariance structure of the desired target state. Finally, we demonstrate our theoretical results using a stochastic Hopfield network and a global brain model. Our results are applicable to a variety of (model) networks and further our understanding of the relationship between network structure and collective dynamics for the benefit of effective control.
Controlling statistical moments of stochastic dynamical networks.
Bielievtsov, Dmytro; Ladenbauer, Josef; Obermayer, Klaus
2016-07-01
We consider a general class of stochastic networks and ask which network nodes need to be controlled, and how, to stabilize and switch between desired metastable (target) states in terms of the first and second statistical moments of the system. We first show that it is sufficient to directly interfere with a subset of nodes which can be identified using information about the graph of the network only. Then we develop a suitable method for feedback control which acts on that subset of nodes and preserves the covariance structure of the desired target state. Finally, we demonstrate our theoretical results using a stochastic Hopfield network and a global brain model. Our results are applicable to a variety of (model) networks and further our understanding of the relationship between network structure and collective dynamics for the benefit of effective control. PMID:27575147
Two levels of Bayesian model averaging for optimal control of stochastic systems
NASA Astrophysics Data System (ADS)
Darwen, Paul J.
2013-02-01
Bayesian model averaging provides the best possible estimate of a model, given the data. This article uses that approach twice: once to get a distribution of plausible models of the world, and again to find a distribution of plausible control functions. The resulting ensemble gives control instructions different from simply taking the single best-fitting model and using it to find a single lowest-error control function for that single model. The only drawback is, of course, the need for more computer time: this article demonstrates that the required computer time is feasible. The test problem here is from flood control and risk management.
Statistical Process Control for KSC Processing
NASA Technical Reports Server (NTRS)
Ford, Roger G.; Delgado, Hector; Tilley, Randy
1996-01-01
The 1996 Summer Faculty Fellowship Program and Kennedy Space Center (KSC) served as the basis for a research effort into statistical process control for KSC processing. The effort entailed several tasks and goals. The first was to develop a customized statistical process control (SPC) course for the Safety and Mission Assurance Trends Analysis Group. The actual teaching of this course took place over several weeks. In addition, an Internet version of the same course complete with animation and video excerpts from the course when it was taught at KSC was developed. The application of SPC to shuttle processing took up the rest of the summer research project. This effort entailed the evaluation of SPC use at KSC, both present and potential, due to the change in roles for NASA and the Single Flight Operations Contractor (SFOC). Individual consulting on SPC use was accomplished as well as an evaluation of SPC software for KSC use in the future. A final accomplishment of the orientation of the author to NASA changes, terminology, data format, and new NASA task definitions will allow future consultation when the needs arise.
NASA Astrophysics Data System (ADS)
Mugnes, J.-M.; Robert, C.
2015-11-01
Spectral analysis is a powerful tool to investigate stellar properties and it has been widely used for decades now. However, the methods considered to perform this kind of analysis are mostly based on iteration among a few diagnostic lines to determine the stellar parameters. While these methods are often simple and fast, they can lead to errors and large uncertainties due to the required assumptions. Here, we present a method based on Bayesian statistics to find simultaneously the best combination of effective temperature, surface gravity, projected rotational velocity, and microturbulence velocity, using all the available spectral lines. Different tests are discussed to demonstrate the strength of our method, which we apply to 54 mid-resolution spectra of field and cluster B stars obtained at the Observatoire du Mont-Mégantic. We compare our results with those found in the literature. Differences are seen which are well explained by the different methods used. We conclude that the B-star microturbulence velocities are often underestimated. We also confirm the trend that B stars in clusters are on average faster rotators than field B stars.
NASA Astrophysics Data System (ADS)
Hashmi, M. Z.; Shamseldin, A. Y.; Melville, B. W.
2009-10-01
Global Circulation Models (GCMs) are a major tool used for future projections of climate change using different emission scenarios. However, for assessing the hydrological impacts of climate change at the watershed and the regional scale, the GCM outputs cannot be used directly due to the mismatch in the spatial resolution between the GCMs and hydrological models. In order to use the output of a GCM for conducting hydrological impact studies, downscaling is used. However, the downscaling results may contain considerable uncertainty which needs to be quantified before making the results available. Among the variables usually downscaled, precipitation downscaling is quite challenging and is more prone to uncertainty issues than other climatological variables. This paper addresses the uncertainty analysis associated with statistical downscaling of a watershed precipitation (Clutha River above Balclutha, New Zealand) using results from three well reputed downscaling methods and Bayesian weighted multi-model ensemble approach. The downscaling methods used for this study belong to the following downscaling categories; (1) Multiple linear regression; (2) Multiple non-linear regression; and (3) Stochastic weather generator. The results obtained in this study have shown that this ensemble strategy is very efficient in combining the results from multiple downscaling methods on the basis of their performance and quantifying the uncertainty contained in this ensemble output. This will encourage any future attempts on quantifying downscaling uncertainties using the multi-model ensemble framework.
A statistical process control case study.
Ross, Thomas K
2006-01-01
Statistical process control (SPC) charts can be applied to a wide number of health care applications, yet widespread use has not occurred. The greatest obstacle preventing wider use is the lack of quality management training that health care workers receive. The technical nature of the SPC guarantees that without explicit instruction this technique will not come into widespread use. Reviews of health care quality management texts inform the reader that SPC charts should be used to improve delivery processes and outcomes often without discussing how they are created. Conversely, medical research frequently reports the improved outcomes achieved after analyzing SPC charts. This article is targeted between these 2 positions: it reviews the SPC technique and presents a tool and data so readers can construct SPC charts. After tackling the case, it is hoped that the readers will collect their own data and apply the same technique to improve processes in their own organization. PMID:17047496
Planetary micro-rover operations on Mars using a Bayesian framework for inference and control
NASA Astrophysics Data System (ADS)
Post, Mark A.; Li, Junquan; Quine, Brendan M.
2016-03-01
With the recent progress toward the application of commercially-available hardware to small-scale space missions, it is now becoming feasible for groups of small, efficient robots based on low-power embedded hardware to perform simple tasks on other planets in the place of large-scale, heavy and expensive robots. In this paper, we describe design and programming of the Beaver micro-rover developed for Northern Light, a Canadian initiative to send a small lander and rover to Mars to study the Martian surface and subsurface. For a small, hardware-limited rover to handle an uncertain and mostly unknown environment without constant management by human operators, we use a Bayesian network of discrete random variables as an abstraction of expert knowledge about the rover and its environment, and inference operations for control. A framework for efficient construction and inference into a Bayesian network using only the C language and fixed-point mathematics on embedded hardware has been developed for the Beaver to make intelligent decisions with minimal sensor data. We study the performance of the Beaver as it probabilistically maps a simple outdoor environment with sensor models that include uncertainty. Results indicate that the Beaver and other small and simple robotic platforms can make use of a Bayesian network to make intelligent decisions in uncertain planetary environments.
NASA Astrophysics Data System (ADS)
Wallace, D. J.; Rosenheim, B. E.; Roberts, M. L.; Burton, J. R.; Donnelly, J. P.; Woodruff, J. D.
2014-12-01
Is a small quantity of high-precision ages more robust than a higher quantity of lower-precision ages for sediment core chronologies? AMS Radiocarbon ages have been available to researchers for several decades now, and precision of the technique has continued to improve. Analysis and time cost is high, though, and projects are often limited in terms of the number of dates that can be used to develop a chronology. The Gas Ion Source at the National Ocean Sciences Accelerator Mass Spectrometry Facility (NOSAMS), while providing lower-precision (uncertainty of order 100 14C y for a sample), is significantly less expensive and far less time consuming than conventional age dating and offers the unique opportunity for large amounts of ages. Here we couple two approaches, one analytical and one statistical, to investigate the utility of an age model comprised of these lower-precision ages for paleotempestology. We use a gas ion source interfaced to a gas-bench type device to generate radiocarbon dates approximately every 5 minutes while determining the order of sample analysis using the published Bayesian accumulation histories for deposits (Bacon). During two day-long sessions, several dates were obtained from carbonate shells in living position in a sediment core comprised of sapropel gel from Mangrove Lake, Bermuda. Samples were prepared where large shells were available, and the order of analysis was determined by the depth with the highest uncertainty according to Bacon. We present the results of these analyses as well as a prognosis for a future where such age models can be constructed from many dates that are quickly obtained relative to conventional radiocarbon dates. This technique currently is limited to carbonates, but development of a system for organic material dating is underway. We will demonstrate the extent to which sacrificing some analytical precision in favor of more dates improves age models.
NASA Astrophysics Data System (ADS)
Wahl, E. R.
2008-12-01
A strict process model for pollen as a climate proxy is currently not approachable beyond localized spatial scales; more generally, the canonical model for vegetation-pollen registration itself requires assimilation of empirically-derived information. In this paper, a taxonomically "reduced-space" climate-pollen forward model is developed, based on the performance of a parallel inverse model. The goal is inclusion of the forward model in a Bayesian climate reconstruction framework, following a 4-step process. (1) Ratios of pollen types calibrated to temperature are examined to determine if they can equal or surpass the skill of multi-taxonomic calibrations using the modern analog technique (MAT) optimized with receiver operating characteristic (ROC) analysis. The first phase of this examination, using modern pollen data from SW N America, demonstrates that the ratio method can give calibrations as skillful as the MAT when vegetation representation (and associated climate gradients) are characterized by two dominant pollen taxa, in this case pine and oak. Paleotemperature reconstructions using the ratio method also compare well to MAT reconstructions, showing very minor differences. [Ratio values are defined as pine/(pine + oak), so they vary between 0 and 1.] (2) Uncertainty analysis is carried out in independent steps, which are combined to give overall probabilistic confidence ranges. Monte Carlo (MC) analysis utilizing Poisson distributions to model the inherent variability of pollen representation in relation to climate (assuming defined temperature normals at the modern calibration sites) allows independent statistical estimation of this component of uncertainty, for both the modern calibration and fossil pollen data sets. In turn, MC analysis utilizing normal distributions allows independent estimation of the addition to overall uncertainty from climate variation itself. (3) Because the quality tests in (1) indicate the ratio method has the capacity to carry
Statistical process control for IMRT dosimetric verification
Breen, Stephen L.; Moseley, Douglas J.; Zhang, Beibei; Sharpe, Michael B.
2008-10-15
Patient-specific measurements are typically used to validate the dosimetry of intensity-modulated radiotherapy (IMRT). To evaluate the dosimetric performance over time of our IMRT process, we have used statistical process control (SPC) concepts to analyze the measurements from 330 head and neck (H and N) treatment plans. The objectives of the present work are to: (i) Review the dosimetric measurements of a large series of consecutive head and neck treatment plans to better understand appropriate dosimetric tolerances; (ii) analyze the results with SPC to develop action levels for measured discrepancies; (iii) develop estimates for the number of measurements that are required to describe IMRT dosimetry in the clinical setting; and (iv) evaluate with SPC a new beam model in our planning system. H and N IMRT cases were planned with the PINNACLE{sup 3} treatment planning system versions 6.2b or 7.6c (Philips Medical Systems, Madison, WI) and treated on Varian (Palo Alto, CA) or Elekta (Crawley, UK) linacs. As part of regular quality assurance, plans were recalculated on a 20-cm-diam cylindrical phantom, and ion chamber measurements were made in high-dose volumes (the PTV with highest dose) and in low-dose volumes (spinal cord organ-at-risk, OR). Differences between the planned and measured doses were recorded as a percentage of the planned dose. Differences were stable over time. Measurements with PINNACLE{sup 3} 6.2b and Varian linacs showed a mean difference of 0.6% for PTVs (n=149, range, -4.3% to 6.6%), while OR measurements showed a larger systematic discrepancy (mean 4.5%, range -4.5% to 16.3%) that was due to well-known limitations of the MLC model in the earlier version of the planning system. Measurements with PINNACLE{sup 3} 7.6c and Varian linacs demonstrated a mean difference of 0.2% for PTVs (n=160, range, -3.0%, to 5.0%) and -1.0% for ORs (range -5.8% to 4.4%). The capability index (ratio of specification range to range of the data) was 1.3 for the PTV
Statistical process control for IMRT dosimetric verification.
Breen, Stephen L; Moseley, Douglas J; Zhang, Beibei; Sharpe, Michael B
2008-10-01
Patient-specific measurements are typically used to validate the dosimetry of intensity-modulated radiotherapy (IMRT). To evaluate the dosimetric performance over time of our IMRT process, we have used statistical process control (SPC) concepts to analyze the measurements from 330 head and neck (H&N) treatment plans. The objectives of the present work are to: (i) Review the dosimetric measurements of a large series of consecutive head and neck treatment plans to better understand appropriate dosimetric tolerances; (ii) analyze the results with SPC to develop action levels for measured discrepancies; (iii) develop estimates for the number of measurements that are required to describe IMRT dosimetry in the clinical setting; and (iv) evaluate with SPC a new beam model in our planning system. H&N IMRT cases were planned with the PINNACLE treatment planning system versions 6.2b or 7.6c (Philips Medical Systems, Madison, WI) and treated on Varian (Palo Alto, CA) or Elekta (Crawley, UK) linacs. As part of regular quality assurance, plans were recalculated on a 20-cm-diam cylindrical phantom, and ion chamber measurements were made in high-dose volumes (the PTV with highest dose) and in low-dose volumes (spinal cord organ-at-risk, OR). Differences between the planned and measured doses were recorded as a percentage of the planned dose. Differences were stable over time. Measurements with PINNACLE3 6.2b and Varian linacs showed a mean difference of 0.6% for PTVs (n=149, range, -4.3% to 6.6%), while OR measurements showed a larger systematic discrepancy (mean 4.5%, range -4.5% to 16.3%) that was due to well-known limitations of the MLC model in the earlier version of the planning system. Measurements with PINNACLE3 7.6c and Varian linacs demonstrated a mean difference of 0.2% for PTVs (n=160, range, -3.0%, to 5.0%) and -1.0% for ORs (range -5.8% to 4.4%). The capability index (ratio of specification range to range of the data) was 1.3 for the PTV data, indicating that almost
NASA Astrophysics Data System (ADS)
Eadie, Gwendolyn Marie
This research uses a Bayesian approach to study the biases that may occur when kinematic data is used to estimate the mass of a galaxy. Data is simulated from the Hernquist (1990) distribution functions (DFs) for velocity dispersions of the isotropic, constant anisotropic, and anisotropic Osipkov (1979) and Merritt (1985) type, and then analysed using the isotropic Hernquist model. Biases are explored when i) the model and data come from the same DF, ii) the model and data come from the same DF but tangential velocities are unknown, iii) the model and data come from different DFs, and iv) the model and data come from different DFs and the tangential velocities are unknown. Mock observations are also created from the Gauthier (2006) simulations and analysed with the isotropic Hernquist model. No bias was found in situation (i), a slight positive bias was found in (ii), a negative bias was found in (iii), and a large positive bias was found in (iv). The mass estimate of the Gauthier system when tangential velocities were unknown was nearly correct, but the mass profile was not described well by the isotropic Hernquist model. When the Gauthier data was analysed with the tangential velocities, the mass of the system was overestimated. The code created for the research runs three parallel Markov Chains for each data set, uses the Gelman-Rubin statistic to assess convergence, and combines the converged chains into a single sample of the posterior distribution for each data set. The code also includes two ways to deal with nuisance parameters. One is to marginalize over the nuisance parameter at every step in the chain, and the other is to sample the nuisance parameters using a hybrid-Gibbs sampler. When tangential velocities, v(t), are unobserved in the analyses above, they are sampled as nuisance parameters in the Markov Chain. The v(t) estimates from the Markov chains did a poor job of estimating the true tangential velocities. However, the posterior samples of v
NASA Astrophysics Data System (ADS)
Beramendi-Orosco, Laura E.; Gonzalez-Hernandez, Galia; Urrutia-Fucugauchi, Jaime; Manzanilla, Linda R.; Soler-Arechalde, Ana M.; Goguitchaishvili, Avto; Jarboe, Nick
2009-03-01
A high-resolution 14C chronology for the Teopancazco archaeological site in the Teotihuacan urban center of Mesoamerica was generated by Bayesian analysis of 33 radiocarbon dates and detailed archaeological information related to occupation stratigraphy, pottery and archaeomagnetic dates. The calibrated intervals obtained using the Bayesian model are up to ca. 70% shorter than those obtained with individual calibrations. For some samples, this is a consequence of plateaus in the part of the calibration curve covered by the sample dates (2500 to 1450 14C yr BP). Effects of outliers are explored by comparing the results from a Bayesian model that incorporates radiocarbon data for two outlier samples with the same model excluding them. The effect of outliers was more significant than expected. Inclusion of radiocarbon dates from two altered contexts, 500 14C yr earlier than those for the first occupational phase, results in ages calculated by the model earlier than the archaeological records. The Bayesian chronology excluding these outliers separates the first two Teopancazco occupational phases and suggests that ending of the Xolalpan phase was around cal AD 550, 100 yr earlier than previously estimated and in accordance with previously reported archaeomagnetic dates from lime plasters for the same site.
NASA Astrophysics Data System (ADS)
Gehrmann, Romina A. S.; Schwalenberg, Katrin; Riedel, Michael; Spence, George D.; Spieß, Volkhard; Dosso, Stan E.
2016-01-01
This paper applies nonlinear Bayesian inversion to marine controlled source electromagnetic (CSEM) data collected near two sites of the Integrated Ocean Drilling Program (IODP) Expedition 311 on the northern Cascadia Margin to investigate subseafloor resistivity structure related to gas hydrate deposits and cold vents. The Cascadia margin, off the west coast of Vancouver Island, Canada, has a large accretionary prism where sediments are under pressure due to convergent plate boundary tectonics. Gas hydrate deposits and cold vent structures have previously been investigated by various geophysical methods and seabed drilling. Here, we invert time-domain CSEM data collected at Sites U1328 and U1329 of IODP Expedition 311 using Bayesian methods to derive subsurface resistivity model parameters and uncertainties. The Bayesian information criterion is applied to determine the amount of structure (number of layers in a depth-dependent model) that can be resolved by the data. The parameter space is sampled with the Metropolis-Hastings algorithm in principal-component space, utilizing parallel tempering to ensure wider and efficient sampling and convergence. Nonlinear inversion allows analysis of uncertain acquisition parameters such as time delays between receiver and transmitter clocks as well as input electrical current amplitude. Marginalizing over these instrument parameters in the inversion accounts for their contribution to the geophysical model uncertainties. One-dimensional inversion of time-domain CSEM data collected at measurement sites along a survey line allows interpretation of the subsurface resistivity structure. The data sets can be generally explained by models with 1 to 3 layers. Inversion results at U1329, at the landward edge of the gas hydrate stability zone, indicate a sediment unconformity as well as potential cold vents which were previously unknown. The resistivities generally increase upslope due to sediment erosion along the slope. Inversion
NASA Technical Reports Server (NTRS)
Denning, Peter J.
1989-01-01
In 1983 and 1984, the Infrared Astronomical Satellite (IRAS) detected 5,425 stellar objects and measured their infrared spectra. In 1987 a program called AUTOCLASS used Bayesian inference methods to discover the classes present in these data and determine the most probable class of each object, revealing unknown phenomena in astronomy. AUTOCLASS has rekindled the old debate on the suitability of Bayesian methods, which are computationally intensive, interpret probabilities as plausibility measures rather than frequencies, and appear to depend on a subjective assessment of the probability of a hypothesis before the data were collected. Modern statistical methods have, however, recently been shown to also depend on subjective elements. These debates bring into question the whole tradition of scientific objectivity and offer scientists a new way to take responsibility for their findings and conclusions.
Applying Statistical Process Control to Clinical Data: An Illustration.
ERIC Educational Resources Information Center
Pfadt, Al; And Others
1992-01-01
Principles of statistical process control are applied to a clinical setting through the use of control charts to detect changes, as part of treatment planning and clinical decision-making processes. The logic of control chart analysis is derived from principles of statistical inference. Sample charts offer examples of evaluating baselines and…
Applying Statistical Process Quality Control Methodology to Educational Settings.
ERIC Educational Resources Information Center
Blumberg, Carol Joyce
A subset of Statistical Process Control (SPC) methodology known as Control Charting is introduced. SPC methodology is a collection of graphical and inferential statistics techniques used to study the progress of phenomena over time. The types of control charts covered are the null X (mean), R (Range), X (individual observations), MR (moving…
Wright, David K; MacEachern, Scott; Lee, Jaeyong
2014-01-01
The locations of diy-geδ-bay (DGB) sites in the Mandara Mountains, northern Cameroon are hypothesized to occur as a function of their ability to see and be seen from points on the surrounding landscape. A series of geostatistical, two-way and Bayesian logistic regression analyses were performed to test two hypotheses related to the intervisibility of the sites to one another and their visual prominence on the landscape. We determine that the intervisibility of the sites to one another is highly statistically significant when compared to 10 stratified-random permutations of DGB sites. Bayesian logistic regression additionally demonstrates that the visibility of the sites to points on the surrounding landscape is statistically significant. The location of sites appears to have also been selected on the basis of lower slope than random permutations of sites. Using statistical measures, many of which are not commonly employed in archaeological research, to evaluate aspects of visibility on the landscape, we conclude that the placement of DGB sites improved their conspicuousness for enhanced ritual, social cooperation and/or competition purposes. PMID:25383883
Wright, David K.; MacEachern, Scott; Lee, Jaeyong
2014-01-01
The locations of diy-geδ-bay (DGB) sites in the Mandara Mountains, northern Cameroon are hypothesized to occur as a function of their ability to see and be seen from points on the surrounding landscape. A series of geostatistical, two-way and Bayesian logistic regression analyses were performed to test two hypotheses related to the intervisibility of the sites to one another and their visual prominence on the landscape. We determine that the intervisibility of the sites to one another is highly statistically significant when compared to 10 stratified-random permutations of DGB sites. Bayesian logistic regression additionally demonstrates that the visibility of the sites to points on the surrounding landscape is statistically significant. The location of sites appears to have also been selected on the basis of lower slope than random permutations of sites. Using statistical measures, many of which are not commonly employed in archaeological research, to evaluate aspects of visibility on the landscape, we conclude that the placement of DGB sites improved their conspicuousness for enhanced ritual, social cooperation and/or competition purposes. PMID:25383883
Artificial Intelligence Approach to Support Statistical Quality Control Teaching
ERIC Educational Resources Information Center
Reis, Marcelo Menezes; Paladini, Edson Pacheco; Khator, Suresh; Sommer, Willy Arno
2006-01-01
Statistical quality control--SQC (consisting of Statistical Process Control, Process Capability Studies, Acceptance Sampling and Design of Experiments) is a very important tool to obtain, maintain and improve the Quality level of goods and services produced by an organization. Despite its importance, and the fact that it is taught in technical and…
Control Statistics Process Data Base V4
Energy Science and Technology Software Center (ESTSC)
1998-05-07
The check standard database program, CSP_CB, is a menu-driven program that can acquire measurement data for check standards having a parameter dependence (such as frequency) or no parameter dependence (for example, mass measurements). The program may be run stand-alone or leaded as a subprogram to a Basic program already in memory. The software was designed to require little additional work on the part of the user. The facilitate this design goal, the program is entirelymore » menu-driven. In addition, the user does have control of file names and parameters within a definition file which sets up the basic scheme of file names.« less
Blanc, Guillermo A.; Kewley, Lisa; Vogt, Frédéric P. A.; Dopita, Michael A.
2015-01-10
We present a new method for inferring the metallicity (Z) and ionization parameter (q) of H II regions and star-forming galaxies using strong nebular emission lines (SELs). We use Bayesian inference to derive the joint and marginalized posterior probability density functions for Z and q given a set of observed line fluxes and an input photoionization model. Our approach allows the use of arbitrary sets of SELs and the inclusion of flux upper limits. The method provides a self-consistent way of determining the physical conditions of ionized nebulae that is not tied to the arbitrary choice of a particular SEL diagnostic and uses all the available information. Unlike theoretically calibrated SEL diagnostics, the method is flexible and not tied to a particular photoionization model. We describe our algorithm, validate it against other methods, and present a tool that implements it called IZI. Using a sample of nearby extragalactic H II regions, we assess the performance of commonly used SEL abundance diagnostics. We also use a sample of 22 local H II regions having both direct and recombination line (RL) oxygen abundance measurements in the literature to study discrepancies in the abundance scale between different methods. We find that oxygen abundances derived through Bayesian inference using currently available photoionization models in the literature can be in good (∼30%) agreement with RL abundances, although some models perform significantly better than others. We also confirm that abundances measured using the direct method are typically ∼0.2 dex lower than both RL and photoionization-model-based abundances.
Statistical approach to linewidth control in a logic fab
NASA Astrophysics Data System (ADS)
Pitter, Michael; Doleschel, Bernhard; Eibl, Ludwig; Steinkirchner, Erwin; Grassmann, Andreas
1999-04-01
We designed an adaptive line width controller specially tailored to the needs of a highly diversified logic fab. Simulations of different controller types fed with historic CD data show advantages of an SPC based controller over a Run by Run controller. This result confirms the SPC assumption that as long as a process is in statistical control, changing the process parameters will only increase the variability of the output.
NASA Technical Reports Server (NTRS)
Vangelder, B. H. W.
1978-01-01
Non-Bayesian statistics were used in simulation studies centered around laser range observations to LAGEOS. The capabilities of satellite laser ranging especially in connection with relative station positioning are evaluated. The satellite measurement system under investigation may fall short in precise determinations of the earth's orientation (precession and nutation) and earth's rotation as opposed to systems as very long baseline interferometry (VLBI) and lunar laser ranging (LLR). Relative station positioning, determination of (differential) polar motion, positioning of stations with respect to the earth's center of mass and determination of the earth's gravity field should be easily realized by satellite laser ranging (SLR). The last two features should be considered as best (or solely) determinable by SLR in contrast to VLBI and LLR.
NASA Astrophysics Data System (ADS)
Speegle, Darrin; Steward, Robert
2015-08-01
We propose a semiparametric approach to infer the existence of and estimate the location of a statistical change-point to a nonlinear high dimensional time series contaminated with an additive noise component. In particular, we consider a p―dimensional stochastic process of independent multivariate normal observations where the mean function varies smoothly except at a single change-point. Our approach first involves a dimension reduction of the original time series through a random matrix multiplication. Next, we conduct a Bayesian analysis on the empirical detail coefficients of this dimensionally reduced time series after a wavelet transform. We also present a means to associate confidence bounds to the conclusions of our results. Aside from being computationally efficient and straight forward to implement, the primary advantage of our methods is seen in how these methods apply to a much larger class of time series whose mean functions are subject to only general smoothness conditions.
Towards Validation of an Adaptive Flight Control Simulation Using Statistical Emulation
NASA Technical Reports Server (NTRS)
He, Yuning; Lee, Herbert K. H.; Davies, Misty D.
2012-01-01
Traditional validation of flight control systems is based primarily upon empirical testing. Empirical testing is sufficient for simple systems in which a.) the behavior is approximately linear and b.) humans are in-the-loop and responsible for off-nominal flight regimes. A different possible concept of operation is to use adaptive flight control systems with online learning neural networks (OLNNs) in combination with a human pilot for off-nominal flight behavior (such as when a plane has been damaged). Validating these systems is difficult because the controller is changing during the flight in a nonlinear way, and because the pilot and the control system have the potential to co-adapt in adverse ways traditional empirical methods are unlikely to provide any guarantees in this case. Additionally, the time it takes to find unsafe regions within the flight envelope using empirical testing means that the time between adaptive controller design iterations is large. This paper describes a new concept for validating adaptive control systems using methods based on Bayesian statistics. This validation framework allows the analyst to build nonlinear models with modal behavior, and to have an uncertainty estimate for the difference between the behaviors of the model and system under test.
NASA Astrophysics Data System (ADS)
Culver, R. Lee; Sibul, Leon H.; Bradley, David L.; Ballard, Jeffrey A.; Camin, H. John
2005-09-01
Our goal is to develop a probabilistic sonar performance prediction methodology that can make use of limited knowledge of random or uncertain environment, target, and sonar system parameters, but does not make unwarranted assumptions. The maximum entropy method (MEM) can be used to construct probability density functions (pdfs) for relevant environmental and source parameters, and an ocean acoustic propagation model can use those pdfs to predict the variability of received signal parameter. At this point, the MEM can be used once again to produce signal parameter pdfs. A Bayesian framework allows these pdfs to be incorporated into the signal processor to produce ROC curves in which, for example, the signal-to-noise ratio (SNR) is a random variable for which a pdf has been calculated. One output of such a processor could be a range-dependent probability of detection for fixed probability of false alarm, which would be more useful than the conventional range of the day that is still in use in some areas. [Work supported by ONR Code 321US.
Optimal control-based bayesian detection of clinical and behavioral state transitions.
Santaniello, Sabato; Sherman, David L; Thakor, Nitish V; Eskandar, Emad N; Sarma, Sridevi V
2012-09-01
Accurately detecting hidden clinical or behavioral states from sequential measurements is an emerging topic in neuroscience and medicine, which may dramatically impact neural prosthetics, brain-computer interface and drug delivery. For example, early detection of an epileptic seizure from sequential electroencephalographic (EEG) measurements would allow timely administration of anticonvulsant drugs or neurostimulation, thus reducing physical impairment and risks of overtreatment. We develop a Bayesian paradigm for state transition detection that combines optimal control and Markov processes. We define a hidden Markov model of the state evolution and develop a detection policy that minimizes a loss function of both probability of false positives and accuracy (i.e., lag between estimated and actual transition time). Our strategy automatically adapts to each newly acquired measurement based on the state evolution model and the relative loss for false positives and accuracy, thus resulting in a time varying threshold policy. The paradigm was used in two applications: 1) detection of movement onset (behavioral state) from subthalamic single unit recordings in Parkinson's disease patients performing a motor task; 2) early detection of an approaching seizure (clinical state) from multichannel intracranial EEG recordings in rodents treated with pentylenetetrazol chemoconvulsant. Our paradigm performs significantly better than chance and improves over widely used detection algorithms. PMID:22893447
Bauer, Robert; Gharabaghi, Alireza
2015-01-01
Restorative brain-computer interfaces (BCI) are increasingly used to provide feedback of neuronal states in a bid to normalize pathological brain activity and achieve behavioral gains. However, patients and healthy subjects alike often show a large variability, or even inability, of brain self-regulation for BCI control, known as BCI illiteracy. Although current co-adaptive algorithms are powerful for assistive BCIs, their inherent class switching clashes with the operant conditioning goal of restorative BCIs. Moreover, due to the treatment rationale, the classifier of restorative BCIs usually has a constrained feature space, thus limiting the possibility of classifier adaptation. In this context, we applied a Bayesian model of neurofeedback and reinforcement learning for different threshold selection strategies to study the impact of threshold adaptation of a linear classifier on optimizing restorative BCIs. For each feedback iteration, we first determined the thresholds that result in minimal action entropy and maximal instructional efficiency. We then used the resulting vector for the simulation of continuous threshold adaptation. We could thus show that threshold adaptation can improve reinforcement learning, particularly in cases of BCI illiteracy. Finally, on the basis of information-theory, we provided an explanation for the achieved benefits of adaptive threshold setting. PMID:25729347
Bauer, Robert; Gharabaghi, Alireza
2015-01-01
Restorative brain-computer interfaces (BCI) are increasingly used to provide feedback of neuronal states in a bid to normalize pathological brain activity and achieve behavioral gains. However, patients and healthy subjects alike often show a large variability, or even inability, of brain self-regulation for BCI control, known as BCI illiteracy. Although current co-adaptive algorithms are powerful for assistive BCIs, their inherent class switching clashes with the operant conditioning goal of restorative BCIs. Moreover, due to the treatment rationale, the classifier of restorative BCIs usually has a constrained feature space, thus limiting the possibility of classifier adaptation. In this context, we applied a Bayesian model of neurofeedback and reinforcement learning for different threshold selection strategies to study the impact of threshold adaptation of a linear classifier on optimizing restorative BCIs. For each feedback iteration, we first determined the thresholds that result in minimal action entropy and maximal instructional efficiency. We then used the resulting vector for the simulation of continuous threshold adaptation. We could thus show that threshold adaptation can improve reinforcement learning, particularly in cases of BCI illiteracy. Finally, on the basis of information-theory, we provided an explanation for the achieved benefits of adaptive threshold setting. PMID:25729347
Using Paper Helicopters to Teach Statistical Process Control
ERIC Educational Resources Information Center
Johnson, Danny J.
2011-01-01
This hands-on project uses a paper helicopter to teach students how to distinguish between common and special causes of variability when developing and using statistical process control charts. It allows the student to experience a process that is out-of-control due to imprecise or incomplete product design specifications and to discover how the…
Statistical Process Control: Going to the Limit for Quality.
ERIC Educational Resources Information Center
Training, 1987
1987-01-01
Defines the concept of statistical process control, a quality control method used especially in manufacturing. Generally, concept users set specific standard levels that must be met. Makes the point that although employees work directly with the method, management is responsible for its success within the plant. (CH)
Manufacturing Squares: An Integrative Statistical Process Control Exercise
ERIC Educational Resources Information Center
Coy, Steven P.
2016-01-01
In the exercise, students in a junior-level operations management class are asked to manufacture a simple product. Given product specifications, they must design a production process, create roles and design jobs for each team member, and develop a statistical process control plan that efficiently and effectively controls quality during…
Statistical Design Model (SDM) of satellite thermal control subsystem
NASA Astrophysics Data System (ADS)
Mirshams, Mehran; Zabihian, Ehsan; Aarabi Chamalishahi, Mahdi
2016-07-01
Satellites thermal control, is a satellite subsystem that its main task is keeping the satellite components at its own survival and activity temperatures. Ability of satellite thermal control plays a key role in satisfying satellite's operational requirements and designing this subsystem is a part of satellite design. In the other hand due to the lack of information provided by companies and designers still doesn't have a specific design process while it is one of the fundamental subsystems. The aim of this paper, is to identify and extract statistical design models of spacecraft thermal control subsystem by using SDM design method. This method analyses statistical data with a particular procedure. To implement SDM method, a complete database is required. Therefore, we first collect spacecraft data and create a database, and then we extract statistical graphs using Microsoft Excel, from which we further extract mathematical models. Inputs parameters of the method are mass, mission, and life time of the satellite. For this purpose at first thermal control subsystem has been introduced and hardware using in the this subsystem and its variants has been investigated. In the next part different statistical models has been mentioned and a brief compare will be between them. Finally, this paper particular statistical model is extracted from collected statistical data. Process of testing the accuracy and verifying the method use a case study. Which by the comparisons between the specifications of thermal control subsystem of a fabricated satellite and the analyses results, the methodology in this paper was proved to be effective. Key Words: Thermal control subsystem design, Statistical design model (SDM), Satellite conceptual design, Thermal hardware
NASA Astrophysics Data System (ADS)
Iizumi, T.; Nishimori, M.; Yokozawa, M.; Kotera, A.; Khang, N. D.
2008-12-01
Long-term daily global solar radiation (GSR) data of the same quality in the 20th century has been needed as a baseline to assess the climate change impact on paddy rice production in Vietnamese Mekong Delta area (MKD: 104.5-107.5oE/8.2-11.2oN). However, though sunshine duration data is available, the accessibility of GSR data is quite poor in MKD. This study estimated the daily GSR in MKD for 30-yr (1978- 2007) by applying the statistical downscaling method (SDM). The estimates of GSR was obtained from four different sources: (1) the combined equations with the corrected reanalysis data of daily maximum/minimum temperatures, relative humidity, sea level pressure, and precipitable water; (2) the correction equation with the reanalysis data of downward shortwave radiation; (3) the empirical equation with the observed sunshine duration; and (4) the observation at one site for short term. Three reanalysis data, i.e., NCEP-R1, ERA-40, and JRA-25, were used. Also the observed meteorological data, which includes many missing data, were obtained from 11 stations of the Vietnamese Meteorological Agency for 28-yr and five stations of the Global Summary of the Day for 30-yr. The observed GSR data for 1-yr was obtained from our station. Considering the use of data with many missing data for analysis, the Bayesian inference was used for this study, which has the powerful capability to optimize multiple parameters in a non-linear and hierarchical model. The Bayesian inference provided the posterior distributions of 306 parameter values relating to the combined equations, the empirical equation, and the correction equation. The preliminary result shows that the amplitude of daily fluctuation of modeled GSR was underestimated by the empirical equation and the correction equation. The combination of SDM and Bayesian inference has a potential to estimate the long- term daily GSR of the same quality even though in the area where the observed data is quite limited.
Archer, S C; Mc Coy, F; Wapenaar, W; Green, M J
2014-01-01
The aim of this research was to determine budgets for specific management interventions to control heifer mastitis in Irish dairy herds as an example of evidence synthesis and 1-step Bayesian micro-simulation in a veterinary context. Budgets were determined for different decision makers based on their willingness to pay. Reducing the prevalence of heifers with a high milk somatic cell count (SCC) early in the first lactation could be achieved through herd level management interventions for pre- and peri-partum heifers, however the cost effectiveness of these interventions is unknown. A synthesis of multiple sources of evidence, accounting for variability and uncertainty in the available data is invaluable to inform decision makers around likely economic outcomes of investing in disease control measures. One analytical approach to this is Bayesian micro-simulation, where the trajectory of different individuals undergoing specific interventions is simulated. The classic micro-simulation framework was extended to encompass synthesis of evidence from 2 separate statistical models and previous research, with the outcome for an individual cow or herd assessed in terms of changes in lifetime milk yield, disposal risk, and likely financial returns conditional on the interventions being simultaneously applied. The 3 interventions tested were storage of bedding inside, decreasing transition yard stocking density, and spreading of bedding evenly in the calving area. Budgets for the interventions were determined based on the minimum expected return on investment, and the probability of the desired outcome. Budgets for interventions to control heifer mastitis were highly dependent on the decision maker's willingness to pay, and hence minimum expected return on investment. Understanding the requirements of decision makers and their rational spending limits would be useful for the development of specific interventions for particular farms to control heifer mastitis, and other
NASA Astrophysics Data System (ADS)
Granade, Christopher; Combes, Joshua; Cory, D. G.
2016-03-01
In recent years, Bayesian methods have been proposed as a solution to a wide range of issues in quantum state and process tomography. State-of-the-art Bayesian tomography solutions suffer from three problems: numerical intractability, a lack of informative prior distributions, and an inability to track time-dependent processes. Here, we address all three problems. First, we use modern statistical methods, as pioneered by Huszár and Houlsby (2012 Phys. Rev. A 85 052120) and by Ferrie (2014 New J. Phys. 16 093035), to make Bayesian tomography numerically tractable. Our approach allows for practical computation of Bayesian point and region estimators for quantum states and channels. Second, we propose the first priors on quantum states and channels that allow for including useful experimental insight. Finally, we develop a method that allows tracking of time-dependent states and estimates the drift and diffusion processes affecting a state. We provide source code and animated visual examples for our methods.
Maintenance with the use of statistical control charts
NASA Astrophysics Data System (ADS)
Gromakov, E. I.; Aleksandrova, T. V.; Ivanenko, B. P.
2016-04-01
The possibility of using statistical process control methods for detection of an abnormal condition of the process equipment at early stages of an emergency is shown in the paper. The authors of the paper has concluded that with the use of Shewhart charts it is possible to monitor the real dynamics of the process equipment condition and make decisions on its maintenance and repair
Statistical porcess control in Deep Space Network operation
NASA Technical Reports Server (NTRS)
Hodder, J. A.
2002-01-01
This report describes how the Deep Space Mission System (DSMS) Operations Program Office at the Jet Propulsion Laboratory's (EL) uses Statistical Process Control (SPC) to monitor performance and evaluate initiatives for improving processes on the National Aeronautics and Space Administration's (NASA) Deep Space Network (DSN).
Statistical Process Control in the Practice of Program Evaluation.
ERIC Educational Resources Information Center
Posavac, Emil J.
1995-01-01
A technique developed to monitor the quality of manufactured products, statistical process control (SPC), incorporates several features that may prove attractive to evaluators. This paper reviews the history of SPC, suggests how the approach can enrich program evaluation, and illustrates its use in a hospital-based example. (SLD)
Statistical Process Control. Impact and Opportunities for Ohio.
ERIC Educational Resources Information Center
Brown, Harold H.
The first purpose of this study is to help the reader become aware of the evolution of Statistical Process Control (SPC) as it is being implemented and used in industry today. This is approached through the presentation of a brief historical account of SPC, from its inception through the technological miracle that has occurred in Japan. The…
Statistical Process Control. A Summary. FEU/PICKUP Project Report.
ERIC Educational Resources Information Center
Owen, M.; Clark, I.
A project was conducted to develop a curriculum and training materials to be used in training industrial operatives in statistical process control (SPC) techniques. During the first phase of the project, questionnaires were sent to 685 companies (215 of which responded) to determine where SPC was being used, what type of SPC firms needed, and how…
Real-time statistical quality control and ARM
Blough, D.K.
1992-05-01
An important component of the Atmospheric Radiation Measurement (ARM) Program is real-time quality control of data obtained from meteorological instruments. It is the goal of the ARM program to enhance the predictive capabilities of global circulation models by incorporating in them more detailed information on the radiative characteristics of the earth's atmosphere. To this end, a number of Cloud and Radiation Testbeds (CART's) will be built at various locations worldwide. Each CART will consist of an array of instruments designed to collect radiative data. The large amount of data obtained from these instruments necessitates real-time processing in order to flag outliers and possible instrument malfunction. The Bayesian dynamic linear model (DLM) proves to be an effective way of monitoring the time series data which each instrument generates. It provides a flexible yet powerful approach to detecting in real-time sudden shifts in a non-stationary multivariate time series. An application of these techniques to data arising from a remote sensing instrument to be used in the CART is provided. Using real data from a wind profiler, the ability of the DLM to detect outliers is studied. 5 refs.
Real-time statistical quality control and ARM
Blough, D.K.
1992-05-01
An important component of the Atmospheric Radiation Measurement (ARM) Program is real-time quality control of data obtained from meteorological instruments. It is the goal of the ARM program to enhance the predictive capabilities of global circulation models by incorporating in them more detailed information on the radiative characteristics of the earth`s atmosphere. To this end, a number of Cloud and Radiation Testbeds (CART`s) will be built at various locations worldwide. Each CART will consist of an array of instruments designed to collect radiative data. The large amount of data obtained from these instruments necessitates real-time processing in order to flag outliers and possible instrument malfunction. The Bayesian dynamic linear model (DLM) proves to be an effective way of monitoring the time series data which each instrument generates. It provides a flexible yet powerful approach to detecting in real-time sudden shifts in a non-stationary multivariate time series. An application of these techniques to data arising from a remote sensing instrument to be used in the CART is provided. Using real data from a wind profiler, the ability of the DLM to detect outliers is studied. 5 refs.
Tool compensation using statistical process control on complex milling operations
Reilly, J.M.
1994-03-01
In today`s competitive manufacturing environment, many companies increasingly rely on numerical control (NC) mills to produce products at a reasonable cost. Typically, this is done by producing as many features as possible at each machining operation to minimize the total number of shop hours invested per part. Consequently, the number of cutting tools involved in one operation can become quite large since NC mills have the capacity to use in excess of 100 cutting tools. As the number of cutting tools increases, the difficulty of applying optimum tool compensation grows exponentially, quickly overwhelming machine operators and engineers. A systematic method of managing tool compensation is required. The name statistical process control (SPC) suggests a technique in which statistics are used to stabilize and control a machining operation. Feedback and control theory, the study of the stabilization of electronic and mechanical systems, states that control can be established by way of a feedback network. If these concepts were combined, SPC would stabilize and control manufacturing operations through the incorporation of statistically processed feedback. In its simplest application, SPC has been used as a tool to analyze inspection data. In its most mature application, SPC can be the link that applies process feedback. The approach involves: (1) identifying the significant process variables adjusted by the operator; (2) developing mathematical relationships that convert strategic part measurements into variable adjustments; and (3) implementing SPC charts that record required adjustment to each variable.
77 FR 46096 - Statistical Process Controls for Blood Establishments; Public Workshop
Federal Register 2010, 2011, 2012, 2013, 2014
2012-08-02
... HUMAN SERVICES Food and Drug Administration Statistical Process Controls for Blood Establishments... and Drug Administration (FDA) is announcing a public workshop entitled: ``Statistical Process Controls... statistical process controls to validate and monitor manufacturing processes in blood establishments....
A Statistical Project Control Tool for Engineering Managers
NASA Technical Reports Server (NTRS)
Bauch, Garland T.
2001-01-01
This slide presentation reviews the use of a Statistical Project Control Tool (SPCT) for managing engineering projects. A literature review pointed to a definition of project success, (i.e., A project is successful when the cost, schedule, technical performance, and quality satisfy the customer.) The literature review also pointed to project success factors, and traditional project control tools, and performance measures that are detailed in the report. The essential problem is that with resources becoming more limited, and an increasing number or projects, project failure is increasing, there is a limitation of existing methods and systematic methods are required. The objective of the work is to provide a new statistical project control tool for project managers. Graphs using the SPCT method plotting results of 3 successful projects and 3 failed projects are reviewed, with success and failure being defined by the owner.
NASA Technical Reports Server (NTRS)
He, Yuning
2015-01-01
Safety of unmanned aerial systems (UAS) is paramount, but the large number of dynamically changing controller parameters makes it hard to determine if the system is currently stable, and the time before loss of control if not. We propose a hierarchical statistical model using Treed Gaussian Processes to predict (i) whether a flight will be stable (success) or become unstable (failure), (ii) the time-to-failure if unstable, and (iii) time series outputs for flight variables. We first classify the current flight input into success or failure types, and then use separate models for each class to predict the time-to-failure and time series outputs. As different inputs may cause failures at different times, we have to model variable length output curves. We use a basis representation for curves and learn the mappings from input to basis coefficients. We demonstrate the effectiveness of our prediction methods on a NASA neuro-adaptive flight control system.
The HONEYPOT Randomized Controlled Trial Statistical Analysis Plan
Pascoe, Elaine Mary; Lo, Serigne; Scaria, Anish; Badve, Sunil V.; Beller, Elaine Mary; Cass, Alan; Hawley, Carmel Mary; Johnson, David W.
2013-01-01
♦ Background: The HONEYPOT study is a multicenter, open-label, blinded-outcome, randomized controlled trial designed to determine whether, compared with standard topical application of mupirocin for nasal staphylococcal carriage, exit-site application of antibacterial honey reduces the rate of catheter-associated infections in peritoneal dialysis patients. ♦ Objective: To make public the pre-specified statistical analysis principles to be adhered to and the procedures to be performed by statisticians who will analyze the data for the HONEYPOT trial. ♦ Methods: Statisticians and clinical investigators who were blinded to treatment allocation and treatment-related study results and who will remain blinded until the central database is locked for final data extraction and analysis determined the statistical methods and procedures to be used for analysis and wrote the statistical analysis plan. The plan describes basic analysis principles, methods for dealing with a range of commonly encountered data analysis issues, and the specific statistical procedures for analyzing the primary, secondary, and safety outcomes. ♦ Results: A statistical analysis plan containing the pre-specified principles, methods, and procedures to be adhered to in the analysis of the data from the HONEYPOT trial was developed in accordance with international guidelines. The structure and content of the plan provide sufficient detail to meet the guidelines on statistical principles for clinical trials produced by the International Conference on Harmonization of Technical Requirements for Registration of Pharmaceuticals for Human Use. ♦ Conclusions: Making public the pre-specified statistical analysis plan for the HONEYPOT trial minimizes the potential for bias in the analysis of trial data and the interpretation and reporting of trial results. PMID:23843589
Statistical physics of human beings in games: Controlled experiments
NASA Astrophysics Data System (ADS)
Liang, Yuan; Huang, Ji-Ping
2014-07-01
It is important to know whether the laws or phenomena in statistical physics for natural systems with non-adaptive agents still hold for social human systems with adaptive agents, because this implies whether it is possible to study or understand social human systems by using statistical physics originating from natural systems. For this purpose, we review the role of human adaptability in four kinds of specific human behaviors, namely, normal behavior, herd behavior, contrarian behavior, and hedge behavior. The approach is based on controlled experiments in the framework of market-directed resource-allocation games. The role of the controlled experiments could be at least two-fold: adopting the real human decision-making process so that the system under consideration could reflect the performance of genuine human beings; making it possible to obtain macroscopic physical properties of a human system by tuning a particular factor of the system, thus directly revealing cause and effect. As a result, both computer simulations and theoretical analyses help to show a few counterparts of some laws or phenomena in statistical physics for social human systems: two-phase phenomena or phase transitions, entropy-related phenomena, and a non-equilibrium steady state. This review highlights the role of human adaptability in these counterparts, and makes it possible to study or understand some particular social human systems by means of statistical physics coming from natural systems.
CRN5EXP: Expert system for statistical quality control
NASA Technical Reports Server (NTRS)
Hentea, Mariana
1991-01-01
The purpose of the Expert System CRN5EXP is to assist in checking the quality of the coils at two very important mills: Hot Rolling and Cold Rolling in a steel plant. The system interprets the statistical quality control charts, diagnoses and predicts the quality of the steel. Measurements of process control variables are recorded in a database and sample statistics such as the mean and the range are computed and plotted on a control chart. The chart is analyzed through patterns using the C Language Integrated Production System (CLIPS) and a forward chaining technique to reach a conclusion about the causes of defects and to take management measures for the improvement of the quality control techniques. The Expert System combines the certainty factors associated with the process control variables to predict the quality of the steel. The paper presents the approach to extract data from the database, the reason to combine certainty factors, the architecture and the use of the Expert System. However, the interpretation of control charts patterns requires the human expert's knowledge and lends to Expert Systems rules.
Utilizing effective statistical process control limits for critical dimension metrology
NASA Astrophysics Data System (ADS)
Buser, Joel T.
2002-12-01
To accurately control critical dimension (CD) metrology in a standard real-time solution across a multi-site operation there is a need to collect measure-to-measure and day-to-day variation across all sites. Each individual site's needs, technologies, and resources can affect the final solution. A preferred statistical process control (SPC) solution for testing measure-to-measure and day-to-day variation is the traditional Mean and Range chart. However, replicating the full measurement process needed for the Mean and Range chart in real-time can strain resources. To solve this problem, an initially proposed measurement methodology was to isolate a point of interest, measure the CD feature n number of times, and continue to the next feature; however, the interdependencies in measure-to-measure variation caused by this methodology resulted in exceedingly narrow control limits. This paper explains how traditional solutions to narrow control limits are statistically problematic and explores the approach of computing control limits for the Mean chart utilizing the moving range of sample means to estimate sigma instead of the traditional range method. Tool monitoring data from multiple CD metrology tools are reported and compared against control limits calculated by the traditional approach, engineering limits, and the suggested approach. The data indicate that the suggested approach is the most accurate of the three solutions.
Ma, Ning; Yu, Angela J.
2015-01-01
Response time (RT) is an oft-reported behavioral measure in psychological and neurocognitive experiments, but the high level of observed trial-to-trial variability in this measure has often limited its usefulness. Here, we combine computational modeling and psychophysics to examine the hypothesis that fluctuations in this noisy measure reflect dynamic computations in human statistical learning and corresponding cognitive adjustments. We present data from the stop-signal task (SST), in which subjects respond to a go stimulus on each trial, unless instructed not to by a subsequent, infrequently presented stop signal. We model across-trial learning of stop signal frequency, P(stop), and stop-signal onset time, SSD (stop-signal delay), with a Bayesian hidden Markov model, and within-trial decision-making with an optimal stochastic control model. The combined model predicts that RT should increase with both expected P(stop) and SSD. The human behavioral data (n = 20) bear out this prediction, showing P(stop) and SSD both to be significant, independent predictors of RT, with P(stop) being a more prominent predictor in 75% of the subjects, and SSD being more prominent in the remaining 25%. The results demonstrate that humans indeed readily internalize environmental statistics and adjust their cognitive/behavioral strategy accordingly, and that subtle patterns in RT variability can serve as a valuable tool for validating models of statistical learning and decision-making. More broadly, the modeling tools presented in this work can be generalized to a large body of behavioral paradigms, in order to extract insights about cognitive and neural processing from apparently quite noisy behavioral measures. We also discuss how this behaviorally validated model can then be used to conduct model-based analysis of neural data, in order to help identify specific brain areas for representing and encoding key computational quantities in learning and decision-making. PMID:26321966
McAloon, Conor G; Doherty, Michael L; Whyte, Paul; O'Grady, Luke; More, Simon J; Messam, Locksley L McV; Good, Margaret; Mullowney, Peter; Strain, Sam; Green, Martin J
2016-06-01
Bovine paratuberculosis is a disease characterised by chronic granulomatous enteritis which manifests clinically as a protein-losing enteropathy causing diarrhoea, hypoproteinaemia, emaciation and, eventually death. Some evidence exists to suggest a possible zoonotic link and a national voluntary Johne's Disease Control Programme was initiated by Animal Health Ireland in 2013. The objective of this study was to estimate herd-level true prevalence (HTP) and animal-level true prevalence (ATP) of paratuberculosis in Irish herds enrolled in the national voluntary JD control programme during 2013-14. Two datasets were used in this study. The first dataset had been collected in Ireland during 2005 (5822 animals from 119 herds), and was used to construct model priors. Model priors were updated with a primary (2013-14) dataset which included test records from 99,101 animals in 1039 dairy herds and was generated as part of the national voluntary JD control programme. The posterior estimate of HTP from the final Bayesian model was 0.23-0.34 with a 95% probability. Across all herds, the median ATP was found to be 0.032 (0.009, 0.145). This study represents the first use of Bayesian methodology to estimate the prevalence of paratuberculosis in Irish dairy herds. The HTP estimate was higher than previous Irish estimates but still lower than estimates from other major dairy producing countries. PMID:27237395
Statistical process control using optimized neural networks: a case study.
Addeh, Jalil; Ebrahimzadeh, Ata; Azarbad, Milad; Ranaee, Vahid
2014-09-01
The most common statistical process control (SPC) tools employed for monitoring process changes are control charts. A control chart demonstrates that the process has altered by generating an out-of-control signal. This study investigates the design of an accurate system for the control chart patterns (CCPs) recognition in two aspects. First, an efficient system is introduced that includes two main modules: feature extraction module and classifier module. In the feature extraction module, a proper set of shape features and statistical feature are proposed as the efficient characteristics of the patterns. In the classifier module, several neural networks, such as multilayer perceptron, probabilistic neural network and radial basis function are investigated. Based on an experimental study, the best classifier is chosen in order to recognize the CCPs. Second, a hybrid heuristic recognition system is introduced based on cuckoo optimization algorithm (COA) algorithm to improve the generalization performance of the classifier. The simulation results show that the proposed algorithm has high recognition accuracy. PMID:24210290
A Gentle Introduction to Bayesian Analysis: Applications to Developmental Research
ERIC Educational Resources Information Center
van de Schoot, Rens; Kaplan, David; Denissen, Jaap; Asendorpf, Jens B.; Neyer, Franz J.; van Aken, Marcel A. G.
2014-01-01
Bayesian statistical methods are becoming ever more popular in applied and fundamental research. In this study a gentle introduction to Bayesian analysis is provided. It is shown under what circumstances it is attractive to use Bayesian estimation, and how to interpret properly the results. First, the ingredients underlying Bayesian methods are…
Structure Learning in Bayesian Sensorimotor Integration
Genewein, Tim; Hez, Eduard; Razzaghpanah, Zeynab; Braun, Daniel A.
2015-01-01
Previous studies have shown that sensorimotor processing can often be described by Bayesian learning, in particular the integration of prior and feedback information depending on its degree of reliability. Here we test the hypothesis that the integration process itself can be tuned to the statistical structure of the environment. We exposed human participants to a reaching task in a three-dimensional virtual reality environment where we could displace the visual feedback of their hand position in a two dimensional plane. When introducing statistical structure between the two dimensions of the displacement, we found that over the course of several days participants adapted their feedback integration process in order to exploit this structure for performance improvement. In control experiments we found that this adaptation process critically depended on performance feedback and could not be induced by verbal instructions. Our results suggest that structural learning is an important meta-learning component of Bayesian sensorimotor integration. PMID:26305797
Statistical Quality Control of Moisture Data in GEOS DAS
NASA Technical Reports Server (NTRS)
Dee, D. P.; Rukhovets, L.; Todling, R.
1999-01-01
A new statistical quality control algorithm was recently implemented in the Goddard Earth Observing System Data Assimilation System (GEOS DAS). The final step in the algorithm consists of an adaptive buddy check that either accepts or rejects outlier observations based on a local statistical analysis of nearby data. A basic assumption in any such test is that the observed field is spatially coherent, in the sense that nearby data can be expected to confirm each other. However, the buddy check resulted in excessive rejection of moisture data, especially during the Northern Hemisphere summer. The analysis moisture variable in GEOS DAS is water vapor mixing ratio. Observational evidence shows that the distribution of mixing ratio errors is far from normal. Furthermore, spatial correlations among mixing ratio errors are highly anisotropic and difficult to identify. Both factors contribute to the poor performance of the statistical quality control algorithm. To alleviate the problem, we applied the buddy check to relative humidity data instead. This variable explicitly depends on temperature and therefore exhibits a much greater spatial coherence. As a result, reject rates of moisture data are much more reasonable and homogeneous in time and space.
NASA Astrophysics Data System (ADS)
Boulanger, Jean-Philippe; Martinez, Fernando; Segura, Enrique C.
2007-02-01
Evaluating the response of climate to greenhouse gas forcing is a major objective of the climate community, and the use of large ensemble of simulations is considered as a significant step toward that goal. The present paper thus discusses a new methodology based on neural network to mix ensemble of climate model simulations. Our analysis consists of one simulation of seven Atmosphere Ocean Global Climate Models, which participated in the IPCC Project and provided at least one simulation for the twentieth century (20c3m) and one simulation for each of three SRES scenarios: A2, A1B and B1. Our statistical method based on neural networks and Bayesian statistics computes a transfer function between models and observations. Such a transfer function was then used to project future conditions and to derive what we would call the optimal ensemble combination for twenty-first century climate change projections. Our approach is therefore based on one statement and one hypothesis. The statement is that an optimal ensemble projection should be built by giving larger weights to models, which have more skill in representing present climate conditions. The hypothesis is that our method based on neural network is actually weighting the models that way. While the statement is actually an open question, which answer may vary according to the region or climate signal under study, our results demonstrate that the neural network approach indeed allows to weighting models according to their skills. As such, our method is an improvement of existing Bayesian methods developed to mix ensembles of simulations. However, the general low skill of climate models in simulating precipitation mean climatology implies that the final projection maps (whatever the method used to compute them) may significantly change in the future as models improve. Therefore, the projection results for late twenty-first century conditions are presented as possible projections based on the “state-of-the-art” of
Statistical process control for hospitals: methodology, user education, and challenges.
Matthes, Nikolas; Ogunbo, Samuel; Pennington, Gaither; Wood, Nell; Hart, Marilyn K; Hart, Robert F
2007-01-01
The health care industry is slowly embracing the use of statistical process control (SPC) to monitor and study causes of variation in health care processes. While the statistics and principles underlying the use of SPC are relatively straightforward, there is a need to be cognizant of the perils that await the user who is not well versed in the key concepts of SPC. This article introduces the theory behind SPC methodology, describes successful tactics for educating users, and discusses the challenges associated with encouraging adoption of SPC among health care professionals. To illustrate these benefits and challenges, this article references the National Hospital Quality Measures, presents critical elements of SPC curricula, and draws examples from hospitals that have successfully embedded SPC into their overall approach to performance assessment and improvement. PMID:17627215
NASA Technical Reports Server (NTRS)
da Silva, Arlindo M.; Norris, Peter M.
2013-01-01
Part I presented a Monte Carlo Bayesian method for constraining a complex statistical model of GCM sub-gridcolumn moisture variability using high-resolution MODIS cloud data, thereby permitting large-scale model parameter estimation and cloud data assimilation. This part performs some basic testing of this new approach, verifying that it does indeed significantly reduce mean and standard deviation biases with respect to the assimilated MODIS cloud optical depth, brightness temperature and cloud top pressure, and that it also improves the simulated rotational-Ramman scattering cloud optical centroid pressure (OCP) against independent (non-assimilated) retrievals from the OMI instrument. Of particular interest, the Monte Carlo method does show skill in the especially difficult case where the background state is clear but cloudy observations exist. In traditional linearized data assimilation methods, a subsaturated background cannot produce clouds via any infinitesimal equilibrium perturbation, but the Monte Carlo approach allows finite jumps into regions of non-zero cloud probability. In the example provided, the method is able to restore marine stratocumulus near the Californian coast where the background state has a clear swath. This paper also examines a number of algorithmic and physical sensitivities of the new method and provides guidance for its cost-effective implementation. One obvious difficulty for the method, and other cloud data assimilation methods as well, is the lack of information content in the cloud observables on cloud vertical structure, beyond cloud top pressure and optical thickness, thus necessitating strong dependence on the background vertical moisture structure. It is found that a simple flow-dependent correlation modification due to Riishojgaard (1998) provides some help in this respect, by better honoring inversion structures in the background state.
Bayesian demography 250 years after Bayes
Bijak, Jakub; Bryant, John
2016-01-01
Bayesian statistics offers an alternative to classical (frequentist) statistics. It is distinguished by its use of probability distributions to describe uncertain quantities, which leads to elegant solutions to many difficult statistical problems. Although Bayesian demography, like Bayesian statistics more generally, is around 250 years old, only recently has it begun to flourish. The aim of this paper is to review the achievements of Bayesian demography, address some misconceptions, and make the case for wider use of Bayesian methods in population studies. We focus on three applications: demographic forecasts, limited data, and highly structured or complex models. The key advantages of Bayesian methods are the ability to integrate information from multiple sources and to describe uncertainty coherently. Bayesian methods also allow for including additional (prior) information next to the data sample. As such, Bayesian approaches are complementary to many traditional methods, which can be productively re-expressed in Bayesian terms. PMID:26902889
Bayesian demography 250 years after Bayes.
Bijak, Jakub; Bryant, John
2016-01-01
Bayesian statistics offers an alternative to classical (frequentist) statistics. It is distinguished by its use of probability distributions to describe uncertain quantities, which leads to elegant solutions to many difficult statistical problems. Although Bayesian demography, like Bayesian statistics more generally, is around 250 years old, only recently has it begun to flourish. The aim of this paper is to review the achievements of Bayesian demography, address some misconceptions, and make the case for wider use of Bayesian methods in population studies. We focus on three applications: demographic forecasts, limited data, and highly structured or complex models. The key advantages of Bayesian methods are the ability to integrate information from multiple sources and to describe uncertainty coherently. Bayesian methods also allow for including additional (prior) information next to the data sample. As such, Bayesian approaches are complementary to many traditional methods, which can be productively re-expressed in Bayesian terms. PMID:26902889
Bayesian neural adjustment of inhibitory control predicts emergence of problem stimulant use.
Harlé, Katia M; Stewart, Jennifer L; Zhang, Shunan; Tapert, Susan F; Yu, Angela J; Paulus, Martin P
2015-11-01
Bayesian ideal observer models quantify individuals' context- and experience-dependent beliefs and expectations about their environment, which provides a powerful approach (i) to link basic behavioural mechanisms to neural processing; and (ii) to generate clinical predictors for patient populations. Here, we focus on (ii) and determine whether individual differences in the neural representation of the need to stop in an inhibitory task can predict the development of problem use (i.e. abuse or dependence) in individuals experimenting with stimulants. One hundred and fifty-seven non-dependent occasional stimulant users, aged 18-24, completed a stop-signal task while undergoing functional magnetic resonance imaging. These individuals were prospectively followed for 3 years and evaluated for stimulant use and abuse/dependence symptoms. At follow-up, 38 occasional stimulant users met criteria for a stimulant use disorder (problem stimulant users), while 50 had discontinued use (desisted stimulant users). We found that those individuals who showed greater neural responses associated with Bayesian prediction errors, i.e. the difference between actual and expected need to stop on a given trial, in right medial prefrontal cortex/anterior cingulate cortex, caudate, anterior insula, and thalamus were more likely to exhibit problem use 3 years later. Importantly, these computationally based neural predictors outperformed clinical measures and non-model based neural variables in predicting clinical status. In conclusion, young adults who show exaggerated brain processing underlying whether to 'stop' or to 'go' are more likely to develop stimulant abuse. Thus, Bayesian cognitive models provide both a computational explanation and potential predictive biomarkers of belief processing deficits in individuals at risk for stimulant addiction. PMID:26336910
Links to sources of cancer-related statistics, including the Surveillance, Epidemiology and End Results (SEER) Program, SEER-Medicare datasets, cancer survivor prevalence data, and the Cancer Trends Progress Report.
Statistical Process Control of a Kalman Filter Model
Gamse, Sonja; Nobakht-Ersi, Fereydoun; Sharifi, Mohammad A.
2014-01-01
For the evaluation of measurement data, different functional and stochastic models can be used. In the case of time series, a Kalman filtering (KF) algorithm can be implemented. In this case, a very well-known stochastic model, which includes statistical tests in the domain of measurements and in the system state domain, is used. Because the output results depend strongly on input model parameters and the normal distribution of residuals is not always fulfilled, it is very important to perform all possible tests on output results. In this contribution, we give a detailed description of the evaluation of the Kalman filter model. We describe indicators of inner confidence, such as controllability and observability, the determinant of state transition matrix and observing the properties of the a posteriori system state covariance matrix and the properties of the Kalman gain matrix. The statistical tests include the convergence of standard deviations of the system state components and normal distribution beside standard tests. Especially, computing controllability and observability matrices and controlling the normal distribution of residuals are not the standard procedures in the implementation of KF. Practical implementation is done on geodetic kinematic observations. PMID:25264959
Statistical process control of a Kalman filter model.
Gamse, Sonja; Nobakht-Ersi, Fereydoun; Sharifi, Mohammad A
2014-01-01
For the evaluation of measurement data, different functional and stochastic models can be used. In the case of time series, a Kalman filtering (KF) algorithm can be implemented. In this case, a very well-known stochastic model, which includes statistical tests in the domain of measurements and in the system state domain, is used. Because the output results depend strongly on input model parameters and the normal distribution of residuals is not always fulfilled, it is very important to perform all possible tests on output results. In this contribution, we give a detailed description of the evaluation of the Kalman filter model. We describe indicators of inner confidence, such as controllability and observability, the determinant of state transition matrix and observing the properties of the a posteriori system state covariance matrix and the properties of the Kalman gain matrix. The statistical tests include the convergence of standard deviations of the system state components and normal distribution beside standard tests. Especially, computing controllability and observability matrices and controlling the normal distribution of residuals are not the standard procedures in the implementation of KF. Practical implementation is done on geodetic kinematic observations. PMID:25264959
Yield enhancement in micromechanical sensor fabrication using statistical process control
NASA Astrophysics Data System (ADS)
Borenstein, Jeffrey T.; Preble, Douglas M.
1997-09-01
Statistical process control (SPC) has gained wide acceptance in recent years as an essential tool for yield improvement in the microelectronics industry. In both manufacturing and research and development settings, statistical methods are extremely useful in process control and optimization. Here we describe the recent implementation of SPC in the micromachining fabrication process at Draper. A wide array of micromachined silicon sensors, including gyroscopes, accelerometers, and microphones, are routinely fabricated at Draper, often with rapidly changing designs and processes. In spite of Draper's requirements for rapid turnaround and relatively small, short production runs, SPC has turned out to be a critical component of the product development process. This paper describes the multipronged SPC approach we have developed and tailored to the particular requirements of an R & D micromachining process line. Standard tools such as Pareto charts, histograms, and cause-and-effect diagrams have been deployed to troubleshoot yield and performance problems in the micromachining process, and several examples of their use are described. More rigorous approaches, such as the use of control charts for variables and attributes, have been instituted with considerable success. The software package CornerstoneR was selected to handle the SPC program at Draper. We describe the highly automated process now in place for monitoring key processes, including diffusion, oxidation, photolithography, and etching. In addition to the process monitoring, gauge capability is applied to critical metrology tools on a regular basis. Applying these tools in the process line has resulted in sharply improved yields and shortened process cycles.
Statistical process control program at a ceramics vendor facility
Enke, G.M.
1992-12-01
Development of a statistical process control (SPC) program at a ceramics vendor location was deemed necessary to improve product quality, reduce manufacturing flowtime, and reduce quality costs borne by AlliedSignal Inc., Kansas City Division (KCD), and the vendor. Because of the lack of available KCD manpower and the required time schedule for the project, it was necessary for the SPC program to be implemented by an external contractor. Approximately a year after the program had been installed, the original baseline was reviewed so that the success of the project could be determined.
BIE: Bayesian Inference Engine
NASA Astrophysics Data System (ADS)
Weinberg, Martin D.
2013-12-01
The Bayesian Inference Engine (BIE) is an object-oriented library of tools written in C++ designed explicitly to enable Bayesian update and model comparison for astronomical problems. To facilitate "what if" exploration, BIE provides a command line interface (written with Bison and Flex) to run input scripts. The output of the code is a simulation of the Bayesian posterior distribution from which summary statistics e.g. by taking moments, or determine confidence intervals and so forth, can be determined. All of these quantities are fundamentally integrals and the Markov Chain approach produces variates heta distributed according to P( heta|D) so moments are trivially obtained by summing of the ensemble of variates.
Larson, Nicholas B; McDonnell, Shannon; Albright, Lisa Cannon; Teerlink, Craig; Stanford, Janet; Ostrander, Elaine A; Isaacs, William B; Xu, Jianfeng; Cooney, Kathleen A; Lange, Ethan; Schleutker, Johanna; Carpten, John D; Powell, Isaac; Bailey-Wilson, Joan; Cussenot, Olivier; Cancel-Tassin, Geraldine; Giles, Graham; MacInnis, Robert; Maier, Christiane; Whittemore, Alice S; Hsieh, Chih-Lin; Wiklund, Fredrik; Catolona, William J; Foulkes, William; Mandal, Diptasri; Eeles, Rosalind; Kote-Jarai, Zsofia; Ackerman, Michael J; Olson, Timothy M; Klein, Christopher J; Thibodeau, Stephen N; Schaid, Daniel J
2016-09-01
Rare variants (RVs) have been shown to be significant contributors to complex disease risk. By definition, these variants have very low minor allele frequencies and traditional single-marker methods for statistical analysis are underpowered for typical sequencing study sample sizes. Multimarker burden-type approaches attempt to identify aggregation of RVs across case-control status by analyzing relatively small partitions of the genome, such as genes. However, it is generally the case that the aggregative measure would be a mixture of causal and neutral variants, and these omnibus tests do not directly provide any indication of which RVs may be driving a given association. Recently, Bayesian variable selection approaches have been proposed to identify RV associations from a large set of RVs under consideration. Although these approaches have been shown to be powerful at detecting associations at the RV level, there are often computational limitations on the total quantity of RVs under consideration and compromises are necessary for large-scale application. Here, we propose a computationally efficient alternative formulation of this method using a probit regression approach specifically capable of simultaneously analyzing hundreds to thousands of RVs. We evaluate our approach to detect causal variation on simulated data and examine sensitivity and specificity in instances of high RV dimensionality as well as apply it to pathway-level RV analysis results from a prostate cancer (PC) risk case-control sequencing study. Finally, we discuss potential extensions and future directions of this work. PMID:27312771
Statistical process control based chart for information systems security
NASA Astrophysics Data System (ADS)
Khan, Mansoor S.; Cui, Lirong
2015-07-01
Intrusion detection systems have a highly significant role in securing computer networks and information systems. To assure the reliability and quality of computer networks and information systems, it is highly desirable to develop techniques that detect intrusions into information systems. We put forward the concept of statistical process control (SPC) in computer networks and information systems intrusions. In this article we propose exponentially weighted moving average (EWMA) type quality monitoring scheme. Our proposed scheme has only one parameter which differentiates it from the past versions. We construct the control limits for the proposed scheme and investigate their effectiveness. We provide an industrial example for the sake of clarity for practitioner. We give comparison of the proposed scheme with EWMA schemes and p chart; finally we provide some recommendations for the future work.
[Statistical Process Control applied to viral genome screening: experimental approach].
Reifenberg, J M; Navarro, P; Coste, J
2001-10-01
During the National Multicentric Study concerning the introduction of NAT for HCV and HIV-1 viruses in blood donation screening which was supervised by the Medical and Scientific departments of the French Blood Establishment (Etablissement français du sang--EFS), Transcription-Mediated transcription Amplification (TMA) technology (Chiron/Gen Probe) was experimented in the Molecular Biology Laboratory of Montpellier, EFS Pyrénées-Méditerranée. After a preliminary phase of qualification of the material and training of the technicians, routine screening of homologous blood and apheresis donations using this technology was applied for two months. In order to evaluate the different NAT systems, exhaustive daily operations and data were registered. Among these, the luminescence results expressed as RLU of the positive and negative calibrators and the associated internal controls were analysed using Control Charts, Statistical Process Control methods, which allow us to display rapidly process drift and to anticipate the appearance of incidents. This study demonstrated the interest of these quality control methods, mainly used for industrial purposes, to follow and to increase the quality of any transfusion process. it also showed the difficulties of the post-investigations of uncontrolled sources of variations of a process which was experimental. Such tools are in total accordance with the new version of the ISO 9000 norms which are particularly focused on the use of adapted indicators for processes control, and could be extended to other transfusion activities, such as blood collection and component preparation. PMID:11729395
A Bayesian Model of Sensory Adaptation
Sato, Yoshiyuki; Aihara, Kazuyuki
2011-01-01
Recent studies reported two opposite types of adaptation in temporal perception. Here, we propose a Bayesian model of sensory adaptation that exhibits both types of adaptation. We regard adaptation as the adaptive updating of estimations of time-evolving variables, which determine the mean value of the likelihood function and that of the prior distribution in a Bayesian model of temporal perception. On the basis of certain assumptions, we can analytically determine the mean behavior in our model and identify the parameters that determine the type of adaptation that actually occurs. The results of our model suggest that we can control the type of adaptation by controlling the statistical properties of the stimuli presented. PMID:21541346
NASA Astrophysics Data System (ADS)
Bell, Kenneth L.; Christensen, Lorna D.
1989-07-01
This paper describes a technique used to determine an optimized microlithographic process using statistical methods which included a statistically designed experiment (SDE); a desirability function, d(θ*) and a rigorous daily statistical process control program, (SPC).
Statistically Controlling for Confounding Constructs Is Harder than You Think
Westfall, Jacob; Yarkoni, Tal
2016-01-01
Social scientists often seek to demonstrate that a construct has incremental validity over and above other related constructs. However, these claims are typically supported by measurement-level models that fail to consider the effects of measurement (un)reliability. We use intuitive examples, Monte Carlo simulations, and a novel analytical framework to demonstrate that common strategies for establishing incremental construct validity using multiple regression analysis exhibit extremely high Type I error rates under parameter regimes common in many psychological domains. Counterintuitively, we find that error rates are highest—in some cases approaching 100%—when sample sizes are large and reliability is moderate. Our findings suggest that a potentially large proportion of incremental validity claims made in the literature are spurious. We present a web application (http://jakewestfall.org/ivy/) that readers can use to explore the statistical properties of these and other incremental validity arguments. We conclude by reviewing SEM-based statistical approaches that appropriately control the Type I error rate when attempting to establish incremental validity. PMID:27031707
Statistically Controlling for Confounding Constructs Is Harder than You Think.
Westfall, Jacob; Yarkoni, Tal
2016-01-01
Social scientists often seek to demonstrate that a construct has incremental validity over and above other related constructs. However, these claims are typically supported by measurement-level models that fail to consider the effects of measurement (un)reliability. We use intuitive examples, Monte Carlo simulations, and a novel analytical framework to demonstrate that common strategies for establishing incremental construct validity using multiple regression analysis exhibit extremely high Type I error rates under parameter regimes common in many psychological domains. Counterintuitively, we find that error rates are highest-in some cases approaching 100%-when sample sizes are large and reliability is moderate. Our findings suggest that a potentially large proportion of incremental validity claims made in the literature are spurious. We present a web application (http://jakewestfall.org/ivy/) that readers can use to explore the statistical properties of these and other incremental validity arguments. We conclude by reviewing SEM-based statistical approaches that appropriately control the Type I error rate when attempting to establish incremental validity. PMID:27031707
LOWER LEVEL INFERENCE CONTROL IN STATISTICAL DATABASE SYSTEMS
Lipton, D.L.; Wong, H.K.T.
1984-02-01
An inference is the process of transforming unclassified data values into confidential data values. Most previous research in inference control has studied the use of statistical aggregates to deduce individual records. However, several other types of inference are also possible. Unknown functional dependencies may be apparent to users who have 'expert' knowledge about the characteristics of a population. Some correlations between attributes may be concluded from 'commonly-known' facts about the world. To counter these threats, security managers should use random sampling of databases of similar populations, as well as expert systems. 'Expert' users of the DATABASE SYSTEM may form inferences from the variable performance of the user interface. Users may observe on-line turn-around time, accounting statistics. the error message received, and the point at which an interactive protocol sequence fails. One may obtain information about the frequency distributions of attribute values, and the validity of data object names from this information. At the back-end of a database system, improved software engineering practices will reduce opportunities to bypass functional units of the database system. The term 'DATA OBJECT' should be expanded to incorporate these data object types which generate new classes of threats. The security of DATABASES and DATABASE SySTEMS must be recognized as separate but related problems. Thus, by increased awareness of lower level inferences, system security managers may effectively nullify the threat posed by lower level inferences.
A journey to statistical process control in the development environment
Hanna, M.; Langston, D.
1996-12-31
Over the past 10 years many organizations have undertaken {open_quotes}process reengineering{close_quotes} activities in an attempt to increase their productivity and quality. Unfortunately, the launching point for these reengineering efforts has been based upon the belief that organizational processes either do not exist or they are grossly inefficient. It is the position of the authors that these beliefs are typically unfounded. All ongoing organizations have processes. These processes are effective, based upon the fact they are producing products (or services) that are being purchased. Therefore, the issue is not to invent or reengineer new processes, rather it is to increase the efficiency of the existing ones. This paper outlines a process (or organizational journey) for continually improving process based upon quantitative management techniques and statistical process control methods.
Application of statistical process control to qualitative molecular diagnostic assays.
O'Brien, Cathal P; Finn, Stephen P
2014-01-01
Modern pathology laboratories and in particular high throughput laboratories such as clinical chemistry have developed a reliable system for statistical process control (SPC). Such a system is absent from the majority of molecular laboratories and where present is confined to quantitative assays. As the inability to apply SPC to an assay is an obvious disadvantage this study aimed to solve this problem by using a frequency estimate coupled with a confidence interval calculation to detect deviations from an expected mutation frequency. The results of this study demonstrate the strengths and weaknesses of this approach and highlight minimum sample number requirements. Notably, assays with low mutation frequencies and detection of small deviations from an expected value require greater sample numbers to mitigate a protracted time to detection. Modeled laboratory data was also used to highlight how this approach might be applied in a routine molecular laboratory. This article is the first to describe the application of SPC to qualitative laboratory data. PMID:25988159
Bayesian Inference: with ecological applications
Link, William A.; Barker, Richard J.
2010-01-01
This text provides a mathematically rigorous yet accessible and engaging introduction to Bayesian inference with relevant examples that will be of interest to biologists working in the fields of ecology, wildlife management and environmental studies as well as students in advanced undergraduate statistics.. This text opens the door to Bayesian inference, taking advantage of modern computational efficiencies and easily accessible software to evaluate complex hierarchical models.
Geological Controls on Glacier Surging?: Statistics and Speculation
NASA Astrophysics Data System (ADS)
Flowers, G. E.; Crompton, J. W.
2015-12-01
Glacier surging represents an end-member behavior in the spectrum of ice dynamics, involving marked acceleration and high flow speeds due to abrupt changes in basal mechanics. Though much effort has been devoted to understanding the role of basal hydrology and thermal regime in fast glacier flow, fewer studies have addressed the potential role of the geologic substrate. One interesting observation is that surge-type glaciers appear almost universally associated with unconsolidated (till) beds, and several large-scale statistical studies have revealed correlations between glacier surging and bedrock properties. We revisit this relationship using field measurements. We selected 20 individual glaciers for sampling in a 40x40 km region of the St. Elias Mountains of Yukon, Canada. Eleven of these glaciers are known to surge and nine are not. The 20 study glaciers are underlain by lithologies that we have broadly classified into two types: metasedimentary only and mixed metasedimentary-granodiorite. We characterized geological and geotechnical properties of the bedrock in each basin, and analyzed the hydrochemistry and mineralogy and grain size distribution (GSD) of the suspended sediments in the proglacial streams. Here we focus on some intriguing results of the GSD analysis. Using statistical techniques, including significance testing and principal component analysis, we find that: (1) lithology determines GSD for non-surge-type glaciers, with metasedimentary basins associated with finer mean grain sizes and mixed-lithology basins with coarser mean grain sizes, but (2) the GSDs associated with surge-type glaciers are intermediate between the distributions described above, and are statistically indistinguishable between metasedimentary and mixed lithology basins. The latter suggests either that surge-type glaciers in our study area occur preferentially in basins where various processes conspire to produce a characteristic GSD, or that the surge cycle itself exerts an
Statistical process control testing of electronic security equipment
Murray, D.W.; Spencer, D.D.
1994-06-01
Statistical Process Control testing of manufacturing processes began back in the 1940`s with the development of Process Control Charts by Dr. Walter A. Shewart. Sandia National Laboratories has developed an application of the SPC method for performance testing of electronic security equipment. This paper documents the evaluation of this testing methodology applied to electronic security equipment and an associated laptop computer-based system for obtaining and analyzing the test data. Sandia developed this SPC sensor performance testing method primarily for use on portal metal detectors, but, has evaluated it for testing of an exterior intrusion detection sensor and other electronic security devices. This method is an alternative to the traditional binomial (alarm or no-alarm) performance testing. The limited amount of information in binomial data drives the number of tests necessary to meet regulatory requirements to unnecessarily high levels. For example, a requirement of a 0.85 probability of detection with a 90% confidence requires a minimum of 19 alarms out of 19 trials. By extracting and analyzing measurement (variables) data whenever possible instead of the more typical binomial data, the user becomes more informed about equipment health with fewer tests (as low as five per periodic evaluation).
A Bayesian Partitioning Model for Detection of Multilocus Effects in Case-Control Studies
Ray, Debashree; Li, Xiang; Pan, Wei; Pankow, James S; Basu, Saonli
2015-01-01
Background Genome-wide association studies (GWASs) have identified hundreds of genetic variants associated with complex diseases, but these variants appear to explain very little of the disease heritability. The typical single locus association analysis in a GWAS fails to detect variants with small effect sizes and to capture higher order interaction among these variants. Multilocus association analysis provides a powerful alternative by jointly modeling the variants within a gene or a pathway and by reducing the burden of multiple hypothesis testing in a GWAS. Methods We have proposed here a powerful and flexible dimension reduction approach to model multilocus association. We use a Bayesian partitioning model which clusters SNPs according to their direction of association, models higher order interactions using a flexible scoring scheme, and uses posterior marginal probabilities to detect association between the SNP-set and the disease. Results We have illustrated our model using extensive simulation studies and applied it detect multilocus interaction in a GWAS study with type 2 diabetes in Atherosclerosis Risk in Communities (ARIC). Conclusion We demonstrate that our approach has better power to detect multilocus interactions than several existing approaches. When applied to ARIC dataset with 9328 individuals to study gene based associations for type 2 diabetes, our method identified some novel variants not detected by conventional single locus association analyses. PMID:26044550
NASA Astrophysics Data System (ADS)
Loredo, Thomas J.
2004-04-01
I describe a framework for adaptive scientific exploration based on iterating an Observation-Inference-Design cycle that allows adjustment of hypotheses and observing protocols in response to the results of observation on-the-fly, as data are gathered. The framework uses a unified Bayesian methodology for the inference and design stages: Bayesian inference to quantify what we have learned from the available data and predict future data, and Bayesian decision theory to identify which new observations would teach us the most. When the goal of the experiment is simply to make inferences, the framework identifies a computationally efficient iterative ``maximum entropy sampling'' strategy as the optimal strategy in settings where the noise statistics are independent of signal properties. Results of applying the method to two ``toy'' problems with simulated data-measuring the orbit of an extrasolar planet, and locating a hidden one-dimensional object-show the approach can significantly improve observational efficiency in settings that have well-defined nonlinear models. I conclude with a list of open issues that must be addressed to make Bayesian adaptive exploration a practical and reliable tool for optimizing scientific exploration.
Impact angle control of interplanetary shock geoeffectiveness: A statistical study
NASA Astrophysics Data System (ADS)
Oliveira, Denny M.; Raeder, Joachim
2015-06-01
We present a survey of interplanetary (IP) shocks using Wind and ACE satellite data from January 1995 to December 2013 to study how IP shock geoeffectiveness is controlled by IP shock impact angles. A shock list covering one and a half solar cycle is compiled. The yearly number of IP shocks is found to correlate well with the monthly sunspot number. We use data from SuperMAG, a large chain with more than 300 geomagnetic stations, to study geoeffectiveness triggered by IP shocks. The SuperMAG SML index, an enhanced version of the familiar AL index, is used in our statistical analysis. The jumps of the SML index triggered by IP shock impacts on the Earth's magnetosphere are investigated in terms of IP shock orientation and speed. We find that, in general, strong (high speed) and almost frontal (small impact angle) shocks are more geoeffective than inclined shocks with low speed. The strongest correlation (correlation coefficient R = 0.78) occurs for fixed IP shock speed and for varied IP shock impact angle. We attribute this result, predicted previously with simulations, to the fact that frontal shocks compress the magnetosphere symmetrically from all sides, which is a favorable condition for the release of magnetic energy stored in the magnetotail, which in turn can produce moderate to strong auroral substorms, which are then observed by ground-based magnetometers.
A Statistical Process Control Method for Semiconductor Manufacturing
NASA Astrophysics Data System (ADS)
Kubo, Tomoaki; Ino, Tomomi; Minami, Kazuhiro; Minami, Masateru; Homma, Tetsuya
To maintain stable operation of semiconductor fabrication lines, statistical process control (SPC) methods are recognized to be effective. However, in semiconductor fabrication lines, there exist a huge number of process state signals to be monitored, and these signals contain both normally and non-normally distributed data. Therefore, if we try to apply SPC methods to those signals, we need one which satisfies three requirements: 1) It can deal with both normally distributed data, and non-normally distributed data, 2) It can be set up automatically, 3) It can be easily understood by engineers and technicians. In this paper, we propose a new SPC method which satisfies these three requirements at the same time. This method uses similar rules to the Shewhart chart, but can deal with non-normally distributed data by introducing “effective standard deviations”. Usefulness of this method is demonstrated by comparing false alarm ratios to that of the Shewhart chart method. In the demonstration, we use various kinds of artificially generated data, and real data observed in a chemical vapor deposition (CVD) process tool in a semiconductor fabrication line.
Statistical models for the control phase of clinical monitoring.
Stevens, Richard J; Oke, Jason; Perera, Rafael
2010-08-01
The rise in the prevalence of chronic conditions means that these are now the leading causes of death and disability worldwide, accounting for almost 60% of all deaths and 43% of the global burden of disease. Management of chronic conditions requires both effective treatment and ongoing monitoring. Although costs related to monitoring are substantial, there is relatively little evidence on its effectiveness. Monitoring is inherently different to diagnosis in its use of regularly repeated tests, and increasing frequency can result in poorer rather than better statistical properties because of multiple testing in the presence of high variability. We present here a general framework for modelling the control phase of a monitoring programme, and for the estimation of quantities of potential clinical interest such as the ratio of false to true positive tests. We show how four recent clinical studies of monitoring cardiovascular disease, hypertension, diabetes and HIV infection can be thought as special cases of this framework; as well as using this framework to clarify the choice of estimation and calculation methods available. Noticeably, in each of the presented examples over-frequent monitoring appears to be a greater problem than under-frequent monitoring. We also present recalculations of results under alternative conditions, illustrating conceptual decisions about modelling the true or observed value of a clinical measure. PMID:20442195
UNIFORMLY MOST POWERFUL BAYESIAN TESTS
Johnson, Valen E.
2014-01-01
Uniformly most powerful tests are statistical hypothesis tests that provide the greatest power against a fixed null hypothesis among all tests of a given size. In this article, the notion of uniformly most powerful tests is extended to the Bayesian setting by defining uniformly most powerful Bayesian tests to be tests that maximize the probability that the Bayes factor, in favor of the alternative hypothesis, exceeds a specified threshold. Like their classical counterpart, uniformly most powerful Bayesian tests are most easily defined in one-parameter exponential family models, although extensions outside of this class are possible. The connection between uniformly most powerful tests and uniformly most powerful Bayesian tests can be used to provide an approximate calibration between p-values and Bayes factors. Finally, issues regarding the strong dependence of resulting Bayes factors and p-values on sample size are discussed. PMID:24659829
2012-01-01
Background A statistical analysis plan (SAP) is a critical link between how a clinical trial is conducted and the clinical study report. To secure objective study results, regulatory bodies expect that the SAP will meet requirements in pre-specifying inferential analyses and other important statistical techniques. To write a good SAP for model-based sensitivity and ancillary analyses involves non-trivial decisions on and justification of many aspects of the chosen setting. In particular, trials with longitudinal count data as primary endpoints pose challenges for model choice and model validation. In the random effects setting, frequentist strategies for model assessment and model diagnosis are complex and not easily implemented and have several limitations. Therefore, it is of interest to explore Bayesian alternatives which provide the needed decision support to finalize a SAP. Methods We focus on generalized linear mixed models (GLMMs) for the analysis of longitudinal count data. A series of distributions with over- and under-dispersion is considered. Additionally, the structure of the variance components is modified. We perform a simulation study to investigate the discriminatory power of Bayesian tools for model criticism in different scenarios derived from the model setting. We apply the findings to the data from an open clinical trial on vertigo attacks. These data are seen as pilot data for an ongoing phase III trial. To fit GLMMs we use a novel Bayesian computational approach based on integrated nested Laplace approximations (INLAs). The INLA methodology enables the direct computation of leave-one-out predictive distributions. These distributions are crucial for Bayesian model assessment. We evaluate competing GLMMs for longitudinal count data according to the deviance information criterion (DIC) or probability integral transform (PIT), and by using proper scoring rules (e.g. the logarithmic score). Results The instruments under study provide excellent
NASA Technical Reports Server (NTRS)
Gupta, Pramod; Guenther, Kurt; Hodgkinson, John; Jacklin, Stephen; Richard, Michael; Schumann, Johann; Soares, Fola
2005-01-01
Modern exploration missions require modern control systems-control systems that can handle catastrophic changes in the system's behavior, compensate for slow deterioration in sustained operations, and support fast system ID. Adaptive controllers, based upon Neural Networks have these capabilities, but they can only be used safely if proper verification & validation (V&V) can be done. In this paper we present our V & V approach and simulation result within NASA's Intelligent Flight Control Systems (IFCS).
Bayesian Magic in Asteroseismology
NASA Astrophysics Data System (ADS)
Kallinger, T.
2015-09-01
Only a few years ago asteroseismic observations were so rare that scientists had plenty of time to work on individual data sets. They could tune their algorithms in any possible way to squeeze out the last bit of information. Nowadays this is impossible. With missions like MOST, CoRoT, and Kepler we basically drown in new data every day. To handle this in a sufficient way statistical methods become more and more important. This is why Bayesian techniques started their triumph march across asteroseismology. I will go with you on a journey through Bayesian Magic Land, that brings us to the sea of granulation background, the forest of peakbagging, and the stony alley of model comparison.
Jow, Howsun; Boys, Richard J; Wilkinson, Darren J
2014-10-01
In this paper we develop a Bayesian statistical inference approach to the unified analysis of isobaric labelled MS/MS proteomic data across multiple experiments. An explicit probabilistic model of the log-intensity of the isobaric labels' reporter ions across multiple pre-defined groups and experiments is developed. This is then used to develop a full Bayesian statistical methodology for the identification of differentially expressed proteins, with respect to a control group, across multiple groups and experiments. This methodology is implemented and then evaluated on simulated data and on two model experimental datasets (for which the differentially expressed proteins are known) that use a TMT labelling protocol. PMID:25153608
NASA Technical Reports Server (NTRS)
Gupta, Pramod; Jacklin, Stephen; Schumann, Johann; Guenther, Kurt; Richard, Michael; Soares, Fola
2005-01-01
Modem aircraft, UAVs, and robotic spacecraft pose substantial requirements on controllers in the light of ever increasing demands for reusability, affordability, and reliability. The individual systems (which are often nonlinear) must be controlled safely and reliably in environments where it is virtually impossible to analyze-ahead of time- all the important and possible scenarios and environmental factors. For example, system components (e.g., gyros, bearings of reaction wheels, valves) may deteriorate or break during autonomous UAV operation or long-lasting space missions, leading to a sudden, drastic change in vehicle performance. Manual repair or replacement is not an option in such cases. Instead, the system must be able to cope with equipment failure and deterioration. Controllability of the system must be retained as good as possible or re-established as fast as possible with a minimum of deactivation or shutdown of the system being controlled. In such situations the control engineer has to employ adaptive control systems that automatically sense and correct themselves whenever drastic disturbances and/or severe changes in the plant or environment occur.
Bayesian Analysis of Individual Level Personality Dynamics
Cripps, Edward; Wood, Robert E.; Beckmann, Nadin; Lau, John; Beckmann, Jens F.; Cripps, Sally Ann
2016-01-01
A Bayesian technique with analyses of within-person processes at the level of the individual is presented. The approach is used to examine whether the patterns of within-person responses on a 12-trial simulation task are consistent with the predictions of ITA theory (Dweck, 1999). ITA theory states that the performance of an individual with an entity theory of ability is more likely to spiral down following a failure experience than the performance of an individual with an incremental theory of ability. This is because entity theorists interpret failure experiences as evidence of a lack of ability which they believe is largely innate and therefore relatively fixed; whilst incremental theorists believe in the malleability of abilities and interpret failure experiences as evidence of more controllable factors such as poor strategy or lack of effort. The results of our analyses support ITA theory at both the within- and between-person levels of analyses and demonstrate the benefits of Bayesian techniques for the analysis of within-person processes. These include more formal specification of the theory and the ability to draw inferences about each individual, which allows for more nuanced interpretations of individuals within a personality category, such as differences in the individual probabilities of spiraling. While Bayesian techniques have many potential advantages for the analyses of processes at the level of the individual, ease of use is not one of them for psychologists trained in traditional frequentist statistical techniques. PMID:27486415
Bayesian Analysis of Individual Level Personality Dynamics.
Cripps, Edward; Wood, Robert E; Beckmann, Nadin; Lau, John; Beckmann, Jens F; Cripps, Sally Ann
2016-01-01
A Bayesian technique with analyses of within-person processes at the level of the individual is presented. The approach is used to examine whether the patterns of within-person responses on a 12-trial simulation task are consistent with the predictions of ITA theory (Dweck, 1999). ITA theory states that the performance of an individual with an entity theory of ability is more likely to spiral down following a failure experience than the performance of an individual with an incremental theory of ability. This is because entity theorists interpret failure experiences as evidence of a lack of ability which they believe is largely innate and therefore relatively fixed; whilst incremental theorists believe in the malleability of abilities and interpret failure experiences as evidence of more controllable factors such as poor strategy or lack of effort. The results of our analyses support ITA theory at both the within- and between-person levels of analyses and demonstrate the benefits of Bayesian techniques for the analysis of within-person processes. These include more formal specification of the theory and the ability to draw inferences about each individual, which allows for more nuanced interpretations of individuals within a personality category, such as differences in the individual probabilities of spiraling. While Bayesian techniques have many potential advantages for the analyses of processes at the level of the individual, ease of use is not one of them for psychologists trained in traditional frequentist statistical techniques. PMID:27486415
A Gentle Introduction to Bayesian Analysis: Applications to Developmental Research
van de Schoot, Rens; Kaplan, David; Denissen, Jaap; Asendorpf, Jens B; Neyer, Franz J; van Aken, Marcel AG
2014-01-01
Bayesian statistical methods are becoming ever more popular in applied and fundamental research. In this study a gentle introduction to Bayesian analysis is provided. It is shown under what circumstances it is attractive to use Bayesian estimation, and how to interpret properly the results. First, the ingredients underlying Bayesian methods are introduced using a simplified example. Thereafter, the advantages and pitfalls of the specification of prior knowledge are discussed. To illustrate Bayesian methods explained in this study, in a second example a series of studies that examine the theoretical framework of dynamic interactionism are considered. In the Discussion the advantages and disadvantages of using Bayesian statistics are reviewed, and guidelines on how to report on Bayesian statistics are provided. PMID:24116396
Bayesian stable isotope mixing models
In this paper we review recent advances in Stable Isotope Mixing Models (SIMMs) and place them into an over-arching Bayesian statistical framework which allows for several useful extensions. SIMMs are used to quantify the proportional contributions of various sources to a mixtur...
Bayesian inference for an emerging arboreal epidemic in the presence of control.
Parry, Matthew; Gibson, Gavin J; Parnell, Stephen; Gottwald, Tim R; Irey, Michael S; Gast, Timothy C; Gilligan, Christopher A
2014-04-29
The spread of Huanglongbing through citrus groves is used as a case study for modeling an emerging epidemic in the presence of a control. Specifically, the spread of the disease is modeled as a susceptible-exposed-infectious-detected-removed epidemic, where the exposure and infectious times are not observed, detection times are censored, removal times are known, and the disease is spreading through a heterogeneous host population with trees of different age and susceptibility. We show that it is possible to characterize the disease transmission process under these conditions. Two innovations in our work are (i) accounting for control measures via time dependence of the infectious process and (ii) including seasonal and host age effects in the model of the latent period. By estimating parameters in different subregions of a large commercially cultivated orchard, we establish a temporal pattern of invasion, host age dependence of the dispersal parameters, and a close to linear relationship between primary and secondary infectious rates. The model can be used to simulate Huanglongbing epidemics to assess economic costs and potential benefits of putative control scenarios. PMID:24711393
Statistical Approach to Quality Control of Large Thermodynamic Databases
NASA Astrophysics Data System (ADS)
Nyman, Henrik; Talonen, Tarja; Roine, Antti; Hupa, Mikko; Corander, Jukka
2012-10-01
In chemistry and engineering, thermodynamic databases are widely used to obtain the basic properties of pure substances or mixtures. Large and reliable databases are the basis of all thermodynamic modeling of complex chemical processes or systems. However, the effort needed in the establishment, maintenance, and management of a database increases exponentially along with the size and scope of the database. Therefore, we developed a statistical modeling approach to assist an expert in the evaluation and management process, which can pinpoint various types of erroneous records in a database. We have applied this method to investigate the enthalpy, entropy, and heat capacity characteristics in a large commercial database for approximately 25,000 chemical species. Our highly successful results show that a statistical approach is a valuable tool (1) for the management of such databases and (2) to create enthalpy, entropy and heat capacity estimates for such species in which thermochemical data are not available.
Bayesian Integrated Microbial Forensics
Jarman, Kristin H.; Kreuzer-Martin, Helen W.; Wunschel, David S.; Valentine, Nancy B.; Cliff, John B.; Petersen, Catherine E.; Colburn, Heather A.; Wahl, Karen L.
2008-06-01
In the aftermath of the 2001 anthrax letters, researchers have been exploring ways to predict the production environment of unknown source microorganisms. Different mass spectral techniques are being developed to characterize components of a microbe’s culture medium including water, carbon and nitrogen sources, metal ions added, and the presence of agar. Individually, each technique has the potential to identify one or two ingredients in a culture medium recipe. However, by integrating data from multiple mass spectral techniques, a more complete characterization is possible. We present a Bayesian statistical approach to integrated microbial forensics and illustrate its application on spores grown in different culture media.
The application of statistical process control to the development of CIS-based photovoltaics
NASA Astrophysics Data System (ADS)
Wieting, R. D.
1996-01-01
This paper reviews the application of Statistical Process Control (SPC) as well as other statistical methods to the development of thin film CuInSe2-based module fabrication processes. These methods have rigorously demonstrated the reproducibility of a number of individual process steps in module fabrication and led to the identification of previously unrecognized sources of process variation. A process exhibiting good statistical control with 11.4% mean module efficiency has been demonstrated.
Bayesian image reconstruction in astronomy
NASA Astrophysics Data System (ADS)
Nunez, Jorge; Llacer, Jorge
1990-09-01
This paper presents the development and testing of a new iterative reconstruction algorithm for astronomy. A maximum a posteriori method of image reconstruction in the Bayesian statistical framework is proposed for the Poisson-noise case. The method uses the entropy with an adjustable 'sharpness parameter' to define the prior probability and the likelihood with 'data increment' parameters to define the conditional probability. The method makes it possible to obtain reconstructions with neither the problem of the 'grey' reconstructions associated with the pure Bayesian reconstructions nor the problem of image deterioration, typical of the maximum-likelihood method. The present iterative algorithm is fast and stable, maintains positivity, and converges to feasible images.
Statistical Process Control Charts for Measuring and Monitoring Temporal Consistency of Ratings
ERIC Educational Resources Information Center
Omar, M. Hafidz
2010-01-01
Methods of statistical process control were briefly investigated in the field of educational measurement as early as 1999. However, only the use of a cumulative sum chart was explored. In this article other methods of statistical quality control are introduced and explored. In particular, methods in the form of Shewhart mean and standard deviation…
Using Statistical Process Control to Make Data-Based Clinical Decisions.
ERIC Educational Resources Information Center
Pfadt, Al; Wheeler, Donald J.
1995-01-01
Statistical process control (SPC), which employs simple statistical tools and problem-solving techniques such as histograms, control charts, flow charts, and Pareto charts to implement continual product improvement procedures, can be incorporated into human service organizations. Examples illustrate use of SPC procedures to analyze behavioral data…
Statistical process control (SPC) for coordinate measurement machines
Escher, R.N.
2000-01-04
The application of process capability analysis, using designed experiments, and gage capability studies as they apply to coordinate measurement machine (CMM) uncertainty analysis and control will be demonstrated. The use of control standards in designed experiments, and the use of range charts and moving range charts to separate measurement error into it's discrete components will be discussed. The method used to monitor and analyze the components of repeatability and reproducibility will be presented with specific emphasis on how to use control charts to determine and monitor CMM performance and capability, and stay within your uncertainty assumptions.
Statistical analysis of static shape control in space structures
NASA Technical Reports Server (NTRS)
Burdisso, Ricardo A.; Haftka, Raphael T.
1990-01-01
The article addresses the problem of efficient analysis of the statistics of initial and corrected shape distortions in space structures. Two approaches for improving efficiency are considered. One is an adjoint technique for calculating distortion shapes: the second is a modal expansion of distortion shapes in terms of pseudo-vibration modes. The two techniques are applied to the problem of optimizing actuator locations on a 55 m radiometer antenna. The adjoint analysis technique is used with a discrete-variable optimization method. The modal approximation technique is coupled with a standard conjugate-gradient continuous optimization method. The agreement between the two sets of results is good, validating both the approximate analysis and optimality of the results.
Statistical methodologies for the control of dynamic remapping
NASA Technical Reports Server (NTRS)
Saltz, J. H.; Nicol, D. M.
1986-01-01
Following an initial mapping of a problem onto a multiprocessor machine or computer network, system performance often deteriorates with time. In order to maintain high performance, it may be necessary to remap the problem. The decision to remap must take into account measurements of performance deterioration, the cost of remapping, and the estimated benefits achieved by remapping. We examine the tradeoff between the costs and the benefits of remapping two qualitatively different kinds of problems. One problem assumes that performance deteriorates gradually, the other assumes that performance deteriorates suddenly. We consider a variety of policies for governing when to remap. In order to evaluate these policies, statistical models of problem behaviors are developed. Simulation results are presented which compare simple policies with computationally expensive optimal decision policies; these results demonstrate that for each problem type, the proposed simple policies are effective and robust.
Simultaneous Bayesian analysis of contingency tables in genetic association studies.
Dickhaus, Thorsten
2015-08-01
Genetic association studies lead to simultaneous categorical data analysis. The sample for every genetic locus consists of a contingency table containing the numbers of observed genotype-phenotype combinations. Under case-control design, the row counts of every table are identical and fixed, while column counts are random. The aim of the statistical analysis is to test independence of the phenotype and the genotype at every locus. We present an objective Bayesian methodology for these association tests, which relies on the conjugacy of Dirichlet and multinomial distributions. Being based on the likelihood principle, the Bayesian tests avoid looping over all tables with given marginals. Making use of data generated by The Wellcome Trust Case Control Consortium (WTCCC), we illustrate that the ordering of the Bayes factors shows a good agreement with that of frequentist p-values. Furthermore, we deal with specifying prior probabilities for the validity of the null hypotheses, by taking linkage disequilibrium structure into account and exploiting the concept of effective numbers of tests. Application of a Bayesian decision theoretic multiple test procedure to the WTCCC data illustrates the proposed methodology. Finally, we discuss two methods for reconciling frequentist and Bayesian approaches to the multiple association test problem. PMID:26215535
Woldegebriel, Michael; Zomer, Paul; Mol, Hans G J; Vivó-Truyols, Gabriel
2016-08-01
In this work, we introduce an automated, efficient, and elegant model to combine all pieces of evidence (e.g., expected retention times, peak shapes, isotope distributions, fragment-to-parent ratio) obtained from liquid chromatography-tandem mass spectrometry (LC-MS/MS/MS) data for screening purposes. Combining all these pieces of evidence requires a careful assessment of the uncertainties in the analytical system as well as all possible outcomes. To-date, the majority of the existing algorithms are highly dependent on user input parameters. Additionally, the screening process is tackled as a deterministic problem. In this work we present a Bayesian framework to deal with the combination of all these pieces of evidence. Contrary to conventional algorithms, the information is treated in a probabilistic way, and a final probability assessment of the presence/absence of a compound feature is computed. Additionally, all the necessary parameters except the chromatographic band broadening for the method are learned from the data in training and learning phase of the algorithm, avoiding the introduction of a large number of user-defined parameters. The proposed method was validated with a large data set and has shown improved sensitivity and specificity in comparison to a threshold-based commercial software package. PMID:27391247
Methods of Statistical Control for Groundwater Quality Indicators
NASA Astrophysics Data System (ADS)
Yankovich, E.; Nevidimova, O.; Yankovich, K.
2016-06-01
The article describes the results of conducted groundwater quality control. Controlled quality indicators included the following microelements - barium, manganese, iron, mercury, iodine, chromium, strontium, etc. Quality control charts - X-bar chart and R chart - were built. For the upper and the lower threshold limits, maximum permissible concentration of components in water and the lower limit of their biologically significant concentration, respectively, were selected. The charts analysis has shown that the levels of microelements content in water at the area of study are stable. Most elements in the underground water are contained in concentrations, significant for human organisms consuming the water. For example, such elements as Ba, Mn, Fe have concentrations that exceed maximum permissible levels for drinking water.
Nonparametric Bayesian evaluation of differential protein quantification
Cansizoglu, A. Ertugrul; Käll, Lukas; Steen, Hanno
2013-01-01
Arbitrary cutoffs are ubiquitous in quantitative computational proteomics: maximum acceptable MS/MS PSM or peptide q–value, minimum ion intensity to calculate a fold change, the minimum number of peptides that must be available to trust the estimated protein fold change (or the minimum number of PSMs that must be available to trust the estimated peptide fold change), and the “significant” fold change cutoff. Here we introduce a novel experimental setup and nonparametric Bayesian algorithm for determining the statistical quality of a proposed differential set of proteins or peptides. By comparing putatively non-changing case-control evidence to an empirical null distribution derived from a control-control experiment, we successfully avoid some of these common parameters. We then apply our method to evaluating different fold change rules and find that, for our data, a 1.2-fold change is the most permissive of the plausible fold change rules. PMID:24024742
NASA Astrophysics Data System (ADS)
Olivares, G.; Teferle, F. N.
2013-12-01
Geodetic time series provide information which helps to constrain theoretical models of geophysical processes. It is well established that such time series, for example from GPS, superconducting gravity or mean sea level (MSL), contain time-correlated noise which is usually assumed to be a combination of a long-term stochastic process (characterized by a power-law spectrum) and random noise. Therefore, when fitting a model to geodetic time series it is essential to also estimate the stochastic parameters beside the deterministic ones. Often the stochastic parameters include the power amplitudes of both time-correlated and random noise, as well as, the spectral index of the power-law process. To date, the most widely used method for obtaining these parameter estimates is based on maximum likelihood estimation (MLE). We present an integration method, the Bayesian Monte Carlo Markov Chain (MCMC) method, which, by using Markov chains, provides a sample of the posteriori distribution of all parameters and, thereby, using Monte Carlo integration, all parameters and their uncertainties are estimated simultaneously. This algorithm automatically optimizes the Markov chain step size and estimates the convergence state by spectral analysis of the chain. We assess the MCMC method through comparison with MLE, using the recently released GPS position time series from JPL and apply it also to the MSL time series from the Revised Local Reference data base of the PSMSL. Although the parameter estimates for both methods are fairly equivalent, they suggest that the MCMC method has some advantages over MLE, for example, without further computations it provides the spectral index uncertainty, is computationally stable and detects multimodality.
Bayesian diagnostic theory using a programmable pocket calculator.
Edwards, F H; Graeber, G M
1987-01-01
A programmable pocket calculator program has been written to serve as an aid in diagnosis. The program uses a Bayesian statistical algorithm to calculate the relative probability of two diagnostic alternatives. The ability to carry out Bayesian statistical calculations at the bedside should make the use of such techniques more attractive to clinical practitioners. PMID:3319380
What Is the Probability You Are a Bayesian?
ERIC Educational Resources Information Center
Wulff, Shaun S.; Robinson, Timothy J.
2014-01-01
Bayesian methodology continues to be widely used in statistical applications. As a result, it is increasingly important to introduce students to Bayesian thinking at early stages in their mathematics and statistics education. While many students in upper level probability courses can recite the differences in the Frequentist and Bayesian…
Hirarchical Bayesian Spatio-Temporal Interpolation including Covariates
NASA Astrophysics Data System (ADS)
Hussain, Ijaz; Mohsin, Muhammad; Spoeck, Gunter; Pilz, Juergen
2010-05-01
The space-time interpolation of precipitation has significant contribution to river control,reservoir operations, forestry interest and flash flood watches etc. The changes in environmental covariates and spatial covariates make space-time estimation of precipitation a challenging task. In our earlier paper [1], we used transformed hirarchical Bayesian sapce-time interpolation method for predicting the amount of precipiation. In present paper, we modified the [2] method to include covarites which varaies with respect to space-time. The proposed method is applied to estimating space-time monthly precipitation in the monsoon periods during 1974 - 2000. The 27-years monthly average data of precipitation, temperature, humidity and wind speed are obtained from 51 monitoring stations in Pakistan. The average monthly precipitation is used response variable and temperature, humidity and wind speed are used as time varying covariates. Moreovere the spatial covarites elevation, latitude and longitude of same monitoring stations are also included. The cross-validation method is used to compare the results of transformed hierarchical Bayesian spatio-temporal interpolation with and without including environmental and spatial covariates. The software of [3] is modified to incorprate enviornmental covariates and spatil covarites. It is observed that the transformed hierarchical Bayesian method including covarites provides more accuracy than the transformed hierarchical Bayesian method without including covarites. Moreover, the five potential monitoring cites are selected based on maximum entropy sampaling design approach. References [1] I.Hussain, J.Pilz,G. Spoeck and H.L.Yu. Spatio-Temporal Interpolation of Precipitation during Monsoon Periods in Pakistan. submitted in Advances in water Resources,2009. [2] N.D. Le, W. Sun, and J.V. Zidek, Bayesian multivariate spatial interpolation with data missing by design. Journal of the Royal Statistical Society. Series B (Methodological
GASP cloud encounter statistics - Implications for laminar flow control flight
NASA Technical Reports Server (NTRS)
Jasperson, W. H.; Nastrom, G. D.; Davis, R. E.; Holdeman, J. D.
1984-01-01
The cloud observation archive from the NASA Global Atmospheric Sampling Program (GASP) is analyzed in order to derive the probability of cloud encounter at altitudes normally flown by commercial airliners, for application to a determination of the feasability of Laminar Flow Control (LFC) on long-range routes. The probability of cloud encounter is found to vary significantly with season. Several meteorological circulation features are apparent in the latitudinal distribution of cloud cover. The cloud encounter data are shown to be consistent with the classical midlatitude cyclone model with more clouds encountered in highs than in lows. Aircraft measurements of route-averaged time-in-clouds fit a gamma probability distribution model which is applied to estimate the probability of extended cloud encounter, and the associated loss of LFC effectiveness along seven high-density routes. The probability is demonstrated to be low.
NASA Astrophysics Data System (ADS)
Isakson, Steve Wesley
2001-12-01
Well-known principles of physics explain why resolution restrictions occur in images produced by optical diffraction-limited systems. The limitations involved are present in all diffraction-limited imaging systems, including acoustical and microwave. In most circumstances, however, prior knowledge about the object and the imaging system can lead to resolution improvements. In this dissertation I outline a method to incorporate prior information into the process of reconstructing images to superresolve the object beyond the above limitations. This dissertation research develops the details of this methodology. The approach can provide the most-probable global solution employing a finite number of steps in both far-field and near-field images. In addition, in order to overcome the effects of noise present in any imaging system, this technique provides a weighted image that quantifies the likelihood of various imaging solutions. By utilizing Bayesian probability, the procedure is capable of incorporating prior information about both the object and the noise to overcome the resolution limitation present in many imaging systems. Finally I will present an imaging system capable of detecting the evanescent waves missing from far-field systems, thus improving the resolution further.
ERIC Educational Resources Information Center
Miller, John
1994-01-01
Presents an approach to document numbering, document titling, and process measurement which, when used with fundamental techniques of statistical process control, reveals meaningful process-element variation as well as nominal productivity models. (SR)
Application of statistical process control charts to monitor changes in animal production systems.
De Vries, A; Reneau, J K
2010-04-01
Statistical process control (SPC) is a method of monitoring, controlling, and improving a process through statistical analysis. An important SPC tool is the control chart, which can be used to detect changes in production processes, including animal production systems, with a statistical level of confidence. This paper introduces the philosophy and types of control charts, design and performance issues, and provides a review of control chart applications in animal production systems found in the literature from 1977 to 2009. Primarily Shewhart and cumulative sum control charts have been described in animal production systems, with examples found in poultry, swine, dairy, and beef production systems. Examples include monitoring of growth, disease incidence, water intake, milk production, and reproductive performance. Most applications describe charting outcome variables, but more examples of control charts applied to input variables are needed, such as compliance to protocols, feeding practice, diet composition, and environmental factors. Common challenges for applications in animal production systems are the identification of the best statistical model for the common cause variability, grouping of data, selection of type of control chart, the cost of false alarms and lack of signals, and difficulty identifying the special causes when a change is signaled. Nevertheless, carefully constructed control charts are powerful methods to monitor animal production systems. Control charts might also supplement randomized controlled trials. PMID:20081080
NASA Astrophysics Data System (ADS)
Gomes, Guilherme J. C.; Vrugt, Jasper A.; Vargas, Eurípedes A.
2016-04-01
The depth to bedrock controls a myriad of processes by influencing subsurface flow paths, erosion rates, soil moisture, and water uptake by plant roots. As hillslope interiors are very difficult and costly to illuminate and access, the topography of the bedrock surface is largely unknown. This essay is concerned with the prediction of spatial patterns in the depth to bedrock (DTB) using high-resolution topographic data, numerical modeling, and Bayesian analysis. Our DTB model builds on the bottom-up control on fresh-bedrock topography hypothesis of Rempe and Dietrich (2014) and includes a mass movement and bedrock-valley morphology term to extent the usefulness and general applicability of the model. We reconcile the DTB model with field observations using Bayesian analysis with the DREAM algorithm. We investigate explicitly the benefits of using spatially distributed parameter values to account implicitly, and in a relatively simple way, for rock mass heterogeneities that are very difficult, if not impossible, to characterize adequately in the field. We illustrate our method using an artificial data set of bedrock depth observations and then evaluate our DTB model with real-world data collected at the Papagaio river basin in Rio de Janeiro, Brazil. Our results demonstrate that the DTB model predicts accurately the observed bedrock depth data. The posterior mean DTB simulation is shown to be in good agreement with the measured data. The posterior prediction uncertainty of the DTB model can be propagated forward through hydromechanical models to derive probabilistic estimates of factors of safety.
The application of statistical process control to the development of CIS-based photovoltaics
Wieting, R.D.
1996-01-01
This paper reviews the application of Statistical Process Control (SPC) as well as other statistical methods to the development of thin film CuInSe{sub 2}-based module fabrication processes. These methods have rigorously demonstrated the reproducibility of a number of individual process steps in module fabrication and led to the identification of previously unrecognized sources of process variation. A process exhibiting good statistical control with 11.4{percent} mean module efficiency has been demonstrated. {copyright} {ital 1996 American Institute of Physics.}
Able, Charles M.; Bright, Megan; Frizzell, Bart
2013-03-01
Purpose: Statistical process control (SPC) is a quality control method used to ensure that a process is well controlled and operates with little variation. This study determined whether SPC was a viable technique for evaluating the proper operation of a high-dose-rate (HDR) brachytherapy treatment delivery system. Methods and Materials: A surrogate prostate patient was developed using Vyse ordnance gelatin. A total of 10 metal oxide semiconductor field-effect transistors (MOSFETs) were placed from prostate base to apex. Computed tomography guidance was used to accurately position the first detector in each train at the base. The plan consisted of 12 needles with 129 dwell positions delivering a prescribed peripheral dose of 200 cGy. Sixteen accurate treatment trials were delivered as planned. Subsequently, a number of treatments were delivered with errors introduced, including wrong patient, wrong source calibration, wrong connection sequence, single needle displaced inferiorly 5 mm, and entire implant displaced 2 mm and 4 mm inferiorly. Two process behavior charts (PBC), an individual and a moving range chart, were developed for each dosimeter location. Results: There were 4 false positives resulting from 160 measurements from 16 accurately delivered treatments. For the inaccurately delivered treatments, the PBC indicated that measurements made at the periphery and apex (regions of high-dose gradient) were much more sensitive to treatment delivery errors. All errors introduced were correctly identified by either the individual or the moving range PBC in the apex region. Measurements at the urethra and base were less sensitive to errors. Conclusions: SPC is a viable method for assessing the quality of HDR treatment delivery. Further development is necessary to determine the most effective dose sampling, to ensure reproducible evaluation of treatment delivery accuracy.
Bayesian structural equation modeling in sport and exercise psychology.
Stenling, Andreas; Ivarsson, Andreas; Johnson, Urban; Lindwall, Magnus
2015-08-01
Bayesian statistics is on the rise in mainstream psychology, but applications in sport and exercise psychology research are scarce. In this article, the foundations of Bayesian analysis are introduced, and we will illustrate how to apply Bayesian structural equation modeling in a sport and exercise psychology setting. More specifically, we contrasted a confirmatory factor analysis on the Sport Motivation Scale II estimated with the most commonly used estimator, maximum likelihood, and a Bayesian approach with weakly informative priors for cross-loadings and correlated residuals. The results indicated that the model with Bayesian estimation and weakly informative priors provided a good fit to the data, whereas the model estimated with a maximum likelihood estimator did not produce a well-fitting model. The reasons for this discrepancy between maximum likelihood and Bayesian estimation are discussed as well as potential advantages and caveats with the Bayesian approach. PMID:26442771
Bayesian analysis for kaon photoproduction
Marsainy, T. Mart, T.
2014-09-25
We have investigated contribution of the nucleon resonances in the kaon photoproduction process by using an established statistical decision making method, i.e. the Bayesian method. This method does not only evaluate the model over its entire parameter space, but also takes the prior information and experimental data into account. The result indicates that certain resonances have larger probabilities to contribute to the process.
Létourneau, Daniel McNiven, Andrea; Keller, Harald; Wang, An; Amin, Md Nurul; Pearce, Jim; Norrlinger, Bernhard; Jaffray, David A.
2014-12-15
Purpose: High-quality radiation therapy using highly conformal dose distributions and image-guided techniques requires optimum machine delivery performance. In this work, a monitoring system for multileaf collimator (MLC) performance, integrating semiautomated MLC quality control (QC) tests and statistical process control tools, was developed. The MLC performance monitoring system was used for almost a year on two commercially available MLC models. Control charts were used to establish MLC performance and assess test frequency required to achieve a given level of performance. MLC-related interlocks and servicing events were recorded during the monitoring period and were investigated as indicators of MLC performance variations. Methods: The QC test developed as part of the MLC performance monitoring system uses 2D megavoltage images (acquired using an electronic portal imaging device) of 23 fields to determine the location of the leaves with respect to the radiation isocenter. The precision of the MLC performance monitoring QC test and the MLC itself was assessed by detecting the MLC leaf positions on 127 megavoltage images of a static field. After initial calibration, the MLC performance monitoring QC test was performed 3–4 times/week over a period of 10–11 months to monitor positional accuracy of individual leaves for two different MLC models. Analysis of test results was performed using individuals control charts per leaf with control limits computed based on the measurements as well as two sets of specifications of ±0.5 and ±1 mm. Out-of-specification and out-of-control leaves were automatically flagged by the monitoring system and reviewed monthly by physicists. MLC-related interlocks reported by the linear accelerator and servicing events were recorded to help identify potential causes of nonrandom MLC leaf positioning variations. Results: The precision of the MLC performance monitoring QC test and the MLC itself was within ±0.22 mm for most MLC leaves
Bayesian classification theory
NASA Technical Reports Server (NTRS)
Hanson, Robin; Stutz, John; Cheeseman, Peter
1991-01-01
The task of inferring a set of classes and class descriptions most likely to explain a given data set can be placed on a firm theoretical foundation using Bayesian statistics. Within this framework and using various mathematical and algorithmic approximations, the AutoClass system searches for the most probable classifications, automatically choosing the number of classes and complexity of class descriptions. A simpler version of AutoClass has been applied to many large real data sets, has discovered new independently-verified phenomena, and has been released as a robust software package. Recent extensions allow attributes to be selectively correlated within particular classes, and allow classes to inherit or share model parameters though a class hierarchy. We summarize the mathematical foundations of AutoClass.
Antal, Péter; Kiszel, Petra Sz.; Gézsi, András; Hadadi, Éva; Virág, Viktor; Hajós, Gergely; Millinghoffer, András; Nagy, Adrienne; Kiss, András; Semsei, Ágnes F.; Temesi, Gergely; Melegh, Béla; Kisfali, Péter; Széll, Márta; Bikov, András; Gálffy, Gabriella; Tamási, Lilla; Falus, András; Szalai, Csaba
2012-01-01
Genetic studies indicate high number of potential factors related to asthma. Based on earlier linkage analyses we selected the 11q13 and 14q22 asthma susceptibility regions, for which we designed a partial genome screening study using 145 SNPs in 1201 individuals (436 asthmatic children and 765 controls). The results were evaluated with traditional frequentist methods and we applied a new statistical method, called Bayesian network based Bayesian multilevel analysis of relevance (BN-BMLA). This method uses Bayesian network representation to provide detailed characterization of the relevance of factors, such as joint significance, the type of dependency, and multi-target aspects. We estimated posteriors for these relations within the Bayesian statistical framework, in order to estimate the posteriors whether a variable is directly relevant or its association is only mediated. With frequentist methods one SNP (rs3751464 in the FRMD6 gene) provided evidence for an association with asthma (OR = 1.43(1.2–1.8); p = 3×10−4). The possible role of the FRMD6 gene in asthma was also confirmed in an animal model and human asthmatics. In the BN-BMLA analysis altogether 5 SNPs in 4 genes were found relevant in connection with asthma phenotype: PRPF19 on chromosome 11, and FRMD6, PTGER2 and PTGDR on chromosome 14. In a subsequent step a partial dataset containing rhinitis and further clinical parameters was used, which allowed the analysis of relevance of SNPs for asthma and multiple targets. These analyses suggested that SNPs in the AHNAK and MS4A2 genes were indirectly associated with asthma. This paper indicates that BN-BMLA explores the relevant factors more comprehensively than traditional statistical methods and extends the scope of strong relevance based methods to include partial relevance, global characterization of relevance and multi-target relevance. PMID:22432035
Bayesianism Versus Confirmation
NASA Astrophysics Data System (ADS)
Strevens, Michael
2014-03-01
The usual Bayesian approach to understanding the confirmation of scientific theories is inadequate. The problem lies not with Bayesian epistemology, but with a simplistic equation of the subjective, individualistic evidential relevance relation that Bayesianism attempts to capture and the more objective relevance relation of confirmation.
Smith, Rebecca Lee; Gröhn, Yrjö Tapio
2015-01-01
Hansen's disease (leprosy) elimination has proven difficult in several countries, including Brazil, and there is a need for a mathematical model that can predict control program efficacy. This study applied the Approximate Bayesian Computation algorithm to fit 6 different proposed models to each of the 5 regions of Brazil, then fitted hierarchical models based on the best-fit regional models to the entire country. The best model proposed for most regions was a simple model. Posterior checks found that the model results were more similar to the observed incidence after fitting than before, and that parameters varied slightly by region. Current control programs were predicted to require additional measures to eliminate Hansen's Disease as a public health problem in Brazil. PMID:26107951
Smith, Rebecca Lee; Gröhn, Yrjö Tapio
2015-01-01
Hansen’s disease (leprosy) elimination has proven difficult in several countries, including Brazil, and there is a need for a mathematical model that can predict control program efficacy. This study applied the Approximate Bayesian Computation algorithm to fit 6 different proposed models to each of the 5 regions of Brazil, then fitted hierarchical models based on the best-fit regional models to the entire country. The best model proposed for most regions was a simple model. Posterior checks found that the model results were more similar to the observed incidence after fitting than before, and that parameters varied slightly by region. Current control programs were predicted to require additional measures to eliminate Hansen’s Disease as a public health problem in Brazil. PMID:26107951
An Automated Statistical Process Control Study of Inline Mixing Using Spectrophotometric Detection
ERIC Educational Resources Information Center
Dickey, Michael D.; Stewart, Michael D.; Willson, C. Grant
2006-01-01
An experiment is described, which is designed for a junior-level chemical engineering "fundamentals of measurements and data analysis" course, where students are introduced to the concept of statistical process control (SPC) through a simple inline mixing experiment. The students learn how to create and analyze control charts in an effort to…
ERIC Educational Resources Information Center
Hantula, Donald A.
1995-01-01
Clinical applications of statistical process control (SPC) in human service organizations are considered. SPC is seen as providing a standard set of criteria that serves as a common interface for data-based decision making, which may bring decision making under the control of established contingencies rather than the immediate contingencies of…
Pulsipher, B.A.; Kuhn, W.L.
1987-02-01
Current planning for liquid high-level nuclear wastes existing in the US includes processing in a liquid-fed ceramic melter to incorporate it into a high-quality glass, and placement in a deep geologic repository. The nuclear waste vitrification process requires assurance of a quality product with little or no final inspection. Statistical process control (SPC) is a quantitative approach to one quality assurance aspect of vitrified nuclear waste. This method for monitoring and controlling a process in the presence of uncertainties provides a statistical basis for decisions concerning product quality improvement. Statistical process control is shown to be a feasible and beneficial tool to help the waste glass producers demonstrate that the vitrification process can be controlled sufficiently to produce an acceptable product. This quantitative aspect of quality assurance could be an effective means of establishing confidence in the claims to a quality product. 2 refs., 4 figs.
The Bayesian bridge between simple and universal kriging
Omre, H.; Halvorsen, K.B. )
1989-10-01
Kriging techniques are suited well for evaluation of continuous, spatial phenomena. Bayesian statistics are characterized by using prior qualified guesses on the model parameters. By merging kriging techniques and Bayesian theory, prior guesses may be used in a spatial setting. Partial knowledge of model parameters defines a continuum of models between what is named simple and universal kriging in geostatistical terminology. The Bayesian approach to kriging is developed and discussed, and a case study concerning depth conversion of seismic reflection times is presented.
2011-01-01
Background This study seeks to increase clinical operational efficiency and accelerator beam consistency by retrospectively investigating the application of statistical process control (SPC) to linear accelerator beam steering parameters to determine the utility of such a methodology in detecting changes prior to equipment failure (interlocks actuated). Methods Steering coil currents (SCC) for the transverse and radial planes are set such that a reproducibly useful photon or electron beam is available. SCC are sampled and stored in the control console computer each day during the morning warm-up. The transverse and radial - positioning and angle SCC for photon beam energies were evaluated using average and range (Xbar-R) process control charts (PCC). The weekly average and range values (subgroup n = 5) for each steering coil were used to develop the PCC. SCC from September 2009 (annual calibration) until two weeks following a beam steering failure in June 2010 were evaluated. PCC limits were calculated using the first twenty subgroups. Appropriate action limits were developed using conventional SPC guidelines. Results PCC high-alarm action limit was set at 6 standard deviations from the mean. A value exceeding this limit would require beam scanning and evaluation by the physicist and engineer. Two low alarms were used to indicate negative trends. Alarms received following establishment of limits (week 20) are indicative of a non-random cause for deviation (Xbar chart) and/or an uncontrolled process (R chart). Transverse angle SCC for 6 MV and 15 MV indicated a high-alarm 90 and 108 days prior to equipment failure respectively. A downward trend in this parameter continued, with high-alarm, until failure. Transverse position and radial angle SCC for 6 and 15 MV indicated low-alarms starting as early as 124 and 116 days prior to failure, respectively. Conclusion Radiotherapy clinical efficiency and accelerator beam consistency may be improved by instituting SPC
Advanced statistical process control: controlling sub-0.18-μm lithography and other processes
NASA Astrophysics Data System (ADS)
Zeidler, Amit; Veenstra, Klaas-Jelle; Zavecz, Terrence E.
2001-08-01
access of the analysis to include the external variables involved in CMP, deposition etc. We then applied yield analysis methods to identify the significant lithography-external process variables from the history of lots, subsequently adding the identified process variable to the signatures database and to the PPC calculations. With these improvements, the authors anticipate a 50% improvement of the process window. This improvement results in a significant reduction of rework and improved yield depending on process demands and equipment configuration. A statistical theory that explains the PPC is then presented. This theory can be used to simulate a general PPC application. In conclusion, the PPC concept is not lithography or semiconductors limited. In fact it is applicable for any production process that is signature biased (chemical industry, car industry, .). Requirements for the PPC are large data collection, a controllable process that is not too expensive to tune the process for every lot, and the ability to employ feedback calculations. PPC is a major change in the process management approach and therefor will first be employed where the need is high and the return on investment is very fast. The best industry to start with is the semiconductors and the most likely process area to start with is lithography.
Research on statistical process control for solvent residual quantity of packaging materials
NASA Astrophysics Data System (ADS)
Xiao, Yingzhe; Huang, Yanan
2013-03-01
Statistical Process Control (SPC) and the basic tool of its controlling - control chart - are discussed in this paper based on the development of quality management, current situation of quality management of Chinese packaging enterprises, and the necessity of applying SPC. On this basis, X-R control chart is used to analyze and control the solvent residual in the compound process. This work may allow field personnel to find the shortcomings in the quality control by noticing the corresponding of fluctuations and slow variations in the process in time. In addition, SPC also provides objective basis for the quality management personnel to assess semi-products or products quality.
NASA Astrophysics Data System (ADS)
Granderson, Jessica Ann
2007-12-01
The need for sustainable, efficient energy systems is the motivation that drove this research, which targeted the design of an intelligent commercial lighting system. Lighting in commercial buildings consumes approximately 13% of all the electricity generated in the US. Advanced lighting controls1 intended for use in commercial office spaces have proven to save up to 45% in electricity consumption. However, they currently comprise only a fraction of the market share, resulting in a missed opportunity to conserve energy. The research goals driving this dissertation relate directly to barriers hindering widespread adoption---increase user satisfaction, and provide increased energy savings through more sophisticated control. To satisfy these goals an influence diagram was developed to perform daylighting actuation. This algorithm was designed to balance the potentially conflicting lighting preferences of building occupants, with the efficiency desires of building facilities management. A supervisory control policy was designed to implement load shedding under a demand response tariff. Such tariffs offer incentives for customers to reduce their consumption during periods of peak demand, trough price reductions. In developing the value function occupant user testing was conducted to determine that computer and paper tasks require different illuminance levels, and that user preferences are sufficiently consistent to attain statistical significance. Approximately ten facilities managers were also interviewed and surveyed to isolate their lighting preferences with respect to measures of lighting quality and energy savings. Results from both simulation and physical implementation and user testing indicate that the intelligent controller can increase occupant satisfaction, efficiency, cost savings, and management satisfaction, with respect to existing commercial daylighting systems. Several important contributions were realized by satisfying the research goals. A general
NASA Technical Reports Server (NTRS)
Shewhart, Mark
1991-01-01
Statistical Process Control (SPC) charts are one of several tools used in quality control. Other tools include flow charts, histograms, cause and effect diagrams, check sheets, Pareto diagrams, graphs, and scatter diagrams. A control chart is simply a graph which indicates process variation over time. The purpose of drawing a control chart is to detect any changes in the process signalled by abnormal points or patterns on the graph. The Artificial Intelligence Support Center (AISC) of the Acquisition Logistics Division has developed a hybrid machine learning expert system prototype which automates the process of constructing and interpreting control charts.
Using Alien Coins to Test Whether Simple Inference Is Bayesian
ERIC Educational Resources Information Center
Cassey, Peter; Hawkins, Guy E.; Donkin, Chris; Brown, Scott D.
2016-01-01
Reasoning and inference are well-studied aspects of basic cognition that have been explained as statistically optimal Bayesian inference. Using a simplified experimental design, we conducted quantitative comparisons between Bayesian inference and human inference at the level of individuals. In 3 experiments, with more than 13,000 participants, we…
Bayesian Inference on Proportional Elections
Brunello, Gabriel Hideki Vatanabe; Nakano, Eduardo Yoshio
2015-01-01
Polls for majoritarian voting systems usually show estimates of the percentage of votes for each candidate. However, proportional vote systems do not necessarily guarantee the candidate with the most percentage of votes will be elected. Thus, traditional methods used in majoritarian elections cannot be applied on proportional elections. In this context, the purpose of this paper was to perform a Bayesian inference on proportional elections considering the Brazilian system of seats distribution. More specifically, a methodology to answer the probability that a given party will have representation on the chamber of deputies was developed. Inferences were made on a Bayesian scenario using the Monte Carlo simulation technique, and the developed methodology was applied on data from the Brazilian elections for Members of the Legislative Assembly and Federal Chamber of Deputies in 2010. A performance rate was also presented to evaluate the efficiency of the methodology. Calculations and simulations were carried out using the free R statistical software. PMID:25786259
Bayesian inference on proportional elections.
Brunello, Gabriel Hideki Vatanabe; Nakano, Eduardo Yoshio
2015-01-01
Polls for majoritarian voting systems usually show estimates of the percentage of votes for each candidate. However, proportional vote systems do not necessarily guarantee the candidate with the most percentage of votes will be elected. Thus, traditional methods used in majoritarian elections cannot be applied on proportional elections. In this context, the purpose of this paper was to perform a Bayesian inference on proportional elections considering the Brazilian system of seats distribution. More specifically, a methodology to answer the probability that a given party will have representation on the chamber of deputies was developed. Inferences were made on a Bayesian scenario using the Monte Carlo simulation technique, and the developed methodology was applied on data from the Brazilian elections for Members of the Legislative Assembly and Federal Chamber of Deputies in 2010. A performance rate was also presented to evaluate the efficiency of the methodology. Calculations and simulations were carried out using the free R statistical software. PMID:25786259
ERIC Educational Resources Information Center
Karimi, Hamid; O'Brian, Sue; Onslow, Mark; Jones, Mark; Menzies, Ross; Packman, Ann
2013-01-01
Purpose: Stuttering varies between and within speaking situations. In this study, the authors used statistical process control charts with 10 case studies to investigate variability of stuttering frequency. Method: Participants were 10 adults who stutter. The authors counted the percentage of syllables stuttered (%SS) for segments of their speech…
ERIC Educational Resources Information Center
Logue, Alexandra W.; Watanabe-Rose, Mari
2014-01-01
This study used a randomized controlled trial to determine whether students, assessed by their community colleges as needing an elementary algebra (remedial) mathematics course, could instead succeed at least as well in a college-level, credit-bearing introductory statistics course with extra support (a weekly workshop). Researchers randomly…
ERIC Educational Resources Information Center
Averitt, Sallie D.
This instructor guide, which was developed for use in a manufacturing firm's advanced technical preparation program, contains the materials required to present a learning module that is designed to prepare trainees for the program's statistical process control module by improving their basic math skills and instructing them in basic calculator…
Analyzing a Mature Software Inspection Process Using Statistical Process Control (SPC)
NASA Technical Reports Server (NTRS)
Barnard, Julie; Carleton, Anita; Stamper, Darrell E. (Technical Monitor)
1999-01-01
This paper presents a cooperative effort where the Software Engineering Institute and the Space Shuttle Onboard Software Project could experiment applying Statistical Process Control (SPC) analysis to inspection activities. The topics include: 1) SPC Collaboration Overview; 2) SPC Collaboration Approach and Results; and 3) Lessons Learned.
ERIC Educational Resources Information Center
Billings, Paul H.
This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 6-hour introductory module on statistical process control (SPC), designed to develop competencies in the following skill areas: (1) identification of the three classes of SPC use; (2) understanding a process and how it works; (3)…
Project T.E.A.M. (Technical Education Advancement Modules). Advanced Statistical Process Control.
ERIC Educational Resources Information Center
Dunlap, Dale
This instructional guide, one of a series developed by the Technical Education Advancement Modules (TEAM) project, is a 20-hour advanced statistical process control (SPC) and quality improvement course designed to develop the following competencies: (1) understanding quality systems; (2) knowing the process; (3) solving quality problems; and (4)…
NASA Astrophysics Data System (ADS)
Hobson, Michael P.; Jaffe, Andrew H.; Liddle, Andrew R.; Mukherjee, Pia; Parkinson, David
2014-02-01
Preface; Part I. Methods: 1. Foundations and algorithms John Skilling; 2. Simple applications of Bayesian methods D. S. Sivia and Steve Rawlings; 3. Parameter estimation using Monte Carlo sampling Antony Lewis and Sarah Bridle; 4. Model selection and multi-model interference Andrew R. Liddle, Pia Mukherjee and David Parkinson; 5. Bayesian experimental design and model selection forecasting Roberto Trotta, Martin Kunz, Pia Mukherjee and David Parkinson; 6. Signal separation in cosmology M. P. Hobson, M. A. J. Ashdown and V. Stolyarov; Part II. Applications: 7. Bayesian source extraction M. P. Hobson, Graça Rocha and R. Savage; 8. Flux measurement Daniel Mortlock; 9. Gravitational wave astronomy Neil Cornish; 10. Bayesian analysis of cosmic microwave background data Andrew H. Jaffe; 11. Bayesian multilevel modelling of cosmological populations Thomas J. Loredo and Martin A. Hendry; 12. A Bayesian approach to galaxy evolution studies Stefano Andreon; 13. Photometric redshift estimation: methods and applications Ofer Lahav, Filipe B. Abdalla and Manda Banerji; Index.
NASA Astrophysics Data System (ADS)
Hobson, Michael P.; Jaffe, Andrew H.; Liddle, Andrew R.; Mukherjee, Pia; Parkinson, David
2009-12-01
Preface; Part I. Methods: 1. Foundations and algorithms John Skilling; 2. Simple applications of Bayesian methods D. S. Sivia and Steve Rawlings; 3. Parameter estimation using Monte Carlo sampling Antony Lewis and Sarah Bridle; 4. Model selection and multi-model interference Andrew R. Liddle, Pia Mukherjee and David Parkinson; 5. Bayesian experimental design and model selection forecasting Roberto Trotta, Martin Kunz, Pia Mukherjee and David Parkinson; 6. Signal separation in cosmology M. P. Hobson, M. A. J. Ashdown and V. Stolyarov; Part II. Applications: 7. Bayesian source extraction M. P. Hobson, Graça Rocha and R. Savage; 8. Flux measurement Daniel Mortlock; 9. Gravitational wave astronomy Neil Cornish; 10. Bayesian analysis of cosmic microwave background data Andrew H. Jaffe; 11. Bayesian multilevel modelling of cosmological populations Thomas J. Loredo and Martin A. Hendry; 12. A Bayesian approach to galaxy evolution studies Stefano Andreon; 13. Photometric redshift estimation: methods and applications Ofer Lahav, Filipe B. Abdalla and Manda Banerji; Index.
Implementation of Statistical Process Control for Proteomic Experiments via LC MS/MS
Bereman, Michael S.; Johnson, Richard; Bollinger, James; Boss, Yuval; Shulman, Nick; MacLean, Brendan; Hoofnagle, Andrew N.; MacCoss, Michael J.
2014-01-01
Statistical process control (SPC) is a robust set of tools that aids in the visualization, detection, and identification of assignable causes of variation in any process that creates products, services, or information. A tool has been developed termed Statistical Process Control in Proteomics (SProCoP) which implements aspects of SPC (e.g., control charts and Pareto analysis) into the Skyline proteomics software. It monitors five quality control metrics in a shotgun or targeted proteomic workflow. None of these metrics require peptide identification. The source code, written in the R statistical language, runs directly from the Skyline interface which supports the use of raw data files from several of the mass spectrometry vendors. It provides real time evaluation of the chromatographic performance (e.g., retention time reproducibility, peak asymmetry, and resolution); and mass spectrometric performance (targeted peptide ion intensity and mass measurement accuracy for high resolving power instruments) via control charts. Thresholds are experiment- and instrument-specific and are determined empirically from user-defined quality control standards that enable the separation of random noise and systematic error. Finally, Pareto analysis provides a summary of performance metrics and guides the user to metrics with high variance. The utility of these charts to evaluate proteomic experiments is illustrated in two case studies. PMID:24496601
Implementation of Statistical Process Control for Proteomic Experiments Via LC MS/MS
NASA Astrophysics Data System (ADS)
Bereman, Michael S.; Johnson, Richard; Bollinger, James; Boss, Yuval; Shulman, Nick; MacLean, Brendan; Hoofnagle, Andrew N.; MacCoss, Michael J.
2014-04-01
Statistical process control (SPC) is a robust set of tools that aids in the visualization, detection, and identification of assignable causes of variation in any process that creates products, services, or information. A tool has been developed termed Statistical Process Control in Proteomics (SProCoP) which implements aspects of SPC (e.g., control charts and Pareto analysis) into the Skyline proteomics software. It monitors five quality control metrics in a shotgun or targeted proteomic workflow. None of these metrics require peptide identification. The source code, written in the R statistical language, runs directly from the Skyline interface, which supports the use of raw data files from several of the mass spectrometry vendors. It provides real time evaluation of the chromatographic performance (e.g., retention time reproducibility, peak asymmetry, and resolution), and mass spectrometric performance (targeted peptide ion intensity and mass measurement accuracy for high resolving power instruments) via control charts. Thresholds are experiment- and instrument-specific and are determined empirically from user-defined quality control standards that enable the separation of random noise and systematic error. Finally, Pareto analysis provides a summary of performance metrics and guides the user to metrics with high variance. The utility of these charts to evaluate proteomic experiments is illustrated in two case studies.
Implementation of statistical process control for proteomic experiments via LC MS/MS.
Bereman, Michael S; Johnson, Richard; Bollinger, James; Boss, Yuval; Shulman, Nick; MacLean, Brendan; Hoofnagle, Andrew N; MacCoss, Michael J
2014-04-01
Statistical process control (SPC) is a robust set of tools that aids in the visualization, detection, and identification of assignable causes of variation in any process that creates products, services, or information. A tool has been developed termed Statistical Process Control in Proteomics (SProCoP) which implements aspects of SPC (e.g., control charts and Pareto analysis) into the Skyline proteomics software. It monitors five quality control metrics in a shotgun or targeted proteomic workflow. None of these metrics require peptide identification. The source code, written in the R statistical language, runs directly from the Skyline interface, which supports the use of raw data files from several of the mass spectrometry vendors. It provides real time evaluation of the chromatographic performance (e.g., retention time reproducibility, peak asymmetry, and resolution), and mass spectrometric performance (targeted peptide ion intensity and mass measurement accuracy for high resolving power instruments) via control charts. Thresholds are experiment- and instrument-specific and are determined empirically from user-defined quality control standards that enable the separation of random noise and systematic error. Finally, Pareto analysis provides a summary of performance metrics and guides the user to metrics with high variance. The utility of these charts to evaluate proteomic experiments is illustrated in two case studies. PMID:24496601
Tsiamyrtzis, Panagiotis; Sobas, Frédéric; Négrier, Claude
2015-07-01
The present study seeks to demonstrate the feasibility of avoiding the preliminary phase, which is mandatory in all conventional approaches for internal quality control (IQC) management. Apart from savings on the resources consumed by the preliminary phase, the alternative approach described here is able to detect any analytic problems during the startup and provide a foundation for subsequent conventional assessment. A new dynamically updated predictive control chart (PCC) is used. Being Bayesian in concept, it utilizes available prior information. The manufacturer's prior quality control target value, the manufacturer's maximum acceptable interassay coefficient of variation value and the interassay standard deviation value defined during method validation in each laboratory, allow online IQC management. An Excel template, downloadable from journal website, allows easy implementation of this alternative approach in any laboratory. In the practical case of prothrombin percentage measurement, PCC gave no false alarms with respect to the 1ks rule (with same 5% false-alarm probability on a single control sample) during an overlap phase between two IQC batches. Moreover, PCCs were as effective as the 1ks rule in detecting increases in both random and systematic error after the minimal preliminary phase required by medical biology guidelines. PCCs can improve efficiency in medical biology laboratories. PMID:25978121
NASA Astrophysics Data System (ADS)
Belt, John Q.; Rice, Gary K.
2002-02-01
There are four major quality control measures that can apply to geochemical petroleum exploration data: statistical quality control charts, data reproducibility-Juran approach, ethane composition index, and hydrocarbon cross plots. Statistical quality control, or SQC, charts reflect the quality-performance of the analytical process composed of equipment, instrumentation, and operator technique. An unstable process is detected through assignable causes using SQC charts. Knowing data variability is paramount to tying geochemical data over time for in-fill samples and/or project extensions. The Juran approach is a statistical tool used to help determine the ability of a process to maintain itself within the limits of set specifications for reproducing geochemical data. Ethane composition index, or ECI, is a statistical calculation based on near-surface, light hydrocarbon measurements that help differentiate thermogenic, petroleum sources at depth. The ECI data are integrated with subsurface geological information, and/or seismic survey data to determine lower-risk drilling locations. Hydrocarbon cross plots are visual correlation techniques that compare two hydrocarbons within a similar hydrocarbon suite (e.g., ethane versus propane, benzene versus toluene, or 2-ring aromatics versus 3-ring aromatics). Cross plots help determine contamination, multiple petroleum sources, and low-quality data versus high-quality data indigenous to different geochemical exploration tools. When integrated with geomorphology, subsurface geology, and seismic survey data high-quality geochemical data provides beneficial information for developing a petroleum exploration model. High-quality data is the key to the successful application of geochemistry in petroleum exploration modeling. The ability to produce high-quality, geochemical data requires the application of quality control measures reflective of a well managed ISO 9000 quality system. Statistical quality control charts, Juran
Bayesian sperm competition estimates.
Jones, Beatrix; Clark, Andrew G
2003-01-01
We introduce a Bayesian method for estimating parameters for a model of multiple mating and sperm displacement from genotype counts of brood-structured data. The model is initially targeted for Drosophila melanogaster, but is easily adapted to other organisms. The method is appropriate for use with field studies where the number of mates and the genotypes of the mates cannot be controlled, but where unlinked markers have been collected for a set of females and a sample of their offspring. Advantages over previous approaches include full use of multilocus information and the ability to cope appropriately with missing data and ambiguities about which alleles are maternally vs. paternally inherited. The advantages of including X-linked markers are also demonstrated. PMID:12663555
Abdel Wahab, Moataza M; Nofal, Laila M; Guirguis, Wafaa W; Mahdy, Nehad H
2004-01-01
Quality control is the application of statistical techniques to a process in an effort to identify and minimize both random and non-random sources of variation. The present study aimed at the application of Statistical Process Control (SPC) to analyze the referrals by General Practitioners (GP) at Health Insurance Organization (HIO) clinics in Alexandria. Retrospective analysis of records and cross sectional interview to 180 GPs were done. Using the control charts (p chart), the present study confirmed the presence of substantial variation in referral rates from GPs to specialists; more than 60% of variation was of the special cause, which revealed that the process of referral in Alexandria (HIO) was completely out of statistical control. Control charts for referrals by GPs classified by different GP characteristics or organizational factors revealed much variation, which suggested that the variation was at the level of individual GPs. Furthermore, the p chart for each GP separately; which yielded a fewer number of points out of control (outliers), with an average of 4 points. For 26 GPs, there was no points out of control, those GPs were slightly older than those having points out of control. Otherwise, there was no significant difference between them. The revised p chart for those 26 GPs together yielded a centerline of 9.7%, upper control limit of 12.0% and lower control limit of 7.4%. Those limits were in good agreement with the limits specified by HIO; they can be suggested to be the new specification limits after some training programs. PMID:17265609
NASA Astrophysics Data System (ADS)
Hu, Hongtao; Jing, Zhongliang; Hu, Shiqiang
2006-12-01
A novel adaptive algorithm for tracking maneuvering targets is proposed. The algorithm is implemented with fuzzy-controlled current statistic model adaptive filtering and unscented transformation. A fuzzy system allows the filter to tune the magnitude of maximum accelerations to adapt to different target maneuvers, and unscented transformation can effectively handle nonlinear system. A bearing-only tracking scenario simulation results show the proposed algorithm has a robust advantage over a wide range of maneuvers and overcomes the shortcoming of the traditional current statistic model and adaptive filtering algorithm.
Bayesian second law of thermodynamics.
Bartolotta, Anthony; Carroll, Sean M; Leichenauer, Stefan; Pollack, Jason
2016-08-01
We derive a generalization of the second law of thermodynamics that uses Bayesian updates to explicitly incorporate the effects of a measurement of a system at some point in its evolution. By allowing an experimenter's knowledge to be updated by the measurement process, this formulation resolves a tension between the fact that the entropy of a statistical system can sometimes fluctuate downward and the information-theoretic idea that knowledge of a stochastically evolving system degrades over time. The Bayesian second law can be written as ΔH(ρ_{m},ρ)+〈Q〉_{F|m}≥0, where ΔH(ρ_{m},ρ) is the change in the cross entropy between the original phase-space probability distribution ρ and the measurement-updated distribution ρ_{m} and 〈Q〉_{F|m} is the expectation value of a generalized heat flow out of the system. We also derive refined versions of the second law that bound the entropy increase from below by a non-negative number, as well as Bayesian versions of integral fluctuation theorems. We demonstrate the formalism using simple analytical and numerical examples. PMID:27627241
Bayesian second law of thermodynamics
NASA Astrophysics Data System (ADS)
Bartolotta, Anthony; Carroll, Sean M.; Leichenauer, Stefan; Pollack, Jason
2016-08-01
We derive a generalization of the second law of thermodynamics that uses Bayesian updates to explicitly incorporate the effects of a measurement of a system at some point in its evolution. By allowing an experimenter's knowledge to be updated by the measurement process, this formulation resolves a tension between the fact that the entropy of a statistical system can sometimes fluctuate downward and the information-theoretic idea that knowledge of a stochastically evolving system degrades over time. The Bayesian second law can be written as Δ H (ρm,ρ ) + F |m≥0 , where Δ H (ρm,ρ ) is the change in the cross entropy between the original phase-space probability distribution ρ and the measurement-updated distribution ρm and
F |m is the expectation value of a generalized heat flow out of the system. We also derive refined versions of the second law that bound the entropy increase from below by a non-negative number, as well as Bayesian versions of integral fluctuation theorems. We demonstrate the formalism using simple analytical and numerical examples.
Bayesian Phylogeography Finds Its Roots
Lemey, Philippe; Rambaut, Andrew; Drummond, Alexei J.; Suchard, Marc A.
2009-01-01
As a key factor in endemic and epidemic dynamics, the geographical distribution of viruses has been frequently interpreted in the light of their genetic histories. Unfortunately, inference of historical dispersal or migration patterns of viruses has mainly been restricted to model-free heuristic approaches that provide little insight into the temporal setting of the spatial dynamics. The introduction of probabilistic models of evolution, however, offers unique opportunities to engage in this statistical endeavor. Here we introduce a Bayesian framework for inference, visualization and hypothesis testing of phylogeographic history. By implementing character mapping in a Bayesian software that samples time-scaled phylogenies, we enable the reconstruction of timed viral dispersal patterns while accommodating phylogenetic uncertainty. Standard Markov model inference is extended with a stochastic search variable selection procedure that identifies the parsimonious descriptions of the diffusion process. In addition, we propose priors that can incorporate geographical sampling distributions or characterize alternative hypotheses about the spatial dynamics. To visualize the spatial and temporal information, we summarize inferences using virtual globe software. We describe how Bayesian phylogeography compares with previous parsimony analysis in the investigation of the influenza A H5N1 origin and H5N1 epidemiological linkage among sampling localities. Analysis of rabies in West African dog populations reveals how virus diffusion may enable endemic maintenance through continuous epidemic cycles. From these analyses, we conclude that our phylogeographic framework will make an important asset in molecular epidemiology that can be easily generalized to infer biogeogeography from genetic data for many organisms. PMID:19779555
Theoretical evaluation of the detectability of random lesions in bayesian emission reconstruction
Qi, Jinyi
2003-05-01
Detecting cancerous lesion is an important task in positron emission tomography (PET). Bayesian methods based on the maximum a posteriori principle (also called penalized maximum likelihood methods) have been developed to deal with the low signal to noise ratio in the emission data. Similar to the filter cut-off frequency in the filtered backprojection method, the prior parameters in Bayesian reconstruction control the resolution and noise trade-off and hence affect detectability of lesions in reconstructed images. Bayesian reconstructions are difficult to analyze because the resolution and noise properties are nonlinear and object-dependent. Most research has been based on Monte Carlo simulations, which are very time consuming. Building on the recent progress on the theoretical analysis of image properties of statistical reconstructions and the development of numerical observers, here we develop a theoretical approach for fast computation of lesion detectability in Bayesian reconstruction. The results can be used to choose the optimum hyperparameter for the maximum lesion detectability. New in this work is the use of theoretical expressions that explicitly model the statistical variation of the lesion and background without assuming that the object variation is (locally) stationary. The theoretical results are validated using Monte Carlo simulations. The comparisons show good agreement between the theoretical predications and the Monte Carlo results.
Performance of cancer cluster Q-statistics for case-control residential histories
Sloan, Chantel D.; Jacquez, Geoffrey M.; Gallagher, Carolyn M.; Ward, Mary H.; Raaschou-Nielsen, Ole; Nordsborg, Rikke Baastrup; Meliker, Jaymie R.
2012-01-01
Few investigations of health event clustering have evaluated residential mobility, though causative exposures for chronic diseases such as cancer often occur long before diagnosis. Recently developed Q-statistics incorporate human mobility into disease cluster investigations by quantifying space- and time-dependent nearest neighbor relationships. Using residential histories from two cancer case-control studies, we created simulated clusters to examine Q-statistic performance. Results suggest the intersection of cases with significant clustering over their life course, Qi, with cases who are constituents of significant local clusters at given times, Qit, yielded the best performance, which improved with increasing cluster size. Upon comparison, a larger proportion of true positives were detected with Kulldorf’s spatial scan method if the time of clustering was provided. We recommend using Q-statistics to identify when and where clustering may have occurred, followed by the scan method to localize the candidate clusters. Future work should investigate the generalizability of these findings. PMID:23149326
Zhu, Shijun; Wang, Fei; Chen, Yahong; Li, Zhenhua; Cai, Yangjian
2014-11-17
Experimental generation of a radially polarized (RP) beam with controllable spatial coherence (i.e., partially coherent RP beam) was reported recently [Appl. Phys. Lett. 100, 051108 (2012)]. In this paper, we carry out theoretical and experimental studies of the statistical properties in Young's two-slit interference pattern formed with a partially coherent RP beam. An approximate analytical expression for the cross-spectral density matrix of a partially coherent RP beam in the observation plane is obtained, and it is found that the statistical properties, such as the intensity, the degree of coherence and the degree of polarization, are strongly affected by the spatial coherence of the incident beam. Our experimental results are consistent with the theoretical predictions, and may be useful in some applications, where light field with special statistical properties are required. PMID:25402110
BART: Bayesian Atmospheric Radiative Transfer fitting code
NASA Astrophysics Data System (ADS)
Cubillos, Patricio; Blecic, Jasmina; Harrington, Joseph; Rojo, Patricio; Lust, Nate; Bowman, Oliver; Stemm, Madison; Foster, Andrew; Loredo, Thomas J.; Fortney, Jonathan; Madhusudhan, Nikku
2016-08-01
BART implements a Bayesian, Monte Carlo-driven, radiative-transfer scheme for extracting parameters from spectra of planetary atmospheres. BART combines a thermochemical-equilibrium code, a one-dimensional line-by-line radiative-transfer code, and the Multi-core Markov-chain Monte Carlo statistical module to constrain the atmospheric temperature and chemical-abundance profiles of exoplanets.
A Bayesian Approach to Interactive Retrieval
ERIC Educational Resources Information Center
Tague, Jean M.
1973-01-01
A probabilistic model for interactive retrieval is presented. Bayesian statistical decision theory principles are applied: use of prior and sample information about the relationship of document descriptions to query relevance; maximization of expected value of a utility function, to the problem of optimally restructuring search strategies in an…
Shanahan, K.L.
1990-09-01
A suite of RS/1 procedures for Shewhart control charting in chemical laboratories is described. The suite uses the RS series product QCA (Quality Control Analysis) for chart construction and analysis. The suite prompts users for data in a user friendly fashion and adds the data to or creates the control charts. All activities are time stamped. Facilities for generating monthly or contiguous time segment summary charts are included. The suite is currently in use at Westinghouse Savannah River Company.
Software For Multivariate Bayesian Classification
NASA Technical Reports Server (NTRS)
Saul, Ronald; Laird, Philip; Shelton, Robert
1996-01-01
PHD general-purpose classifier computer program. Uses Bayesian methods to classify vectors of real numbers, based on combination of statistical techniques that include multivariate density estimation, Parzen density kernels, and EM (Expectation Maximization) algorithm. By means of simple graphical interface, user trains classifier to recognize two or more classes of data and then use it to identify new data. Written in ANSI C for Unix systems and optimized for online classification applications. Embedded in another program, or runs by itself using simple graphical-user-interface. Online help files makes program easy to use.
NASA Technical Reports Server (NTRS)
Manning, Robert M.
1990-01-01
A static and dynamic rain-attenuation model is presented which describes the statistics of attenuation on an arbitrarily specified satellite link for any location for which there are long-term rainfall statistics. The model may be used in the design of the optimal stochastic control algorithms to mitigate the effects of attenuation and maintain link reliability. A rain-statistics data base is compiled, which makes it possible to apply the model to any location in the continental U.S. with a resolution of 0-5 degrees in latitude and longitude. The model predictions are compared with experimental observations, showing good agreement.
ERIC Educational Resources Information Center
Yuan, Ying; MacKinnon, David P.
2009-01-01
In this article, we propose Bayesian analysis of mediation effects. Compared with conventional frequentist mediation analysis, the Bayesian approach has several advantages. First, it allows researchers to incorporate prior information into the mediation analysis, thus potentially improving the efficiency of estimates. Second, under the Bayesian…
Mallick, Himel; Yi, Nengjun
2016-01-01
Park and Casella (2008) provided the Bayesian lasso for linear models by assigning scale mixture of normal (SMN) priors on the parameters and independent exponential priors on their variances. In this paper, we propose an alternative Bayesian analysis of the lasso problem. A different hierarchical formulation of Bayesian lasso is introduced by utilizing the scale mixture of uniform (SMU) representation of the Laplace density. We consider a fully Bayesian treatment that leads to a new Gibbs sampler with tractable full conditional posterior distributions. Empirical results and real data analyses show that the new algorithm has good mixing property and performs comparably to the existing Bayesian method in terms of both prediction accuracy and variable selection. An ECM algorithm is provided to compute the MAP estimates of the parameters. Easy extension to general models is also briefly discussed.
Chen, Yu-Pei; Guo, Rui; Liu, Na; Liu, Xu; Mao, Yan-Ping; Tang, Ling-Long; Zhou, Guan-Qun; Lin, Ai-Hua; Sun, Ying; Ma, Jun
2015-01-01
Background: Due to the lack of studies, it remains unclear whether the additional neoadjuvant chemotherapy (NACT) to concurrent chemoradiotherapy (CCRT) is superior to CCRT alone for locoregionally advanced nasopharyngeal carcinoma (NPC). The main objective of this Bayesian network meta-analysis was to determine the efficacy of NACT+CCRT as compared with CCRT alone. Methods: We comprehensively searched databases and extracted data from randomized controlled trials involving NPC patients who received NACT+CCRT, CCRT, NACT+radiotherapy (RT), or RT. Overall survival (OS) with hazard ratio (HR), and locoregional recurrence rate (LRR) and distant metastasis rate (DMR) with relative risks (RRs), were concerned. Results: Nine trials involving 1988 patients were analyzed. In the network meta-analysis, there was significant benefit of NACT+CCRT over CCRT for DMR (RR=0.54, 95% credible interval [CrI]=0.27-0.94). However, NACT+CCRT had a tendency to worsen locoregional control significantly as compared with CCRT (RR =1.71, 95%CrI =0.94-2.84), and no significant improvement in OS was found (HR =0.73, 95%CrI=0.40-1.23). Conclusions: NACT+CCRT is associated with reduced distant failure as compared with CCRT alone, and whether the additional NACT can improve survival for locoregionally advanced NPC should be further explored. Optimizing regimens and identifying patients at high risk of metastasis may enhance the efficacy of NACT+CCRT. PMID:26284140
A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC)
Gerard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent; Kafrouni, Hanna; Husson, Francois; Aletti, Pierre
2009-04-15
The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center ({+-}4% of deviation between the calculated and measured doses) by calculating a control process capability (C{sub pc}) index. The C{sub pc} index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should
A comprehensive analysis of the IMRT dose delivery process using statistical process control (SPC).
Gérard, Karine; Grandhaye, Jean-Pierre; Marchesi, Vincent; Kafrouni, Hanna; Husson, François; Aletti, Pierre
2009-04-01
The aim of this study is to introduce tools to improve the security of each IMRT patient treatment by determining action levels for the dose delivery process. To achieve this, the patient-specific quality control results performed with an ionization chamber--and which characterize the dose delivery process--have been retrospectively analyzed using a method borrowed from industry: Statistical process control (SPC). The latter consisted in fulfilling four principal well-structured steps. The authors first quantified the short-term variability of ionization chamber measurements regarding the clinical tolerances used in the cancer center (+/- 4% of deviation between the calculated and measured doses) by calculating a control process capability (C(pc)) index. The C(pc) index was found superior to 4, which implies that the observed variability of the dose delivery process is not biased by the short-term variability of the measurement. Then, the authors demonstrated using a normality test that the quality control results could be approximated by a normal distribution with two parameters (mean and standard deviation). Finally, the authors used two complementary tools--control charts and performance indices--to thoroughly analyze the IMRT dose delivery process. Control charts aim at monitoring the process over time using statistical control limits to distinguish random (natural) variations from significant changes in the process, whereas performance indices aim at quantifying the ability of the process to produce data that are within the clinical tolerances, at a precise moment. The authors retrospectively showed that the analysis of three selected control charts (individual value, moving-range, and EWMA control charts) allowed efficient drift detection of the dose delivery process for prostate and head-and-neck treatments before the quality controls were outside the clinical tolerances. Therefore, when analyzed in real time, during quality controls, they should improve the
Bayesian parameter estimation for effective field theories
NASA Astrophysics Data System (ADS)
Wesolowski, Sarah; Klco, Natalie; Furnstahl, Richard; Phillips, Daniel; Thapilaya, Arbin
2015-10-01
We present a procedure based on Bayesian statistics for effective field theory (EFT) parameter estimation from experimental or lattice data. The extraction of low-energy constants (LECs) is guided by physical principles such as naturalness in a quantifiable way and various sources of uncertainty are included by the specification of Bayesian priors. Special issues for EFT parameter estimation are demonstrated using representative model problems, and a set of diagnostics is developed to isolate and resolve these issues. We apply the framework to the extraction of the LECs of the nucleon mass expansion in SU(2) chiral perturbation theory from synthetic lattice data.
NASA Astrophysics Data System (ADS)
Wang, Ping; Dai, Xin-Gang
2016-09-01
The term "APEC Blue" has been created to describe the clear sky days since the Asia-Pacific Economic Cooperation (APEC) summit held in Beijing during November 5-11, 2014. The duration of the APEC Blue is detected from November 1 to November 14 (hereafter Blue Window) by moving t test in statistics. Observations show that APEC Blue corresponds to low air pollution with respect to PM2.5, PM10, SO2, and NO2 under strict emission-control measures (ECMs) implemented in Beijing and surrounding areas. Quantitative assessment shows that ECM is more effective on reducing aerosols than the chemical constituents. Statistical investigation has revealed that the window also resulted from intensified wind variability, as well as weakened static stability of atmosphere (SSA). The wind and ECMs played key roles in reducing air pollution during November 1-7 and 11-13, and strict ECMs and weak SSA become dominant during November 7-10 under weak wind environment. Moving correlation manifests that the emission reduction for aerosols can increase the apparent wind cleanup effect, leading to significant negative correlations of them, and the period-wise changes in emission rate can be well identified by multi-scale correlations basing on wavelet decomposition. In short, this case study manifests statistically how human interference modified air quality in the mega city through controlling local and surrounding emissions in association with meteorological condition.
NASA Technical Reports Server (NTRS)
Navard, Sharon E.
1989-01-01
In recent years there has been a push within NASA to use statistical techniques to improve the quality of production. Two areas where statistics are used are in establishing product and process quality control of flight hardware and in evaluating the uncertainty of calibration of instruments. The Flight Systems Quality Engineering branch is responsible for developing and assuring the quality of all flight hardware; the statistical process control methods employed are reviewed and evaluated. The Measurement Standards and Calibration Laboratory performs the calibration of all instruments used on-site at JSC as well as those used by all off-site contractors. These calibrations must be performed in such a way as to be traceable to national standards maintained by the National Institute of Standards and Technology, and they must meet a four-to-one ratio of the instrument specifications to calibrating standard uncertainty. In some instances this ratio is not met, and in these cases it is desirable to compute the exact uncertainty of the calibration and determine ways of reducing it. A particular example where this problem is encountered is with a machine which does automatic calibrations of force. The process of force calibration using the United Force Machine is described in detail. The sources of error are identified and quantified when possible. Suggestions for improvement are made.
The DWPF product composition control system at Savannah River: Statistical process control algorithm
Postles, R.L.; Brown, K.G.
1991-01-01
The DWPF Process batch-blends aqueous radwaste (PHA) with solid radwaste (Sludge) in a waste receipt vessel (the SRAT). The resulting SRAT-Batch is transferred to the next process vessel (the SME) and there blended with ground glass (Frit) to produce a batch of feed slurry. The SME-Batch is passed to a subsequent hold tank (the MFT) which feeds a Melter continuously. The Melter produces a molten glass wasteform which is poured into stainless steel canisters for cooling and, ultimately, shipment to and storage in a geologic Repository. The Repository will require that the glass wasteform be resistant to leaching by any underground water that might contact it. In addition, there are processing constraints on Viscosity and Liquidus Temperature of the melt. The Product Composition Control System (PCCS) is the system intended to ensure that the melt will be Processible and that the glass wasteform will be Acceptable. Within the PCCS, the SPC Algorithm is the device which guides control of the DWPF process. The SPC Algorithm is needed to control the multivariate DWPF process in the face of uncertainties (variances and covariances) which arise from this process and its supply, sampling, modeling, and measurement systems.
The DWPF product composition control system at Savannah River: Statistical process control algorithm
Postles, R.L.; Brown, K.G.
1991-12-31
The DWPF Process batch-blends aqueous radwaste (PHA) with solid radwaste (Sludge) in a waste receipt vessel (the SRAT). The resulting SRAT-Batch is transferred to the next process vessel (the SME) and there blended with ground glass (Frit) to produce a batch of feed slurry. The SME-Batch is passed to a subsequent hold tank (the MFT) which feeds a Melter continuously. The Melter produces a molten glass wasteform which is poured into stainless steel canisters for cooling and, ultimately, shipment to and storage in a geologic Repository. The Repository will require that the glass wasteform be resistant to leaching by any underground water that might contact it. In addition, there are processing constraints on Viscosity and Liquidus Temperature of the melt. The Product Composition Control System (PCCS) is the system intended to ensure that the melt will be Processible and that the glass wasteform will be Acceptable. Within the PCCS, the SPC Algorithm is the device which guides control of the DWPF process. The SPC Algorithm is needed to control the multivariate DWPF process in the face of uncertainties (variances and covariances) which arise from this process and its supply, sampling, modeling, and measurement systems.
CROWDER, STEPHEN V.
1999-09-01
In many manufacturing environments such as the nuclear weapons complex, emphasis has shifted from the regular production and delivery of large orders to infrequent small orders. However, the challenge to maintain the same high quality and reliability standards while building much smaller lot sizes remains. To meet this challenge, specific areas need more attention, including fast and on-target process start-up, low volume statistical process control, process characterization with small experiments, and estimating reliability given few actual performance tests of the product. In this paper we address the issue of low volume statistical process control. We investigate an adaptive filtering approach to process monitoring with a relatively short time series of autocorrelated data. The emphasis is on estimation and minimization of mean squared error rather than the traditional hypothesis testing and run length analyses associated with process control charting. We develop an adaptive filtering technique that assumes initial process parameters are unknown, and updates the parameters as more data become available. Using simulation techniques, we study the data requirements (the length of a time series of autocorrelated data) necessary to adequately estimate process parameters. We show that far fewer data values are needed than is typically recommended for process control applications. We also demonstrate the techniques with a case study from the nuclear weapons manufacturing complex.
Crowder, S.V.; Eshleman, L.
1998-08-01
In many manufacturing environments such as the nuclear weapons complex, emphasis has shifted from the regular production and delivery of large orders to infrequent small orders. However, the challenge to maintain the same high quality and reliability standards white building much smaller lot sizes remains. To meet this challenge, specific areas need more attention, including fast and on-target process start-up, low volume statistical process control, process characterization with small experiments, and estimating reliability given few actual performance tests of the product. In this paper the authors address the issue of low volume statistical process control. They investigate an adaptive filtering approach to process monitoring with a relatively short time series of autocorrelated data. The emphasis is on estimation and minimization of mean squared error rather than the traditional hypothesis testing and run length analyses associated with process control charting. The authors develop an adaptive filtering technique that assumes initial process parameters are unknown, and updates the parameters as more data become available. Using simulation techniques, they study the data requirements (the length of a time series of autocorrelated data) necessary to adequately estimate process parameters. They show that far fewer data values are needed than is typically recommended for process control applications. And they demonstrate the techniques with a case study from the nuclear weapons manufacturing complex.
Cheung, Yvonne Y; Jung, Boyoun; Sohn, Jae Ho; Ogrinc, Greg
2012-01-01
Quality improvement (QI) projects are an integral part of today's radiology practice, helping identify opportunities for improving outcomes by refining work processes. QI projects are typically driven by outcome measures, but the data can be difficult to interpret: The numbers tend to fluctuate even before a process is altered, and after a QI intervention takes place, it may be even more difficult to determine the cause of such vacillations. Control chart analysis helps the QI project team identify variations that should be targeted for intervention and avoid tampering in processes in which variation is random or harmless. Statistical control charts make it possible to distinguish among random variation or noise in the data, outlying tendencies that should be targeted for future intervention, and changes that signify the success of previous intervention. The data on control charts are plotted over time and integrated with various graphic devices that represent statistical reasoning (eg, control limits) to allow visualization of the intensity and overall effect-negative or positive-of variability. Even when variability has no substantial negative effect, appropriate intervention based on the results of control chart analysis can help increase the efficiency of a process by optimizing the central tendency of the outcome measure. Different types of control charts may be used to analyze the same outcome dataset: For example, paired charts of individual values (x) and the moving range (mR) allow robust and reliable analyses of most types of data from radiology QI projects. Many spreadsheet programs and templates are available for use in creating x-mR charts and other types of control charts. PMID:23150861
Kottner, Jan; Halfens, Ruud
2010-05-01
Institutionally acquired pressure ulcers are used as outcome indicators to assess the quality of pressure ulcer prevention programs. Determining whether quality improvement projects that aim to decrease the proportions of institutionally acquired pressure ulcers lead to real changes in clinical practice depends on the measurement method and statistical analysis used. To examine whether nosocomial pressure ulcer prevalence rates in hospitals in the Netherlands changed, a secondary data analysis using different statistical approaches was conducted of annual (1998-2008) nationwide nursing-sensitive health problem prevalence studies in the Netherlands. Institutions that participated regularly in all survey years were identified. Risk-adjusted nosocomial pressure ulcers prevalence rates, grade 2 to 4 (European Pressure Ulcer Advisory Panel system) were calculated per year and hospital. Descriptive statistics, chi-square trend tests, and P charts based on statistical process control (SPC) were applied and compared. Six of the 905 healthcare institutions participated in every survey year and 11,444 patients in these six hospitals were identified as being at risk for pressure ulcers. Prevalence rates per year ranged from 0.05 to 0.22. Chi-square trend tests revealed statistically significant downward trends in four hospitals but based on SPC methods, prevalence rates of five hospitals varied by chance only. Results of chi-square trend tests and SPC methods were not comparable, making it impossible to decide which approach is more appropriate. P charts provide more valuable information than single P values and are more helpful for monitoring institutional performance. Empirical evidence about the decrease of nosocomial pressure ulcer prevalence rates in the Netherlands is contradictory and limited. PMID:20511685
Computationally efficient Bayesian inference for inverse problems.
Marzouk, Youssef M.; Najm, Habib N.; Rahn, Larry A.
2007-10-01
Bayesian statistics provides a foundation for inference from noisy and incomplete data, a natural mechanism for regularization in the form of prior information, and a quantitative assessment of uncertainty in the inferred results. Inverse problems - representing indirect estimation of model parameters, inputs, or structural components - can be fruitfully cast in this framework. Complex and computationally intensive forward models arising in physical applications, however, can render a Bayesian approach prohibitive. This difficulty is compounded by high-dimensional model spaces, as when the unknown is a spatiotemporal field. We present new algorithmic developments for Bayesian inference in this context, showing strong connections with the forward propagation of uncertainty. In particular, we introduce a stochastic spectral formulation that dramatically accelerates the Bayesian solution of inverse problems via rapid evaluation of a surrogate posterior. We also explore dimensionality reduction for the inference of spatiotemporal fields, using truncated spectral representations of Gaussian process priors. These new approaches are demonstrated on scalar transport problems arising in contaminant source inversion and in the inference of inhomogeneous material or transport properties. We also present a Bayesian framework for parameter estimation in stochastic models, where intrinsic stochasticity may be intermingled with observational noise. Evaluation of a likelihood function may not be analytically tractable in these cases, and thus several alternative Markov chain Monte Carlo (MCMC) schemes, operating on the product space of the observations and the parameters, are introduced.
Space Shuttle RTOS Bayesian Network
NASA Technical Reports Server (NTRS)
Morris, A. Terry; Beling, Peter A.
2001-01-01
With shrinking budgets and the requirements to increase reliability and operational life of the existing orbiter fleet, NASA has proposed various upgrades for the Space Shuttle that are consistent with national space policy. The cockpit avionics upgrade (CAU), a high priority item, has been selected as the next major upgrade. The primary functions of cockpit avionics include flight control, guidance and navigation, communication, and orbiter landing support. Secondary functions include the provision of operational services for non-avionics systems such as data handling for the payloads and caution and warning alerts to the crew. Recently, a process to selection the optimal commercial-off-the-shelf (COTS) real-time operating system (RTOS) for the CAU was conducted by United Space Alliance (USA) Corporation, which is a joint venture between Boeing and Lockheed Martin, the prime contractor for space shuttle operations. In order to independently assess the RTOS selection, NASA has used the Bayesian network-based scoring methodology described in this paper. Our two-stage methodology addresses the issue of RTOS acceptability by incorporating functional, performance and non-functional software measures related to reliability, interoperability, certifiability, efficiency, correctness, business, legal, product history, cost and life cycle. The first stage of the methodology involves obtaining scores for the various measures using a Bayesian network. The Bayesian network incorporates the causal relationships between the various and often competing measures of interest while also assisting the inherently complex decision analysis process with its ability to reason under uncertainty. The structure and selection of prior probabilities for the network is extracted from experts in the field of real-time operating systems. Scores for the various measures are computed using Bayesian probability. In the second stage, multi-criteria trade-off analyses are performed between the scores
Using statistical process control to make data-based clinical decisions.
Pfadt, A; Wheeler, D J
1995-01-01
Applied behavior analysis is based on an investigation of variability due to interrelationships among antecedents, behavior, and consequences. This permits testable hypotheses about the causes of behavior as well as for the course of treatment to be evaluated empirically. Such information provides corrective feedback for making data-based clinical decisions. This paper considers how a different approach to the analysis of variability based on the writings of Walter Shewart and W. Edwards Deming in the area of industrial quality control helps to achieve similar objectives. Statistical process control (SPC) was developed to implement a process of continual product improvement while achieving compliance with production standards and other requirements for promoting customer satisfaction. SPC involves the use of simple statistical tools, such as histograms and control charts, as well as problem-solving techniques, such as flow charts, cause-and-effect diagrams, and Pareto charts, to implement Deming's management philosophy. These data-analytic procedures can be incorporated into a human service organization to help to achieve its stated objectives in a manner that leads to continuous improvement in the functioning of the clients who are its customers. Examples are provided to illustrate how SPC procedures can be used to analyze behavioral data. Issues related to the application of these tools for making data-based clinical decisions and for creating an organizational climate that promotes their routine use in applied settings are also considered. PMID:7592154
Bilingualism and Inhibitory Control Influence Statistical Learning of Novel Word Forms
Bartolotti, James; Marian, Viorica; Schroeder, Scott R.; Shook, Anthony
2011-01-01
We examined the influence of bilingual experience and inhibitory control on the ability to learn a novel language. Using a statistical learning paradigm, participants learned words in two novel languages that were based on the International Morse Code. First, participants listened to a continuous stream of words in a Morse code language to test their ability to segment words from continuous speech. Since Morse code does not overlap in form with natural languages, interference from known languages was minimized. Next, participants listened to another Morse code language composed of new words that conflicted with the first Morse code language. Interference in this second language was high due to conflict between languages and due to the presence of two colliding cues (compressed pauses between words and statistical regularities) that competed to define word boundaries. Results suggest that bilingual experience can improve word learning when interference from other languages is low, while inhibitory control ability can improve word learning when interference from other languages is high. We conclude that the ability to extract novel words from continuous speech is a skill that is affected both by linguistic factors, such as bilingual experience, and by cognitive abilities, such as inhibitory control. PMID:22131981
Statistical process control for AR(1) or non-Gaussian processes using wavelets coefficients
NASA Astrophysics Data System (ADS)
Cohen, A.; Tiplica, T.; Kobi, A.
2015-11-01
Autocorrelation and non-normality of process characteristic variables are two main difficulties that industrial engineers must face when they should implement control charting techniques. This paper presents new issues regarding the probability distribution of wavelets coefficients. Firstly, we highlight that wavelets coefficients have capacities to strongly decrease autocorrelation degree of original data and are normally-like distributed, especially in the case of Haar wavelet. We used AR(1) model with positive autoregressive parameters to simulate autocorrelated data. Illustrative examples are presented to show wavelets coefficients properties. Secondly, the distributional parameters of wavelets coefficients are derived, it shows that wavelets coefficients reflect an interesting statistical properties for SPC purposes.
Monitoring Actuarial Present Values of Term Life Insurance By a Statistical Process Control Chart
NASA Astrophysics Data System (ADS)
Hafidz Omar, M.
2015-06-01
Tracking performance of life insurance or similar insurance policy using standard statistical process control chart is complex because of many factors. In this work, we present the difficulty in doing so. However, with some modifications of the SPC charting framework, the difficulty can be manageable to the actuaries. So, we propose monitoring a simpler but natural actuarial quantity that is typically found in recursion formulas of reserves, profit testing, as well as present values. We shared some simulation results for the monitoring process. Additionally, some advantages of doing so is discussed.
Statistical process control program at a ceramics vendor facility. Final report
Enke, G.M.
1992-12-01
Development of a statistical process control (SPC) program at a ceramics vendor location was deemed necessary to improve product quality, reduce manufacturing flowtime, and reduce quality costs borne by AlliedSignal Inc., Kansas City Division (KCD), and the vendor. Because of the lack of available KCD manpower and the required time schedule for the project, it was necessary for the SPC program to be implemented by an external contractor. Approximately a year after the program had been installed, the original baseline was reviewed so that the success of the project could be determined.
Menghi, Enrico; Marcocci, Francesco; Bianchini, David
2016-01-01
The purpose of this study was to retrospectively evaluate the results from a Helical TomoTherapy Hi-Art treatment system relating to quality controls based on daily static and dynamic output checks using statistical process control methods. Individual value X-charts, exponentially weighted moving average charts, and process capability and acceptability indices were used to monitor the treatment system performance. Daily output values measured from January 2014 to January 2015 were considered. The results obtained showed that, although the process was in control, there was an out-of-control situation in the principal maintenance intervention for the treatment system. In particular, process capability indices showed a decreasing percentage of points in control which was, however, acceptable according to AAPM TG148 guidelines. Our findings underline the importance of restricting the acceptable range of daily output checks and suggest a future line of investigation for a detailed process control of daily output checks for the Helical TomoTherapy Hi-Art treatment system. PMID:26848962
Mezzenga, Emilio; D'Errico, Vincenzo; Sarnelli, Anna; Strigari, Lidia; Menghi, Enrico; Marcocci, Francesco; Bianchini, David; Benassi, Marcello
2016-01-01
The purpose of this study was to retrospectively evaluate the results from a Helical TomoTherapy Hi-Art treatment system relating to quality controls based on daily static and dynamic output checks using statistical process control methods. Individual value X-charts, exponentially weighted moving average charts, and process capability and acceptability indices were used to monitor the treatment system performance. Daily output values measured from January 2014 to January 2015 were considered. The results obtained showed that, although the process was in control, there was an out-of-control situation in the principal maintenance intervention for the treatment system. In particular, process capability indices showed a decreasing percentage of points in control which was, however, acceptable according to AAPM TG148 guidelines. Our findings underline the importance of restricting the acceptable range of daily output checks and suggest a future line of investigation for a detailed process control of daily output checks for the Helical TomoTherapy Hi-Art treatment system. PMID:26848962
NASA Astrophysics Data System (ADS)
von Toussaint, Udo
2011-07-01
Bayesian inference provides a consistent method for the extraction of information from physics experiments even in ill-conditioned circumstances. The approach provides a unified rationale for data analysis, which both justifies many of the commonly used analysis procedures and reveals some of the implicit underlying assumptions. This review summarizes the general ideas of the Bayesian probability theory with emphasis on the application to the evaluation of experimental data. As case studies for Bayesian parameter estimation techniques examples ranging from extra-solar planet detection to the deconvolution of the apparatus functions for improving the energy resolution and change point estimation in time series are discussed. Special attention is paid to the numerical techniques suited for Bayesian analysis, with a focus on recent developments of Markov chain Monte Carlo algorithms for high-dimensional integration problems. Bayesian model comparison, the quantitative ranking of models for the explanation of a given data set, is illustrated with examples collected from cosmology, mass spectroscopy, and surface physics, covering problems such as background subtraction and automated outlier detection. Additionally the Bayesian inference techniques for the design and optimization of future experiments are introduced. Experiments, instead of being merely passive recording devices, can now be designed to adapt to measured data and to change the measurement strategy on the fly to maximize the information of an experiment. The applied key concepts and necessary numerical tools which provide the means of designing such inference chains and the crucial aspects of data fusion are summarized and some of the expected implications are highlighted.
Bayesian nonparametric models for ranked set sampling.
Gemayel, Nader; Stasny, Elizabeth A; Wolfe, Douglas A
2015-04-01
Ranked set sampling (RSS) is a data collection technique that combines measurement with judgment ranking for statistical inference. This paper lays out a formal and natural Bayesian framework for RSS that is analogous to its frequentist justification, and that does not require the assumption of perfect ranking or use of any imperfect ranking models. Prior beliefs about the judgment order statistic distributions and their interdependence are embodied by a nonparametric prior distribution. Posterior inference is carried out by means of Markov chain Monte Carlo techniques, and yields estimators of the judgment order statistic distributions (and of functionals of those distributions). PMID:25326663
Using a statistical process control chart during the quality assessment of cancer registry data.
Myles, Zachary M; German, Robert R; Wilson, Reda J; Wu, Manxia
2011-01-01
Statistical process control (SPC) charts may be used to detect acute variations in the data while simultaneously evaluating unforeseen aberrations that may warrant further investigation by the data user. Using cancer stage data captured by the Summary Stage 2000 (SS2000) variable, we sought to present a brief report highlighting the utility of the SPC chart during the quality assessment of cancer registry data. Using a county-level caseload for the diagnosis period of 2001-2004 (n=25,648), we found the overall variation of the SS2000 variable to be in control during diagnosis years of 2001 and 2002, exceeded the lower control limit (LCL) in 2003, and exceeded the upper control limit (UCL) in 2004; in situ/localized stages were in control throughout the diagnosis period, regional stage exceeded UCL in 2004, and distant stage exceeded the LCL in 2001 and the UCL in 2004. Our application of the SPC chart with cancer registry data illustrates that the SPC chart may serve as a readily available and timely tool for identifying areas of concern during the data collection and quality assessment of central cancer registry data. PMID:22223059
NASA Technical Reports Server (NTRS)
Oravec, Heather Ann; Daniels, Christopher C.
2014-01-01
The National Aeronautics and Space Administration has been developing a novel docking system to meet the requirements of future exploration missions to low-Earth orbit and beyond. A dynamic gas pressure seal is located at the main interface between the active and passive mating components of the new docking system. This seal is designed to operate in the harsh space environment, but is also to perform within strict loading requirements while maintaining an acceptable level of leak rate. In this study, a candidate silicone elastomer seal was designed, and multiple subscale test articles were manufactured for evaluation purposes. The force required to fully compress each test article at room temperature was quantified and found to be below the maximum allowable load for the docking system. However, a significant amount of scatter was observed in the test results. Due to the stochastic nature of the mechanical performance of this candidate docking seal, a statistical process control technique was implemented to isolate unusual compression behavior from typical mechanical performance. The results of this statistical analysis indicated a lack of process control, suggesting a variation in the manufacturing phase of the process. Further investigation revealed that changes in the manufacturing molding process had occurred which may have influenced the mechanical performance of the seal. This knowledge improves the chance of this and future space seals to satisfy or exceed design specifications.
Editorial: Bayesian benefits for child psychology and psychiatry researchers.
Oldehinkel, Albertine J
2016-09-01
For many scientists, performing statistical tests has become an almost automated routine. However, p-values are frequently used and interpreted incorrectly; and even when used appropriately, p-values tend to provide answers that do not match researchers' questions and hypotheses well. Bayesian statistics present an elegant and often more suitable alternative. The Bayesian approach has rarely been applied in child psychology and psychiatry research so far, but the development of user-friendly software packages and tutorials has placed it well within reach now. Because Bayesian analyses require a more refined definition of hypothesized probabilities of possible outcomes than the classical approach, going Bayesian may offer the additional benefit of sparkling the development and refinement of theoretical models in our field. PMID:27535649
Bayesian analysis on meta-analysis of case-control studies accounting for within-study correlation.
Chen, Yong; Chu, Haitao; Luo, Sheng; Nie, Lei; Chen, Sining
2015-12-01
In retrospective studies, odds ratio is often used as the measure of association. Under independent beta prior assumption, the exact posterior distribution of odds ratio given a single 2 × 2 table has been derived in the literature. However, independence between risks within the same study may be an oversimplified assumption because cases and controls in the same study are likely to share some common factors and thus to be correlated. Furthermore, in a meta-analysis of case-control studies, investigators usually have multiple 2 × 2 tables. In this article, we first extend the published results on a single 2 × 2 table to allow within study prior correlation while retaining the advantage of closed-form posterior formula, and then extend the results to multiple 2 × 2 tables and regression setting. The hyperparameters, including within study correlation, are estimated via an empirical Bayes approach. The overall odds ratio and the exact posterior distribution of the study-specific odds ratio are inferred based on the estimated hyperparameters. We conduct simulation studies to verify our exact posterior distribution formulas and investigate the finite sample properties of the inference for the overall odds ratio. The results are illustrated through a twin study for genetic heritability and a meta-analysis for the association between the N-acetyltransferase 2 (NAT2) acetylation status and colorectal cancer. PMID:22143403
Yu, Hongliang; Gu, Dayong; He, Xia; Gao, Xianshu; Bian, Xiuhua
2016-01-01
Whether the addition of induction chemotherapy (IC) or adjuvant chemotherapy (AC) to concurrent chemoradiotherapy (CCRT) is superior to CCRT alone for locally advanced nasopharyngeal cancer is unknown. A Bayesian network meta-analysis was performed to investigate the efficacy of CCRT, IC + CCRT, and CCRT + AC on locally advanced nasopharyngeal cancer. The overall survival (OS) with hazard ratios (HRs) and locoregional recurrence rates (LRRs) and distant metastasis rates (DMRs) with risk ratios (RRs) were investigated. After a comprehensive database search, eleven studies involving 2,626 assigned patients were included in this network meta-analysis. Compared with CCRT alone, IC + CCRT resulted in no significant improvement in OS or LRR and a marginal improvement in DMR (OS: HR =0.67, 95% credible interval (CrI) 0.32–1.18; LRR: RR =1.79, 95% CrI 0.80–3.51; DMR: RR =1.79, 95% CrI 0.24–1.04) and CCRT + AC exhibited no beneficial effects on any of the endpoints of OS, LRR, or DMR (OS: HR =0.99, 95% CrI 0.64–1.43; LRR: RR =0.78, 95% CrI 0.43–1.32; DMR: RR =0.85, 95% CrI 0.57–1.24). As a conclusion, for locally advanced nasopharyngeal cancer, no significant differences in the treatment efficacies of CCRT, IC + CCRT, and CCRT + AC were found, with the exception of a marginally significant improvement in distant control observed following IC + CCRT compared with CCRT alone. PMID:26793000
Lee, Young Ho; Song, Gwan Gyu
2016-05-01
The aim of this study was to assess the relative efficacy and tolerability of duloxetine, pregabalin, and milnacipran at the recommended doses in patients with fibromyalgia. Randomized controlled trials (RCTs) examining the efficacy and safety of duloxetine 60 mg, pregabalin 300 mg, pregabalin 150 mg, milnacipran 200 mg, and milnacipran 100 mg compared to placebo in patients with fibromyalgia were included in this Bayesian network meta-analysis. Nine RCTs including 5140 patients met the inclusion criteria. The proportion of patients with >30 % improvement from baseline in pain was significantly higher in the duloxetine 60 mg, pregabalin 300 mg, milnacipran 100 mg, and milnacipran 200 mg groups than in the placebo group [pairwise odds ratio (OR) 2.33, 95 % credible interval (CrI) 1.50-3.67; OR 1.68, 95 % CrI 1.25-2.28; OR 1.62, 95 % CrI 1.16-2.25; and OR 1.61; 95 % CrI 1.15-2.24, respectively]. Ranking probability based on the surface under the cumulative ranking curve (SUCRA) indicated that duloxetine 60 mg had the highest probability of being the best treatment for achieving the response level (SUCRA = 0.9431), followed by pregabalin 300 mg (SUCRA = 0.6300), milnacipran 100 mg (SUCRA = 0.5680), milnacipran 200 mg (SUCRA = 0.5617), pregabalin 150 mg (SUCRA = 0.2392), and placebo (SUCRA = 0.0580). The risk of withdrawal due to adverse events was lower in the placebo group than in the pregabalin 300 mg, duloxetine 60 mg, milnacipran 100 mg, and milnacipran 200 mg groups. However, there was no significant difference in the efficacy and tolerability between the medications at the recommended doses. Duloxetine 60 mg, pregabalin 300 mg, milnacipran 100 mg, and milnacipran 200 mg were more efficacious than placebo. However, there was no significant difference in the efficacy and tolerability between the medications at the recommended doses. PMID:27000046
ERIC Educational Resources Information Center
van Krimpen-Stoop, Edith M. L. A.; Meijer, Rob R.
Person-fit research in the context of paper-and-pencil tests is reviewed, and some specific problems regarding person fit in the context of computerized adaptive testing (CAT) are discussed. Some new methods are proposed to investigate person fit in a CAT environment. These statistics are based on Statistical Process Control (SPC) theory. A…
Particle identification in ALICE: a Bayesian approach
NASA Astrophysics Data System (ADS)
Adam, J.; Adamová, D.; Aggarwal, M. M.; Aglieri Rinella, G.; Agnello, M.; Agrawal, N.; Ahammed, Z.; Ahmad, S.; Ahn, S. U.; Aiola, S.; Akindinov, A.; Alam, S. N.; Albuquerque, D. S. D.; Aleksandrov, D.; Alessandro, B.; Alexandre, D.; Alfaro Molina, R.; Alici, A.; Alkin, A.; Almaraz, J. R. M.; Alme, J.; Alt, T.; Altinpinar, S.; Altsybeev, I.; Alves Garcia Prado, C.; Andrei, C.; Andronic, A.; Anguelov, V.; Antičić, T.; Antinori, F.; Antonioli, P.; Aphecetche, L.; Appelshäuser, H.; Arcelli, S.; Arnaldi, R.; Arnold, O. W.; Arsene, I. C.; Arslandok, M.; Audurier, B.; Augustinus, A.; Averbeck, R.; Azmi, M. D.; Badalà, A.; Baek, Y. W.; Bagnasco, S.; Bailhache, R.; Bala, R.; Balasubramanian, S.; Baldisseri, A.; Baral, R. C.; Barbano, A. M.; Barbera, R.; Barile, F.; Barnaföldi, G. G.; Barnby, L. S.; Barret, V.; Bartalini, P.; Barth, K.; Bartke, J.; Bartsch, E.; Basile, M.; Bastid, N.; Basu, S.; Bathen, B.; Batigne, G.; Batista Camejo, A.; Batyunya, B.; Batzing, P. C.; Bearden, I. G.; Beck, H.; Bedda, C.; Behera, N. K.; Belikov, I.; Bellini, F.; Bello Martinez, H.; Bellwied, R.; Belmont, R.; Belmont-Moreno, E.; Belyaev, V.; Benacek, P.; Bencedi, G.; Beole, S.; Berceanu, I.; Bercuci, A.; Berdnikov, Y.; Berenyi, D.; Bertens, R. A.; Berzano, D.; Betev, L.; Bhasin, A.; Bhat, I. R.; Bhati, A. K.; Bhattacharjee, B.; Bhom, J.; Bianchi, L.; Bianchi, N.; Bianchin, C.; Bielčík, J.; Bielčíková, J.; Bilandzic, A.; Biro, G.; Biswas, R.; Biswas, S.; Bjelogrlic, S.; Blair, J. T.; Blau, D.; Blume, C.; Bock, F.; Bogdanov, A.; Bøggild, H.; Boldizsár, L.; Bombara, M.; Book, J.; Borel, H.; Borissov, A.; Borri, M.; Bossú, F.; Botta, E.; Bourjau, C.; Braun-Munzinger, P.; Bregant, M.; Breitner, T.; Broker, T. A.; Browning, T. A.; Broz, M.; Brucken, E. J.; Bruna, E.; Bruno, G. E.; Budnikov, D.; Buesching, H.; Bufalino, S.; Buncic, P.; Busch, O.; Buthelezi, Z.; Butt, J. B.; Buxton, J. T.; Cabala, J.; Caffarri, D.; Cai, X.; Caines, H.; Calero Diaz, L.; Caliva, A.; Calvo Villar, E.; Camerini, P.; Carena, F.; Carena, W.; Carnesecchi, F.; Castillo Castellanos, J.; Castro, A. J.; Casula, E. A. R.; Ceballos Sanchez, C.; Cepila, J.; Cerello, P.; Cerkala, J.; Chang, B.; Chapeland, S.; Chartier, M.; Charvet, J. L.; Chattopadhyay, S.; Chattopadhyay, S.; Chauvin, A.; Chelnokov, V.; Cherney, M.; Cheshkov, C.; Cheynis, B.; Chibante Barroso, V.; Chinellato, D. D.; Cho, S.; Chochula, P.; Choi, K.; Chojnacki, M.; Choudhury, S.; Christakoglou, P.; Christensen, C. H.; Christiansen, P.; Chujo, T.; Chung, S. U.; Cicalo, C.; Cifarelli, L.; Cindolo, F.; Cleymans, J.; Colamaria, F.; Colella, D.; Collu, A.; Colocci, M.; Conesa Balbastre, G.; Conesa del Valle, Z.; Connors, M. E.; Contreras, J. G.; Cormier, T. M.; Corrales Morales, Y.; Cortés Maldonado, I.; Cortese, P.; Cosentino, M. R.; Costa, F.; Crochet, P.; Cruz Albino, R.; Cuautle, E.; Cunqueiro, L.; Dahms, T.; Dainese, A.; Danisch, M. C.; Danu, A.; Das, D.; Das, I.; Das, S.; Dash, A.; Dash, S.; De, S.; De Caro, A.; de Cataldo, G.; de Conti, C.; de Cuveland, J.; De Falco, A.; De Gruttola, D.; De Marco, N.; De Pasquale, S.; Deisting, A.; Deloff, A.; Dénes, E.; Deplano, C.; Dhankher, P.; Di Bari, D.; Di Mauro, A.; Di Nezza, P.; Diaz Corchero, M. A.; Dietel, T.; Dillenseger, P.; Divià, R.; Djuvsland, Ø.; Dobrin, A.; Domenicis Gimenez, D.; Dönigus, B.; Dordic, O.; Drozhzhova, T.; Dubey, A. K.; Dubla, A.; Ducroux, L.; Dupieux, P.; Ehlers, R. J.; Elia, D.; Endress, E.; Engel, H.; Epple, E.; Erazmus, B.; Erdemir, I.; Erhardt, F.; Espagnon, B.; Estienne, M.; Esumi, S.; Eum, J.; Evans, D.; Evdokimov, S.; Eyyubova, G.; Fabbietti, L.; Fabris, D.; Faivre, J.; Fantoni, A.; Fasel, M.; Feldkamp, L.; Feliciello, A.; Feofilov, G.; Ferencei, J.; Fernández Téllez, A.; Ferreiro, E. G.; Ferretti, A.; Festanti, A.; Feuillard, V. J. G.; Figiel, J.; Figueredo, M. A. S.; Filchagin, S.; Finogeev, D.; Fionda, F. M.; Fiore, E. M.; Fleck, M. G.; Floris, M.; Foertsch, S.; Foka, P.; Fokin, S.; Fragiacomo, E.; Francescon, A.; Frankenfeld, U.; Fronze, G. G.; Fuchs, U.; Furget, C.; Furs, A.; Fusco Girard, M.; Gaardhøje, J. J.; Gagliardi, M.; Gago, A. M.; Gallio, M.; Gangadharan, D. R.; Ganoti, P.; Gao, C.; Garabatos, C.; Garcia-Solis, E.; Gargiulo, C.; Gasik, P.; Gauger, E. F.; Germain, M.; Gheata, A.; Gheata, M.; Ghosh, P.; Ghosh, S. K.; Gianotti, P.; Giubellino, P.; Giubilato, P.; Gladysz-Dziadus, E.; Glässel, P.; Goméz Coral, D. M.; Gomez Ramirez, A.; Gonzalez, A. S.; Gonzalez, V.; González-Zamora, P.; Gorbunov, S.; Görlich, L.; Gotovac, S.; Grabski, V.; Grachov, O. A.; Graczykowski, L. K.; Graham, K. L.; Grelli, A.; Grigoras, A.; Grigoras, C.; Grigoriev, V.; Grigoryan, A.; Grigoryan, S.; Grinyov, B.; Grion, N.; Gronefeld, J. M.; Grosse-Oetringhaus, J. F.; Grosso, R.; Guber, F.
2016-05-01
We present a Bayesian approach to particle identification (PID) within the ALICE experiment. The aim is to more effectively combine the particle identification capabilities of its various detectors. After a brief explanation of the adopted methodology and formalism, the performance of the Bayesian PID approach for charged pions, kaons and protons in the central barrel of ALICE is studied. PID is performed via measurements of specific energy loss ( d E/d x) and time of flight. PID efficiencies and misidentification probabilities are extracted and compared with Monte Carlo simulations using high-purity samples of identified particles in the decay channels K0S → π-π+, φ→ K-K+, and Λ→ p π- in p-Pb collisions at √{s_{NN}}=5.02 TeV. In order to thoroughly assess the validity of the Bayesian approach, this methodology was used to obtain corrected pT spectra of pions, kaons, protons, and D0 mesons in pp collisions at √{s}=7 TeV. In all cases, the results using Bayesian PID were found to be consistent with previous measurements performed by ALICE using a standard PID approach. For the measurement of D0 → K-π+, it was found that a Bayesian PID approach gave a higher signal-to-background ratio and a similar or larger statistical significance when compared with standard PID selections, despite a reduced identification efficiency. Finally, we present an exploratory study of the measurement of Λc+ → p K-π+ in pp collisions at √{s}=7 TeV, using the Bayesian approach for the identification of its decay products.
A Case Study of Ion Implant In-Line Statistical Process Control
NASA Astrophysics Data System (ADS)
Zhao, Zhiyong; Ramczyk, Kenneth; Hall, Darcy; Wang, Linda
2005-09-01
Ion implantation is one of the most critical processes in the front-end-of-line for ULSI manufacturing. With more complexity in device layout, the fab cycle time can only be expected to be longer. To ensure yield and consistent device performance it is very beneficial to have a Statistical Process Control (SPC) practice that can detect tool issues to prevent excursions. Also, implanters may abort a process due to run-time issues. It requires human intervention to dispose of the lot. Since device wafers have a fixed flow plan and can only do anneal at certain points in the manufacturing process, it is not practical to use four-point probe to check such implants. Pattern recognition option on some of the metrology tools, such as ThermaWave (TWave), allows user to check an open area on device wafers for implant information. The two cited reasons prompted this work to look into the sensitivity of TWave with different implant processes and the possibility of setting up an SPC practice in a high-volume manufacturing fab. In this work, the authors compare the test wafer result with that of device wafers with variations in implant conditions such as dose, implant angle, energy, etc. The intention of this work is to correlate analytical measurement such as sheet resistance (Rs) and Secondary Ion Mass Spectrometry (SIMS) with device data such as electrical testing and sort yield. For a ± 1.5% TWave control limit with the tested implant processes in this work, this translates to about 0.5° in implant angle control or 2% to 8% dose change, respectively. It is understood that the dose sensitivity is not good since the tested processes are deep layer implants. Based on the statistics calculation, we assess the experimental error bar is within 1% of the measured values.
Spectral likelihood expansions for Bayesian inference
NASA Astrophysics Data System (ADS)
Nagel, Joseph B.; Sudret, Bruno
2016-03-01
A spectral approach to Bayesian inference is presented. It pursues the emulation of the posterior probability density. The starting point is a series expansion of the likelihood function in terms of orthogonal polynomials. From this spectral likelihood expansion all statistical quantities of interest can be calculated semi-analytically. The posterior is formally represented as the product of a reference density and a linear combination of polynomial basis functions. Both the model evidence and the posterior moments are related to the expansion coefficients. This formulation avoids Markov chain Monte Carlo simulation and allows one to make use of linear least squares instead. The pros and cons of spectral Bayesian inference are discussed and demonstrated on the basis of simple applications from classical statistics and inverse modeling.
Toward Bayesian chemometrics--a tutorial on some recent advances.
Chen, Hongshu; Bakshi, Bhavik R; Goel, Prem K
2007-10-17
Chemometrics is increasingly being perceived as a maturing science. While this perception seems to be true with regards to the traditional methods and applications of chemometrics, this article argues that advances in instrumentation, computation, and statistical theory may combine to drive a resurgence in chemometrics research. Previous surges in chemometrics research activity were driven by the development of new ways of making better use of available information. Bayesian statistics can further enhance the ability to use domain specific information to obtain more accurate and useful models, and presents many research opportunities as well as challenges. Although Bayesian statistics is not new, recent advances via sampling-based Monte Carlo methods make these methods practical for large scale applications without making the common assumptions of Gaussian noise and uniform prior distributions, made by most chemometric methods. This article provides an overview of traditional chemometric methods from a Bayesian view and a tutorial of some recently developed techniques in Bayesian chemometrics, such as Bayesian PCA and Bayesian latent variable regression. New challenges and opportunities for future work are also identified. PMID:17936101
Prémaud, A; Rousseau, A; Le Meur, Y; Venisse, N; Loichot, C; Turcant, A; Hoizey, G; Compagnon, P; Hary, L; Debruyne, D; Saivin, S; Jacqz-Aigrain, E; Marquet, P
2010-02-01
The aim of this study was to analyze retrospectively and critically the different steps of the individual dose adjustment procedure employed in the concentration-controlled (CC) versus fixed-dose trial Apomygre, which showed that mycophenolate mofetil (MMF) dose adjustment using a limited sampling strategy significantly reduced the risk of treatment failures and acute rejection in renal transplants at one year posttransplantation. The number of AUCs performed during the study and circumstances of collection, time of blood sampling, Bayesian mycophenolic acid (MPA) area-under-the-curve (AUC) estimation procedures and physicians' compliance with MMF dose recommendations were retrospectively analyzed. 92% of AUCs scheduled over the study were actually performed. Sampling times were very well respected. Bayesian estimation of MPA exposure was done by the pharmacologists locally in accordance with the protocol instructions and the AUC estimates obtained were virtually all confirmed a posteriori. On the other hand, a second AUC estimated by multiple linear regression could only be provided for 84% of the profiles and showed a large overestimation with respect to Bayesian estimates for AUC values between 10 and 55mgh/L. In the CC arm, a very good physicians' compliance was observed (85%) and application of the dose recommendations led to higher values of AUCs (42.1+/-14.6mgh/L versus 36.7+/-16.3mgh/L, p=0.0035) and to more AUCs in the target range (69% versus 56%, p=0.0343) than when dose recommendations were not applied. By analyzing in detail the feasibility criteria of MMF Bayesian dose adjustment, this study highlighted the requirements for successful extrapolation of the Apomygre trial results to routine practice: (i) respect of the PK sampling time-windows; (ii) use of relevant tools for accurate drug exposure estimation and dose adjustment calculation; and (iii) good compliance of the physicians with regard to the recommended doses. PMID:19800973
A Bayesian Model for Determining Levels of Student Mastery.
ERIC Educational Resources Information Center
Schmalz, Steve W.; Cartledge, Carolyn M.
During the last decade the use of Bayesian statistical method has become quite prevalent in the educational community. Yet, like most statistical techniques, little has been written concerning the application of these methods to the classroom setting. The purpose of this paper is to help correct such a deficiency in the literature by developing a…
A Bayesian belief network (BBN) was developed to characterize the effects of sediment accumulation on the water storage capacity of Lago Lucchetti (located in southwest Puerto Rico) and to forecast the life expectancy (usefulness) of the reservoir under different management scena...
Statistical characterization of negative control data in the Ames Salmonella/microsome test.
Hamada, C; Wada, T; Sakamoto, Y
1994-01-01
A statistical characterization of negative control data in the Ames Salmonella/microsome reverse mutation test was performed using data obtained at Takeda Analytical Research Laboratories during January 1989 to April 1990. The lot-to-lot variability of bacterial stock cultures and day-to-day variability of experiments were small for Salmonella typhimurium strains TA1535 and TA1537 and Escherichia coli WP2uvrA, but they were larger for S. typhimurium TA100. The number of revertant colonies for all test strains studied here followed Poisson distributions within the same day. The two-fold rule that is an empirical method to evaluate the Ames Salmonella/microsome test results has been widely used in Japan. This two-fold rule was evaluated statistically. The comparison-wise type I error rate was less than 0.05 for TA98, TA100, TA1535, TA1537, and WP2uvrA. Moreover, this rule is particularly conservative for TA100, for which the type I error rate was nearly 0. PMID:8187699
Bayesian least squares deconvolution
NASA Astrophysics Data System (ADS)
Asensio Ramos, A.; Petit, P.
2015-11-01
Aims: We develop a fully Bayesian least squares deconvolution (LSD) that can be applied to the reliable detection of magnetic signals in noise-limited stellar spectropolarimetric observations using multiline techniques. Methods: We consider LSD under the Bayesian framework and we introduce a flexible Gaussian process (GP) prior for the LSD profile. This prior allows the result to automatically adapt to the presence of signal. We exploit several linear algebra identities to accelerate the calculations. The final algorithm can deal with thousands of spectral lines in a few seconds. Results: We demonstrate the reliability of the method with synthetic experiments and we apply it to real spectropolarimetric observations of magnetic stars. We are able to recover the magnetic signals using a small number of spectral lines, together with the uncertainty at each velocity bin. This allows the user to consider if the detected signal is reliable. The code to compute the Bayesian LSD profile is freely available.
Statistics of plastic events in post-yield strain-controlled amorphous solids
NASA Astrophysics Data System (ADS)
Dubey, Awadhesh K.; Hentschel, H. George E.; Procaccia, Itamar; Singh, Murari
2016-06-01
Amorphous solids yield in strain-controlled protocols at a critical value of the strain. For larger strains the stress and energy display a generic complex serrated signal with elastic segments punctuated by sharp energy and stress plastic drops having a wide range of magnitudes. Here we provide a theory of the scaling properties of such serrated signals taking into account the system-size dependence. We show that the statistics are not homogeneous: they separate sharply to a regime of "small" and "large" drops, each endowed with its own scaling properties. A scaling theory is first derived solely by data analysis, showing a somewhat complex picture. But after considering the physical interpretation one discovers that the scaling behavior and the scaling exponents are in fact very simple and universal.
Statistical Analysis of Crossed Undulator for Polarization Control in a SASE FEL
Ding, Yuantao; Huang, Zhirong; /SLAC
2008-02-01
There is a growing interest in producing intense, coherent x-ray radiation with an adjustable and arbitrary polarization state. In this paper, we study the crossed undulator scheme (K.-J. Kim, Nucl. Instrum. Methods A 445, 329 (2000)) for rapid polarization control in a self-amplified spontaneous emission (SASE) free electron laser (FEL). Because a SASE source is a temporally chaotic light, we perform a statistical analysis on the state of polarization using FEL theory and simulations. We show that by adding a small phase shifter and a short (about 1.3 times the FEL power gain length), 90{sup o} rotated planar undulator after the main SASE planar undulator, one can obtain circularly polarized light--with over 80% polarization--near the FEL saturation.
Baldewijns, Greet; Luca, Stijn; Nagels, William; Vanrumste, Bart; Croonenborghs, Tom
2015-01-01
It has been shown that gait speed and transfer times are good measures of functional ability in elderly. However, data currently acquired by systems that measure either gait speed or transfer times in the homes of elderly people require manual reviewing by healthcare workers. This reviewing process is time-consuming. To alleviate this burden, this paper proposes the use of statistical process control methods to automatically detect both positive and negative changes in transfer times. Three SPC techniques: tabular CUSUM, standardized CUSUM and EWMA, known for their ability to detect small shifts in the data, are evaluated on simulated transfer times. This analysis shows that EWMA is the best-suited method with a detection accuracy of 82% and an average detection time of 9.64 days. PMID:26737425
NASA Astrophysics Data System (ADS)
Zhang, Deyi; Bao, Yuequan; Li, Hui; Ou, Jinping
2009-07-01
Structural health monitoring (SHM) is regarded as an effective technique for structural damage diagnosis, safety and integrity assessment and service life evaluation. SHM techniques based on vibration modal parameters are ineffective for space structure health maintenance and the statistical process control (SPC) technique is a simple and effective tool to monitor the operational process of structures. Therefore, employing strain measurements from optical fiber Bragg grating (OFBG) sensors, the Johnson transformation based SPC is proposed to monitor structural health state and some unexpected excitements on line in this paper. The large and complicated space structure-the China National Aquatics Center is employed as an example to verify the proposed method in both numerical and experimental aspects. It is found that the Johnson transformation can effectively improve the quality of SPC for SHM process, and it can clearly and effectively monitor structural health state and detect the unexpected external load happened in structures.
NASA Astrophysics Data System (ADS)
Villeta, M.; Sanz-Lobera, A.; González, C.; Sebastián, M. A.
2009-11-01
The implantation of Statistical Process Control, SPC designated in short, requires the use of measurement systems. The inherent variability of these systems influences on the reliability of measurement results obtained, and as a consequence of it, influences on the SPC results. This paper investigates about the influence of the uncertainty of measurement on the analysis of process capability. It looks for reducing the effect of measurement uncertainty, to approach the capability that the productive process really has. In this work processes centered at a nominal value as well as off-center processes are raised, and a criterion is proposed that allows validate the adequacy of the dimensional measurement systems used in a SPC implantation.
Simulation on a car interior aerodynamic noise control based on statistical energy analysis
NASA Astrophysics Data System (ADS)
Chen, Xin; Wang, Dengfeng; Ma, Zhengdong
2012-09-01
How to simulate interior aerodynamic noise accurately is an important question of a car interior noise reduction. The unsteady aerodynamic pressure on body surfaces is proved to be the key effect factor of car interior aerodynamic noise control in high frequency on high speed. In this paper, a detail statistical energy analysis (SEA) model is built. And the vibra-acoustic power inputs are loaded on the model for the valid result of car interior noise analysis. The model is the solid foundation for further optimization on car interior noise control. After the most sensitive subsystems for the power contribution to car interior noise are pointed by SEA comprehensive analysis, the sound pressure level of car interior aerodynamic noise can be reduced by improving their sound and damping characteristics. The further vehicle testing results show that it is available to improve the interior acoustic performance by using detailed SEA model, which comprised by more than 80 subsystems, with the unsteady aerodynamic pressure calculation on body surfaces and the materials improvement of sound/damping properties. It is able to acquire more than 2 dB reduction on the central frequency in the spectrum over 800 Hz. The proposed optimization method can be looked as a reference of car interior aerodynamic noise control by the detail SEA model integrated unsteady computational fluid dynamics (CFD) and sensitivity analysis of acoustic contribution.
Statistical process control in the hybrid microelectronics manufacturing industry: A Navy view point
NASA Astrophysics Data System (ADS)
Azu, Charles C., Jr.
1993-04-01
The U.S. Navy is concerned with receiving high quality hybrid microelectronic circuits. The Navy recognizes that in order to obtain high quality circuits a manufacturer must have an effective statistical process control (SPC) program implemented. The implementation of effective SPC programs is an objective of the military hybrid microelectronics industry. Often the smaller sized manufacturers in the industry have little SPC implementation, while the larger manufacturers have practices originally developed for the control of other product lines outside the hybrid technology area. The industry recognizes that SPC programs will result in high quality hybrid microcircuits which the U.S. Navy requires. In the past, the goal of the industry had not been to put in effective process control methods, but to merely meet the government military standards on the quality of the hybrids they produce. This practice, at best, resulted in a 'hit or miss' situation when it comes to hybrid microcircuit assemblies meeting military standards. The U.S. Navy through its MicroCIM program has been challenging and working with the industry in the area of SPC practice methods. The major limitations so far have been a lack of available sensors for the real-time collection of effective SPC data on the factory floor. This paper will discuss the Navy's efforts in bringing about effective SPC programs in the military hybrid manufacturing industry.
Eisenberg, Dan T.A.; Kuzawa, Christopher W.; Hayes, M. Geoffrey
2015-01-01
Objectives Telomere length (TL) is commonly measured using quantitative PCR (qPCR). Although easier than the southern blot of terminal restriction fragments (TRF) TL measurement method, one drawback of qPCR is that it introduces greater measurement error and thus reduces the statistical power of analyses. To address a potential source of measurement error, we consider the effect of well position on qPCR TL measurements. Methods qPCR TL data from 3,638 people run on a Bio-Rad iCycler iQ are reanalyzed here. To evaluate measurement validity, correspondence with TRF, age and between mother and offspring are examined. Results First, we present evidence for systematic variation in qPCR TL measurements in relation to thermocycler well position. Controlling for these well-position effects consistently improves measurement validity and yields estimated improvements in statistical power equivalent to increasing sample sizes by 16%. We additionally evaluated the linearity of the relationships between telomere and single copy gene control amplicons and between qPCR and TRF measures. We find that, unlike some previous reports, our data exhibit linear relationships. We introduce the standard error in percent, a superior method for quantifying measurement error compared to the commonly used coefficient of variation. Using this measure, we find that excluding samples with high measurement error does not improve measurement validity. Conclusions Future studies using block-based thermocyclers should consider well position effects. Since additional information can be gleaned from well position corrections, re-running analyses of previous results with well position correction could serve as an independent test of the validity of these results. PMID:25757675
NASA Astrophysics Data System (ADS)
Cheong, Kwang-Ho; Lee, Me-Yeon; Kang, Sei-Kwon; Yoon, Jai-Woong; Park, Soah; Hwang, Taejin; Kim, Haeyoung; Kim, Kyoung Ju; Han, Tae Jin; Bae, Hoonsik
2015-07-01
The aim of this study is to set up statistical quality control for monitoring the volumetric modulated arc therapy (VMAT) delivery error by using the machine's log data. Eclipse and a Clinac iX linac with the RapidArc system (Varian Medical Systems, Palo Alto, USA) are used for delivery of the VMAT plan. During the delivery of the RapidArc fields, the machine determines the delivered monitor units (MUs) and the gantry angle's position accuracy and the standard deviations of the MU ( σMU: dosimetric error) and the gantry angle ( σGA: geometric error) are displayed on the console monitor after completion of the RapidArc delivery. In the present study, first, the log data were analyzed to confirm its validity and usability; then, statistical process control (SPC) was applied to monitor the σMU and the σGA in a timely manner for all RapidArc fields: a total of 195 arc fields for 99 patients. The MU and the GA were determined twice for all fields, that is, first during the patient-specific plan QA and then again during the first treatment. The sMU and the σGA time series were quite stable irrespective of the treatment site; however, the sGA strongly depended on the gantry's rotation speed. The σGA of the RapidArc delivery for stereotactic body radiation therapy (SBRT) was smaller than that for the typical VMAT. Therefore, SPC was applied for SBRT cases and general cases respectively. Moreover, the accuracy of the potential meter of the gantry rotation is important because the σGA can change dramatically due to its condition. By applying SPC to the σMU and σGA, we could monitor the delivery error efficiently. However, the upper and the lower limits of SPC need to be determined carefully with full knowledge of the machine and log data.
NASA Astrophysics Data System (ADS)
Ji, Xinye; Shen, Chaopeng; Riley, William J.
2015-12-01
Soil moisture statistical fractal is an important tool for downscaling remotely-sensed observations and has the potential to play a key role in multi-scale hydrologic modeling. The fractal was first introduced two decades ago, but relatively little is known regarding how its scaling exponents evolve in time in response to climatic forcings. Previous studies have neglected the process of moisture re-distribution due to regional groundwater flow. In this study we used a physically-based surface-subsurface processes model and numerical experiments to elucidate the patterns and controls of fractal temporal evolution in two U.S. Midwest basins. Groundwater flow was found to introduce large-scale spatial structure, thereby reducing the scaling exponents (τ), which has implications for the transferability of calibrated parameters to predict τ. However, the groundwater effects depend on complex interactions with other physical controls such as soil texture and land use. The fractal scaling exponents, while in general showing a seasonal mode that correlates with mean moisture content, display hysteresis after storm events that can be divided into three phases, consistent with literature findings: (a) wetting, (b) re-organizing, and (c) dry-down. Modeling experiments clearly show that the hysteresis is attributed to soil texture, whose "patchiness" is the primary contributing factor. We generalized phenomenological rules for the impacts of rainfall, soil texture, groundwater flow, and land use on τ evolution. Grid resolution has a mild influence on the results and there is a strong correlation between predictions of τ from different resolutions. Overall, our results suggest that groundwater flow should be given more consideration in studies of the soil moisture statistical fractal, especially in regions with a shallow water table.
Bayesian Networks for Social Modeling
Whitney, Paul D.; White, Amanda M.; Walsh, Stephen J.; Dalton, Angela C.; Brothers, Alan J.
2011-03-28
This paper describes a body of work developed over the past five years. The work addresses the use of Bayesian network (BN) models for representing and predicting social/organizational behaviors. The topics covered include model construction, validation, and use. These topics show the bulk of the lifetime of such model, beginning with construction, moving to validation and other aspects of model ‘critiquing’, and finally demonstrating how the modeling approach might be used to inform policy analysis. To conclude, we discuss limitations of using BN for this activity and suggest remedies to address those limitations. The primary benefits of using a well-developed computational, mathematical, and statistical modeling structure, such as BN, are 1) there are significant computational, theoretical and capability bases on which to build 2) ability to empirically critique the model, and potentially evaluate competing models for a social/behavioral phenomena.
Song, Guo-Min; Tian, Xu; Zhang, Lei; Ou, Yang-Xiang; Yi, Li-Juan; Shuai, Ting; Zhou, Jian-Guo; Zeng, Zi; Yang, Hong-Ling
2015-07-01
Enteral immunonutrition (EIN) has been established to be as a significantly important modality to prevent the postoperative infectious and noninfectious complications, enhance the immunity of host, and eventually improve the prognosis of gastrointestinal (GI) cancer patients undergoing surgery. However, different support routes, which are the optimum option, remain unclear. To evaluate the effects of different EIN support regimes for patients who underwent selective surgery for resectable GI malignancy, a Bayesian network meta-analysis (NMA) of randomized controlled trials (RCTs) was conducted. A search of PubMed, EMBASE, and the Cochrane Central Register of Controlled Trials (CENTRAL) was electronically searched until the end of December 2014. Moreover, we manually checked reference lists of eligible trials and review and retrieval unpublished literature. RCTs which investigated the comparative effects of EIN versus standard enteral nutrition (EN) or different EIN regimes were included if the clinical outcomes information can be extracted from it. A total of 27 RCTs were incorporated into this study. Pair-wise meta-analyses suggested that preoperative (relative risk [RR], 0.58; 95% confidence interval [CI], 0.43-0.78), postoperative (RR, 0.63; 95% CI, 0.52-0.76), and perioperative EIN methods (RR, 0.46; 95% CI, 0.34-0.62) reduced incidence of postoperative infectious complications compared with standard EN. Moreover, perioperative EIN (RR, 0.65; 95% CI, 0.44-0.95) reduced the incidence of postoperative noninfectious complications, and the postoperative (mean difference [MD], -2.38; 95% CI, -3.4 to -1.31) and perioperative EIN (MD, -2.64; 95% CI, -3.28 to -1.99) also shortened the length of postoperative hospitalization compared with standard EN. NMA found that EIN support effectively improved the clinical outcomes of patients who underwent selective surgery for GI cancer compared with standard EN. Our results suggest EIN support is promising alternative for
Bayesian Exploratory Factor Analysis
Conti, Gabriella; Frühwirth-Schnatter, Sylvia; Heckman, James J.; Piatek, Rémi
2014-01-01
This paper develops and applies a Bayesian approach to Exploratory Factor Analysis that improves on ad hoc classical approaches. Our framework relies on dedicated factor models and simultaneously determines the number of factors, the allocation of each measurement to a unique factor, and the corresponding factor loadings. Classical identification criteria are applied and integrated into our Bayesian procedure to generate models that are stable and clearly interpretable. A Monte Carlo study confirms the validity of the approach. The method is used to produce interpretable low dimensional aggregates from a high dimensional set of psychological measurements. PMID:25431517
A study of finite mixture model: Bayesian approach on financial time series data
NASA Astrophysics Data System (ADS)
Phoong, Seuk-Yen; Ismail, Mohd Tahir
2014-07-01
Recently, statistician have emphasized on the fitting finite mixture model by using Bayesian method. Finite mixture model is a mixture of distributions in modeling a statistical distribution meanwhile Bayesian method is a statistical method that use to fit the mixture model. Bayesian method is being used widely because it has asymptotic properties which provide remarkable result. In addition, Bayesian method also shows consistency characteristic which means the parameter estimates are close to the predictive distributions. In the present paper, the number of components for mixture model is studied by using Bayesian Information Criterion. Identify the number of component is important because it may lead to an invalid result. Later, the Bayesian method is utilized to fit the k-component mixture model in order to explore the relationship between rubber price and stock market price for Malaysia, Thailand, Philippines and Indonesia. Lastly, the results showed that there is a negative effect among rubber price and stock market price for all selected countries.
Decision generation tools and Bayesian inference
NASA Astrophysics Data System (ADS)
Jannson, Tomasz; Wang, Wenjian; Forrester, Thomas; Kostrzewski, Andrew; Veeris, Christian; Nielsen, Thomas
2014-05-01
Digital Decision Generation (DDG) tools are important software sub-systems of Command and Control (C2) systems and technologies. In this paper, we present a special type of DDGs based on Bayesian Inference, related to adverse (hostile) networks, including such important applications as terrorism-related networks and organized crime ones.
Bayesian networks in neuroscience: a survey.
Bielza, Concha; Larrañaga, Pedro
2014-01-01
Bayesian networks are a type of probabilistic graphical models lie at the intersection between statistics and machine learning. They have been shown to be powerful tools to encode dependence relationships among the variables of a domain under uncertainty. Thanks to their generality, Bayesian networks can accommodate continuous and discrete variables, as well as temporal processes. In this paper we review Bayesian networks and how they can be learned automatically from data by means of structure learning algorithms. Also, we examine how a user can take advantage of these networks for reasoning by exact or approximate inference algorithms that propagate the given evidence through the graphical structure. Despite their applicability in many fields, they have been little used in neuroscience, where they have focused on specific problems, like functional connectivity analysis from neuroimaging data. Here we survey key research in neuroscience where Bayesian networks have been used with different aims: discover associations between variables, perform probabilistic reasoning over the model, and classify new observations with and without supervision. The networks are learned from data of any kind-morphological, electrophysiological, -omics and neuroimaging-, thereby broadening the scope-molecular, cellular, structural, functional, cognitive and medical- of the brain aspects to be studied. PMID:25360109
Bayesian networks in neuroscience: a survey
Bielza, Concha; Larrañaga, Pedro
2014-01-01
Bayesian networks are a type of probabilistic graphical models lie at the intersection between statistics and machine learning. They have been shown to be powerful tools to encode dependence relationships among the variables of a domain under uncertainty. Thanks to their generality, Bayesian networks can accommodate continuous and discrete variables, as well as temporal processes. In this paper we review Bayesian networks and how they can be learned automatically from data by means of structure learning algorithms. Also, we examine how a user can take advantage of these networks for reasoning by exact or approximate inference algorithms that propagate the given evidence through the graphical structure. Despite their applicability in many fields, they have been little used in neuroscience, where they have focused on specific problems, like functional connectivity analysis from neuroimaging data. Here we survey key research in neuroscience where Bayesian networks have been used with different aims: discover associations between variables, perform probabilistic reasoning over the model, and classify new observations with and without supervision. The networks are learned from data of any kind–morphological, electrophysiological, -omics and neuroimaging–, thereby broadening the scope–molecular, cellular, structural, functional, cognitive and medical– of the brain aspects to be studied. PMID:25360109
A Bayesian approach to reliability and confidence
NASA Technical Reports Server (NTRS)
Barnes, Ron
1989-01-01
The historical evolution of NASA's interest in quantitative measures of reliability assessment is outlined. The introduction of some quantitative methodologies into the Vehicle Reliability Branch of the Safety, Reliability and Quality Assurance (SR and QA) Division at Johnson Space Center (JSC) was noted along with the development of the Extended Orbiter Duration--Weakest Link study which will utilize quantitative tools for a Bayesian statistical analysis. Extending the earlier work of NASA sponsor, Richard Heydorn, researchers were able to produce a consistent Bayesian estimate for the reliability of a component and hence by a simple extension for a system of components in some cases where the rate of failure is not constant but varies over time. Mechanical systems in general have this property since the reliability usually decreases markedly as the parts degrade over time. While they have been able to reduce the Bayesian estimator to a simple closed form for a large class of such systems, the form for the most general case needs to be attacked by the computer. Once a table is generated for this form, researchers will have a numerical form for the general solution. With this, the corresponding probability statements about the reliability of a system can be made in the most general setting. Note that the utilization of uniform Bayesian priors represents a worst case scenario in the sense that as researchers incorporate more expert opinion into the model, they will be able to improve the strength of the probability calculations.
Explanation mode for Bayesian automatic object recognition
NASA Astrophysics Data System (ADS)
Hazlett, Thomas L.; Cofer, Rufus H.; Brown, Harold K.
1992-09-01
One of the more useful techniques to emerge from AI is the provision of an explanation modality used by the researcher to understand and subsequently tune the reasoning of an expert system. Such a capability, missing in the arena of statistical object recognition, is not that difficult to provide. Long standing results show that the paradigm of Bayesian object recognition is truly optimal in a minimum probability of error sense. To a large degree, the Bayesian paradigm achieves optimality through adroit fusion of a wide range of lower informational data sources to give a higher quality decision--a very 'expert system' like capability. When various sources of incoming data are represented by C++ classes, it becomes possible to automatically backtrack the Bayesian data fusion process, assigning relative weights to the more significant datums and their combinations. A C++ object oriented engine is then able to synthesize 'English' like textural description of the Bayesian reasoning suitable for generalized presentation. Key concepts and examples are provided based on an actual object recognition problem.
Predicting coastal cliff erosion using a Bayesian probabilistic model
Hapke, C.; Plant, N.
2010-01-01
Regional coastal cliff retreat is difficult to model due to the episodic nature of failures and the along-shore variability of retreat events. There is a growing demand, however, for predictive models that can be used to forecast areas vulnerable to coastal erosion hazards. Increasingly, probabilistic models are being employed that require data sets of high temporal density to define the joint probability density function that relates forcing variables (e.g. wave conditions) and initial conditions (e.g. cliff geometry) to erosion events. In this study we use a multi-parameter Bayesian network to investigate correlations between key variables that control and influence variations in cliff retreat processes. The network uses Bayesian statistical methods to estimate event probabilities using existing observations. Within this framework, we forecast the spatial distribution of cliff retreat along two stretches of cliffed coast in Southern California. The input parameters are the height and slope of the cliff, a descriptor of material strength based on the dominant cliff-forming lithology, and the long-term cliff erosion rate that represents prior behavior. The model is forced using predicted wave impact hours. Results demonstrate that the Bayesian approach is well-suited to the forward modeling of coastal cliff retreat, with the correct outcomes forecast in 70-90% of the modeled transects. The model also performs well in identifying specific locations of high cliff erosion, thus providing a foundation for hazard mapping. This approach can be employed to predict cliff erosion at time-scales ranging from storm events to the impacts of sea-level rise at the century-scale. ?? 2010.
Childhood autism in India: A case-control study using tract-based spatial statistics analysis
Assis, Zarina Abdul; Bagepally, Bhavani Shankara; Saini, Jitender; Srinath, Shoba; Bharath, Rose Dawn; Naidu, Purushotham R.; Gupta, Arun Kumar
2015-01-01
Context: Autism is a serious behavioral disorder among young children that now occurs at epidemic rates in developing countries like India. We have used tract-based spatial statistics (TBSS) of diffusion tensor imaging (DTI) measures to investigate the microstructure of primary neurocircuitry involved in autistic spectral disorders as compared to the typically developed children. Objective: To evaluate the various white matter tracts in Indian autistic children as compared to the controls using TBSS. Materials and Methods: Prospective, case-control, voxel-based, whole-brain DTI analysis using TBSS was performed. The study included 19 autistic children (mean age 8.7 years ± 3.84, 16 males and 3 females) and 34 controls (mean age 12.38 ± 3.76, all males). Fractional anisotropy (FA), mean diffusivity (MD), radial diffusivity (RD), and axial diffusivity (AD) values were used as outcome variables. Results: Compared to the control group, TBSS demonstrated multiple areas of markedly reduced FA involving multiple long white matter tracts, entire corpus callosum, bilateral posterior thalami, and bilateral optic tracts (OTs). Notably, there were no voxels where FA was significantly increased in the autism group. Increased RD was also noted in these regions, suggesting underlying myelination defect. The MD was elevated in many of the projections and association fibers and notably in the OTs. There were no significant changes in the AD in these regions, indicating no significant axonal injury. There was no significant correlation between the FA values and Childhood Autism Rating Scale. Conclusion: This is a first of a kind study evaluating DTI findings in autistic children in India. In our study, DTI has shown a significant fault with the underlying intricate brain wiring system in autism. OT abnormality is a novel finding and needs further research. PMID:26600581
NASA Astrophysics Data System (ADS)
García-Díaz, J. Carlos
2009-11-01
Fault detection and diagnosis is an important problem in process engineering. Process equipments are subject to malfunctions during operation. Galvanized steel is a value added product, furnishing effective performance by combining the corrosion resistance of zinc with the strength and formability of steel. Fault detection and diagnosis is an important problem in continuous hot dip galvanizing and the increasingly stringent quality requirements in automotive industry has also demanded ongoing efforts in process control to make the process more robust. When faults occur, they change the relationship among these observed variables. This work compares different statistical regression models proposed in the literature for estimating the quality of galvanized steel coils on the basis of short time histories. Data for 26 batches were available. Five variables were selected for monitoring the process: the steel strip velocity, four bath temperatures and bath level. The entire data consisting of 48 galvanized steel coils was divided into sets. The first training data set was 25 conforming coils and the second data set was 23 nonconforming coils. Logistic regression is a modeling tool in which the dependent variable is categorical. In most applications, the dependent variable is binary. The results show that the logistic generalized linear models do provide good estimates of quality coils and can be useful for quality control in manufacturing process.
NASA Astrophysics Data System (ADS)
Ma, Chao; An, Wei; Deng, Xinpu
2015-10-01
The Ground Control Points (GCPs) is an important source of fundamental data in geometric correction for remote sensing imagery. The quantity, accuracy and distribution of GCPs are three factors which may affect the accuracy of geometric correction. It is generally required that the distribution of GCP should be uniform, so they can fully control the accuracy of mapping regions. In this paper, we establish an objective standard of evaluating the uniformity of the GCPs' distribution based on regional statistical information (RSI), and get an optimal distribution of GCPs. This sampling method is called RSIS for short in this work. The Amounts of GCPs in different regions by equally partitioning the image in regions in different manners are counted which forms a vector called RSI vector in this work. The uniformity of GCPs' distribution can be evaluated by a mathematical quantity of the RSI vector. An optimal distribution of GCPs is obtained by searching the RSI vector with the minimum mathematical quantity. In this paper, the simulation annealing is employed to search the optimal distribution of GCPs that have the minimum mathematical quantity of the RSI vector. Experiments are carried out to test the method proposed in this paper, and sampling designs compared are simple random sampling and universal kriging model-based sampling. The experiments indicate that this method is highly recommended as new GCPs sampling design method for geometric correction of remotely sensed imagery.
Matheny, Michael E; Morrow, David A; Ohno-Machado, Lucila; Cannon, Christopher P; Resnic, Frederic S
2007-01-01
We sought to validate an automated outcomes surveillance system (DELTA) using OPUS (TIMI-16), a multi-center randomized, controlled trial that was stopped early due to elevated mortality in one of the two intervention arms. Methodologies that were incorporated into the application (Statistical Process Control [SPC] and Bayesian Updating Statistics [BUS]) were compared with standard Data Safety Monitoring Board (DSMB) protocols. PMID:18694141
Spectral Bayesian Knowledge Tracing
ERIC Educational Resources Information Center
Falakmasir, Mohammad; Yudelson, Michael; Ritter, Steve; Koedinger, Ken
2015-01-01
Bayesian Knowledge Tracing (BKT) has been in wide use for modeling student skill acquisition in Intelligent Tutoring Systems (ITS). BKT tracks and updates student's latent mastery of a skill as a probability distribution of a binary variable. BKT does so by accounting for observed student successes in applying the skill correctly, where success is…
ERIC Educational Resources Information Center
Gustafson, S. C.; Costello, C. S.; Like, E. C.; Pierce, S. J.; Shenoy, K. N.
2009-01-01
Bayesian estimation of a threshold time (hereafter simply threshold) for the receipt of impulse signals is accomplished given the following: 1) data, consisting of the number of impulses received in a time interval from zero to one and the time of the largest time impulse; 2) a model, consisting of a uniform probability density of impulse time…
Exploring the use of statistical process control methods to assess course changes
NASA Astrophysics Data System (ADS)
Vollstedt, Ann-Marie
This dissertation pertains to the field of Engineering Education. The Department of Mechanical Engineering at the University of Nevada, Reno (UNR) is hosting this dissertation under a special agreement. This study was motivated by the desire to find an improved, quantitative measure of student quality that is both convenient to use and easy to evaluate. While traditional statistical analysis tools such as ANOVA (analysis of variance) are useful, they are somewhat time consuming and are subject to error because they are based on grades, which are influenced by numerous variables, independent of student ability and effort (e.g. inflation and curving). Additionally, grades are currently the only measure of quality in most engineering courses even though most faculty agree that grades do not accurately reflect student quality. Based on a literature search, in this study, quality was defined as content knowledge, cognitive level, self efficacy, and critical thinking. Nineteen treatments were applied to a pair of freshmen classes in an effort in increase the qualities. The qualities were measured via quiz grades, essays, surveys, and online critical thinking tests. Results from the quality tests were adjusted and filtered prior to analysis. All test results were subjected to Chauvenet's criterion in order to detect and remove outlying data. In addition to removing outliers from data sets, it was felt that individual course grades needed adjustment to accommodate for the large portion of the grade that was defined by group work. A new method was developed to adjust grades within each group based on the residual of the individual grades within the group and the portion of the course grade defined by group work. It was found that the grade adjustment method agreed 78% of the time with the manual ii grade changes instructors made in 2009, and also increased the correlation between group grades and individual grades. Using these adjusted grades, Statistical Process Control
Badr, Lina Kurdahi
2009-01-01
By adopting more appropriate statistical methods to appraise data from a previously published randomized controlled trial (RCT), we evaluated the statistical and clinical significance of an intervention on the 18 month neurodevelopmental outcome of infants with suspected brain injury. The intervention group (n =32) received extensive, individualized cognitive/sensorimotor stimulation by public health nurses (PHNs) while the control group (n = 30) received standard follow-up care. At 18 months 43 infants remained in the study (22 = intervention, 21 = control). The results indicate that there was a significant statistical change within groups and a clinical significance whereby more infants in the intervention group improved in mental, motor and neurological functioning at 18 months compared to the control group. The benefits of looking at clinical significance from a meaningful aspect for practitioners are emphasized. PMID:19276403
A Bayesian Measurment Error Model for Misaligned Radiographic Data
Lennox, Kristin P.; Glascoe, Lee G.
2013-09-06
An understanding of the inherent variability in micro-computed tomography (micro-CT) data is essential to tasks such as statistical process control and the validation of radiographic simulation tools. The data present unique challenges to variability analysis due to the relatively low resolution of radiographs, and also due to minor variations from run to run which can result in misalignment or magnification changes between repeated measurements of a sample. Positioning changes artificially inflate the variability of the data in ways that mask true physical phenomena. We present a novel Bayesian nonparametric regression model that incorporates both additive and multiplicative measurement error inmore » addition to heteroscedasticity to address this problem. We also use this model to assess the effects of sample thickness and sample position on measurement variability for an aluminum specimen. Supplementary materials for this article are available online.« less
A Bayesian Measurment Error Model for Misaligned Radiographic Data
Lennox, Kristin P.; Glascoe, Lee G.
2013-09-06
An understanding of the inherent variability in micro-computed tomography (micro-CT) data is essential to tasks such as statistical process control and the validation of radiographic simulation tools. The data present unique challenges to variability analysis due to the relatively low resolution of radiographs, and also due to minor variations from run to run which can result in misalignment or magnification changes between repeated measurements of a sample. Positioning changes artificially inflate the variability of the data in ways that mask true physical phenomena. We present a novel Bayesian nonparametric regression model that incorporates both additive and multiplicative measurement error in addition to heteroscedasticity to address this problem. We also use this model to assess the effects of sample thickness and sample position on measurement variability for an aluminum specimen. Supplementary materials for this article are available online.
Quantum-Like Representation of Non-Bayesian Inference
NASA Astrophysics Data System (ADS)
Asano, M.; Basieva, I.; Khrennikov, A.; Ohya, M.; Tanaka, Y.
2013-01-01
This research is related to the problem of "irrational decision making or inference" that have been discussed in cognitive psychology. There are some experimental studies, and these statistical data cannot be described by classical probability theory. The process of decision making generating these data cannot be reduced to the classical Bayesian inference. For this problem, a number of quantum-like coginitive models of decision making was proposed. Our previous work represented in a natural way the classical Bayesian inference in the frame work of quantum mechanics. By using this representation, in this paper, we try to discuss the non-Bayesian (irrational) inference that is biased by effects like the quantum interference. Further, we describe "psychological factor" disturbing "rationality" as an "environment" correlating with the "main system" of usual Bayesian inference.
Bayesian multimodel inference for dose-response studies
Link, W.A.; Albers, P.H.
2007-01-01
Statistical inference in dose?response studies is model-based: The analyst posits a mathematical model of the relation between exposure and response, estimates parameters of the model, and reports conclusions conditional on the model. Such analyses rarely include any accounting for the uncertainties associated with model selection. The Bayesian inferential system provides a convenient framework for model selection and multimodel inference. In this paper we briefly describe the Bayesian paradigm and Bayesian multimodel inference. We then present a family of models for multinomial dose?response data and apply Bayesian multimodel inferential methods to the analysis of data on the reproductive success of American kestrels (Falco sparveriuss) exposed to various sublethal dietary concentrations of methylmercury.
Advances in Bayesian Multiple QTL Mapping in Experimental Crosses
Yi, Nengjun; Shriner, Daniel
2016-01-01
Many complex human diseases and traits of biological and/or economic importance are determined by interacting networks of multiple quantitative trait loci (QTL) and environmental factors. Mapping QTL is critical for understanding the genetic basis of complex traits, and for ultimate identification of responsible genes. A variety of sophisticated statistical methods for QTL mapping have been developed. Among these developments, the evolution of Bayesian approaches for multiple QTL mapping over the past decade has been remarkable. Bayesian methods can jointly infer the number of QTL, their genomic positions, and their genetic effects. Here, we review recently developed and still developing Bayesian methods and associated computer software for mapping multiple QTL in experimental crosses. We compare and contrast these methods to clearly describe the relationships among different Bayesian methods. We conclude this review by highlighting some areas of future research. PMID:17987056
On Bayesian analysis of on-off measurements
NASA Astrophysics Data System (ADS)
Nosek, Dalibor; Nosková, Jana
2016-06-01
We propose an analytical solution to the on-off problem within the framework of Bayesian statistics. Both the statistical significance for the discovery of new phenomena and credible intervals on model parameters are presented in a consistent way. We use a large enough family of prior distributions of relevant parameters. The proposed analysis is designed to provide Bayesian solutions that can be used for any number of observed on-off events, including zero. The procedure is checked using Monte Carlo simulations. The usefulness of the method is demonstrated on examples from γ-ray astronomy.
Bayesian truthing and experimental validation in homeland security and defense
NASA Astrophysics Data System (ADS)
Jannson, Tomasz; Forrester, Thomas; Wang, Wenjian; Kostrzewski, Andrew; Pradhan, Ranjit
2014-05-01
In this paper we discuss relations between Bayesian Truthing (experimental validation), Bayesian statistics, and Binary Sensing in the context of selected Homeland Security and Intelligence, Surveillance, Reconnaissance (ISR) optical and nonoptical application scenarios. The basic Figure of Merit (FoM) is Positive Predictive Value (PPV), as well as false positives and false negatives. By using these simple binary statistics, we can analyze, classify, and evaluate a broad variety of events including: ISR; natural disasters; QC; and terrorism-related, GIS-related, law enforcement-related, and other C3I events.
Rah, Jeong-Eun; Oh, Do Hoon; Shin, Dongho; Kim, Tae Hyun; Kim, Gwe-Ya
2014-09-15
Purpose: To evaluate and improve the reliability of proton quality assurance (QA) processes and, to provide an optimal customized tolerance level using the statistical process control (SPC) methodology. Methods: The authors investigated the consistency check of dose per monitor unit (D/MU) and range in proton beams to see whether it was within the tolerance level of the daily QA process. This study analyzed the difference between the measured and calculated ranges along the central axis to improve the patient-specific QA process in proton beams by using process capability indices. Results: The authors established a customized tolerance level of ±2% for D/MU and ±0.5 mm for beam range in the daily proton QA process. In the authors’ analysis of the process capability indices, the patient-specific range measurements were capable of a specification limit of ±2% in clinical plans. Conclusions: SPC methodology is a useful tool for customizing the optimal QA tolerance levels and improving the quality of proton machine maintenance, treatment delivery, and ultimately patient safety.
Practical guidelines for applying statistical process control to blood component production.
Beckman, N; Nightingale, M J; Pamphilon, D
2009-12-01
Legislation, guidelines and recommendations for blood components related to statistical process control (SPC) and the selection of a quality monitoring (QM) sampling regimen are subject to misinterpretation and lack practical guidance on implementation. The aim of this article is: to review and interpret applicable European legislation and guidelines and to develop an SPC strategy that meets these requirements; and to provide practical guidance on the selection and application of appropriate techniques and the interpretation of resultant blood component QM data. A methodology is presented which utilizes: an algorithm to select an appropriate quality-monitoring strategy for the blood component parameter under consideration; a range of straightforward, validated SPC techniques for variable data and an assessment of process capability (Cpk) and blood component parameter 'criticality' to determine the sampling regimen. The methodology was applied to routine National Health Service Blood and Transplant (NHSBT) blood component data for 2005-2006. Cpk values ranged from 0.22 to >3 and their predicted non-conformance rates were close to those observed (23 to <0.001%). Required sample size ranged from 0.01 to 10%. Chosen techniques identified significant deviation from 'as validated' performance within an appropriate time-scale. Thus the methodology was straightforward to apply and prompted the choice of a clinically and operationally appropriate sampling regimen and analysis for each blood component parameter. This evidence-based, targeted use of SPC for blood component monitoring provides an essential focus on processes with a low capability in achieving their specifications. PMID:19761545
Babamoradi, Hamid; van den Berg, Frans; Rinnan, Åsmund
2016-02-18
In Multivariate Statistical Process Control, when a fault is expected or detected in the process, contribution plots are essential for operators and optimization engineers in identifying those process variables that were affected by or might be the cause of the fault. The traditional way of interpreting a contribution plot is to examine the largest contributing process variables as the most probable faulty ones. This might result in false readings purely due to the differences in natural variation, measurement uncertainties, etc. It is more reasonable to compare variable contributions for new process runs with historical results achieved under Normal Operating Conditions, where confidence limits for contribution plots estimated from training data are used to judge new production runs. Asymptotic methods cannot provide confidence limits for contribution plots, leaving re-sampling methods as the only option. We suggest bootstrap re-sampling to build confidence limits for all contribution plots in online PCA-based MSPC. The new strategy to estimate CLs is compared to the previously reported CLs for contribution plots. An industrial batch process dataset was used to illustrate the concepts. PMID:26826689
Takahashi, Fumitake; Kida, Akiko; Shimaoka, Takayuki
2010-10-15
Although representative removal efficiencies of gaseous mercury for air pollution control devices (APCDs) are important to prepare more reliable atmospheric emission inventories of mercury, they have been still uncertain because they depend sensitively on many factors like the type of APCDs, gas temperature, and mercury speciation. In this study, representative removal efficiencies of gaseous mercury for several types of APCDs of municipal solid waste incineration (MSWI) were offered using a statistical method. 534 data of mercury removal efficiencies for APCDs used in MSWI were collected. APCDs were categorized as fixed-bed absorber (FA), wet scrubber (WS), electrostatic precipitator (ESP), and fabric filter (FF), and their hybrid systems. Data series of all APCD types had Gaussian log-normality. The average removal efficiency with a 95% confidence interval for each APCD was estimated. The FA, WS, and FF with carbon and/or dry sorbent injection systems had 75% to 82% average removal efficiencies. On the other hand, the ESP with/without dry sorbent injection had lower removal efficiencies of up to 22%. The type of dry sorbent injection in the FF system, dry or semi-dry, did not make more than 1% difference to the removal efficiency. The injection of activated carbon and carbon-containing fly ash in the FF system made less than 3% difference. Estimation errors of removal efficiency were especially high for the ESP. The national average of removal efficiency of APCDs in Japanese MSWI plants was estimated on the basis of incineration capacity. Owing to the replacement of old APCDs for dioxin control, the national average removal efficiency increased from 34.5% in 1991 to 92.5% in 2003. This resulted in an additional reduction of about 0.86Mg emission in 2003. Further study using the methodology in this study to other important emission sources like coal-fired power plants will contribute to better emission inventories. PMID:20713298
Structural damage detection using extended Kalman filter combined with statistical process control
NASA Astrophysics Data System (ADS)
Jin, Chenhao; Jang, Shinae; Sun, Xiaorong
2015-04-01
Traditional modal-based methods, which identify damage based upon changes in vibration characteristics of the structure on a global basis, have received considerable attention in the past decades. However, the effectiveness of the modalbased methods is dependent on the type of damage and the accuracy of the structural model, and these methods may also have difficulties when applied to complex structures. The extended Kalman filter (EKF) algorithm which has the capability to estimate parameters and catch abrupt changes, is currently used in continuous and automatic structural damage detection to overcome disadvantages of traditional methods. Structural parameters are typically slow-changing variables under effects of operational and environmental conditions, thus it would be difficult to observe the structural damage and quantify the damage in real-time with EKF only. In this paper, a Statistical Process Control (SPC) is combined with EFK method in order to overcome this difficulty. Based on historical measurements of damage-sensitive feathers involved in the state-space dynamic models, extended Kalman filter (EKF) algorithm is used to produce real-time estimations of these features as well as standard derivations, which can then be used to form control ranges for SPC to detect any abnormality of the selected features. Moreover, confidence levels of the detection can be adjusted by choosing different times of sigma and number of adjacent out-of-range points. The proposed method is tested using simulated data of a three floors linear building in different damage scenarios, and numerical results demonstrate high damage detection accuracy and light computation of this presented method.
Hatjimihail, Aristides T.
2009-01-01
Background An open problem in clinical chemistry is the estimation of the optimal sampling time intervals for the application of statistical quality control (QC) procedures that are based on the measurement of control materials. This is a probabilistic risk assessment problem that requires reliability analysis of the analytical system, and the estimation of the risk caused by the measurement error. Methodology/Principal Findings Assuming that the states of the analytical system are the reliability state, the maintenance state, the critical-failure modes and their combinations, we can define risk functions based on the mean time of the states, their measurement error and the medically acceptable measurement error. Consequently, a residual risk measure rr can be defined for each sampling time interval. The rr depends on the state probability vectors of the analytical system, the state transition probability matrices before and after each application of the QC procedure and the state mean time matrices. As optimal sampling time intervals can be defined those minimizing a QC related cost measure while the rr is acceptable. I developed an algorithm that estimates the rr for any QC sampling time interval of a QC procedure applied to analytical systems with an arbitrary number of critical-failure modes, assuming any failure time and measurement error probability density function for each mode. Furthermore, given the acceptable rr, it can estimate the optimal QC sampling time intervals. Conclusions/Significance It is possible to rationally estimate the optimal QC sampling time intervals of an analytical system to sustain an acceptable residual risk with the minimum QC related cost. For the optimization the reliability analysis of the analytical system and the risk analysis of the measurement error are needed. PMID:19513124
An improvement in IMRT QA results and beam matching in linacs using statistical process control.
Gagneur, Justin D; Ezzell, Gary A
2014-01-01
The purpose of this study is to apply the principles of statistical process control (SPC) in the context of patient specific intensity-modulated radiation therapy (IMRT) QA to set clinic-specific action limits and evaluate the impact of changes to the multileaf collimator (MLC) calibrations on IMRT QA results. Ten months of IMRT QA data with 247 patient QAs collected on three beam-matched linacs were retrospectively analyzed with a focus on the gamma pass rate (GPR) and the average ratio between the measured and planned doses. Initial control charts and action limits were calculated. Based on this data, changes were made to the leaf gap parameter for the MLCs to improve the consistency between linacs. This leaf gap parameter is tested monthly using a MLC sweep test. A follow-up dataset with 424 unique QAs were used to evaluate the impact of the leaf gap parameter change. The initial data average GPR was 98.6% with an SPC action limit of 93.7%. The average ratio of doses was 1.003, with an upper action limit of 1.017 and a lower action limit of 0.989. The sweep test results for the linacs were -1.8%, 0%, and +1.2% from nominal. After the adjustment of the leaf gap parameter, all sweep test results were within 0.4% of nominal. Subsequently, the average GPR was 99.4% with an SPC action limit of 97.3%. The average ratio of doses was 0.997 with an upper action limit of 1.011 and a lower action limit of 0.981. Applying the principles of SPC to IMRT QA allowed small differences between closely matched linacs to be identified and reduced. Ongoing analysis will monitor the process and be used to refine the clinical action limits for IMRT QA. PMID:25207579
Bayesian parameter estimation for effective field theories
NASA Astrophysics Data System (ADS)
Wesolowski, S.; Klco, N.; Furnstahl, R. J.; Phillips, D. R.; Thapaliya, A.
2016-07-01
We present procedures based on Bayesian statistics for estimating, from data, the parameters of effective field theories (EFTs). The extraction of low-energy constants (LECs) is guided by theoretical expectations in a quantifiable way through the specification of Bayesian priors. A prior for natural-sized LECs reduces the possibility of overfitting, and leads to a consistent accounting of different sources of uncertainty. A set of diagnostic tools is developed that analyzes the fit and ensures that the priors do not bias the EFT parameter estimation. The procedures are illustrated using representative model problems, including the extraction of LECs for the nucleon-mass expansion in SU(2) chiral perturbation theory from synthetic lattice data.
Advanced Bayesian Method for Planetary Surface Navigation
NASA Technical Reports Server (NTRS)
Center, Julian
2015-01-01
Autonomous Exploration, Inc., has developed an advanced Bayesian statistical inference method that leverages current computing technology to produce a highly accurate surface navigation system. The method combines dense stereo vision and high-speed optical flow to implement visual odometry (VO) to track faster rover movements. The Bayesian VO technique improves performance by using all image information rather than corner features only. The method determines what can be learned from each image pixel and weighs the information accordingly. This capability improves performance in shadowed areas that yield only low-contrast images. The error characteristics of the visual processing are complementary to those of a low-cost inertial measurement unit (IMU), so the combination of the two capabilities provides highly accurate navigation. The method increases NASA mission productivity by enabling faster rover speed and accuracy. On Earth, the technology will permit operation of robots and autonomous vehicles in areas where the Global Positioning System (GPS) is degraded or unavailable.
Quantum-like Representation of Bayesian Updating
NASA Astrophysics Data System (ADS)
Asano, Masanari; Ohya, Masanori; Tanaka, Yoshiharu; Khrennikov, Andrei; Basieva, Irina
2011-03-01
Recently, applications of quantum mechanics to coginitive psychology have been discussed, see [1]-[11]. It was known that statistical data obtained in some experiments of cognitive psychology cannot be described by classical probability model (Kolmogorov's model) [12]-[15]. Quantum probability is one of the most advanced mathematical models for non-classical probability. In the paper of [11], we proposed a quantum-like model describing decision-making process in a two-player game, where we used the generalized quantum formalism based on lifting of density operators [16]. In this paper, we discuss the quantum-like representation of Bayesian inference, which has been used to calculate probabilities for decision making under uncertainty. The uncertainty is described in the form of quantum superposition, and Bayesian updating is explained as a reduction of state by quantum measurement.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-02-08
...The Food and Drug Administration (FDA) is announcing the availability of the guidance entitled ``Guidance for the Use of Bayesian Statistics in Medical Device Clinical Trials.'' This guidance summarizes FDA's current thoughts on the appropriate use of Bayesian statistical methods in the design and analysis of medical device clinical...
NASA Astrophysics Data System (ADS)
Goodman, Joseph W.
2000-07-01
The Wiley Classics Library consists of selected books that have become recognized classics in their respective fields. With these new unabridged and inexpensive editions, Wiley hopes to extend the life of these important works by making them available to future generations of mathematicians and scientists. Currently available in the Series: T. W. Anderson The Statistical Analysis of Time Series T. S. Arthanari & Yadolah Dodge Mathematical Programming in Statistics Emil Artin Geometric Algebra Norman T. J. Bailey The Elements of Stochastic Processes with Applications to the Natural Sciences Robert G. Bartle The Elements of Integration and Lebesgue Measure George E. P. Box & Norman R. Draper Evolutionary Operation: A Statistical Method for Process Improvement George E. P. Box & George C. Tiao Bayesian Inference in Statistical Analysis R. W. Carter Finite Groups of Lie Type: Conjugacy Classes and Complex Characters R. W. Carter Simple Groups of Lie Type William G. Cochran & Gertrude M. Cox Experimental Designs, Second Edition Richard Courant Differential and Integral Calculus, Volume I RIchard Courant Differential and Integral Calculus, Volume II Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume I Richard Courant & D. Hilbert Methods of Mathematical Physics, Volume II D. R. Cox Planning of Experiments Harold S. M. Coxeter Introduction to Geometry, Second Edition Charles W. Curtis & Irving Reiner Representation Theory of Finite Groups and Associative Algebras Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume I Charles W. Curtis & Irving Reiner Methods of Representation Theory with Applications to Finite Groups and Orders, Volume II Cuthbert Daniel Fitting Equations to Data: Computer Analysis of Multifactor Data, Second Edition Bruno de Finetti Theory of Probability, Volume I Bruno de Finetti Theory of Probability, Volume 2 W. Edwards Deming Sample Design in Business Research
Emmert-Streib, Frank; de Matos Simoes, Ricardo; Tripathi, Shailesh; Glazko, Galina V; Dehmer, Matthias
2012-01-01
In this paper, we present a Bayesian approach to estimate a chromosome and a disorder network from the Online Mendelian Inheritance in Man (OMIM) database. In contrast to other approaches, we obtain statistic rather than deterministic networks enabling a parametric control in the uncertainty of the underlying disorder-disease gene associations contained in the OMIM, on which the networks are based. From a structural investigation of the chromosome network, we identify three chromosome subgroups that reflect architectural differences in chromosome-disorder associations that are predictively exploitable for a functional analysis of diseases. PMID:22822426
A critique of statistical hypothesis testing in clinical research
Raha, Somik
2011-01-01
Many have documented the difficulty of using the current paradigm of Randomized Controlled Trials (RCTs) to test and validate the effectiveness of alternative medical systems such as Ayurveda. This paper critiques the applicability of RCTs for all clinical knowledge-seeking endeavors, of which Ayurveda research is a part. This is done by examining statistical hypothesis testing, the underlying foundation of RCTs, from a practical and philosophical perspective. In the philosophical critique, the two main worldviews of probability are that of the Bayesian and the frequentist. The frequentist worldview is a special case of the Bayesian worldview requiring the unrealistic assumptions of knowing nothing about the universe and believing that all observations are unrelated to each other. Many have claimed that the first belief is necessary for science, and this claim is debunked by comparing variations in learning with different prior beliefs. Moving beyond the Bayesian and frequentist worldviews, the notion of hypothesis testing itself is challenged on the grounds that a hypothesis is an unclear distinction, and assigning a probability on an unclear distinction is an exercise that does not lead to clarity of action. This critique is of the theory itself and not any particular application of statistical hypothesis testing. A decision-making frame is proposed as a way of both addressing this critique and transcending ideological debates on probability. An example of a Bayesian decision-making approach is shown as an alternative to statistical hypothesis testing, utilizing data from a past clinical trial that studied the effect of Aspirin on heart attacks in a sample population of doctors. As a big reason for the prevalence of RCTs in academia is legislation requiring it, the ethics of legislating the use of statistical methods for clinical research is also examined. PMID:22022152
ERIC Educational Resources Information Center
Fienup, Daniel M.; Critchfield, Thomas S.
2010-01-01
Computerized lessons that reflect stimulus equivalence principles were used to teach college students concepts related to inferential statistics and hypothesis decision making. Lesson 1 taught participants concepts related to inferential statistics, and Lesson 2 taught them to base hypothesis decisions on a scientific hypothesis and the direction…
Statistical examination of laser therapy effects in controlled double-blind clinical trial
NASA Astrophysics Data System (ADS)
Boerner, Ewa; Podbielska, Halina
2001-10-01
For the evaluation of the therapy effects the double-blind clinical trial followed by statistical analysis was performed. After statistical calculations it was stated that laser therapy with IR radiation has a significant influence on the decrease of the level of pain in the examined group of patients suffering from various locomotive diseases. The level of pain of patients undergoing laser therapy was statistically lower than the level of pain of patients undergoing placebo therapy. It means that laser therapy had statistically significant influence on the decrease of the level of pain. The same tests were performed for evaluation of movement range. Although placebo therapy contributes to the increase of the range of movement, the statistically significant influence was stated in case of the therapeutic group treated by laser.
Probability, statistics, and computational science.
Beerenwinkel, Niko; Siebourg, Juliane
2012-01-01
In this chapter, we review basic concepts from probability theory and computational statistics that are fundamental to evolutionary genomics. We provide a very basic introduction to statistical modeling and discuss general principles, including maximum likelihood and Bayesian inference. Markov chains, hidden Markov models, and Bayesian network models are introduced in more detail as they occur frequently and in many variations in genomics applications. In particular, we discuss efficient inference algorithms and methods for learning these models from partially observed data. Several simple examples are given throughout the text, some of which point to models that are discussed in more detail in subsequent chapters. PMID:22407706
Efficient Bayesian Phase Estimation
NASA Astrophysics Data System (ADS)
Wiebe, Nathan; Granade, Chris
2016-07-01
We introduce a new method called rejection filtering that we use to perform adaptive Bayesian phase estimation. Our approach has several advantages: it is classically efficient, easy to implement, achieves Heisenberg limited scaling, resists depolarizing noise, tracks time-dependent eigenstates, recovers from failures, and can be run on a field programmable gate array. It also outperforms existing iterative phase estimation algorithms such as Kitaev's method.
Efficient Bayesian Phase Estimation.
Wiebe, Nathan; Granade, Chris
2016-07-01
We introduce a new method called rejection filtering that we use to perform adaptive Bayesian phase estimation. Our approach has several advantages: it is classically efficient, easy to implement, achieves Heisenberg limited scaling, resists depolarizing noise, tracks time-dependent eigenstates, recovers from failures, and can be run on a field programmable gate array. It also outperforms existing iterative phase estimation algorithms such as Kitaev's method. PMID:27419551
Model parameter updating using Bayesian networks
Treml, C. A.; Ross, Timothy J.
2004-01-01
This paper outlines a model parameter updating technique for a new method of model validation using a modified model reference adaptive control (MRAC) framework with Bayesian Networks (BNs). The model parameter updating within this method is generic in the sense that the model/simulation to be validated is treated as a black box. It must have updateable parameters to which its outputs are sensitive, and those outputs must have metrics that can be compared to that of the model reference, i.e., experimental data. Furthermore, no assumptions are made about the statistics of the model parameter uncertainty, only upper and lower bounds need to be specified. This method is designed for situations where a model is not intended to predict a complete point-by-point time domain description of the item/system behavior; rather, there are specific points, features, or events of interest that need to be predicted. These specific points are compared to the model reference derived from actual experimental data. The logic for updating the model parameters to match the model reference is formed via a BN. The nodes of this BN consist of updateable model input parameters and the specific output values or features of interest. Each time the model is executed, the input/output pairs are used to adapt the conditional probabilities of the BN. Each iteration further refines the inferred model parameters to produce the desired model output. After parameter updating is complete and model inputs are inferred, reliabilities for the model output are supplied. Finally, this method is applied to a simulation of a resonance control cooling system for a prototype coupled cavity linac. The results are compared to experimental data.
NASA Astrophysics Data System (ADS)
Wiegerinck, Wim; Schoenaker, Christiaan; Duane, Gregory
2016-04-01
Recently, methods for model fusion by dynamically combining model components in an interactive ensemble have been proposed. In these proposals, fusion parameters have to be learned from data. One can view these systems as parametrized dynamical systems. We address the question of learnability of dynamical systems with respect to both short term (vector field) and long term (attractor) behavior. In particular we are interested in learning in the imperfect model class setting, in which the ground truth has a higher complexity than the models, e.g. due to unresolved scales. We take a Bayesian point of view and we define a joint log-likelihood that consists of two terms, one is the vector field error and the other is the attractor error, for which we take the L1 distance between the stationary distributions of the model and the assumed ground truth. In the context of linear models (like so-called weighted supermodels), and assuming a Gaussian error model in the vector fields, vector field learning leads to a tractable Gaussian solution. This solution can then be used as a prior for the next step, Bayesian attractor learning, in which the attractor error is used as a log-likelihood term. Bayesian attractor learning is implemented by elliptical slice sampling, a sampling method for systems with a Gaussian prior and a non Gaussian likelihood. Simulations with a partially observed driven Lorenz 63 system illustrate the approach.
Bayesian ARTMAP for regression.
Sasu, L M; Andonie, R
2013-10-01
Bayesian ARTMAP (BA) is a recently introduced neural architecture which uses a combination of Fuzzy ARTMAP competitive learning and Bayesian learning. Training is generally performed online, in a single-epoch. During training, BA creates input data clusters as Gaussian categories, and also infers the conditional probabilities between input patterns and categories, and between categories and classes. During prediction, BA uses Bayesian posterior probability estimation. So far, BA was used only for classification. The goal of this paper is to analyze the efficiency of BA for regression problems. Our contributions are: (i) we generalize the BA algorithm using the clustering functionality of both ART modules, and name it BA for Regression (BAR); (ii) we prove that BAR is a universal approximator with the best approximation property. In other words, BAR approximates arbitrarily well any continuous function (universal approximation) and, for every given continuous function, there is one in the set of BAR approximators situated at minimum distance (best approximation); (iii) we experimentally compare the online trained BAR with several neural models, on the following standard regression benchmarks: CPU Computer Hardware, Boston Housing, Wisconsin Breast Cancer, and Communities and Crime. Our results show that BAR is an appropriate tool for regression tasks, both for theoretical and practical reasons. PMID:23665468
Hu, T.A.; Lo, J.C.
1994-11-01
A Quality Assurance independent assessment has brought about continued improvement in the PUREX Plant surveillance program at the Department of Energy`s Hanford Site. After the independent assessment, Quality Assurance personnel were closely involved in improving the surveillance program, specifically regarding storage tank monitoring. The independent assessment activities included reviewing procedures, analyzing surveillance data, conducting personnel interviews, and communicating with management. Process improvement efforts included: (1) designing data collection methods; (2) gaining concurrence between engineering and management, (3) revising procedures; and (4) interfacing with shift surveillance crews. Through this process, Statistical Process Control (SPC) was successfully implemented and surveillance management was improved. The independent assessment identified several deficiencies within the surveillance system. These deficiencies can be grouped into two areas: (1) data recording and analysis and (2) handling off-normal conditions. By using several independent assessment techniques, Quality Assurance was able to point out program weakness to senior management and present suggestions for improvements. SPC charting, as implemented by Quality Assurance, is an excellent tool for diagnosing the process, improving communication between the team members, and providing a scientific database for management decisions. In addition, the surveillance procedure was substantially revised. The goals of this revision were to (1) strengthen the role of surveillance management, engineering and operators and (2) emphasize the importance of teamwork for each individual who performs a task. In this instance we believe that the value independent assessment adds to the system is the continuous improvement activities that follow the independent assessment. Excellence in teamwork between the independent assessment organization and the auditee is the key to continuing improvement.
NASA Astrophysics Data System (ADS)
Flach, Milan; Mahecha, Miguel; Gans, Fabian; Rodner, Erik; Bodesheim, Paul; Guanche-Garcia, Yanira; Brenning, Alexander; Denzler, Joachim; Reichstein, Markus
2016-04-01
The number of available Earth observations (EOs) is currently substantially increasing. Detecting anomalous patterns in these multivariate time series is an important step in identifying changes in the underlying dynamical system. Likewise, data quality issues might result in anomalous multivariate data constellations and have to be identified before corrupting subsequent analyses. In industrial application a common strategy is to monitor production chains with several sensors coupled to some statistical process control (SPC) algorithm. The basic idea is to raise an alarm when these sensor data depict some anomalous pattern according to the SPC, i.e. the production chain is considered 'out of control'. In fact, the industrial applications are conceptually similar to the on-line monitoring of EOs. However, algorithms used in the context of SPC or process monitoring are rarely considered for supervising multivariate spatio-temporal Earth observations. The objective of this study is to exploit the potential and transferability of SPC concepts to Earth system applications. We compare a range of different algorithms typically applied by SPC systems and evaluate their capability to detect e.g. known extreme events in land surface processes. Specifically two main issues are addressed: (1) identifying the most suitable combination of data pre-processing and detection algorithm for a specific type of event and (2) analyzing the limits of the individual approaches with respect to the magnitude, spatio-temporal size of the event as well as the data's signal to noise ratio. Extensive artificial data sets that represent the typical properties of Earth observations are used in this study. Our results show that the majority of the algorithms used can be considered for the detection of multivariate spatiotemporal events and directly transferred to real Earth observation data as currently assembled in different projects at the European scale, e.g. http://baci-h2020.eu
Capability index--a statistical process control tool to aid in udder health control in dairy herds.
Niza-Ribeiro, J; Noordhuizen, J P T M; Menezes, J C
2004-08-01
Bulk milk somatic cell count (BMSCC) averages have been used to evaluate udder health both at the individual or the herd level as well as milk quality and hygiene. The authors show that the BMSCC average is not the best tool to be used in udder health control programs and that it can be replaced with advantage by the capability index (Cpk). The Cpk is a statistical process control tool traditionally used by engineers to validate, monitor, and predict the expected behavior of processes or machines. The BMSCC data of 13 consecutive months of production from 414 dairy herds as well as SCC from all cows in the DHI program from 264 herds in the same period were collected. The Cpk and the annual BMSCC average (AAVG) of all the herds were calculated. Confronting the herd's performance explained by the Cpk and AAVG with the European Union (EU) official limit for BMSCC of 400,000 cells/mL, it was noticed that the Cpk accurately classified the compliance of the 414 farms, whereas the AAVG misclassified 166 (40%) of the 414 selected farms. The annual prevalence of subclinical mastitis (SMP) of each herd was calculated with individual SCC data from the same 13-mo period. Cows with more than 200,000 SCC/mL were considered as having subclinical mastitis. A logistic regression model to relate the Cpk and the herd's subclinical mastitis prevalence was calculated. The model is: SMPe = 0.475 e(-0.5286 x Cpk). The validation of the model was carried out evaluating the relation between the observed SMP and the predicted SMPe, in terms of the linear correlation coefficient (R2) and the mean difference between SMP and SMPe (i.e., mean square error of prediction). The validation suggests that our model can be used to estimate the herd's SMP with the herd's Cpk. The Cpk equation relates the herd's BMSCC with the EU official SCC limit, thus the logistic regression model enables the adoption of critical limits for subclinical mastitis, taking into consideration the legal standard for SCC
ERIC Educational Resources Information Center
Hill, Stephen E.; Schvaneveldt, Shane J.
2011-01-01
This article presents an educational exercise in which statistical process control charts are constructed and used to identify the Steroids Era in American professional baseball. During this period (roughly 1993 until the present), numerous baseball players were alleged or proven to have used banned, performance-enhancing drugs. Also observed…
Caballero Morales, Santiago Omar
2013-01-01
The application of Preventive Maintenance (PM) and Statistical Process Control (SPC) are important practices to achieve high product quality, small frequency of failures, and cost reduction in a production process. However there are some points that have not been explored in depth about its joint application. First, most SPC is performed with the X-bar control chart which does not fully consider the variability of the production process. Second, many studies of design of control charts consider just the economic aspect while statistical restrictions must be considered to achieve charts with low probabilities of false detection of failures. Third, the effect of PM on processes with different failure probability distributions has not been studied. Hence, this paper covers these points, presenting the Economic Statistical Design (ESD) of joint X-bar-S control charts with a cost model that integrates PM with general failure distribution. Experiments showed statistically significant reductions in costs when PM is performed on processes with high failure rates and reductions in the sampling frequency of units for testing under SPC. PMID:23527082
ERIC Educational Resources Information Center
Coast Community Coll. District, Costa Mesa, CA.
This instructor's manual for workplace trainers contains the materials required to conduct a course in pre-statistical process control. The course consists of six lessons for workers and two lessons for supervisors that discuss the following: concepts taught in the six lessons; workers' progress in the individual lessons; and strategies for…
ERIC Educational Resources Information Center
Averitt, Sallie D.
This instructor guide, which was developed for use in a manufacturing firm's advanced technical preparation program, contains the materials required to present a learning module that is designed to prepare trainees for the program's statistical process control module by improving their basic math skills in working with line graphs and teaching…
Caballero Morales, Santiago Omar
2013-01-01
The application of Preventive Maintenance (PM) and Statistical Process Control (SPC) are important practices to achieve high product quality, small frequency of failures, and cost reduction in a production process. However there are some points that have not been explored in depth about its joint application. First, most SPC is performed with the X-bar control chart which does not fully consider the variability of the production process. Second, many studies of design of control charts consider just the economic aspect while statistical restrictions must be considered to achieve charts with low probabilities of false detection of failures. Third, the effect of PM on processes with different failure probability distributions has not been studied. Hence, this paper covers these points, presenting the Economic Statistical Design (ESD) of joint X-bar-S control charts with a cost model that integrates PM with general failure distribution. Experiments showed statistically significant reductions in costs when PM is performed on processes with high failure rates and reductions in the sampling frequency of units for testing under SPC. PMID:23527082
Bayesian approach for neural networks--review and case studies.
Lampinen, J; Vehtari, A
2001-04-01
We give a short review on the Bayesian approach for neural network learning and demonstrate the advantages of the approach in three real applications. We discuss the Bayesian approach with emphasis on the role of prior knowledge in Bayesian models and in classical error minimization approaches. The generalization capability of a statistical model, classical or Bayesian, is ultimately based on the prior assumptions. The Bayesian approach permits propagation of uncertainty in quantities which are unknown to other assumptions in the model, which may be more generally valid or easier to guess in the problem. The case problem studied in this paper include a regression, a classification, and an inverse problem. In the most thoroughly analyzed regression problem, the best models were those with less restrictive priors. This emphasizes the major advantage of the Bayesian approach, that we are not forced to guess attributes that are unknown, such as the number of degrees of freedom in the model, non-linearity of the model with respect to each input variable, or the exact form for the distribution of the model residuals. PMID:11341565
Bayesian Quantitative Electrophysiology and Its Multiple Applications in Bioengineering
Barr, Roger C.; Nolte, Loren W.; Pollard, Andrew E.
2014-01-01
Bayesian interpretation of observations began in the early 1700s, and scientific electrophysiology began in the late 1700s. For two centuries these two fields developed mostly separately. In part that was because quantitative Bayesian interpretation, in principle a powerful method of relating measurements to their underlying sources, often required too many steps to be feasible with hand calculation in real applications. As computer power became widespread in the later 1900s, Bayesian models and interpretation moved rapidly but unevenly from the domain of mathematical statistics into applications. Use of Bayesian models now is growing rapidly in electrophysiology. Bayesian models are well suited to the electrophysiological environment, allowing a direct and natural way to express what is known (and unknown) and to evaluate which one of many alternatives is most likely the source of the observations, and the closely related receiver operating characteristic (ROC) curve is a powerful tool in making decisions. Yet, in general, many people would ask what such models are for, in electrophysiology, and what particular advantages such models provide. So to examine this question in particular, this review identifies a number of electrophysiological papers in bio-engineering arising from questions in several organ systems to see where Bayesian electrophysiological models or ROC curves were important to the results that were achieved. PMID:22275206
Bonangelino, Pablo; Irony, Telba; Liang, Shengde; Li, Xuefeng; Mukhi, Vandana; Ruan, Shiling; Xu, Yunling; Yang, Xiting; Wang, Chenguang
2011-09-01
Challenging statistical issues often arise in the design and analysis of clinical trials to assess safety and effectiveness of medical devices in the regulatory setting. The use of Bayesian methods in the design and analysis of medical device clinical trials has been increasing significantly in the past decade, not only due to the availability of prior information, but mainly due to the appealing nature of Bayesian clinical trial designs. The Center for Devices and Radiological Health at the Food and Drug Administration (FDA) has gained extensive experience with the use of Bayesian statistical methods and has identified some important issues that need further exploration. In this article, we discuss several topics relating to the use of Bayesian statistical methods in medical device trials, based on our experience and real applications. We illustrate the benefits and challenges of Bayesian approaches when incorporating prior information to evaluate the effectiveness and safety of a medical device. We further present an example of a Bayesian adaptive clinical trial and compare it to a traditional frequentist design. Finally, we discuss the use of Bayesian hierarchical models for multiregional trials and highlight the advantages of the Bayesian approach when specifying clinically relevant study hypotheses. PMID:21830924
Statistical Characteristics of Experimental Geysers: Factors Controlling Mass and Style of Eruption
NASA Astrophysics Data System (ADS)
Toramaru, A.; Maeda, K.
2011-12-01
to volume of empty at the conduit top. Once the eruption is triggered, mass proportional to number fraction φ D of parcels with temperature higher than decompression boiling point under free of conduit water is erupted. Assuming that the rectangular area from the top to the depth n (φ S}+φ {D) of square flask is erupted, the erupted mass is calculated by n ± n (φ S}+φ {D). The sum of φ S}+φ {D and the enclosed area is defined as explosive mass which contributes to the explosivity of eruption, and the explosivity index (EI) is calculated by the mass ratio of (explosive mass)/(erupted mass). Presuming a specific system, we carried out Monte Carlo simulations with varying average and variance of temperature PDF as parameters to obtain the statistical properties of erupted mass and EI. As a result, we find that a system with higher average temperature and smaller variance produces mostly explosive eruptions with larger mass with Gaussian type of frequency distribution. Thus, from results of laboratory experiments and simulations, we conclude that the spatial heterogeneity of supersaturated state or temperature in chamber is a key factor to control the eruption style and erupted mass.
Using Bayesian statistical methods to quantify uncertainty and variability in human physiologically-based pharmacokinetic (PBPK) model predictions for use in risk assessments requires prior distributions (priors), which characterize what is known or believed about parameters’ val...
Using Bayesian statistical methods to quantify uncertainty and variability in human PBPK model predictions for use in risk assessments requires prior distributions (priors), which characterize what is known or believed about parameters’ values before observing in vivo data. Expe...
BNFinder2: Faster Bayesian network learning and Bayesian classification
Dojer, Norbert; Bednarz, Paweł; Podsiadło, Agnieszka; Wilczyński, Bartek
2013-01-01
Summary: Bayesian Networks (BNs) are versatile probabilistic models applicable to many different biological phenomena. In biological applications the structure of the network is usually unknown and needs to be inferred from experimental data. BNFinder is a fast software implementation of an exact algorithm for finding the optimal structure of the network given a number of experimental observations. Its second version, presented in this article, represents a major improvement over the previous version. The improvements include (i) a parallelized learning algorithm leading to an order of magnitude speed-ups in BN structure learning time; (ii) inclusion of an additional scoring function based on mutual information criteria; (iii) possibility of choosing the resulting network specificity based on statistical criteria and (iv) a new module for classification by BNs, including cross-validation scheme and classifier quality measurements with receiver operator characteristic scores. Availability and implementation: BNFinder2 is implemented in python and freely available under the GNU general public license at the project Web site https://launchpad.net/bnfinder, together with a user’s manual, introductory tutorial and supplementary methods. Contact: dojer@mimuw.edu.pl or bartek@mimuw.edu.pl Supplementary information: Supplementary data are available at Bioinformatics online. PMID:23818512
Evaluating the performances of statistical and neural network based control charts
NASA Astrophysics Data System (ADS)
Teoh, Kok Ban; Ong, Hong Choon
2015-10-01
Control chart is used widely in many fields and traditional control chart is no longer adequate in detecting a sudden change in a particular process. So, run rules which are built in into Shewhart X ¯ control chart while Exponential Weighted Moving Average control chart (EWMA), Cumulative Sum control chart (CUSUM) and neural network based control chart are introduced to overcome the limitation regarding to the sensitivity of traditional control chart. In this study, the average run length (ARL) and median run length (MRL) in the shifts in the process mean of control charts mentioned will be computed. We will show that interpretations based only on the ARL can be misleading. Thus, MRL is also used to evaluate the performances of the control charts. From this study, neural network based control chart is found to possess a better performance than run rules of Shewhart X ¯ control chart, EWMA and CUSUM control chart.
Model Comparison of Bayesian Semiparametric and Parametric Structural Equation Models
ERIC Educational Resources Information Center
Song, Xin-Yuan; Xia, Ye-Mao; Pan, Jun-Hao; Lee, Sik-Yum
2011-01-01
Structural equation models have wide applications. One of the most important issues in analyzing structural equation models is model comparison. This article proposes a Bayesian model comparison statistic, namely the "L[subscript nu]"-measure for both semiparametric and parametric structural equation models. For illustration purposes, we consider…
Personalized Multi-Student Improvement Based on Bayesian Cybernetics
ERIC Educational Resources Information Center
Kaburlasos, Vassilis G.; Marinagi, Catherine C.; Tsoukalas, Vassilis Th.
2008-01-01
This work presents innovative cybernetics (feedback) techniques based on Bayesian statistics for drawing questions from an Item Bank towards personalized multi-student improvement. A novel software tool, namely "Module for Adaptive Assessment of Students" (or, "MAAS" for short), implements the proposed (feedback) techniques. In conclusion, a pilot…
Model Criticism of Bayesian Networks with Latent Variables.
ERIC Educational Resources Information Center
Williamson, David M.; Mislevy, Robert J.; Almond, Russell G.
This study investigated statistical methods for identifying errors in Bayesian networks (BN) with latent variables, as found in intelligent cognitive assessments. BN, commonly used in artificial intelligence systems, are promising mechanisms for scoring constructed-response examinations. The success of an intelligent assessment or tutoring system…
Augmenting Data with Published Results in Bayesian Linear Regression
ERIC Educational Resources Information Center
de Leeuw, Christiaan; Klugkist, Irene
2012-01-01
In most research, linear regression analyses are performed without taking into account published results (i.e., reported summary statistics) of similar previous studies. Although the prior density in Bayesian linear regression could accommodate such prior knowledge, formal models for doing so are absent from the literature. The goal of this…
A Comparison of Imputation Methods for Bayesian Factor Analysis Models
ERIC Educational Resources Information Center
Merkle, Edgar C.
2011-01-01
Imputation methods are popular for the handling of missing data in psychology. The methods generally consist of predicting missing data based on observed data, yielding a complete data set that is amiable to standard statistical analyses. In the context of Bayesian factor analysis, this article compares imputation under an unrestricted…
Stan: A Probabilistic Programming Language for Bayesian Inference and Optimization
ERIC Educational Resources Information Center
Gelman, Andrew; Lee, Daniel; Guo, Jiqiang
2015-01-01
Stan is a free and open-source C++ program that performs Bayesian inference or optimization for arbitrary user-specified models and can be called from the command line, R, Python, Matlab, or Julia and has great promise for fitting large and complex statistical models in many areas of application. We discuss Stan from users' and developers'…
NASA Technical Reports Server (NTRS)
Wong, K. W.
1974-01-01
In lunar phototriangulation, there is a complete lack of accurate ground control points. The accuracy analysis of the results of lunar phototriangulation must, therefore, be completely dependent on statistical procedure. It was the objective of this investigation to examine the validity of the commonly used statistical procedures, and to develop both mathematical techniques and computer softwares for evaluating (1) the accuracy of lunar phototriangulation; (2) the contribution of the different types of photo support data on the accuracy of lunar phototriangulation; (3) accuracy of absolute orientation as a function of the accuracy and distribution of both the ground and model points; and (4) the relative slope accuracy between any triangulated pass points.
NASA Astrophysics Data System (ADS)
Desa, Nor Hasliza Mat; Jemain, Abdul Aziz
2013-11-01
A key assumption in traditional statistical process control (SPC) technique is based on the requirement that observations or time series data are normally and independently distributed. The presences of a serial autocorrelation results in a number of problems, including an increase in the type I error rate and thereby increase the expected number of false alarm in the process observation. However, the independency assumption is often violated in practice due to the influence of serial correlation in the observation. Therefore, the aim of this paper is to demonstrate with the hospital admission data, the influence of serial correlation on the statistical control charts. The trend free pre-whitening (TFPW) method has been used and applied as an alternative method to obtain residuals series which are statistically uncorrelated to each other. In this study, a data set of daily hospital admission for respiratory and cardiovascular diseases was used from the period of 1 January 2009 to 31 December 2009 (365 days). Result showed that TFPW method is an easy and useful method in removing the influence of serial correlation from the hospital admission data. It can be concluded that statistical control chart based on residual series perform better compared to original hospital admission series which influenced by the effects of serial correlation data.
Fienup, Daniel M; Critchfield, Thomas S
2010-01-01
Computerized lessons that reflect stimulus equivalence principles were used to teach college students concepts related to inferential statistics and hypothesis decision making. Lesson 1 taught participants concepts related to inferential statistics, and Lesson 2 taught them to base hypothesis decisions on a scientific hypothesis and the direction of an effect. Lesson 3 taught the conditional influence of inferential statistics over decisions regarding the scientific and null hypotheses. Participants entered the study with low scores on the targeted skills and left the study demonstrating a high level of accuracy on these skills, which involved mastering more relations than were taught formally. This study illustrates the efficiency of equivalence-based instruction in establishing academic skills in sophisticated learners. PMID:21358904
Computationally efficient Bayesian tracking
NASA Astrophysics Data System (ADS)
Aughenbaugh, Jason; La Cour, Brian
2012-06-01
In this paper, we describe the progress we have achieved in developing a computationally efficient, grid-based Bayesian fusion tracking system. In our approach, the probability surface is represented by a collection of multidimensional polynomials, each computed adaptively on a grid of cells representing state space. Time evolution is performed using a hybrid particle/grid approach and knowledge of the grid structure, while sensor updates use a measurement-based sampling method with a Delaunay triangulation. We present an application of this system to the problem of tracking a submarine target using a field of active and passive sonar buoys.
Searching Algorithm Using Bayesian Updates
ERIC Educational Resources Information Center
Caudle, Kyle
2010-01-01
In late October 1967, the USS Scorpion was lost at sea, somewhere between the Azores and Norfolk Virginia. Dr. Craven of the U.S. Navy's Special Projects Division is credited with using Bayesian Search Theory to locate the submarine. Bayesian Search Theory is a straightforward and interesting application of Bayes' theorem which involves searching…
Bayesian Approach for Inconsistent Information
Stein, M.; Beer, M.; Kreinovich, V.
2013-01-01
In engineering situations, we usually have a large amount of prior knowledge that needs to be taken into account when processing data. Traditionally, the Bayesian approach is used to process data in the presence of prior knowledge. Sometimes, when we apply the traditional Bayesian techniques to engineering data, we get inconsistencies between the data and prior knowledge. These inconsistencies are usually caused by the fact that in the traditional approach, we assume that we know the exact sample values, that the prior distribution is exactly known, etc. In reality, the data is imprecise due to measurement errors, the prior knowledge is only approximately known, etc. So, a natural way to deal with the seemingly inconsistent information is to take this imprecision into account in the Bayesian approach – e.g., by using fuzzy techniques. In this paper, we describe several possible scenarios for fuzzifying the Bayesian approach. Particular attention is paid to the interaction between the estimated imprecise parameters. In this paper, to implement the corresponding fuzzy versions of the Bayesian formulas, we use straightforward computations of the related expression – which makes our computations reasonably time-consuming. Computations in the traditional (non-fuzzy) Bayesian approach are much faster – because they use algorithmically efficient reformulations of the Bayesian formulas. We expect that similar reformulations of the fuzzy Bayesian formulas will also drastically decrease the computation time and thus, enhance the practical use of the proposed methods. PMID:24089579
Deductive updating is not Bayesian.
Markovits, Henry; Brisson, Janie; de Chantal, Pier-Luc
2015-07-01
One of the major debates concerning the nature of inferential reasoning is between counterexample-based theories such as mental model theory and probabilistic theories. This study looks at conclusion updating after the addition of statistical information to examine the hypothesis that deductive reasoning cannot be explained by probabilistic inferences. In Study 1, participants were given an initial "If P then Q rule" for a phenomenon on a recently discovered planet, told that "Q was true," and asked to make a judgment of either deductive validity or probabilistic likelihood of the putative conclusion that "P is true." They were then told the results of 1,000 observations. In the low-probability problem, 950 times P was false and Q was true, whereas 50 times P was true and Q was true. In the high-probability problem, these proportions were inverted. On the low-probability problem, probabilistic ratings and judgments of logical validity decreased. However, on the high-probability problem, probabilistic ratings remained high whereas judgments of logical validity significantly decreased. Confidence ratings were consistent with this different pattern for probabilistic and for deductive inferences. Study 2 replicated this result with another form of inference, "If P then Q. P is false." These results show that deductive updating is not explicable by Bayesian updating. PMID:25603167
Khondker, Zakaria S; Zhu, Hongtu; Chu, Haitao; Lin, Weili; Ibrahim, Joseph G.
2012-01-01
Estimation of sparse covariance matrices and their inverse subject to positive definiteness constraints has drawn a lot of attention in recent years. The abundance of high-dimensional data, where the sample size (n) is less than the dimension (d), requires shrinkage estimation methods since the maximum likelihood estimator is not positive definite in this case. Furthermore, when n is larger than d but not sufficiently larger, shrinkage estimation is more stable than maximum likelihood as it reduces the condition number of the precision matrix. Frequentist methods have utilized penalized likelihood methods, whereas Bayesian approaches rely on matrix decompositions or Wishart priors for shrinkage. In this paper we propose a new method, called the Bayesian Covariance Lasso (BCLASSO), for the shrinkage estimation of a precision (covariance) matrix. We consider a class of priors for the precision matrix that leads to the popular frequentist penalties as special cases, develop a Bayes estimator for the precision matrix, and propose an efficient sampling scheme that does not precalculate boundaries for positive definiteness. The proposed method is permutation invariant and performs shrinkage and estimation simultaneously for non-full rank data. Simulations show that the proposed BCLASSO performs similarly as frequentist methods for non-full rank data. PMID:24551316
Adaptive Dynamic Bayesian Networks
Ng, B M
2007-10-26
A discrete-time Markov process can be compactly modeled as a dynamic Bayesian network (DBN)--a graphical model with nodes representing random variables and directed edges indicating causality between variables. Each node has a probability distribution, conditional on the variables represented by the parent nodes. A DBN's graphical structure encodes fixed conditional dependencies between variables. But in real-world systems, conditional dependencies between variables may be unknown a priori or may vary over time. Model errors can result if the DBN fails to capture all possible interactions between variables. Thus, we explore the representational framework of adaptive DBNs, whose structure and parameters can change from one time step to the next: a distribution's parameters and its set of conditional variables are dynamic. This work builds on recent work in nonparametric Bayesian modeling, such as hierarchical Dirichlet processes, infinite-state hidden Markov networks and structured priors for Bayes net learning. In this paper, we will explain the motivation for our interest in adaptive DBNs, show how popular nonparametric methods are combined to formulate the foundations for adaptive DBNs, and present preliminary results.
A Bayesian Approach to Identifying New Risk Factors for Dementia
Wen, Yen-Hsia; Wu, Shihn-Sheng; Lin, Chun-Hung Richard; Tsai, Jui-Hsiu; Yang, Pinchen; Chang, Yang-Pei; Tseng, Kuan-Hua
2016-01-01
Abstract Dementia is one of the most disabling and burdensome health conditions worldwide. In this study, we identified new potential risk factors for dementia from nationwide longitudinal population-based data by using Bayesian statistics. We first tested the consistency of the results obtained using Bayesian statistics with those obtained using classical frequentist probability for 4 recognized risk factors for dementia, namely severe head injury, depression, diabetes mellitus, and vascular diseases. Then, we used Bayesian statistics to verify 2 new potential risk factors for dementia, namely hearing loss and senile cataract, determined from the Taiwan's National Health Insurance Research Database. We included a total of 6546 (6.0%) patients diagnosed with dementia. We observed older age, female sex, and lower income as independent risk factors for dementia. Moreover, we verified the 4 recognized risk factors for dementia in the older Taiwanese population; their odds ratios (ORs) ranged from 3.469 to 1.207. Furthermore, we observed that hearing loss (OR = 1.577) and senile cataract (OR = 1.549) were associated with an increased risk of dementia. We found that the results obtained using Bayesian statistics for assessing risk factors for dementia, such as head injury, depression, DM, and vascular diseases, were consistent with those obtained using classical frequentist probability. Moreover, hearing loss and senile cataract were found to be potential risk factors for dementia in the older Taiwanese population. Bayesian statistics could help clinicians explore other potential risk factors for dementia and for developing appropriate treatment strategies for these patients. PMID:27227925
Chatziioannou, Aristotelis A.; Moulos, Panagiotis
2011-01-01
StRAnGER is a web application for the automated statistical analysis of annotated gene profiling experiments, exploiting controlled biological vocabularies, like the Gene Ontology or the KEGG pathways terms. Starting from annotated lists of differentially expressed genes and gene enrichment scores, regarding the terms of each vocabulary, StRAnGER repartitions and reorders the initial distribution of terms to define a new distribution of elements. Each element pools terms holding the same enrichment score. The new distribution thus derived, is reordered in a decreasing order to the right, according to the observation score of the elements, while elements with the same score, are sorted again in a decreasing order of their enrichment scores. By applying bootstrapping techniques, a corrected measure of the statistical significance of these elements is derived, which enables the selection of terms mapped to these elements, unambiguously associated with respective significant gene sets. The selected terms are immunized against the bias infiltrating statistical enrichment analyses, producing technically very high statistical scores, due to the finite nature of the data population. Besides their high statistical score, another selection criterion for the terms is the number of their members, something that incurs a biological prioritization in line with a Systems Biology context. The output derived, represents a detailed ranked list of significant terms, which constitute a starting point for further functional analysis. PMID:21293737
Fu, Zhibiao; Baker, Daniel; Cheng, Aili; Leighton, Julie; Appelbaum, Edward; Aon, Juan
2016-05-01
The principle of quality by design (QbD) has been widely applied to biopharmaceutical manufacturing processes. Process characterization is an essential step to implement the QbD concept to establish the design space and to define the proven acceptable ranges (PAR) for critical process parameters (CPPs). In this study, we present characterization of a Saccharomyces cerevisiae fermentation process using risk assessment analysis, statistical design of experiments (DoE), and the multivariate Bayesian predictive approach. The critical quality attributes (CQAs) and CPPs were identified with a risk assessment. The statistical model for each attribute was established using the results from the DoE study with consideration given to interactions between CPPs. Both the conventional overlapping contour plot and the multivariate Bayesian predictive approaches were used to establish the region of process operating conditions where all attributes met their specifications simultaneously. The quantitative Bayesian predictive approach was chosen to define the PARs for the CPPs, which apply to the manufacturing control strategy. Experience from the 10,000 L manufacturing scale process validation, including 64 continued process verification batches, indicates that the CPPs remain under a state of control and within the established PARs. The end product quality attributes were within their drug substance specifications. The probability generated with the Bayesian approach was also used as a tool to assess CPP deviations. This approach can be extended to develop other production process characterization and quantify a reliable operating region. © 2016 American Institute of Chemical Engineers Biotechnol. Prog., 32:799-812, 2016. PMID:27095416
Embedding the results of focussed Bayesian fusion into a global context
NASA Astrophysics Data System (ADS)
Sander, Jennifer; Heizmann, Michael
2014-05-01
Bayesian statistics offers a well-founded and powerful fusion methodology also for the fusion of heterogeneous information sources. However, except in special cases, the needed posterior distribution is not analytically derivable. As consequence, Bayesian fusion may cause unacceptably high computational and storage costs in practice. Local Bayesian fusion approaches aim at reducing the complexity of the Bayesian fusion methodology significantly. This is done by concentrating the actual Bayesian fusion on the potentially most task relevant parts of the domain of the Properties of Interest. Our research on these approaches is motivated by an analogy to criminal investigations where criminalists pursue clues also only locally. This publication follows previous publications on a special local Bayesian fusion technique called focussed Bayesian fusion. Here, the actual calculation of the posterior distribution gets completely restricted to a suitably chosen local context. By this, the global posterior distribution is not completely determined. Strategies for using the results of a focussed Bayesian analysis appropriately are needed. In this publication, we primarily contrast different ways of embedding the results of focussed Bayesian fusion explicitly into a global context. To obtain a unique global posterior distribution, we analyze the application of the Maximum Entropy Principle that has been shown to be successfully applicable in metrology and in different other areas. To address the special need for making further decisions subsequently to the actual fusion task, we further analyze criteria for decision making under partial information.
A Bayesian Framework for Single Image Dehazing considering Noise
Nan, Dong; Bi, Du-yan; Liu, Chang; Ma, Shi-ping; He, Lin-yuan
2014-01-01
The single image dehazing algorithms in existence can only satisfy the demand for dehazing efficiency, not for denoising. In order to solve the problem, a Bayesian framework for single image dehazing considering noise is proposed. Firstly, the Bayesian framework is transformed to meet the dehazing algorithm. Then, the probability density function of the improved atmospheric scattering model is estimated by using the statistical prior and objective assumption of degraded image. Finally, the reflectance image is achieved by an iterative approach with feedback to reach the balance between dehazing and denoising. Experimental results demonstrate that the proposed method can remove haze and noise simultaneously and effectively. PMID:25215327
Bayesian Decision Theory for Multi-Category Adaptive Testing
NASA Astrophysics Data System (ADS)
Marinagi, Catherine C.; Kaburlasos, Vassilis G.
2008-09-01
This work presents a method for item selection in adaptive tests based on Bayesian Decision Theory (BDT). Multiple categories of examinee's competence level are assumed. The method determines the probability an examinee belongs to each category using Bayesian statistics. Before starting a test, prior probabilities of an examinee are assumed. Then, each time an examinee responds to a single item, a new competence level is estimated "a-posteriori" using item response and prior probabilities values. A customized focus-of-attention vector of probabilities is estimated, which is used to draw the next item from the Item Bank. The latter vector considers both Personalized Cost and content balancing percentages of items.
NOTE FROM THE EDITOR: Bayesian and Maximum Entropy Methods Bayesian and Maximum Entropy Methods
NASA Astrophysics Data System (ADS)
Dobrzynski, L.
2008-10-01
The Bayesian and Maximum Entropy Methods are now standard routines in various data analyses, irrespective of ones own preference to the more conventional approach based on so-called frequentists understanding of the notion of the probability. It is not the purpose of the Editor to show all achievements of these methods in various branches of science, technology and medicine. In the case of condensed matter physics most of the oldest examples of Bayesian analysis can be found in the excellent tutorial textbooks by Sivia and Skilling [1], and Bretthorst [2], while the application of the Maximum Entropy Methods were described in `Maximum Entropy in Action' [3]. On the list of questions addressed one finds such problems as deconvolution and reconstruction of the complicated spectra, e.g. counting the number of lines hidden within the spectrum observed with always finite resolution, reconstruction of charge, spin and momentum density distribution from an incomplete sets of data, etc. On the theoretical side one might find problems like estimation of interatomic potentials [4], application of the MEM to quantum Monte Carlo data [5], Bayesian approach to inverse quantum statistics [6], very general to statistical mechanics [7] etc. Obviously, in spite of the power of the Bayesian and Maximum Entropy Methods, it is not possible for everything to be solved in a unique way by application of these particular methods of analysis, and one of the problems which is often raised is connected not only with a uniqueness of a reconstruction of a given distribution (map) but also with its accuracy (error maps). In this `Comments' section we present a few papers showing more recent advances and views, and highlighting some of the aforementioned problems. References [1] Sivia D S and Skilling J 2006 Data Analysis: A Bayesian Tutorial 2nd edn (Oxford: Oxford University Press) [2] Bretthorst G L 1988 Bayesian Spectruim Analysis and Parameter Estimation (Berlin: Springer) [3] Buck B and
Carvalho, Pedro; Marques, Rui Cunha
2016-02-15
This study aims to search for economies of size and scope in the Portuguese water sector applying Bayesian and classical statistics to make inference in stochastic frontier analysis (SFA). This study proves the usefulness and advantages of the application of Bayesian statistics for making inference in SFA over traditional SFA which just uses classical statistics. The resulting Bayesian methods allow overcoming some problems that arise in the application of the traditional SFA, such as the bias in small samples and skewness of residuals. In the present case study of the water sector in Portugal, these Bayesian methods provide more plausible and acceptable results. Based on the results obtained we found that there are important economies of output density, economies of size, economies of vertical integration and economies of scope in the Portuguese water sector, pointing out to the huge advantages in undertaking mergers by joining the retail and wholesale components and by joining the drinking water and wastewater services. PMID:26674686
NASA Astrophysics Data System (ADS)
Grégoire, G.
2016-05-01
This chapter is devoted to two objectives. The first one is to answer the request expressed by attendees of the first Astrostatistics School (Annecy, October 2013) to be provided with an elementary vademecum of statistics that would facilitate understanding of the given courses. In this spirit we recall very basic notions, that is definitions and properties that we think sufficient to benefit from courses given in the Astrostatistical School. Thus we give briefly definitions and elementary properties on random variables and vectors, distributions, estimation and tests, maximum likelihood methodology. We intend to present basic ideas in a hopefully comprehensible way. We do not try to give a rigorous presentation, and due to the place devoted to this chapter, can cover only a rather limited field of statistics. The second aim is to focus on some statistical tools that are useful in classification: basic introduction to Bayesian statistics, maximum likelihood methodology, Gaussian vectors and Gaussian mixture models.
Using alien coins to test whether simple inference is Bayesian.
Cassey, Peter; Hawkins, Guy E; Donkin, Chris; Brown, Scott D
2016-03-01
Reasoning and inference are well-studied aspects of basic cognition that have been explained as statistically optimal Bayesian inference. Using a simplified experimental design, we conducted quantitative comparisons between Bayesian inference and human inference at the level of individuals. In 3 experiments, with more than 13,000 participants, we asked people for prior and posterior inferences about the probability that 1 of 2 coins would generate certain outcomes. Most participants' inferences were inconsistent with Bayes' rule. Only in the simplest version of the task did the majority of participants adhere to Bayes' rule, but even in that case, there was a significant proportion that failed to do so. The current results highlight the importance of close quantitative comparisons between Bayesian inference and human data at the individual-subject level when evaluating models of cognition. (PsycINFO Database Record PMID:26461034
Bayesian estimation of isotopic age differences
Curl, R.L.
1988-08-01
Isotopic dating is subject to uncertainties arising from counting statistics and experimental errors. These uncertainties are additive when an isotopic age difference is calculated. If large, they can lead to no significant age difference by classical statistics. In many cases, relative ages are known because of stratigraphic order or other clues. Such information can be used to establish a Bayes estimate of age difference which will include prior knowledge of age order. Age measurement errors are assumed to be log-normal and a noninformative but constrained bivariate prior for two true ages in known order is adopted. True-age ratio is distributed as a truncated log-normal variate. Its expected value gives an age-ratio estimate, and its variance provides credible intervals. Bayesian estimates of ages are different and in correct order even if measured ages are identical or reversed in order. For example, age measurements on two samples might both yield 100 ka with coefficients of variation of 0.2. Bayesian estimates are 22.7 ka for age difference with a 75% credible interval of (4.4, 43.7) ka.
Bayesian inference in geomagnetism
NASA Technical Reports Server (NTRS)
Backus, George E.
1988-01-01
The inverse problem in empirical geomagnetic modeling is investigated, with critical examination of recently published studies. Particular attention is given to the use of Bayesian inference (BI) to select the damping parameter lambda in the uniqueness portion of the inverse problem. The mathematical bases of BI and stochastic inversion are explored, with consideration of bound-softening problems and resolution in linear Gaussian BI. The problem of estimating the radial magnetic field B(r) at the earth core-mantle boundary from surface and satellite measurements is then analyzed in detail, with specific attention to the selection of lambda in the studies of Gubbins (1983) and Gubbins and Bloxham (1985). It is argued that the selection method is inappropriate and leads to lambda values much larger than those that would result if a reasonable bound on the heat flow at the CMB were assumed.
How can we change beliefs? A Bayesian perspective.
Rutten, A L B
2008-10-01
How can Randomised Controlled Trials (RCTs) change our beliefs? The fact that they do update prior beliefs to different posterior beliefs is explained by Bayesian philosophy. Crucial points in Bayesian analysis include setting the first prior expectation right and sequential updating of the prior in the light of new evidence. Bayesian analysis depends highly on the evidence included. RCT evidence can only falsify the placebo hypothesis, it cannot indicate which mechanism of action could be responsible for an intrinsic effect and therefore cannot overturn existing beliefs. Bayesian reasoning could structure further discussion, but subjectivity is an inherent element of this process. In the case of homeopathy the first prior is not a common prior shared by all parties to the debate, but a paradigm, this prevents common updating of beliefs. Only by keeping an open mind towards other paradigms and all possible hypotheses can a low Bayesian prior be elevated to the point of accepting a new paradigm, this is more relevant than Bayesian calculations. PMID:19371571
NASA Astrophysics Data System (ADS)
Stefani, Jerry A.; Poarch, Scott; Saxena, Sharad; Mozumder, P. K.
1994-09-01
An advanced multivariable off-line process control system, which combines traditional Statistical Process Control (SPC) with feedback control, has been applied to the CVD tungsten process on an Applied Materials Centura reactor. The goal of the model-based controller is to compensate for shifts in the process and maintain the wafer state responses on target. In the present application the controller employs measurements made on test wafers by off-line metrology tools to track the process behavior. This is accomplished by using model- bases SPC, which compares the measurements with predictions obtained from empirically-derived process models. For CVD tungsten, a physically-based modeling approach was employed based on the kinetically-limited H2 reduction of WF6. On detecting a statistically significant shift in the process, the controller calculates adjustments to the settings to bring the process responses back on target. To achieve this a few additional test wafers are processed at slightly different settings than the nominal. This local experiment allows the models to be updated to reflect the current process performance. The model updates are expressed as multiplicative or additive changes in the process inputs and a change in the model constant. This approach for model updating not only tracks the present process/equipment state, but it also provides some diagnostic capability regarding the cause of the process shift. The updated models are used by an optimizer to compute new settings to bring the responses back to target. The optimizer is capable of incrementally entering controllables into the strategy, reflecting the degree to which the engineer desires to manipulates each setting. The capability of the controller to compensate for shifts in the CVD tungsten process has been demonstrated. Targets for film bulk resistivity and deposition rate were maintained while satisfying constraints on film stress and WF6 conversion efficiency.
Hong, Peilong; Li, Liming; Liu, Jianji; Zhang, Guoquan
2016-01-01
Young's double-slit or two-beam interference is of fundamental importance to understand various interference effects, in which the stationary phase difference between two beams plays the key role in the first-order coherence. Different from the case of first-order coherence, in the high-order optical coherence the statistic behavior of the optical phase will play the key role. In this article, by employing a fundamental interfering configuration with two classical point sources, we showed that the high- order optical coherence between two classical point sources can be actively designed by controlling the statistic behavior of the relative phase difference between two point sources. Synchronous position Nth-order subwavelength interference with an effective wavelength of λ/M was demonstrated, in which λ is the wavelength of point sources and M is an integer not larger than N. Interestingly, we found that the synchronous position Nth-order interference fringe fingerprints the statistic trace of random phase fluctuation of two classical point sources, therefore, it provides an effective way to characterize the statistic properties of phase fluctuation for incoherent light sources. PMID:27021589
NASA Astrophysics Data System (ADS)
Hong, Peilong; Li, Liming; Liu, Jianji; Zhang, Guoquan
2016-03-01
Young’s double-slit or two-beam interference is of fundamental importance to understand various interference effects, in which the stationary phase difference between two beams plays the key role in the first-order coherence. Different from the case of first-order coherence, in the high-order optical coherence the statistic behavior of the optical phase will play the key role. In this article, by employing a fundamental interfering configuration with two classical point sources, we showed that the high- order optical coherence between two classical point sources can be actively designed by controlling the statistic behavior of the relative phase difference between two point sources. Synchronous position Nth-order subwavelength interference with an effective wavelength of λ/M was demonstrated, in which λ is the wavelength of point sources and M is an integer not larger than N. Interestingly, we found that the synchronous position Nth-order interference fringe fingerprints the statistic trace of random phase fluctuation of two classical point sources, therefore, it provides an effective way to characterize the statistic properties of phase fluctuation for incoherent light sources.
Hong, Peilong; Li, Liming; Liu, Jianji; Zhang, Guoquan
2016-01-01
Young’s double-slit or two-beam interference is of fundamental importance to understand various interference effects, in which the stationary phase difference between two beams plays the key role in the first-order coherence. Different from the case of first-order coherence, in the high-order optical coherence the statistic behavior of the optical phase will play the key role. In this article, by employing a fundamental interfering configuration with two classical point sources, we showed that the high- order optical coherence between two classical point sources can be actively designed by controlling the statistic behavior of the relative phase difference between two point sources. Synchronous position Nth-order subwavelength interference with an effective wavelength of λ/M was demonstrated, in which λ is the wavelength of point sources and M is an integer not larger than N. Interestingly, we found that the synchronous position Nth-order interference fringe fingerprints the statistic trace of random phase fluctuation of two classical point sources, therefore, it provides an effective way to characterize the statistic properties of phase fluctuation for incoherent light sources. PMID:27021589
A statistical learning strategy for closed-loop control of fluid flows
NASA Astrophysics Data System (ADS)
Guéniat, Florimond; Mathelin, Lionel; Hussaini, M. Yousuff
2016-04-01
This work discusses a closed-loop control strategy for complex systems utilizing scarce and streaming data. A discrete embedding space is first built using hash functions applied to the sensor measurements from which a Markov process model is derived, approximating the complex system's dynamics. A control strategy is then learned using reinforcement learning once rewards relevant with respect to the control objective are identified. This method is designed for experimental configurations, requiring no computations nor prior knowledge of the system, and enjoys intrinsic robustness. It is illustrated on two systems: the control of the transitions of a Lorenz'63 dynamical system, and the control of the drag of a cylinder flow. The method is shown to perform well.
Statistical process control applied to the liquid-fed ceramic melter process
Pulsipher, B.A.; Kuhn, W.L.
1987-09-01
In this report, an application of control charts to the apparent feed composition of a Liquid-Fed Ceramic Melter (LFCM) is demonstrated by using results from a simulation of the LFCM system. Usual applications of control charts require the assumption of uncorrelated observations over time. This assumption is violated in the LFCM system because of the heels left in tanks from previous batches. Methods for dealing with this problem have been developed to create control charts for individual batches sent to the feed preparation tank (FPT). These control charts are capable of detecting changes in the process average as well as changes in the process variation. All numbers reported in this document were derived from a simulated demonstration of a plausible LFCM system. In practice, site-specific data must be used as input to a simulation tailored to that site. These data directly affect all variance estimates used to develop control charts. 64 refs., 3 figs., 2 tabs.
Bayesian networks as a tool for epidemiological systems analysis
NASA Astrophysics Data System (ADS)
Lewis, F. I.
2012-11-01
Bayesian network analysis is a form of probabilistic modeling which derives from empirical data a directed acyclic graph (DAG) describing the dependency structure between random variables. Bayesian networks are increasingly finding application in areas such as computational and systems biology, and more recently in epidemiological analyses. The key distinction between standard empirical modeling approaches, such as generalised linear modeling, and Bayesian network analyses is that the latter attempts not only to identify statistically associated variables, but to additionally, and empirically, separate these into those directly and indirectly dependent with one or more outcome variables. Such discrimination is vastly more ambitious but has the potential to reveal far more about key features of complex disease systems. Applying Bayesian network modeling to biological and medical data has considerable computational demands, combined with the need to ensure robust model selection given the vast model space of possible DAGs. These challenges require the use of approximation techniques, such as the Laplace approximation, Markov chain Monte Carlo simulation and parametric bootstrapping, along with computational parallelization. A case study in structure discovery - identification of an optimal DAG for given data - is presented which uses additive Bayesian networks to explore veterinary disease data of industrial and medical relevance.
Statistical Comparison of Far-Field Noise Events in a Controlled Flow Ma = 0.6 Jet
NASA Astrophysics Data System (ADS)
Freedland, Graham; Lewalle, Jacques
2013-11-01
We compare distributions of acoustic events in controlled and uncontrolled high speed jets. By examining far-field acoustic signals from three microphones and using continuous wavelets, sources of noise can be identified through cross-correlation of the different far-field signals. From the events found, four properties (wavelet magnitude, Strouhal number and lags between two pairs of microphones) were tabulated. Each test gives over 10,000 events, which were sorted into histograms that approximate the statistical distributions of properties. This is used to determine what influence the addition of synthetic jet actuators has on the properties of the flow of the jet. A qualitative analysis of the distributions using quantile-quantile plots helps in the understanding of the distributions of sources. A quantitative analysis using the Anderson-Darling and Kolmogorov-Smirnov tests establishes statistically significant differences between the baseline and control cases. The authors thank Dr. Mark Glauser, Dr. Kerwin Low and the Syracuse Jet Group for the use of their data, Professor Dongliang Wang of Upstate Medical University for his suggestion of statistical methods, and Spectral Energies LLC (through an SBIR grant from AFRL) for their support.
NASA Astrophysics Data System (ADS)
Chernyak, Vladimir Y.; Chertkov, Michael; Bierkens, Joris; Kappen, Hilbert J.
2014-01-01
In stochastic optimal control (SOC) one minimizes the average cost-to-go, that consists of the cost-of-control (amount of efforts), cost-of-space (where one wants the system to be) and the target cost (where one wants the system to arrive), for a system participating in forced and controlled Langevin dynamics. We extend the SOC problem by introducing an additional cost-of-dynamics, characterized by a vector potential. We propose derivation of the generalized gauge-invariant Hamilton-Jacobi-Bellman equation as a variation over density and current, suggest hydrodynamic interpretation and discuss examples, e.g., ergodic control of a particle-within-a-circle, illustrating non-equilibrium space-time complexity.
Sedlack, Jeffrey D
2010-01-01
Surgeons have been slow to incorporate industrial reliability techniques. Process control methods were applied to surgeon waiting time between cases, and to length of stay (LOS) after colon surgery. Waiting times between surgeries were evaluated by auditing the operating room records of a single hospital over a 1-month period. The medical records of 628 patients undergoing colon surgery over a 5-year period were reviewed. The average surgeon wait time between cases was 53 min, and the busiest surgeon spent 291/2 hr in 1 month waiting between surgeries. Process control charting demonstrated poor overall control of the room turnover process. Average LOS after colon resection also demonstrated very poor control. Mean LOS was 10 days. Weibull's conditional analysis revealed a conditional LOS of 9.83 days. Serious process management problems were identified in both analyses. These process issues are both expensive and adversely affect the quality of service offered by the institution. Process control mechanisms were suggested or implemented to improve these surgical processes. Industrial reliability and quality management tools can easily and effectively identify process control problems that occur on surgical services. PMID:20946422
NASA Technical Reports Server (NTRS)
Hutchens, Dale E.; Doan, Patrick A.; Boothe, Richard E.
1997-01-01
Bonding labs at both MSFC and the northern Utah production plant prepare bond test specimens which simulate or witness the production of NASA's Reusable Solid Rocket Motor (RSRM). The current process for preparing the bonding surfaces employs 1,1,1-trichloroethane vapor degreasing, which simulates the current RSRM process. Government regulations (e.g., the 1990 Amendments to the Clean Air Act) have mandated a production phase-out of a number of ozone depleting compounds (ODC) including 1,1,1-trichloroethane. In order to comply with these regulations, the RSRM Program is qualifying a spray-in-air (SIA) precision cleaning process using Brulin 1990, an aqueous blend of surfactants. Accordingly, surface preparation prior to bonding process simulation test specimens must reflect the new production cleaning process. The Bonding Lab Statistical Process Control (SPC) program monitors the progress of the lab and its capabilities, as well as certifies the bonding technicians, by periodically preparing D6AC steel tensile adhesion panels with EA-91 3NA epoxy adhesive using a standardized process. SPC methods are then used to ensure the process is statistically in control, thus producing reliable data for bonding studies, and identify any problems which might develop. Since the specimen cleaning process is being changed, new SPC limits must be established. This report summarizes side-by-side testing of D6AC steel tensile adhesion witness panels and tapered double cantilevered beams (TDCBs) using both the current baseline vapor degreasing process and a lab-scale spray-in-air process. A Proceco 26 inches Typhoon dishwasher cleaned both tensile adhesion witness panels and TDCBs in a process which simulates the new production process. The tests were performed six times during 1995, subsequent statistical analysis of the data established new upper control limits (UCL) and lower control limits (LCL). The data also demonstrated that the new process was equivalent to the vapor
Sparsity and the Bayesian perspective
NASA Astrophysics Data System (ADS)
Starck, J.-L.; Donoho, D. L.; Fadili, M. J.; Rassat, A.
2013-04-01
Sparsity has recently been introduced in cosmology for weak-lensing and cosmic microwave background (CMB) data analysis for different applications such as denoising, component separation, or inpainting (i.e., filling the missing data or the mask). Although it gives very nice numerical results, CMB sparse inpainting has been severely criticized by top researchers in cosmology using arguments derived from a Bayesian perspective. In an attempt to understand their point of view, we realize that interpreting a regularization penalty term as a prior in a Bayesian framework can lead to erroneous conclusions. This paper is by no means against the Bayesian approach, which has proven to be very useful for many applications, but warns against a Bayesian-only interpretation in data analysis, which can be misleading in some cases.
Integrative Bayesian analysis of neuroimaging-genetic data with application to cocaine dependence.
Azadeh, Shabnam; Hobbs, Brian P; Ma, Liangsuo; Nielsen, David A; Moeller, F Gerard; Baladandayuthapani, Veerabhadran
2016-01-15
Neuroimaging and genetic studies provide distinct and complementary information about the structural and biological aspects of a disease. Integrating the two sources of data facilitates the investigation of the links between genetic variability and brain mechanisms among different individuals for various medical disorders. This article presents a general statistical framework for integrative Bayesian analysis of neuroimaging-genetic (iBANG) data, which is motivated by a neuroimaging-genetic study in cocaine dependence. Statistical inference necessitated the integration of spatially dependent voxel-level measurements with various patient-level genetic and demographic characteristics under an appropriate probability model to account for the multiple inherent sources of variation. Our framework uses Bayesian model averaging to integrate genetic information into the analysis of voxel-wise neuroimaging data, accounting for spatial correlations in the voxels. Using multiplicity controls based on the false discovery rate, we delineate voxels associated with genetic and demographic features that may impact diffusion as measured by fractional anisotropy (FA) obtained from DTI images. We demonstrate the benefits of accounting for model uncertainties in both model fit and prediction. Our results suggest that cocaine consumption is associated with FA reduction in most white matter regions of interest in the brain. Additionally, gene polymorphisms associated with GABAergic, serotonergic and dopaminergic neurotransmitters and receptors were associated with FA. PMID:26484829
NASA Astrophysics Data System (ADS)
Verma, Harish Kumar; Jain, Cheshta
2015-07-01
In this article, a hybrid algorithm of particle swarm optimization (PSO) with statistical parameter (HSPSO) is proposed. Basic PSO for shifted multimodal problems have low searching precision due to falling into a number of local minima. The proposed approach uses statistical characteristics to update the velocity of the particle to avoid local minima and help particles to search global optimum with improved convergence. The performance of the newly developed algorithm is verified using various standard multimodal, multivariable, shifted hybrid composition benchmark problems. Further, the comparative analysis of HSPSO with variants of PSO is tested to control frequency of hybrid renewable energy system which comprises solar system, wind system, diesel generator, aqua electrolyzer and ultra capacitor. A significant improvement in convergence characteristic of HSPSO algorithm over other variants of PSO is observed in solving benchmark optimization and renewable hybrid system problems.
Bayesian Calibration of Microsimulation Models.
Rutter, Carolyn M; Miglioretti, Diana L; Savarino, James E
2009-12-01
Microsimulation models that describe disease processes synthesize information from multiple sources and can be used to estimate the effects of screening and treatment on cancer incidence and mortality at a population level. These models are characterized by simulation of individual event histories for an idealized population of interest. Microsimulation models are complex and invariably include parameters that are not well informed by existing data. Therefore, a key component of model development is the choice of parameter values. Microsimulation model parameter values are selected to reproduce expected or known results though the process of model calibration. Calibration may be done by perturbing model parameters one at a time or by using a search algorithm. As an alternative, we propose a Bayesian method to calibrate microsimulation models that uses Markov chain Monte Carlo. We show that this approach converges to the target distribution and use a simulation study to demonstrate its finite-sample performance. Although computationally intensive, this approach has several advantages over previously proposed methods, including the use of statistical criteria to select parameter values, simultaneous calibration of multiple parameters to multiple data sources, incorporation of information via prior distributions, description of parameter identifiability, and the ability to obtain interval estimates of model parameters. We develop a microsimulation model for colorectal cancer and use our proposed method to calibrate model parameters. The microsimulation model provides a good fit to the calibration data. We find evidence that some parameters are identified primarily through prior distributions. Our results underscore the need to incorporate multiple sources of variability (i.e., due to calibration data, unknown parameters, and estimated parameters and predicted values) when calibrating and applying microsimulation models. PMID:20076767
Bayesian Calibration of Microsimulation Models
Rutter, Carolyn M.; Miglioretti, Diana L.; Savarino, James E.
2009-01-01
Microsimulation models that describe disease processes synthesize information from multiple sources and can be used to estimate the effects of screening and treatment on cancer incidence and mortality at a population level. These models are characterized by simulation of individual event histories for an idealized population of interest. Microsimulation models are complex and invariably include parameters that are not well informed by existing data. Therefore, a key component of model development is the choice of parameter values. Microsimulation model parameter values are selected to reproduce expected or known results though the process of model calibration. Calibration may be done by perturbing model parameters one at a time or by using a search algorithm. As an alternative, we propose a Bayesian method to calibrate microsimulation models that uses Markov chain Monte Carlo. We show that this approach converges to the target distribution and use a simulation study to demonstrate its finite-sample performance. Although computationally intensive, this approach has several advantages over previously proposed methods, including the use of statistical criteria to select parameter values, simultaneous calibration of multiple parameters to multiple data sources, incorporation of information via prior distributions, description of parameter identifiability, and the ability to obtain interval estimates of model parameters. We develop a microsimulation model for colorectal cancer and use our proposed method to calibrate model parameters. The microsimulation model provides a good fit to the calibration data. We find evidence that some parameters are identified primarily through prior distributions. Our results underscore the need to incorporate multiple sources of variability (i.e., due to calibration data, unknown parameters, and estimated parameters and predicted values) when calibrating and applying microsimulation models. PMID:20076767
Applications of Bayesian spectrum representation in acoustics
NASA Astrophysics Data System (ADS)
Botts, Jonathan M.
framework. The application to reflection data is useful for representing frequency-dependent impedance boundaries in finite difference acoustic simulations. Furthermore, since the filter transfer function is a parametric model, it can be modified to incorporate arbitrary frequency weighting and account for the band-limited nature of measured reflection spectra. Finally, the model is modified to compensate for dispersive error in the finite difference simulation, from the filter design process. Stemming from the filter boundary problem, the implementation of pressure sources in finite difference simulation is addressed in order to assure that schemes properly converge. A class of parameterized source functions is proposed and shown to offer straightforward control of residual error in the simulation. Guided by the notion that the solution to be approximated affects the approximation error, sources are designed which reduce residual dispersive error to the size of round-off errors. The early part of a room impulse response can be characterized by a series of isolated plane waves. Measured with an array of microphones, plane waves map to a directional response of the array or spatial intensity map. Probabilistic inversion of this response results in estimates of the number and directions of image source arrivals. The model-based inversion is shown to avoid ambiguities associated with peak-finding or inspection of the spatial intensity map. For this problem, determining the number of arrivals in a given frame is critical for properly inferring the state of the sound field. This analysis is effectively compression of the spatial room response, which is useful for analysis or encoding of the spatial sound field. Parametric, model-based formulations of these problems enhance the solution in all cases, and a Bayesian interpretation provides a principled approach to model comparison and parameter estimation. v
Kernel approximate Bayesian computation in population genetic inferences.
Nakagome, Shigeki; Fukumizu, Kenji; Mano, Shuhei
2013-12-01
Approximate Bayesian computation (ABC) is a likelihood-free approach for Bayesian inferences based on a rejection algorithm method that applies a tolerance of dissimilarity between summary statistics from observed and simulated data. Although several improvements to the algorithm have been proposed, none of these improvements avoid the following two sources of approximation: 1) lack of sufficient statistics: sampling is not from the true posterior density given data but from an approximate posterior density given summary statistics; and 2) non-zero tolerance: sampling from the posterior density given summary statistics is achieved only in the limit of zero tolerance. The first source of approximation can be improved by adding a summary statistic, but an increase in the number of summary statistics could introduce additional variance caused by the low acceptance rate. Consequently, many researchers have attempted to develop techniques to choose informative summary statistics. The present study evaluated the utility of a kernel-based ABC method [Fukumizu, K., L. Song and A. Gretton (2010): "Kernel Bayes' rule: Bayesian inference with positive definite kernels," arXiv, 1009.5736 and Fukumizu, K., L. Song and A. Gretton (2011): "Kernel Bayes' rule. Advances in Neural Information Processing Systems 24." In: J. Shawe-Taylor and R. S. Zemel and P. Bartlett and F. Pereira and K. Q. Weinberger, (Eds.), pp. 1549-1557., NIPS 24: 1549-1557] for complex problems that demand many summary statistics. Specifically, kernel ABC was applied to population genetic inference. We demonstrate that, in contrast to conventional ABCs, kernel ABC can incorporate a large number of summary statistics while maintaining high performance of the inference. PMID:24150124
Identifying the controls of wildfire activity in Namibia using multivariate statistics
NASA Astrophysics Data System (ADS)
Mayr, Manuel; Le Roux, Johan; Samimi, Cyrus
2015-04-01
data mining techniques to select a conceivable set of variables by their explanatory value and to remove redundancy. We will then apply two multivariate statistical methods suitable to a large variety of data types and frequently used for (non-linear) causative factor identification: Non-metric Multidimensional Scaling (NMDS) and Regression Trees. The assumed value of these analyses is i) to determine the most important predictor variables of fire activity in Namibia, ii) to decipher their complex interactions in driving fire variability in Namibia, and iii) to compare the performance of two state-of-the-art statistical methods. References: Le Roux, J. (2011): The effect of land use practices on the spatial and temporal characteristics of savanna fires in Namibia. Doctoral thesis at the University of Erlangen-Nuremberg/Germany - 155 pages.
Rediscovery of Good-Turing estimators via Bayesian nonparametrics.
Favaro, Stefano; Nipoti, Bernardo; Teh, Yee Whye
2016-03-01
The problem of estimating discovery probabilities originated in the context of statistical ecology, and in recent years it has become popular due to its frequent appearance in challenging applications arising in genetics, bioinformatics, linguistics, designs of experiments, machine learning, etc. A full range of statistical approaches, parametric and nonparametric as well as frequentist and Bayesian, has been proposed for estimating discovery probabilities. In this article, we investigate the relationships between the celebrated Good-Turing approach, which is a frequentist nonparametric approach developed in the 1940s, and a Bayesian nonparametric approach recently introduced in the literature. Specifically, under the assumption of a two parameter Poisson-Dirichlet prior, we show that Bayesian nonparametric estimators of discovery probabilities are asymptotically equivalent, for a large sample size, to suitably smoothed Good-Turing estimators. As a by-product of this result, we introduce and investigate a methodology for deriving exact and asymptotic credible intervals to be associated with the Bayesian nonparametric estimators of discovery probabilities. The proposed methodology is illustrated through a comprehensive simulation study and the analysis of Expressed Sequence Tags data generated by sequencing a benchmark complementary DNA library. PMID:26224325
Bayesian Analysis of Underground Flooding
NASA Astrophysics Data System (ADS)
Bogardi, Istvan; Duckstein, Lucien; Szidarovszky, Ferenc
1982-08-01
An event-based stochastic model is used to describe the spatial phenomenon of water inrush into underground works located under a karstic aquifer, and a Bayesian analysis is performed because of high parameter uncertainty. The random variables of the model are inrush yield per event, distance between events, number of events per unit underground space, maximum yield, and total yield over mine lifetime. Physically based hypotheses on the types of distributions are made and reinforced by observations. High parameter uncertainty stems from the random characteristics of karstic limestone and the limited amount of observation data. Thus, during the design stage, only indirect data such as regional information and geological analogies are available; updating of this information should then be done as the construction progresses and inrush events are observed and recorded. A Bayes simulation algorithm is developed and applied to estimate the probability distributions of inrush event characteristics used in the design of water control facilities in underground mining. A real-life example in the Transdanubian region of Hungary is used to illustrate the methodology.
An Assessment of Statistical Process Control-Based Approaches for Charting Student Evaluation Scores
ERIC Educational Resources Information Center
Ding, Xin; Wardell, Don; Verma, Rohit
2006-01-01
We compare three control charts for monitoring data from student evaluations of teaching (SET) with the goal of improving student satisfaction with teaching performance. The two charts that we propose are a modified "p" chart and a z-score chart. We show that these charts overcome some of the shortcomings of the more traditional charts…
NASA Astrophysics Data System (ADS)
Ghosal, Kajal; Chandra, Aniruddha
2010-10-01
Different concentrations of hydrophobically modified hydroxypropyl methylcellulose (HPMC, 60 M Grade) and conventional hydrophilic hydroxypropyl methylcellulose (50 cPs) were used to prepare four topical hydrogel formulations using a model non steroidal anti-inflammatory drug (NSAID) diclofenac potassium (DP). For all the formulations, suitability of different common empirical (zero-order, first-order, and Higuchi), semi-empirical (Ritger-Peppas and Peppas-Sahlin), and some new statistical (logistic, log-logistic, Weibull, Gumbel, and generalized extreme value distribution) models to describe the drug release profile were tested through non-linear least-square curve fitting. A general purpose mathematical analysis tool MATLAB is used for the purpose. Further, instead of the widely used transformed linear fit method, direct fitting was used in the paper to avoid any sort of truncation and transformation errors. The results revealed that the log-logistic distribution, among all the models that were investigated, was the best fit for hydrophobic formulations. For hydrophilic cases, the semi-empirical models and Weibull distribution worked best, although log-logistic also showed a close fit.
Li, Junning; Shi, Yonggang; Toga, Arthur W.
2015-01-01
Thresholding statistical maps with appropriate correction of multiple testing remains a critical and challenging problem in brain mapping. Since the false discovery rate (FDR) criterion was introduced to the neuroimaging community a decade ago, various improvements have been proposed. However, a highly desirable feature, transformation invariance, has not been adequately addressed, especially for voxel-based FDR. Thresholding applied after spatial transformation is not necessarily equivalent to transformation applied after thresholding in the original space. We find this problem closely related to another important issue: spatial correlation of signals. A Gaussian random vector-valued image after normalization is a random map from a Euclidean space to a high-dimension unit-sphere. Instead of defining the FDR measure in the image’s Euclidean space, we define it in the signals’ hyper-spherical space whose measure not only reflects the intrinsic “volume” of signals’ randomness but also keeps invariant under images’ spatial transformation. Experiments with synthetic and real images demonstrate that our method achieves transformation invariance and significantly minimizes the bias introduced by the choice of template images. PMID:26213450
Gutierrez-Gonzalez, Juan Jose; Wu, Xiaolei; Zhang, Juan; Lee, Jeong-Dong; Ellersieck, Mark; Shannon, J Grover; Yu, Oliver; Nguyen, Henry T; Sleper, David A
2009-10-01
A major objective for geneticists is to decipher genetic architecture of traits associated with agronomic importance. However, a majority of such traits are complex, and their genetic dissection has been traditionally hampered not only by the number of minor-effect quantitative trait loci (QTL) but also by genome-wide interacting loci with little or no individual effect. Soybean (Glycine max [L.] Merr.) seed isoflavonoids display a broad range of variation, even in genetically stabilized lines that grow in a fixed environment, because their synthesis and accumulation are affected by many biotic and abiotic factors. Due to this complexity, isoflavone QTL mapping has often produced conflicting results especially with variable growing conditions. Herein, we comparatively mapped soybean seed isoflavones genistein, daidzein, and glycitein by using several of the most commonly used mapping approaches: interval mapping, composite interval mapping, multiple interval mapping and a mixed-model based composite interval mapping. In total, 26 QTLs, including many novel regions, were found bearing additive main effects in a population of RILs derived from the cross between Essex and PI 437654. Our comparative approach demonstrates that statistical mapping methodologies are crucial for QTL discovery in complex traits. Despite a previous understanding of the influence of additive QTL on isoflavone production, the role of epistasis is not well established. Results indicate that epistasis, although largely dependent on the environment, is a very important genetic component underlying seed isoflavone content, and suggest epistasis as a key factor causing the observed phenotypic variability of these traits in diverse environments. PMID:19626310
Bayesian Inference for Functional Dynamics Exploring in fMRI Data.
Guo, Xuan; Liu, Bing; Chen, Le; Chen, Guantao; Pan, Yi; Zhang, Jing
2016-01-01
This paper aims to review state-of-the-art Bayesian-inference-based methods applied to functional magnetic resonance imaging (fMRI) data. Particularly, we focus on one specific long-standing challenge in the computational modeling of fMRI datasets: how to effectively explore typical functional interactions from fMRI time series and the corresponding boundaries of temporal segments. Bayesian inference is a method of statistical inference which has been shown to be a powerful tool to encode dependence relationships among the variables with uncertainty. Here we provide an introduction to a group of Bayesian-inference-based methods for fMRI data analysis, which were designed to detect magnitude or functional connectivity change points and to infer their functional interaction patterns based on corresponding temporal boundaries. We also provide a comparison of three popular Bayesian models, that is, Bayesian Magnitude Change Point Model (BMCPM), Bayesian Connectivity Change Point Model (BCCPM), and Dynamic Bayesian Variable Partition Model (DBVPM), and give a summary of their applications. We envision that more delicate Bayesian inference models will be emerging and play increasingly important roles in modeling brain functions in the years to come. PMID:27034708
Bayesian Inference for Functional Dynamics Exploring in fMRI Data
Guo, Xuan; Liu, Bing; Chen, Le; Chen, Guantao
2016-01-01
This paper aims to review state-of-the-art Bayesian-inference-based methods applied to functional magnetic resonance imaging (fMRI) data. Particularly, we focus on one specific long-standing challenge in the computational modeling of fMRI datasets: how to effectively explore typical functional interactions from fMRI time series and the corresponding boundaries of temporal segments. Bayesian inference is a method of statistical inference which has been shown to be a powerful tool to encode dependence relationships among the variables with uncertainty. Here we provide an introduction to a group of Bayesian-inference-based methods for fMRI data analysis, which were designed to detect magnitude or functional connectivity change points and to infer their functional interaction patterns based on corresponding temporal boundaries. We also provide a comparison of three popular Bayesian models, that is, Bayesian Magnitude Change Point Model (BMCPM), Bayesian Connectivity Change Point Model (BCCPM), and Dynamic Bayesian Variable Partition Model (DBVPM), and give a summary of their applications. We envision that more delicate Bayesian inference models will be emerging and play increasingly important roles in modeling brain functions in the years to come. PMID:27034708
Ayala, Guillermo; Díez, Fernando; Gassó, María T; Jones, Brian E; Martín-Portugués, Rafael; Ramiro-Aparicio, Juan
2016-04-30
The internal lubricant content (ILC) of inhalation grade HPMC capsules is a key factor to ensure good powder release when the patient inhales a medicine from a dry powder inhaler (DPI). Powder release from capsules has been shown to be influenced by the ILC. The characteristics used to measure this are the emitted dose, fine particle fraction and mass median aerodynamic diameter. In addition the ILC level is critical for capsule shell manufacture because it is an essential part of the process that cannot work without it. An experiment has been applied to the manufacture of inhalation capsules with the required ILC. A full factorial model was used to identify the controlling factors and from this a linear model has been proposed to improve control of the process. PMID:26899981
Social Self-Control Is a Statistically Nonredundant Correlate of Adolescent Substance Use.
Sussman, Steve; Chou, Chih-Ping; Pang, Raina D; Kirkpatrick, Matthew; Guillot, Casey R; Stone, Matthew; Khoddam, Rubin; Riggs, Nathaniel R; Unger, Jennifer B; Leventhal, Adam M
2016-05-11
The social self-control scale (SSCS), which taps provocative behavior in social situations, was compared with five potentially overlapping measures (i.e., temperament-related impulsivity, psychomotor agitation-related self-control, perceived social competence, and rash action in response to negative and positive affectively charged states) as correlates of tobacco use and other drug use among a sample of 3,356 ninth-grade youth in Southern California high schools. While there was a lot of shared variance among the measures, the SSCS was incrementally associated with both categories of drug use over and above alternate constructs previously implicated in adolescent drug use. Hence, SSC may relate to adolescent drug use through an etiological pathway unique from other risk constructs. Given that youth who tend to alienate others through provocative social behavior are at risk for multiple drug use, prevention programming to modify low SSC may be warranted. PMID:27070833
Bayesian Error Estimation Functionals
NASA Astrophysics Data System (ADS)
Jacobsen, Karsten W.
The challenge of approximating the exchange-correlation functional in Density Functional Theory (DFT) has led to the development of numerous different approximations of varying accuracy on different calculated properties. There is therefore a need for reliable estimation of prediction errors within the different approximation schemes to DFT. The Bayesian Error Estimation Functionals (BEEF) have been developed with this in mind. The functionals are constructed by fitting to experimental and high-quality computational databases for molecules and solids including chemisorption and van der Waals systems. This leads to reasonably accurate general-purpose functionals with particual focus on surface science. The fitting procedure involves considerations on how to combine different types of data, and applies Tikhonov regularization and bootstrap cross validation. The methodology has been applied to construct GGA and metaGGA functionals with and without inclusion of long-ranged van der Waals contributions. The error estimation is made possible by the generation of not only a single functional but through the construction of a probability distribution of functionals represented by a functional ensemble. The use of the functional ensemble is illustrated on compound heat of formation and by investigations of the reliability of calculated catalytic ammonia synthesis rates.
Approximate Bayesian multibody tracking.
Lanz, Oswald
2006-09-01
Visual tracking of multiple targets is a challenging problem, especially when efficiency is an issue. Occlusions, if not properly handled, are a major source of failure. Solutions supporting principled occlusion reasoning have been proposed but are yet unpractical for online applications. This paper presents a new solution which effectively manages the trade-off between reliable modeling and computational efficiency. The Hybrid Joint-Separable (HJS) filter is derived from a joint Bayesian formulation of the problem, and shown to be efficient while optimal in terms of compact belief representation. Computational efficiency is achieved by employing a Markov random field approximation to joint dynamics and an incremental algorithm for posterior update with an appearance likelihood that implements a physically-based model of the occlusion process. A particle filter implementation is proposed which achieves accurate tracking during partial occlusions, while in cases of complete occlusion, tracking hypotheses are bound to estimated occlusion volumes. Experiments show that the proposed algorithm is efficient, robust, and able to resolve long-term occlusions between targets with identical appearance. PMID:16929730
Bayesian Spatial Quantile Regression
Reich, Brian J.; Fuentes, Montserrat; Dunson, David B.
2013-01-01
Tropospheric ozone is one of the six criteria pollutants regulated by the United States Environmental Protection Agency under the Clean Air Act and has been linked with several adverse health effects, including mortality. Due to the strong dependence on weather conditions, ozone may be sensitive to climate change and there is great interest in studying the potential effect of climate change on ozone, and how this change may affect public health. In this paper we develop a Bayesian spatial model to predict ozone under different meteorological conditions, and use this model to study spatial and temporal trends and to forecast ozone concentrations under different climate scenarios. We develop a spatial quantile regression model that does not assume normality and allows the covariates to affect the entire conditional distribution, rather than just the mean. The conditional distribution is allowed to vary from site-to-site and is smoothed with a spatial prior. For extremely large datasets our model is computationally infeasible, and we develop an approximate method. We apply the approximate version of our model to summer ozone from 1997–2005 in the Eastern U.S., and use deterministic climate models to project ozone under future climate conditions. Our analysis suggests that holding all other factors fixed, an increase in daily average temperature will lead to the largest increase in ozone in the Industrial Midwest and Northeast. PMID:23459794
Bayesian Spatial Quantile Regression.
Reich, Brian J; Fuentes, Montserrat; Dunson, David B
2011-03-01
Tropospheric ozone is one of the six criteria pollutants regulated by the United States Environmental Protection Agency under the Clean Air Act and has been linked with several adverse health effects, including mortality. Due to the strong dependence on weather conditions, ozone may be sensitive to climate change and there is great interest in studying the potential effect of climate change on ozone, and how this change may affect public health. In this paper we develop a Bayesian spatial model to predict ozone under different meteorological conditions, and use this model to study spatial and temporal trends and to forecast ozone concentrations under different climate scenarios. We develop a spatial quantile regression model that does not assume normality and allows the covariates to affect the entire conditional distribution, rather than just the mean. The conditional distribution is allowed to vary from site-to-site and is smoothed with a spatial prior. For extremely large datasets our model is computationally infeasible, and we develop an approximate method. We apply the approximate version of our model to summer ozone from 1997-2005 in the Eastern U.S., and use deterministic climate models to project ozone under future climate conditions. Our analysis suggests that holding all other factors fixed, an increase in daily average temperature will lead to the largest increase in ozone in the Industrial Midwest and Northeast. PMID:23459794
Bayesian approach for network modeling of brain structural features
NASA Astrophysics Data System (ADS)
Joshi, Anand A.; Joshi, Shantanu H.; Leahy, Richard M.; Shattuck, David W.; Dinov, Ivo; Toga, Arthur W.
2010-03-01
Brain connectivity patterns are useful in understanding brain function and organization. Anatomical brain connectivity is largely determined using the physical synaptic connections between neurons. In contrast statistical brain connectivity in a given brain population refers to the interaction and interdependencies of statistics of multitudes of brain features including cortical area, volume, thickness etc. Traditionally, this dependence has been studied by statistical correlations of cortical features. In this paper, we propose the use of Bayesian network modeling for inferring statistical brain connectivity patterns that relate to causal (directed) as well as non-causal (undirected) relationships between cortical surface areas. We argue that for multivariate cortical data, the Bayesian model provides for a more accurate representation by removing the effect of confounding correlations that get introduced due to canonical dependence between the data. Results are presented for a population of 466 brains, where a SEM (structural equation modeling) approach is used to generate a Bayesian network model, as well as a dependency graph for the joint distribution of cortical areas.
NASA Technical Reports Server (NTRS)
Robinson, Michael; Steiner, Matthias; Wolff, David B.; Ferrier, Brad S.; Kessinger, Cathy; Einaudi, Franco (Technical Monitor)
2000-01-01
The primary function of the TRMM Ground Validation (GV) Program is to create GV rainfall products that provide basic validation of satellite-derived precipitation measurements for select primary sites. A fundamental and extremely important step in creating high-quality GV products is radar data quality control. Quality control (QC) processing of TRMM GV radar data is based on some automated procedures, but the current QC algorithm is not fully operational and requires significant human interaction to assure satisfactory results. Moreover, the TRMM GV QC algorithm, even with continuous manual tuning, still can not completely remove all types of spurious echoes. In an attempt to improve the current operational radar data QC procedures of the TRMM GV effort, an intercomparison of several QC algorithms has been conducted. This presentation will demonstrate how various radar data QC algorithms affect accumulated radar rainfall products. In all, six different QC algorithms will be applied to two months of WSR-88D radar data from Melbourne, Florida. Daily, five-day, and monthly accumulated radar rainfall maps will be produced for each quality-controlled data set. The QC algorithms will be evaluated and compared based on their ability to remove spurious echoes without removing significant precipitation. Strengths and weaknesses of each algorithm will be assessed based on, their abilit to mitigate both erroneous additions and reductions in rainfall accumulation from spurious echo contamination and true precipitation removal, respectively. Contamination from individual spurious echo categories will be quantified to further diagnose the abilities of each radar QC algorithm. Finally, a cost-benefit analysis will be conducted to determine if a more automated QC algorithm is a viable alternative to the current, labor-intensive QC algorithm employed by TRMM GV.
Dimova, Rositsa B; Allison, David B
2016-01-01
The conclusions of Cassani et al. in the January 2015 issue of Nutrition Journal (doi: 10.1186/1475-2891-14-5 ) cannot be substantiated by the analysis reported nor by the data themselves. The authors ascribed the observed decrease in inflammatory markers to the components of flaxseed and based their conclusions on within-group comparisons made between the final and the baseline measurements separately in each arm of the randomized controlled trial. However, this is an improper approach and the conclusions of the paper are invalid. A correct analysis of the data shows no such effects. PMID:27265269
A program for the Bayesian Neural Network in the ROOT framework
NASA Astrophysics Data System (ADS)
Zhong, Jiahang; Huang, Run-Sheng; Lee, Shih-Chang
2011-12-01
We present a Bayesian Neural Network algorithm implemented in the TMVA package (Hoecker et al., 2007 [1]), within the ROOT framework (Brun and Rademakers, 1997 [2]). Comparing to the conventional utilization of Neural Network as discriminator, this new implementation has more advantages as a non-parametric regression tool, particularly for fitting probabilities. It provides functionalities including cost function selection, complexity control and uncertainty estimation. An example of such application in High Energy Physics is shown. The algorithm is available with ROOT release later than 5.29. Program summaryProgram title: TMVA-BNN Catalogue identifier: AEJX_v1_0 Program summary URL:http://cpc.cs.qub.ac.uk/summaries/AEJX_v1_0.html Program obtainable from: CPC Program Library, Queen's University, Belfast, N. Ireland Licensing provisions: BSD license No. of lines in distributed program, including test data, etc.: 5094 No. of bytes in distributed program, including test data, etc.: 1,320,987 Distribution format: tar.gz Programming language: C++ Computer: Any computer system or cluster with C++ compiler and UNIX-like operating system Operating system: Most UNIX/Linux systems. The application programs were thoroughly tested under Fedora and Scientific Linux CERN. Classification: 11.9 External routines: ROOT package version 5.29 or higher ( http://root.cern.ch) Nature of problem: Non-parametric fitting of multivariate distributions Solution method: An implementation of Neural Network following the Bayesian statistical interpretation. Uses Laplace approximation for the Bayesian marginalizations. Provides the functionalities of automatic complexity control and uncertainty estimation. Running time: Time consumption for the training depends substantially on the size of input sample, the NN topology, the number of training iterations, etc. For the example in this manuscript, about 7 min was used on a PC/Linux with 2.0 GHz processors.
On the evolution of statistical methods as applied to clinical trials.
Machin, D
2004-05-01
This paper describes how statistical methods have evolved in parallel with activities associated with randomized control trials. In particular we emphasize the pivotal role of two papers published in British Journal of Cancer, and the paper describing the Cox proportional hazards model. In addition, the importance of early papers on estimating the sample size required for trials is highlighted. Later developments including the increasing roles for competing risks, multilevel modelling and Bayesian methodologies are described. The interplay between computer software and statistical methodological developments is stressed. Finally some future directions are indicated. PMID:15078495
Liu, Yang; He, Kebin; Li, Shenshen; Wang, Zhaoxi; Christiani, David C; Koutrakis, Petros
2012-09-01
A statistical model was developed using satellite remote sensing data and meteorological parameters to evaluate the effectiveness of air pollution control measures during the 2008 Beijing Olympic Games. Custom satellite retrievals under hazy conditions were included in the modeling dataset to represent the air pollution levels more accurately. This model explained 70% of the PM(2.5) variability during the modeling period from June to October 2008. Using this tool, we estimate that the aggressive emission reduction measures alone effectively lowered PM(2.5) levels by 20-24 μg/m(3) or 27-33% on average during the Games period, which is substantially greater than those reported previously. Since parameters required to develop this model are readily available in most cities of the world, it can be quickly applied after other major events to evaluate air pollution control policy. PMID:22406019
Lekone, Phenyo E; Finkenstädt, Bärbel F
2006-12-01
A stochastic discrete-time susceptible-exposed-infectious-recovered (SEIR) model for infectious diseases is developed with the aim of estimating parameters from daily incidence and mortality time series for an outbreak of Ebola in the Democratic Republic of Congo in 1995. The incidence time series exhibit many low integers as well as zero counts requiring an intrinsically stochastic modeling approach. In order to capture the stochastic nature of the transitions between the compartmental populations in such a model we specify appropriate conditional binomial distributions. In addition, a relatively simple temporally varying transmission rate function is introduced that allows for the effect of control interventions. We develop Markov chain Monte Carlo methods for inference that are used to explore the posterior distribution of the parameters. The algorithm is further extended to integrate numerically over state variables of the model, which are unobserved. This provides a realistic stochastic model that can be used by epidemiologists to study the dynamics of the disease and the effect of control interventions. PMID:17156292
NASA Astrophysics Data System (ADS)
Bhattacharyya, Sidhakam; Bandyopadhyay, Gautam
2010-10-01
The council of most of the Urban Local Bodies (ULBs) has a limited scope for decision making in the absence of appropriate financial control mechanism. The information about expected amount of own fund during a particular period is of great importance for decision making. Therefore, in this paper, efforts are being made to present set of findings and to establish a model of estimating receipts of own sources and payments thereof using multiple regression analysis. Data for sixty months from a reputed ULB in West Bengal have been considered for ascertaining the regression models. This can be used as a part of financial management and control procedure by the council to estimate the effect on own fund. In our study we have considered two models using multiple regression analysis. "Model I" comprises of total adjusted receipt as the dependent variable and selected individual receipts as the independent variables. Similarly "Model II" consists of total adjusted payments as the dependent variable and selected individual payments as independent variables. The resultant of Model I and Model II is the surplus or deficit effecting own fund. This may be applied for decision making purpose by the council.
El-Say, Khalid M; El-Helw, Abdel-Rahim M; Ahmed, Osama A A; Hosny, Khaled M; Ahmed, Tarek A; Kharshoum, Rasha M; Fahmy, Usama A; Alsawahli, Majed
2015-01-01
The purpose was to improve the encapsulation efficiency of cetirizine hydrochloride (CTZ) microspheres as a model for water soluble drugs and control its release by applying response surface methodology. A 3(3) Box-Behnken design was used to determine the effect of drug/polymer ratio (X1), surfactant concentration (X2) and stirring speed (X3), on the mean particle size (Y1), percentage encapsulation efficiency (Y2) and cumulative percent drug released for 12 h (Y3). Emulsion solvent evaporation (ESE) technique was applied utilizing Eudragit RS100 as coating polymer and span 80 as surfactant. All formulations were evaluated for micromeritic properties and morphologically characterized by scanning electron microscopy (SEM). The relative bioavailability of the optimized microspheres was compared with CTZ marketed product after oral administration on healthy human volunteers using a double blind, randomized, cross-over design. The results revealed that the mean particle sizes of the microspheres ranged from 62 to 348 µm and the efficiency of entrapment ranged from 36.3% to 70.1%. The optimized CTZ microspheres exhibited a slow and controlled release over 12 h. The pharmacokinetic data of optimized CTZ microspheres showed prolonged tmax, decreased Cmax and AUC0-∞ value of 3309 ± 211 ng h/ml indicating improved relative bioavailability by 169.4% compared with marketed tablets. PMID:24856961
Ung, N M; Wee, L
2011-12-01
Portal imaging of implanted fiducial markers has been in use for image-guided radiotherapy (IGRT) of prostate cancer, with ample attention to localization accuracy and organ motion. The geometric uncertainties in point-based rigid-body matching algorithms during localization of prostate fiducial markers can be quantified in terms of a fiducial registration error (FRE). In this study, the aim is to demonstrate how statistical process control (SPC) can be used to intercept potential problems with rigid-body matching algorithms in a retrospective study of FRE for a pilot cohort of 34 patients with fiducial markers. A procedure for estimating control parameters of a SPC control chart (x-chart) from a small number of initial observations (N) of FRE was implemented. The sensitivity analysis of N on the number of 'in-control' and 'out-of-control' x-charts was also performed. Uncorrected rotational offsets of an individual patient were examined to elucidate possible correlations with the behaviours of an x-chart. Four specific types of qualitative x-chart behaviour have been observed. The number of out-of-control processes was insensitive to the choice of N, provided N ≥ 5. Residual errors of rigid-body registration were contributed from uncorrected rotational offsets in 5 out of 15 'out-of-control' x-charts. Out-of-control x-charts were also shown to be correlated with potential changes in the IGRT processes, which may compromise the quality of the radiation treatment delivery. The SPC methodology, implemented in the form of individually customized x-charts, has been shown to be a useful tool for monitoring process reliability during fiducial-based IGRT for prostate cancer. PMID:22080792
NASA Astrophysics Data System (ADS)
Ung, N. M.; Wee, L.
2011-12-01
Portal imaging of implanted fiducial markers has been in use for image-guided radiotherapy (IGRT) of prostate cancer, with ample attention to localization accuracy and organ motion. The geometric uncertainties in point-based rigid-body matching algorithms during localization of prostate fiducial markers can be quantified in terms of a fiducial registration error (FRE). In this study, the aim is to demonstrate how statistical process control (SPC) can be used to intercept potential problems with rigid-body matching algorithms in a retrospective study of FRE for a pilot cohort of 34 patients with fiducial markers. A procedure for estimating control parameters of a SPC control chart (x-chart) from a small number of initial observations (N) of FRE was implemented. The sensitivity analysis of N on the number of 'in-control' and 'out-of-control' x-charts was also performed. Uncorrected rotational offsets of an individual patient were examined to elucidate possible correlations with the behaviours of an x-chart. Four specific types of qualitative x-chart behaviour have been observed. The number of out-of-control processes was insensitive to the choice of N, provided N >= 5. Residual errors of rigid-body registration were contributed from uncorrected rotational offsets in 5 out of 15 'out-of-control' x-charts. Out-of-control x-charts were also shown to be correlated with potential changes in the IGRT processes, which may compromise the quality of the radiation treatment delivery. The SPC methodology, implemented in the form of individually customized x-charts, has been shown to be a useful tool for monitoring process reliability during fiducial-based IGRT for prostate cancer.
Organism-level models: When mechanisms and statistics fail us
NASA Astrophysics Data System (ADS)
Phillips, M. H.; Meyer, J.; Smith, W. P.; Rockhill, J. K.
2014-03-01
Purpose: To describe the unique characteristics of models that represent the entire course of radiation therapy at the organism level and to highlight the uses to which such models can be put. Methods: At the level of an organism, traditional model-building runs into severe difficulties. We do not have sufficient knowledge to devise a complete biochemistry-based model. Statistical model-building fails due to the vast number of variables and the inability to control many of them in any meaningful way. Finally, building surrogate models, such as animal-based models, can result in excluding some of the most critical variables. Bayesian probabilistic models (Bayesian networks) provide a useful alternative that have the advantages of being mathematically rigorous, incorporating the knowledge that we do have, and being practical. Results: Bayesian networks representing radiation therapy pathways for prostate cancer and head & neck cancer were used to highlight the important aspects of such models and some techniques of model-building. A more specific model representing the treatment of occult lymph nodes in head & neck cancer were provided as an example of how such a model can inform clinical decisions. A model of the possible role of PET imaging in brain cancer was used to illustrate the means by which clinical trials can be modelled in order to come up with a trial design that will have meaningful outcomes. Conclusions: Probabilistic models are currently the most useful approach to representing the entire therapy outcome process.
Day, N E; Byar, D P
1979-09-01
The two approaches in common use for the analysis of case-control studies are cross-classification by confounding variables, and modeling the logarithm of the odds ratio as a function of exposure and confounding variables. We show here that score statistics derived from the likelihood function in the latter approach are identical to the Mantel-Haenszel test statistics appropriate for the former approach. This identity holds in the most general situation considered, testing for marginal homogeneity in mK tables. This equivalence is demonstrated by a permutational argument which leads to a general likelihood expression in which the exposure variable may be a vector of discrete and/or continuous variables and in which more than two comparison groups may be considered. This likelihood can be used in analyzing studies in which there are multiple controls for each case or in which several disease categories are being compared. The possibility of including continuous variables makes this likelihood useful in situations that cannot be treated using the Mantel-Haenszel cross-classification approach. PMID:497345
Shaviv, Avi; Raban, Smadar; Zaidel, Elina
2003-05-15
A statistically based model for describing the release from a population of polymer coated controlled release fertilizer (CRF) granules by the diffusion mechanism was constructed. The model is based on a mathematical-mechanistic description of the release from a single granule of a coated CRF accounting for its complex and nonlinear nature. The large variation within populations of coated CRFs poses the need for a statistically based approach to integrate over the release from the individual granules within a given population for which the distribution and range of granule radii and coating thickness are known. The model was constructed and verified using experimentally determined parameters and release curves of polymer-coated CRFs. A sensitivity analysis indicated the importance of water permeability in controlling the lag period and that of solute permeability in governing the rate of linear release and the total duration of the release. Increasing the mean values of normally distributed granule radii or coating thickness, increases the lag period and the period of linear release. The variation of radii and coating thickness, within realistic ranges, affects the release only when the standard deviation is very large or when water permeability is reduced without affecting solute permeability. The model provides an effective tool for designing and improving agronomic and environmental effectiveness of polymer-coated CRFs. PMID:12785533
Bayesian design strategies for synthetic biology
Barnes, Chris P.; Silk, Daniel; Stumpf, Michael P. H.
2011-01-01
We discuss how statistical inference techniques can be applied in the context of designing novel biological systems. Bayesian techniques have found widespread application and acceptance in the systems biology community, where they are used for both parameter estimation and model selection. Here we show that the same approaches can also be used in order to engineer synthetic biological systems by inferring the structure and parameters that are most likely to give rise to the dynamics that we require a system to exhibit. Problems that are shared between applications in systems and synthetic biology include the vast potential spaces that need to be searched for suitable models and model parameters; the complex forms of likelihood functions; and the interplay between noise at the molecular level and nonlinearity in the dynamics owing to often complex feedback structures. In order to meet these challenges, we have to develop suitable inferential tools and here, in particular, we illustrate the use of approximate Bayesian computation and unscented Kalman filtering-based approaches. These partly complementary methods allow us to tackle a number of recurring problems in the design of biological systems. After a brief exposition of these two methodologies, we focus on their application to oscillatory systems. PMID:23226588
Bayesian response adaptive randomization using longitudinal outcomes.
Hatayama, Tomoyoshi; Morita, Satoshi; Sakamaki, Kentaro
2015-01-01
The response adaptive randomization (RAR) method is used to increase the number of patients assigned to more efficacious treatment arms in clinical trials. In many trials evaluating longitudinal patient outcomes, RAR methods based only on the final measurement may not benefit significantly from RAR because of its delayed initiation. We propose a Bayesian RAR method to improve RAR performance by accounting for longitudinal patient outcomes (longitudinal RAR). We use a Bayesian linear mixed effects model to analyze longitudinal continuous patient outcomes for calculating a patient allocation probability. In addition, we aim to mitigate the loss of statistical power because of large patient allocation imbalances by embedding adjusters into the patient allocation probability calculation. Using extensive simulation we compared the operating characteristics of our proposed longitudinal RAR method with those of the RAR method based only on the final measurement and with an equal randomization method. Simulation results showed that our proposed longitudinal RAR method assigned more patients to the presumably superior treatment arm compared with the other two methods. In addition, the embedded adjuster effectively worked to prevent extreme patient allocation imbalances. However, our proposed method may not function adequately when the treatment effect difference is moderate or less, and still needs to be modified to deal with unexpectedly large departures from the presumed longitudinal data model. PMID:26099995
Stellar Parameter Determination Using Bayesian Techniques.
NASA Astrophysics Data System (ADS)
Ekanayake, Gemunu B.; Wilhelm, Ronald J.
2015-01-01
Spectral energy distributions of stars covering the wavelength range from far UV to far IR can be used to derive stellar atmospheric parameters (effective temperature, surface gravity and iron abundance) with a high reliability. For this purpose we are using a method based on Bayesian statistics, which make use of all available photometric data for a given star to construct stellar parameter probability distribution function (PDF) in order to determine the expectation values and their uncertainties in stellar parameters. The marginalized probabilities allow us to characterize the constraint for each parameter and estimate the influence of the quantity and quality of the photometric data on the resulting parameter values. We have obtained low resolution spectroscopy of blue horizontal branch, blue straggler and normal main sequence A, B, G and F stellar parameter standard stars using the McDonald observatory, 2.1m telescope to constrain both synthetic and empirical stellar libraries like Atlas9, MARCS, MILES and Pickles across a wide range in parameter space. This calibration process helps to evaluate the correlations between different stellar libraries and observed data especially in the UV part of the spectrum. When the calibration is complete the Bayesian analysis can be applied to large samples of data from GALEX, SDSS, 2MASS,WISE etc. We expect significant improvements to luminosity classification, distances and interstellar extinction using this technique.
Bayesian statistical techniques have proven useful in clinical and environmental epidemiological applications to evaluate and integrate available information, and in regulatory applications such as the National Ambient Air Quality Assessment for Nitrogen Oxides. A recent special...
Statistical analysis of failure data on controllers and SSME turbine blade failures
NASA Technical Reports Server (NTRS)
Patil, S. A.
1986-01-01
The expressions for the maximum likelihood functions are given when the failure data are censored at a given point or at multiple points, or when the data come in groups. Different models applicable to failure data are presented with their characteristics. A graphical method of distinguishing different models by using cumulative hazard fucnction is discussed. For the failure data on controllers the model is determined by cumulative hazard function and chi-square goodness of fit. Using the Weibull Model the maximum likelihood estimators of the shape parameter and the failure rate parameter are obtained. The confidence intervals, meantime between failures, and B1 are determined. Similarly, for the data on Space Shuttle Main Engine (SSME) blade failures the maximum likelihood estimators are obtained for the Weibull parameters. The variances, confidence intervals, meantime between failures, and reliability are determined. The analysis is performed under assumption of grouped data as well as randomly placed data.
Fortune, Mary D.; Guo, Hui; Burren, Oliver; Schofield, Ellen; Walker, Neil M.; Ban, Maria; Sawcer, Stephen J.; Bowes, John; Worthington, Jane; Barton, Ann; Eyre, Steve; Todd, John A.; Wallace, Chris
2015-01-01
Identifying whether potential causal variants for related diseases are shared can identify overlapping etiologies of multifactorial disorders. Colocalization methods disentangle shared and distinct causal variants. However, existing approaches require independent datasets. Here we extend two colocalization methods to allow for the shared control design commonly used in comparison of genome-wide association study results across diseases. Our analysis of four autoimmune diseases, type 1 diabetes (T1D), rheumatoid arthritis, celiac disease and multiple sclerosis, revealed 90 regions that were associated with at least one disease, 33 (37%) of which with two or more disorders. Nevertheless, for 14 of these 33 shared regions there was evidence that causal variants differed. We identified novel disease associations in 11 regions previously associated with one or more of the other three disorders. Four of eight T1D-specific regions contained known type 2 diabetes candidate genes: COBL, GLIS3, RNLS and BCAR1, suggesting a shared cellular etiology. PMID:26053495
Bayesian methods for the design and interpretation of clinical trials in very rare diseases
Hampson, Lisa V; Whitehead, John; Eleftheriou, Despina; Brogan, Paul
2014-01-01
This paper considers the design and interpretation of clinical trials comparing treatments for conditions so rare that worldwide recruitment efforts are likely to yield total sample sizes of 50 or fewer, even when patients are recruited over several years. For such studies, the sample size needed to meet a conventional frequentist power requirement is clearly infeasible. Rather, the expectation of any such trial has to be limited to the generation of an improved understanding of treatment options. We propose a Bayesian approach for the conduct of rare-disease trials comparing an experimental treatment with a control where patient responses are classified as a success or failure. A systematic elicitation from clinicians of their beliefs concerning treatment efficacy is used to establish Bayesian priors for unknown model parameters. The process of determining the prior is described, including the possibility of formally considering results from related trials. As sample sizes are small, it is possible to compute all possible posterior distributions of the two success rates. A number of allocation ratios between the two treatment groups can be considered with a view to maximising the prior probability that the trial concludes recommending the new treatment when in fact it is non-inferior to control. Consideration of the extent to which opinion can be changed, even by data from the best feasible design, can help to determine whether such a trial is worthwhile. © 2014 The Authors. Statistics in Medicine published by John Wiley & Sons, Ltd. PMID:24957522
Bayesian Model Selection for Group Studies
Stephan, Klaas Enno; Penny, Will D.; Daunizeau, Jean; Moran, Rosalyn J.; Friston, Karl J.
2009-01-01
Bayesian model selection (BMS) is a powerful method for determining the most likely among a set of competing hypotheses about the mechanisms that generated observed data. BMS has recently found widespread application in neuroimaging, particularly in the context of dynamic causal modelling (DCM). However, so far, combining BMS results from several subjects has relied on simple (fixed effects) metrics, e.g. the group Bayes factor (GBF), that do not account for group heterogeneity or outliers. In this paper, we compare the GBF with two random effects methods for BMS at the between-subject or group level. These methods provide inference on model-space using a classical and Bayesian perspective respectively. First, a classical (frequentist) approach uses the log model evidence as a subject-specific summary statistic. This enables one to use analysis of variance to test for differences in log-evidences over models, relative to inter-subject differences. We then consider the same problem in Bayesian terms and describe a novel hierarchical model, which is optimised to furnish a probability density on the models themselves. This new variational Bayes method rests on treating the model as a random variable and estimating the parameters of a Dirichlet distribution which describes the probabilities for all models considered. These probabilities then define a multinomial distribution over model space, allowing one to compute how likely it is that a specific model generated the data of a randomly chosen subject as well as the exceedance probability of one model being more likely than any other model. Using empirical and synthetic data, we show that optimising a conditional density of the model probabilities, given the log-evidences for each model over subjects, is more informative and appropriate than both the GBF and frequentist tests of the log-evidences. In particular, we found that the hierarchical Bayesian approach is considerably more robust than either of the other
Transit-Depth Metallicity Correlation: A Bayesian Approach
NASA Astrophysics Data System (ADS)
Sarkis, P.; Nehmé, C.
2015-12-01
A negative correlation was previously reported between the transit depth of Kepler's Q1-Q12 gas giant candidates and the stellar metallicity. In this present work, we revisit this correlation to better understand the role of the stellar metallicity in the formation of giant planets, in particular, to investigate the effect of the metallicity on the transit depth. We selected the 82 confirmed giant planets from the cumulative catalogue. This is the first large and homogenous sample of confirmed giant planets used to study this correlation. Such samples are suitable to perform robust statistical analysis. We present the first hierarchical Bayesian linear regression model to revise this correlation. The advantages of using a Bayesian framework are to incorporate measurement errors in the model and to quantify both the intrinsic scatter and the uncertainties on the parameters of the model. Our statistical analysis reveals no correlation between the transit depth of confirmed giant planets and the stellar metallicity.
Bayesian shared frailty models for regional inference about wildlife survival
Heisey, D.M.
2012-01-01
One can joke that 'exciting statistics' is an oxymoron, but it is neither a joke nor an exaggeration to say that these are exciting times to be involved in statistical ecology. As Halstead et al.'s (2012) paper nicely exemplifies, recently developed Bayesian analyses can now be used to extract insights from data using techniques that would have been unavailable to the ecological researcher just a decade ago. Some object to this, implying that the subjective priors of the Bayesian approach is the pathway to perdition (e.g. Lele & Dennis, 2009). It is reasonable to ask whether these new approaches are really giving us anything that we could not obtain with traditional tried-and-true frequentist approaches. I believe the answer is a clear yes.
ERIC Educational Resources Information Center
Meijer, Rob R.; van Krimpen-Stoop, Edith M. L. A.
In this study a cumulative-sum (CUSUM) procedure from the theory of Statistical Process Control was modified and applied in the context of person-fit analysis in a computerized adaptive testing (CAT) environment. Six person-fit statistics were proposed using the CUSUM procedure, and three of them could be used to investigate the CAT in online test…
A Bayesian Estimator of Protein-Protein Association Probabilities
Gilmore, Jason M.; Auberry, Deanna L.; Sharp, Julia L.; White, Amanda M.; Anderson, Kevin K.; Daly, Don S.
2008-07-01
The Bayesian Estimator of Protein-Protein Association Probabilities (BEPro3) is a software tool for estimating probabilities of protein-protein association between bait and prey protein pairs using data from multiple-bait, multiple-replicate, protein pull-down LC-MS assay experiments. BEPro3 is open source software that runs on both Windows XP and Mac OS 10.4 or newer versions, and is freely available from http://www.pnl.gov/statistics/BEPro3.
Goodrich, Jaclyn M; Sánchez, Brisa N; Dolinoy, Dana C; Zhang, Zhenzhen; Hernández-Ávila, Mauricio; Hu, Howard; Peterson, Karen E; Téllez-Rojo, Martha M
2015-01-01
DNA methylation data assayed using pyrosequencing techniques are increasingly being used in human cohort studies to investigate associations between epigenetic modifications at candidate genes and exposures to environmental toxicants and to examine environmentally-induced epigenetic alterations as a mechanism underlying observed toxicant-health outcome associations. For instance, in utero lead (Pb) exposure is a neurodevelopmental toxicant of global concern that has also been linked to altered growth in human epidemiological cohorts; a potential mechanism of this association is through alteration of DNA methylation (e.g., at growth-related genes). However, because the associations between toxicants and DNA methylation might be weak, using appropriate quality control and statistical methods is important to increase reliability and power of such studies. Using a simulation study, we compared potential approaches to estimate toxicant-DNA methylation associations that varied by how methylation data were analyzed (repeated measures vs. averaging all CpG sites) and by method to adjust for batch effects (batch controls vs. random effects). We demonstrate that correcting for batch effects using plate controls yields unbiased associations, and that explicitly modeling the CpG site-specific variances and correlations among CpG sites increases statistical power. Using the recommended approaches, we examined the association between DNA methylation (in LINE-1 and growth related genes IGF2, H19 and HSD11B2) and 3 biomarkers of Pb exposure (Pb concentrations in umbilical cord blood, maternal tibia, and maternal patella), among mother-infant pairs of the Early Life Exposures in Mexico to Environmental Toxicants (ELEMENT) cohort (n = 247). Those with 10 μg/g higher patella Pb had, on average, 0.61% higher IGF2 methylation (P = 0.05). Sex-specific trends between Pb and DNA methylation (P < 0.1) were observed among girls including a 0.23% increase in HSD11B2 methylation with 10
NASA Astrophysics Data System (ADS)
Rajabi, Mohammad Mahdi; Ataie-Ashtiani, Behzad
2016-05-01
Bayesian inference has traditionally been conceived as the proper framework for the formal incorporation of expert knowledge in parameter estimation of groundwater models. However, conventional Bayesian inference is incapable of taking into account the imprecision essentially embedded in expert provided information. In order to solve this problem, a number of extensions to conventional Bayesian inference have been introduced in recent years. One of these extensions is 'fuzzy Bayesian inference' which is the result of integrating fuzzy techniques into Bayesian statistics. Fuzzy Bayesian inference has a number of desirable features which makes it an attractive approach for incorporating expert knowledge in the parameter estimation process of groundwater models: (1) it is well adapted to the nature of expert provided information, (2) it allows to distinguishably model both uncertainty and imprecision, and (3) it presents a framework for fusing expert provided information regarding the various inputs of the Bayesian inference algorithm. However an important obstacle in employing fuzzy Bayesian inference in groundwater numerical modeling applications is the computational burden, as the required number of numerical model simulations often becomes extremely exhaustive and often computationally infeasible. In this paper, a novel approach of accelerating the fuzzy Bayesian inference algorithm is proposed which is based on using approximate posterior distributions derived from surrogate modeling, as a screening tool in the computations. The proposed approach is first applied to a synthetic test case of seawater intrusion (SWI) in a coastal aquifer. It is shown that for this synthetic test case, the proposed approach decreases the number of required numerical simulations by an order of magnitude. Then the proposed approach is applied to a real-world test case involving three-dimensional numerical modeling of SWI in Kish Island, located in the Persian Gulf. An expert
Variational Bayesian method for Retinex.
Wang, Liqian; Xiao, Liang; Liu, Hongyi; Wei, Zhihui
2014-08-01
In this paper, we propose a variational Bayesian method for Retinex to simulate and interpret how the human visual system perceives color. To construct a hierarchical Bayesian model, we use the Gibbs distributions as prior distributions for the reflectance and the illumination, and the gamma distributions for the model parameters. By assuming that the reflection function is piecewise continuous and illumination function is spatially smooth, we define the energy functions in the Gibbs distributions as a total variation function and a smooth function for the reflectance and the illumination, respectively. We then apply the variational Bayes approximation to obtain the approximation of the posterior distribution of unknowns so that the unknown images and hyperparameters are estimated simultaneously. Experimental results demonstrate the efficiency of the proposed method for providing competitive performance without additional information about the unknown parameters, and when prior information is added the proposed method outperforms the non-Bayesian-based Retinex methods we compared. PMID:24846606
Bayesian QTL mapping using skewed Student-t distributions
von Rohr, Peter; Hoeschele, Ina
2002-01-01
In most QTL mapping studies, phenotypes are assumed to follow normal distributions. Deviations from this assumption may lead to detection of false positive QTL. To improve the robustness of Bayesian QTL mapping methods, the normal distribution for residuals is replaced with a skewed Student-t distribution. The latter distribution is able to account for both heavy tails and skewness, and both components are each controlled by a single parameter. The Bayesian QTL mapping method using a skewed Student-t distribution is evaluated with simulated data sets under five different scenarios of residual error distributions and QTL effects. PMID:11929622
Bayesian planet searches in radial velocity data
NASA Astrophysics Data System (ADS)
Gregory, Phil
2015-08-01
Intrinsic stellar variability caused by magnetic activity and convection has become the main limiting factor for planet searches in both transit and radial velocity (RV) data. New spectrographs are under development like ESPRESSO and EXPRES that aim to improve RV precision by a factor of approximately 100 over the current best spectrographs, HARPS and HARPS-N. This will greatly exacerbate the challenge of distinguishing planetary signals from stellar activity induced RV signals. At the same time good progress has been made in simulating stellar activity signals. At the Porto 2014 meeting, “Towards Other Earths II,” Xavier Dumusque challenged the community to a large scale blind test using the simulated RV data to understand the limitations of present solutions to deal with stellar signals and to select the best approach. My talk will focus on some of the statistical lesson learned from this challenge with an emphasis on Bayesian methodology.
2012-01-01
Background We carried out a candidate gene association study in pediatric acute lymphoblastic leukemia (ALL) to identify possible genetic risk factors in a Hungarian population. Methods The results were evaluated with traditional statistical methods and with our newly developed Bayesian network based Bayesian multilevel analysis of relevance (BN-BMLA) method. We collected genomic DNA and clinical data from 543 children, who underwent chemotherapy due to ALL, and 529 healthy controls. Altogether 66 single nucleotide polymorphisms (SNPs) in 19 candidate genes were genotyped. Results With logistic regression, we identified 6 SNPs in the ARID5B and IKZF1 genes associated with increased risk to B-cell ALL, and two SNPs in the STAT3 gene, which decreased the risk to hyperdiploid ALL. Because the associated SNPs were in linkage in each gene, these associations corresponded to one signal per gene. The odds ratio (OR) associated with the tag SNPs were: OR = 1.69, P = 2.22x10-7 for rs4132601 (IKZF1), OR = 1.53, P = 1.95x10-5 for rs10821936 (ARID5B) and OR = 0.64, P = 2.32x10-4 for rs12949918 (STAT3). With the BN-BMLA we confirmed the findings of the frequentist-based method and received additional information about the nature of the relations between the SNPs and the disease. E.g. the rs10821936 in ARID5B and rs17405722 in STAT3 showed a weak interaction, and in case of T-cell lineage sample group, the gender showed a weak interaction with three SNPs in three genes. In the hyperdiploid patient group the BN-BMLA detected a strong interaction among SNPs in the NOTCH1, STAT1, STAT3 and BCL2 genes. Evaluating the survival rate of the patients with ALL, the BN-BMLA showed that besides risk groups and subtypes, genetic variations in the BAX and CEBPA genes might also influence the probability of survival of the patients. Conclusions In the present study we confirmed the roles of genetic variations in ARID5B and IKZF1 in the susceptibility to B-cell ALL
Bayesian phylogenetic estimation of fossil ages.
Drummond, Alexei J; Stadler, Tanja
2016-07-19
Recent advances have allowed for both morphological fossil evidence and molecular sequences to be integrated into a single combined inference of divergence dates under the rule of Bayesian probability. In particular, the fossilized birth-death tree prior and the Lewis-Mk model of discrete morphological evolution allow for the estimation of both divergence times and phylogenetic relationships between fossil and extant taxa. We exploit this statistical framework to investigate the internal consistency of these models by producing phylogenetic estimates of the age of each fossil in turn, within two rich and well-characterized datasets of fossil and extant species (penguins and canids). We find that the estimation accuracy of fossil ages is generally high with credible intervals seldom excluding the true age and median relative error in the two datasets of 5.7% and 13.2%, respectively. The median relative standard error (RSD) was 9.2% and 7.2%, respectively, suggesting good precision, although with some outliers. In fact, in the two datasets we analyse, the phylogenetic estimate of fossil age is on average less than 2 Myr from the mid-point age of the geological strata from which it was excavated. The high level of internal consistency found in our analyses suggests that the Bayesian statistical model employed is an adequate fit for both the geological and morphological data, and provides evidence from real data that the framework used can accurately model the evolution of discrete morphological traits coded from fossil and extant taxa. We anticipate that this approach will have diverse applications beyond divergence time dating, including dating fossils that are temporally unconstrained, testing of the 'morphological clock', and for uncovering potential model misspecification and/or data errors when controversial phylogenetic hypotheses are obtained based on combined divergence dating analyses.This article is part of the themed issue 'Dating species divergences using
Bayesian phylogenetic estimation of fossil ages
Drummond, Alexei J.; Stadler, Tanja
2016-01-01
Recent advances have allowed for both morphological fossil evidence and molecular sequences to be integrated into a single combined inference of divergence dates under the rule of Bayesian probability. In particular, the fossilized birth–death tree prior and the Lewis-Mk model of discrete morphological evolution allow for the estimation of both divergence times and phylogenetic relationships between fossil and extant taxa. We exploit this statistical framework to investigate the internal consistency of these models by producing phylogenetic estimates of the age of each fossil in turn, within two rich and well-characterized datasets of fossil and extant species (penguins and canids). We find that the estimation accuracy of fossil ages is generally high with credible intervals seldom excluding the true age and median relative error in the two datasets of 5.7% and 13.2%, respectively. The median relative standard error (RSD) was 9.2% and 7.2%, respectively, suggesting good precision, although with some outliers. In fact, in the two datasets we analyse, the phylogenetic estimate of fossil age is on average less than 2 Myr from the mid-point age of the geological strata from which it was excavated. The high level of internal consistency found in our analyses suggests that the Bayesian statistical model employed is an adequate fit for both the geological and morphological data, and provides evidence from real data that the framework used can accurately model the evolution of discrete morphological traits coded from fossil and extant taxa. We anticipate that this approach will have diverse applications beyond divergence time dating, including dating fossils that are temporally unconstrained, testing of the ‘morphological clock', and for uncovering potential model misspecification and/or data errors when controversial phylogenetic hypotheses are obtained based on combined divergence dating analyses. This article is part of the themed issue ‘Dating species divergences
Turnidge, John; Bordash, Gerry
2007-01-01
Quality control (QC) ranges for antimicrobial agents against QC strains for both dilution and disk diffusion testing are currently set by the Clinical and Laboratory Standards Institute (CLSI), using data gathered in predefined structured multilaboratory studies, so-called tier 2 studies. The ranges are finally selected by the relevant CLSI subcommittee, based largely on visual inspection and a few simple rules. We have developed statistical methods for analyzing the data from tier 2 studies and applied them to QC strain-antimicrobial agent combinations from 178 dilution testing data sets and 48 disk diffusion data sets, including a method for identifying possible outlier data from individual laboratories. The methods are based on the fact that dilution testing MIC data were log normally distributed and disk diffusion zone diameter data were normally distributed. For dilution testing, compared to QC ranges actually set by CLSI, calculated ranges were identical in 68% of cases, narrower in 7% of cases, and wider in 14% of cases. For disk diffusion testing, calculated ranges were identical to CLSI ranges in 33% of cases, narrower in 8% of cases, and 1 to 2 mm wider in 58% of cases. Possible outliers were detected in 8% of diffusion test data but none of the disk diffusion data. Application of statistical techniques to the analysis of QC tier 2 data and the setting of QC ranges is relatively simple to perform on spreadsheets, and the output enhances the current CLSI methods for setting of QC ranges. PMID:17438045
Bayesian Model Averaging for Propensity Score Analysis
ERIC Educational Resources Information Center
Kaplan, David; Chen, Jianshen
2013-01-01
The purpose of this study is to explore Bayesian model averaging in the propensity score context. Previous research on Bayesian propensity score analysis does not take into account model uncertainty. In this regard, an internally consistent Bayesian framework for model building and estimation must also account for model uncertainty. The…
NASA Technical Reports Server (NTRS)
Safford, Robert R.; Jackson, Andrew E.; Swart, William W.; Barth, Timothy S.
1994-01-01
Successful ground processing at KSC requires that flight hardware and ground support equipment conform to specifications at tens of thousands of checkpoints. Knowledge of conformance is an essential requirement for launch. That knowledge of conformance at every requisite point does not, however, enable identification of past problems with equipment, or potential problem areas. This paper describes how the introduction of Statistical Process Control and Process Capability Analysis identification procedures into existing shuttle processing procedures can enable identification of potential problem areas and candidates for improvements to increase processing performance measures. Results of a case study describing application of the analysis procedures to Thermal Protection System processing are used to illustrate the benefits of the approaches described in the paper.
Mohammed, Mohammed A; Panesar, Jagdeep S; Laney, David B; Wilson, Richard
2013-04-01
The use of statistical process control (SPC) charts in healthcare is increasing. The primary purpose of SPC is to distinguish between common-cause variation which is attributable to the underlying process, and special-cause variation which is extrinsic to the underlying process. This is important because improvement under common-cause variation requires action on the process, whereas special-cause variation merits an investigation to first find the cause. Nonetheless, when dealing with attribute or count data (eg, number of emergency admissions) involving very large sample sizes, traditional SPC charts often produce tight control limits with most of the data points appearing outside the control limits. This can give a false impression of common and special-cause variation, and potentially misguide the user into taking the wrong actions. Given the growing availability of large datasets from routinely collected databases in healthcare, there is a need to present a review of this problem (which arises because traditional attribute charts only consider within-subgroup variation) and its solutions (which consider within and between-subgroup variation), which involve the use of the well-established measurements chart and the more recently developed attribute charts based on Laney's innovative approach. We close by making some suggestions for practice. PMID:23365140
Evidence cross-validation and Bayesian inference of MAST plasma equilibria
NASA Astrophysics Data System (ADS)
von Nessi, G. T.; Hole, M. J.; Svensson, J.; Appel, L.
2012-01-01
In this paper, current profiles for plasma discharges on the mega-ampere spherical tokamak are directly calculated from pickup coil, flux loop, and motional-Stark effect observations via methods based in the statistical theory of Bayesian analysis. By representing toroidal plasma current as a series of axisymmetric current beams with rectangular cross-section and inferring the current for each one of these beams, flux-surface geometry and q-profiles are subsequently calculated by elementary application of Biot-Savart's law. The use of this plasma model in the context of Bayesian analysis was pioneered by Svensson and Werner on the joint-European tokamak [Svensson and Werner,Plasma Phys. Controlled Fusion 50(8), 085002 (2008)]. In this framework, linear forward models are used to generate diagnostic predictions, and the probability distribution for the currents in the collection of plasma beams was subsequently calculated directly via application of Bayes' formula. In this work, we introduce a new diagnostic technique to identify and remove outlier observations associated with diagnostics falling out of calibration or suffering from an unidentified malfunction. These modifications enable a good agreement between Bayesian inference of the last-closed flux-surface with other corroborating data, such as that from force balance considerations using EFIT++ [Appel et al., "A unified approach to equilibrium reconstruction" Proceedings of the 33rd EPS Conference on Plasma Physics (Rome, Italy, 2006)]. In addition, this analysis also yields errors on the plasma current profile and flux-surface geometry as well as directly predicting the Shafranov shift of the plasma core.
Using Bayesian analysis in repeated preclinical in vivo studies for a more effective use of animals.
Walley, Rosalind; Sherington, John; Rastrick, Joe; Detrait, Eric; Hanon, Etienne; Watt, Gillian
2016-05-01
Whilst innovative Bayesian approaches are increasingly used in clinical studies, in the preclinical area Bayesian methods appear to be rarely used in the reporting of pharmacology data. This is particularly surprising in the context of regularly repeated in vivo studies where there is a considerable amount of data from historical control groups, which has potential value. This paper describes our experience with introducing Bayesian analysis for such studies using a Bayesian meta-analytic predictive approach. This leads naturally either to an informative prior for a control group as part of a full Bayesian analysis of the next study or using a predictive distribution to replace a control group entirely. We use quality control charts to illustrate study-to-study variation to the scientists and describe informative priors in terms of their approximate effective numbers of animals. We describe two case studies of animal models: the lipopolysaccharide-induced cytokine release model used in inflammation and the novel object recognition model used to screen cognitive enhancers, both of which show the advantage of a Bayesian approach over the standard frequentist analysis. We conclude that using Bayesian methods in stable repeated in vivo studies can result in a more effective use of animals, either by reducing the total number of animals used or by increasing the precision of key treatment differences. This will lead to clearer results and supports the "3Rs initiative" to Refine, Reduce and Replace animals in research. Copyright © 2016 John Wiley & Sons, Ltd. PMID:27028721
Determination of absolute structure using Bayesian statistics on Bijvoet differences
Hooft, Rob W. W.; Straver, Leo H.; Spek, Anthony L.
2008-01-01
A new probabilistic approach is introduced for the determination of the absolute structure of a compound which is known to be enantiopure based on Bijvoet-pair intensity differences. The new method provides relative probabilities for different models of the chiral composition of the structure. The outcome of this type of analysis can also be cast in the form of a new value, along with associated standard uncertainty, that resembles the value of the well known Flack x parameter. The standard uncertainty we obtain is often about half of the standard uncertainty in the value of the Flack x parameter. The proposed formalism is suited in particular to absolute configuration determination from diffraction data of biologically active (pharmaceutical) compounds where the strongest resonant scattering signal often comes from oxygen. It is shown that a reliable absolute configuration assignment in such cases can be made on the basis of Cu Kα data, and in some cases even with carefully measured Mo Kα data. PMID:19461838
Universal efficiency at optimal work with bayesian statistics.
Johal, Ramandeep S
2010-12-01
If the work per cycle of a quantum heat engine is averaged over an appropriate prior distribution for an external parameter a , the work becomes optimal at Curzon-Ahlborn (CA) efficiency. More general priors of the form Π(a) ∝ 1/a(γ) yield optimal work at an efficiency which stays close to CA value, in particular near equilibrium the efficiency scales as one-half of the Carnot value. This feature is analogous to the one recently observed in literature for certain models of finite-time thermodynamics. Further, the use of Bayes' theorem implies that the work estimated with posterior probabilities also bears close analogy with the classical formula. These findings suggest that the notion of prior information can be used to reveal thermodynamic features in quantum systems, thus pointing to a connection between thermodynamic behavior and the concept of information. PMID:21230650
Gao, S; Meyer, R; Shi, L; D'Souza, W; Zhang, H
2014-06-15
Purpose: To apply a statistical modeling approach, threshold modeling (TM), for quality control of intensity-modulated radiation therapy (IMRT) treatment plans. Methods: A quantitative measure, which was the weighted sum of violations of dose/dose-volume constraints, was first developed to represent the quality of each IMRT plan. Threshold modeling approach, which is is an extension of extreme value theory in statistics and is an effect way to model extreme values, was then applied to analyze the quality of the plans summarized by our quantitative measures. Our approach modeled the plans generated by planners as a series of independent and identically distributed random variables and described the behaviors of them if the plan quality was controlled below certain threshold. We tested our approach with five locally advanced head and neck cancer patients retrospectively. Two statistics were incorporated for numerical analysis: probability of quality improvement (PQI) of the plans and expected amount of improvement on the quantitative measure (EQI). Results: After clinical planners generated 15 plans for each patient, we applied our approach to obtain the PQI and EQI as if planners would generate additional 15 plans. For two of the patients, the PQI was significantly higher than the other three (0.17 and 0.18 comparing to 0.08, 0.01 and 0.01). The actual percentage of the additional 15 plans that outperformed the best of initial 15 plans was 20% and 27% comparing to 11%, 0% and 0%. EQI for the two potential patients were 34.5 and 32.9 and the rest of three patients were 9.9, 1.4 and 6.6. The actual improvements obtained were 28.3 and 20.5 comparing to 6.2, 0 and 0. Conclusion: TM is capable of reliably identifying the potential quality improvement of IMRT plans. It provides clinicians an effective tool to assess the trade-off between extra planning effort and achievable plan quality. This work was supported in part by NIH/NCI grant CA130814.
Bayesian Integration of Spatial Information
ERIC Educational Resources Information Center
Cheng, Ken; Shettleworth, Sara J.; Huttenlocher, Janellen; Rieser, John J.
2007-01-01
Spatial judgments and actions are often based on multiple cues. The authors review a multitude of phenomena on the integration of spatial cues in diverse species to consider how nearly optimally animals combine the cues. Under the banner of Bayesian perception, cues are sometimes combined and weighted in a near optimal fashion. In other instances…
Word Learning as Bayesian Inference
ERIC Educational Resources Information Center
Xu, Fei; Tenenbaum, Joshua B.
2007-01-01
The authors present a Bayesian framework for understanding how adults and children learn the meanings of words. The theory explains how learners can generalize meaningfully from just one or a few positive examples of a novel word's referents, by making rational inductive inferences that integrate prior knowledge about plausible word meanings with…
Bayesian analysis of rare events
NASA Astrophysics Data System (ADS)
Straub, Daniel; Papaioannou, Iason; Betz, Wolfgang
2016-06-01
In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.
Mohammed, Mohammed A; Worthington, Peter
2013-03-01
The use of statistical process control (SPC) charts in healthcare is increasing. The general advice when plotting SPC charts is to begin by selecting the right chart. This advice, in the case of attribute data, may be limiting our insights into the underlying process and consequently be potentially misleading. Given the general lack of awareness that additional insights may be obtained by using more than one SPC chart, there is a need to review this issue and make some recommendations. Under purely common cause variation the control limits on the xmr-chart and traditional attribute charts (eg, p-chart, c-chart, u-chart) will be in close agreement, indicating that the observed variation (xmr-chart) is consistent with the underlying Binomial model (p-chart) or Poisson model (c-chart, u-chart). However, when there is a material difference between the limits from the xmr-chart and the attribute chart then this also constitutes a signal of an underlying systematic special cause of variation. We use one simulation and two case studies to demonstrate these ideas and show the utility of plotting the SPC chart for attribute data alongside an xmr-chart. We conclude that the combined use of attribute charts and xmr-charts, which requires little additional effort, is a useful strategy because it is less likely to mislead us and more likely to give us the insight to do the right thing. PMID:23104897
NASA Technical Reports Server (NTRS)
1982-01-01
A FORTRAN coded computer program and method to predict the reaction control fuel consumption statistics for a three axis stabilized rocket vehicle upper stage is described. A Monte Carlo approach is used which is more efficient by using closed form estimates of impulses. The effects of rocket motor thrust misalignment, static unbalance, aerodynamic disturbances, and deviations in trajectory, mass properties and control system characteristics are included. This routine can be applied to many types of on-off reaction controlled vehicles. The pseudorandom number generation and statistical analyses subroutines including the output histograms can be used for other Monte Carlo analyses problems.
ERIC Educational Resources Information Center
Press, S. James; Tanur, Judith M.
1991-01-01
Relevance of the intersection of sociology, statistics, and public policy to the study of quality control in three family assistance programs--food stamps, Aid to Families with Dependent Children (AFDC), and Medicaid--is reviewed using a study by the National Academy of Sciences of methods for improving quality control systems. (SLD)
SU-E-CAMPUS-T-04: Statistical Process Control for Patient-Specific QA in Proton Beams
LAH, J; SHIN, D; Kim, G
2014-06-15
Purpose: To evaluate and improve the reliability of proton QA process, to provide an optimal customized level using the statistical process control (SPC) methodology. The aim is then to suggest the suitable guidelines for patient-specific QA process. Methods: We investigated the constancy of the dose output and range to see whether it was within the tolerance level of daily QA process. This study analyzed the difference between the measured and calculated ranges along the central axis to suggest the suitable guidelines for patient-specific QA in proton beam by using process capability indices. In this study, patient QA plans were classified into 6 treatment sites: head and neck (41 cases), spinal cord (29 cases), lung (28 cases), liver (30 cases), pancreas (26 cases), and prostate (24 cases). Results: The deviations for the dose output and range of daily QA process were ±0.84% and ±019%, respectively. Our results show that the patient-specific range measurements are capable at a specification limit of ±2% in all treatment sites except spinal cord cases. In spinal cord cases, comparison of process capability indices (Cp, Cpm, Cpk ≥1, but Cpmk ≤1) indicated that the process is capable, but not centered, the process mean deviates from its target value. The UCL (upper control limit), CL (center line) and LCL (lower control limit) for spinal cord cases were 1.37%, −0.27% and −1.89%, respectively. On the other hands, the range differences in prostate cases were good agreement between calculated and measured values. The UCL, CL and LCL for prostate cases were 0.57%, −0.11% and −0.78%, respectively. Conclusion: SPC methodology has potential as a useful tool to customize an optimal tolerance levels and to suggest the suitable guidelines for patient-specific QA in clinical proton beam.
Predictive Bayesian inference and dynamic treatment regimes.
Saarela, Olli; Arjas, Elja; Stephens, David A; Moodie, Erica E M
2015-11-01
While optimal dynamic treatment regimes (DTRs) can be estimated without specification of a predictive model, a model-based approach, combined with dynamic programming and Monte Carlo integration, enables direct probabilistic comparisons between the outcomes under the optimal DTR and alternative (dynamic or static) treatment regimes. The Bayesian predictive approach also circumvents problems related to frequentist estimators under the nonregular estimation problem. However, the model-based approach is susceptible to misspecification, in particular of the "null-paradox" type, which is due to the model parameters not having a direct causal interpretation in the presence of latent individual-level characteristics. Because it is reasonable to insist on correct inferences under the null of no difference between the alternative treatment regimes, we discuss how to achieve this through a "null-robust" reparametrization of the problem in a longitudinal setting. Since we argue that causal inference can be entirely understood as posterior predictive inference in a hypothetical population without covariate imbalances, we also discuss how controlling for confounding through inverse probability of treatment weighting can be justified and incorporated in the Bayesian setting. PMID:26259996
Bayesian networks for evaluation of evidence from forensic entomology.
Andersson, M Gunnar; Sundström, Anders; Lindström, Anders
2013-09-01
In the aftermath of a CBRN incident, there is an urgent need to reconstruct events in order to bring the perpetrators to court and to take preventive actions for the future. The challenge is to discriminate, based on available information, between alternative scenarios. Forensic interpretation is used to evaluate to what extent results from the forensic investigation favor the prosecutors' or the defendants' arguments, using the framework of Bayesian hypothesis testing. Recently, several new scientific disciplines have been used in a forensic context. In the AniBioThreat project, the framework was applied to veterinary forensic pathology, tracing of pathogenic microorganisms, and forensic entomology. Forensic entomology is an important tool for estimating the postmortem interval in, for example, homicide investigations as a complement to more traditional methods. In this article we demonstrate the applicability of the Bayesian framework for evaluating entomological evidence in a forensic investigation through the analysis of a hypothetical scenario involving suspect movement of carcasses from a clandestine laboratory. Probabilities of different findings under the alternative hypotheses were estimated using a combination of statistical analysis of data, expert knowledge, and simulation, and entomological findings are used to update the beliefs about the prosecutors' and defendants' hypotheses and to calculate the value of evidence. The Bayesian framework proved useful for evaluating complex hypotheses using findings from several insect species, accounting for uncertainty about development rate, temperature, and precolonization. The applicability of the forensic statistic approach to evaluating forensic results from a CBRN incident is discussed. PMID:23971824